aboutsummaryrefslogtreecommitdiffstats
path: root/vnfs/DAaaS/applications/charts/sample-spark-app/values.yaml
diff options
context:
space:
mode:
authorRajamohan Raj <rajamohan.raj@intel.com>2019-03-11 23:53:41 +0000
committerRajamohan Mohan Raj <rajamohan.raj@intel.com>2019-03-13 23:40:21 +0000
commitb94b8b3ff5f403d9460f97acb7c2a553a42498f7 (patch)
treeb0fd552f7a24bb6c2ff912fe338369cdd73be4a3 /vnfs/DAaaS/applications/charts/sample-spark-app/values.yaml
parente8f7e027283f8630733fb423d834e7d828d0db11 (diff)
Helm charts for spark and hdfs
Tasks accomplished in this patch: https://jira.onap.org/browse/ONAPARC-445 - Create helm chart for Spark on K8S operator and add it to operator. https://jira.onap.org/browse/ONAPARC-446 - Create helm charts for HDFS https://jira.onap.org/browse/ONAPARC-447 - Create Spark application helm chart as part of application pacakge https://jira.onap.org/browse/ONAPARC-448 - Add Anaconda with tensorflow, keras, horovod support to Spark image Change-Id: Icb4adeaa8a0aa445614f91203d7793e4e4f304c1 Issue-ID: ONAPARC-391 Signed-off-by: Rajamohan Raj <rajamohan.raj@intel.com>
Diffstat (limited to 'vnfs/DAaaS/applications/charts/sample-spark-app/values.yaml')
-rw-r--r--vnfs/DAaaS/applications/charts/sample-spark-app/values.yaml57
1 files changed, 57 insertions, 0 deletions
diff --git a/vnfs/DAaaS/applications/charts/sample-spark-app/values.yaml b/vnfs/DAaaS/applications/charts/sample-spark-app/values.yaml
new file mode 100644
index 00000000..afb48d67
--- /dev/null
+++ b/vnfs/DAaaS/applications/charts/sample-spark-app/values.yaml
@@ -0,0 +1,57 @@
+# Default values for sample-spark-app.
+# This is a YAML-formatted file.
+# Declare variables to be passed into your templates.
+
+
+#===========================KUBERNETES POD RELATED CONFIGs========================
+image: spark-tf-keras-horo:latest
+imagePullPolicy: Never
+restartPolicy: Never
+volumesName: test-volume
+hostpath: /tmp
+hostpathType: Directory
+
+
+
+#============================SPARK APP RELATED CONFIGs=============================
+
+nameOfTheSparkApp: spark-apache-logs2
+# Python or Scala supported.
+programmingLanguageType: Scala
+modeOfSparkApp: cluster
+mainClassOfTheSparkApp: ApacheLogAnalysis
+# can be http path, s3 path, minio path
+mainApplicationFileOfTheSparkApp: https://github.com/mohanraj1311/ApacheLogAnalysisJar/raw/master/analysisofapachelogs_2.11-0.1.jar
+argumentsOfTheSparkProgram:
+ - hdfs://hdfs-1-namenode-1.hdfs-1-namenode.hdfs1.svc.cluster.local:8020/data/apache-logs
+
+
+
+#============================SPARK DRIVER RELATED CONFIGs=========================
+driverCores: 0.1
+driverCoreLimit: 200m
+driverMemory: 1024m
+driverVolumeMountsName: test-volume
+driverVolumeMountPath: /tmp
+
+
+
+#============================SPARK EXECUTOR RELATED CONFIGs=======================
+executorCores: 1
+executorInstances: 1
+executorMemory: 512m
+executorVolumeMountsName: test-volume
+executorVolumeMountPath: /tmp
+
+
+
+#===========================HADOOP RELATED CONFIGs===============================
+# config map of the hdfs
+hadoopConfigMap: hdfs-1-config
+
+
+###################################################################################
+
+
+
+