Deploy Spark into Kubernetes Cluster


#1

Hi,

I’m newbie in Kubernetes & Spark Environment. I’m requested to deploy Spark inside Kubernetes so that it’s can be auto Horizontal Scalling.

The problem is, I can’t deploy SparkPi example from official website(https://spark.apache.org/docs/latest/running-on-kubernetes#cluster-mode).

I’ve already follow the instruction, but the pods failed to execute. Here is the explanation :

  1. Already run : Kubectl proxy
  2. When execute :

spark-submit --master k8s://https://localhost:6445 --deploy-mode cluster --name spark-pi --class org.apache.spark.examples.SparkPi --conf spark.executor.instances=5 --conf spark.kubernetes.container.image=xnuxer88/spark-kubernetes-bash-test-entry:v1 local:///opt/spark/examples/jars/spark-examples_2.11-2.3.2.jar

Get Error :
Error: Could not find or load main class org.apache.spark.examples.SparkPi

  1. When I check the docker image (create the container from related image), I found the file.

Is there any missing instruction that I forgot to follow?

Please Help.

Thank You.

Link : https://stackoverflow.com/questions/52623435/deploy-spark-into-kubernetes-cluster


#2

Heya!

Did you by chance see this similar issue? https://stackoverflow.com/questions/51467082/sparkpi-on-kubernetes-could-not-find-or-load-main-class

Is it possible the JAR file isn’t actually present in the image? I might try using their spark-submit command and see if that moves you along. :slight_smile:


#3

Hi Jeffy,

Still the same.

Command :

spark-submit --master k8s://https://localhost:6445 --deploy-mode cluster --name spark-pi --class org.apache.spark.examples.SparkPi --conf spark.executor.instances=5 --conf spark.kubernetes.container.image=xnuxer88/spark-kubernetes-bash-test-entry:v1 --conf spark.kubernetes.authenticate.driver.serviceAccountName=default https://github.com/JWebDev/spark/raw/master/spark-examples_2.11-2.3.1.jar

Thank You.


#4

Additional Image :


#5


#6

Hi ,

I think you should use this path local:///opt/spark/jars/abcd.jar when try to run your spark jobs in k8 cluster.
The thing is the script which is present in spark installation file copies all the jars present under spark2.3.2/examples/jars/abcd.jar to /opt/spark/jars location.

Also before running this thing can you build your docker image using the script given in the installation directory/bin/docker.sh