Hello all,
I am trying to run spark job using spark-submit with a docker image over yarn.
I followed the instructions in the Blog provided by cloudera in the following link:
https://blog.cloudera.com/introducing-apache-spark-on-docker-on-top-of-apache-yarn-with-cdp-datacent...
and I ran into an error that I couldn't fine an answer to.
Note: I already did all the configurations required from the post.
I ran this command :
spark-submit \
--master yarn \
--deploy-mode cluster \
--conf spark.yarn.appMasterEnv.YARN_CONTAINER_RUNTIME_TYPE=docker \
--conf spark.yarn.appMasterEnv.YARN_CONTAINER_RUNTIME_DOCKER_IMAGE=faresdev8/python3:v5 \
--conf spark.yarn.appMasterEnv.YARN_CONTAINER_RUNTIME_DOCKER_MOUNTS="/etc/passwd:/etc/passwd:ro,/etc/hadoop:/etc/hadoop:ro,/opt/cloudera/parcels/:/opt/cloudera/parcels/:ro,/data1/opt/cloudera/parcels/:/data1/opt/cloudera/parcels/:ro" \
--conf spark.executorEnv.YARN_CONTAINER_RUNTIME_TYPE=docker \
--conf spark.executorEnv.YARN_CONTAINER_RUNTIME_DOCKER_IMAGE=faresdev8/python3:v5 \
--conf spark.executorEnv.YARN_CONTAINER_RUNTIME_DOCKER_MOUNTS="/etc/passwd:/etc/passwd:ro,/etc/hadoop:/etc/hadoop:ro,/opt/cloudera/parcels/:/opt/cloudera/parcels/:ro,/data1/opt/cloudera/parcels/:/data1/opt/cloudera/parcels/:ro" \
ols.py
And this is the error I get:

Sometimes its gives me an exit code 29. I don't understand what the problem is especially that I followed instructions properly.