Support Questions

Find answers, ask questions, and share your expertise

Shall we run multiple spark version jobs innoozie workflow

avatar
Explorer
Can we able to run spark1 and spark1 jobs in same cluster. Please suggest me.

Thanks
1 ACCEPTED SOLUTION

avatar
Super Collaborator

There is no alternate for the issue at the moment. Some people have tried to work around it by hacking the oozie shared libs but that has not really been succesful. For now I would recommend that you stick with Spark 1 in Oozie and Spark 2 from the command line. 

 

Wilfred

View solution in original post

4 REPLIES 4

avatar
Super Collaborator

I think you wanted to ask can we run spark 1 and spark 2 jobs in the same cluster?

 

Simple answer: yes you can have them both installed, see the docs You can not have different minor versions of the same major version in one cluster. (i.e. 1.5 and 1.6, or 2.1 and 2.2)

 

Wilfred

avatar
Explorer
That would be the question as you mentioned.

I gone through the docs.

Do you suggest any alternate for the below known issue.

https://www.cloudera.com/documentation/spark2/latest/topics/spark2_known_issues.html#ki_oozie_spark_...


Thanks

avatar
Super Collaborator

There is no alternate for the issue at the moment. Some people have tried to work around it by hacking the oozie shared libs but that has not really been succesful. For now I would recommend that you stick with Spark 1 in Oozie and Spark 2 from the command line. 

 

Wilfred

avatar
Explorer
Thanks for quick reply.