12-20-2017 04:55 AM - last edited on 12-20-2017 07:07 AM by cjervis
I think you wanted to ask can we run spark 1 and spark 2 jobs in the same cluster?
Simple answer: yes you can have them both installed, see the docs You can not have different minor versions of the same major version in one cluster. (i.e. 1.5 and 1.6, or 2.1 and 2.2)
12-20-2017 06:40 PM
12-20-2017 08:26 PM
There is no alternate for the issue at the moment. Some people have tried to work around it by hacking the oozie shared libs but that has not really been succesful. For now I would recommend that you stick with Spark 1 in Oozie and Spark 2 from the command line.