Created on 12-18-2017 07:17 AM - edited 09-16-2022 05:39 AM
Created 12-20-2017 08:26 PM
There is no alternate for the issue at the moment. Some people have tried to work around it by hacking the oozie shared libs but that has not really been succesful. For now I would recommend that you stick with Spark 1 in Oozie and Spark 2 from the command line.
Wilfred
I think you wanted to ask can we run spark 1 and spark 2 jobs in the same cluster?
Simple answer: yes you can have them both installed, see the docs You can not have different minor versions of the same major version in one cluster. (i.e. 1.5 and 1.6, or 2.1 and 2.2)
Wilfred
Created 12-20-2017 06:40 PM
Created 12-20-2017 08:26 PM
There is no alternate for the issue at the moment. Some people have tried to work around it by hacking the oozie shared libs but that has not really been succesful. For now I would recommend that you stick with Spark 1 in Oozie and Spark 2 from the command line.
Wilfred
Created 12-20-2017 08:36 PM