Member since
02-19-2016
4
Posts
0
Kudos Received
0
Solutions
03-23-2016
06:01 AM
Hi, Parcels are part of it. But then how do I create and activate virtual envs for a specifics jobs ?
... View more
03-23-2016
05:55 AM
Hi, As many of you I guess, I mix pyspark jobs and regular pandas dataframes and scikit-learn for data science. But I'm sharing the platform with many other data scientists and we might end with a libs mess. I'd love to be able to have conda envs for projects I could activate separately and in parallel for each job context. Manageable ? Thanks. Yann
... View more
02-27-2016
03:30 AM
Hi guys, Thanks for anwsers. Wiuld you recommend using Oozie and refer spark jars like in this post or use SharedLibs (which would resukt in a mess if I have like 3 Spark versions in this folder) ? Cheers, Yann
... View more
02-19-2016
09:02 AM
Hi, Is there a way to have various Spark version running on the cluster, and specifying which version to use at job startup ? Thanks. Cheers, Yann
... View more