Created 03-18-2016 11:32 PM
Hi,
I can ssh into my HDP cluster servers as user 'cloudbreak', but I cannot sudo or su to root in order to install python modules and packages I need for pyspark. I spent time searching for answers but came up with nothing. Any help appreciated.
Created 03-18-2016 11:54 PM
Cloudbreak uses docker containers, just ssh to machine is not enough.
Created 03-19-2016 11:47 AM
Ahh yes. But question still remains how would one install python packages in the spark worker dockers such that they survive restart etc? I couldn't find any documentation on it.
Created 03-21-2016 01:50 AM
@Darren Govoni I haven't touched Cloudbreak in a while, 1.2 version was just published and it introduced a concept of recipes. You can add your own custom scripts/jars etc to cluster. I believe you can add to an existing cluster as well. Try it out and let us know. http://sequenceiq.com/cloudbreak-docs/latest/recipes/