Created on 08-19-2016 01:37 AM - edited 09-16-2022 03:35 AM
Hi Community,
I have installed anaconda on Centos6 for using ipython and jupyter notebooks together with spark.
When I run pyspark I get the following error:
java.io.IOException: Cannot run program "/home/hadoop/anaconda/bin/": error=13, Permission denied
This is wired, because I start pyspark on the console with user hadoop and anaconda is in the home directory.
Also I've set the permissions, so that the user hadoop should be able to execute this.
drwxr-xr-x 3 hadoop hadoop 4096 Aug 16 23:46 bin
Anaconda in general is running and I'm able to execute pyspark using sudo pyspark (but thats not a solution as ipython for root is not available).
Question: what needs to be set, that the user hadoop is able to run pyspark using anaconda?
Thanks!!!
Created 08-19-2016 06:03 AM
I solved it by myself by installing anaconda again as root in the directory /opt/anaconda and adjusted the environment variables accordingly. This saved my issue.
Cheers
Created 08-19-2016 06:03 AM
I solved it by myself by installing anaconda again as root in the directory /opt/anaconda and adjusted the environment variables accordingly. This saved my issue.
Cheers
Created 08-19-2016 06:35 AM
I'm happy to see that you resolved the issue. Thank you for updating the post with the solution in case it can assist others. 🙂