Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

pyspark anaconda "permission denied"

avatar
Contributor

 Hi Community, 

 

I have installed anaconda on Centos6 for using ipython and jupyter notebooks together with spark.

When I run pyspark I get the following error: 

 

java.io.IOException: Cannot run program "/home/hadoop/anaconda/bin/": error=13, Permission denied

This is wired, because I start pyspark on the console with user hadoop and anaconda is in the home directory. 

Also I've set the permissions, so that the user hadoop should be able to execute this. 

 

drwxr-xr-x   3 hadoop hadoop  4096 Aug 16 23:46 bin

Anaconda in general is running and I'm able to execute pyspark using sudo pyspark (but thats not a solution as ipython for root is not available). 

Question: what needs to be set, that the user hadoop is able to run pyspark using anaconda?

 

Thanks!!!

1 ACCEPTED SOLUTION

avatar
Contributor

I solved it by myself by installing anaconda again as root in the directory /opt/anaconda and adjusted the environment variables accordingly. This saved my issue.

Cheers 

View solution in original post

2 REPLIES 2

avatar
Contributor

I solved it by myself by installing anaconda again as root in the directory /opt/anaconda and adjusted the environment variables accordingly. This saved my issue.

Cheers 

avatar
Community Manager

I'm happy to see that you resolved the issue. Thank you for updating the post with the solution in case it can assist others. 🙂


Cy Jervis, Manager, Community Program
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.