Created 02-23-2016 02:03 PM
Hi,
i got this error below
Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, ip-internal): java.io.IOException: Cannot run program "/home/centos/apps/pyspark/venv/bin/python": error=13, Permission denied
while running
$ export HADOOP_USER_NAME=centos $ export PYSPARK_PYTHON=/home/centos/apps/pyspark/venv/bin/python $ pyspark --master yarn-client
I add user centos to hadoop as hdfs group
i'm using latest HDP 2.3.4 and centos 7.2
Created 02-23-2016 05:58 PM
I can fix this issue by set the /home/centos executable "775" @Ian Roberts
Created 02-23-2016 02:04 PM
What are the permissions on /home/centos/apps/pyspark/venv/bin/python?
Created 02-23-2016 02:10 PM
hi @Ian Roberts
drwxrwxrwx 4 centos centos 57 Feb 23 13:22 pyspark
this is the permission for pyspark venv "777"
and i install the pyspark venv on every node manager with the same permission.
Created 02-23-2016 02:22 PM
What are the permissions on python
ls -ltr /home/centos/apps/pyspark/venv/bin
Created 02-23-2016 03:49 PM
here's the permission for the python @Ian Roberts
-rwxrwxrwx 1 centos centos 7136 Feb 23 13:22 python
Created 02-23-2016 05:58 PM
I can fix this issue by set the /home/centos executable "775" @Ian Roberts
Created 10-19-2018 01:25 PM
For my case, previously I had given my anaconda installation.
Reverting it back to /usr/bin/python2.7 fixed the problem.