Created 06-02-2016 11:04 AM
File "/opt/mapr/spark/spark-1.6.1/python/lib/pyspark.zip/pyspark/mllib/__init__.py", line 25, in <module>
ImportError: No module named numpy
Created 06-02-2016 11:09 AM
numpy is missing here,install numpy using pip install numpy
Created 06-02-2016 11:11 AM
I have already installed numpy and using python console its working fine. I tried to setup PYthon environment variable in spark-env.sh. but did not work.
Created 06-02-2016 11:16 AM
Created 06-02-2016 11:41 AM
do you have multiple python version of python installed on your machine or your working with python testenv. what is your PYTHONPATH?
Created 06-02-2016 12:18 PM
nope I have only one python 2.7.5 and
python: /usr/bin/python /usr/bin/python2.7 /usr/bin/python2.7-config /usr/lib/python2.7 /usr/lib64/python2.7 /etc/python /usr/include/python2.7 /usr/share/man/man1/python.1.gz
Created 09-01-2016 09:56 AM
I am facing the same problem. I have installed numpy on all the nodes. And I am running it using YARN. In the directory /usr/bin I see python, python2, and python2.7. But only python2.7 is green in the list. echo $PYTHONPATH gave me empty string. Afterwards, I executed export PYTHONPATH=/usr/bin/python2.7 on each node. But still the my job submission exits with 'No module named numpy'. Any help?
Created 09-01-2016 11:38 AM
please check the permission of python installation directories and see your current user is having correct permission or not.
Also try to simulate scenarios using root user. I hope using root user it should work.
Created 02-10-2019 11:03 PM
as @Bhupendra Mishra indirectly pointed out, ensure to launch
pip install numpy command from a root account (sudo does not suffice) after forcing umask to 022 (
umask 022) so it cascades the rights to Spark (or Zeppelin) User
Also, You have to be aware that you need to have numpy installed on each and every worker, and even the master itself (depending on your component placement)