- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
ImportError: No module named pyspark_llap
- Labels:
-
Apache Hive
Created 07-19-2019 04:47 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
when i am import this >from pyspark_llap import HiveWarehouseSession
i had faced the error like
ImportError: No module named pyspark_llap.
how to install this module .Is there any step by step user guide?
Created 10-14-2019 02:30 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello @sduraisankar93,
If you are facing this issue, as you said it's because you do not have imported the module.
I believe you should check this documentation on how to import HWC and use it :
If you are using Zeppelin, please check this :
Note that, in Zeppelin pysark configuration could not work, so a work around is to set (via Ambari) in Zeppelin-env.sh section, this configuration :
export SPARK_SUBMIT_OTPIONS="--jars usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-<version>.jar --py-files usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-<version>.zip"
Then to start a pyspark shell on your machines, launch this command :
pyspark --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-jar --py-files /usr/hdp/current/hive_warehouse_connector/pyspark_hwc-.zip
