Member since
07-17-2019
9
Posts
0
Kudos Received
0
Solutions
10-14-2019
02:30 AM
Hello @sduraisankar93, If you are facing this issue, as you said it's because you do not have imported the module. I believe you should check this documentation on how to import HWC and use it : https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/integrating-hive/content/hive_configure_a_spark_hive_connection.html If you are using Zeppelin, please check this : https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/integrating-hive/content/hive_zeppelin_configuration_hivewarehouseconnector.html Note that, in Zeppelin pysark configuration could not work, so a work around is to set (via Ambari) in Zeppelin-env.sh section, this configuration : export SPARK_SUBMIT_OTPIONS="--jars usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-<version>.jar --py-files usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-<version>.zip" Then to start a pyspark shell on your machines, launch this command : pyspark --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-jar --py-files /usr/hdp/current/hive_warehouse_connector/pyspark_hwc-.zip
... View more
07-18-2019
04:39 PM
@Shu hive warehouse connector support for acid table is available in HDP-3.0 r do i need to upgrade HDP-3.0 to HDP-3.1
... View more