Hello @sduraisankar93,
If you are facing this issue, as you said it's because you do not have imported the module.
I believe you should check this documentation on how to import HWC and use it :
https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/integrating-hive/content/hive_configure_a_spar...
If you are using Zeppelin, please check this :
https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/integrating-hive/content/hive_zeppelin_configu...
Note that, in Zeppelin pysark configuration could not work, so a work around is to set (via Ambari) in Zeppelin-env.sh section, this configuration :
export SPARK_SUBMIT_OTPIONS="--jars usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-<version>.jar --py-files usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-<version>.zip"
Then to start a pyspark shell on your machines, launch this command :
pyspark --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-jar --py-files /usr/hdp/current/hive_warehouse_connector/pyspark_hwc-.zip