Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

import-hive.sh gives classnotfound exceptions

Highlighted

import-hive.sh gives classnotfound exceptions

New Contributor

Hi, I've setup apache-atlas-0.9-incubating locally on Ubuntu 16.04 LTS. In order to import Hive metadata into atlas, I've setup a cluster of Hadoop 2.7 with Apache Hive 1.2.2 now when I execute shell script (import-hive.sh) from atlas/distro/target/apache-atlas-0.9-incubating-SNAPSHOT/bin. It gives me different exceptions for "(MIssing jars) ClassNotFound".

I've tried to fix them by manually placing the jar under folder atlas/distro/target/apache-atlas-0.9-incubating-SNAPSHOT/hook/hive/atlas-hive-plugin-impl/ but every time there is a new missing jar. So, I think this is not the right solution.

Please guide me on this urgently with very basic level guide as I'm a newbee... :)

Thanks

3 REPLIES 3

Re: import-hive.sh gives classnotfound exceptions

Contributor

@Saba Baig

Try installing hive client on the atlas server and try using import-hive.sh from atlas server.

Hive metadata is imported using import-hive.sh command, The script needs Hadoop and Hive classpath jars. * For Hadoop jars, please make sure that the environment variable HADOOP_CLASSPATH is set. Another way is to set HADOOP_HOME to point to root directory of your Hadoop installation * Similarly, for Hive jars, set HIVE_HOME to the root of Hive installation * Set environment variable HIVE_CONF_DIR to Hive configuration directory * Copy <atlas-conf>/atlas-application.properties to the hive conf directory

Re: import-hive.sh gives classnotfound exceptions

New Contributor

@Vinod Bonthu

This is my bashrc file, here I tried to set CLASSPATH=$CLASSPATH:/usr/local/hadoop/lib/*:. but it says that it's a non-existing directory. whereas in my /usr/local/hadoop folder their is a folder named as native which has files with .so extension. Please guide me how can I resolve my HADOOP_CLASSPATH variable?

Any help on setting up Hive-client on locally installed Apache Atlas Server would be highly appreciated.

Re: import-hive.sh gives classnotfound exceptions

Contributor

@Saba Baig are you using apache hadoop ? or HDP ?

To get the whole hadoop classpath, you just need to run the following command on your machine. You can copy that pass it to HADOOP_CLASSPATH

hadoop classpath

Example :[root@vb-atlas-node2 ~]# hadoop classpath
/usr/hdp/2.6.0.3-8/hadoop/conf:/usr/hdp/2.6.0.3-8/hadoop/lib/*:/usr/hdp/2.6.0.3-8/hadoop/.//*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/./:/usr/hdp/2.6.0.3-8/hadoop-hdfs/lib/*:/usr/hdp/2.6.0.3-8/hadoop-hdfs/.//*:/usr/hdp/2.6.0.3-8/hadoop-yarn/lib/*:/usr/hdp/2.6.0.3-8/hadoop-yarn/.//*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/lib/*:/usr/hdp/2.6.0.3-8/hadoop-mapreduce/.//*::mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/2.6.0.3-8/tez/*:/usr/hdp/2.6.0.3-8/tez/lib/*:/usr/hdp/2.6.0.3-8/tez/conf

To install hadoop client and hive client you can go to Ambari and select hosts -> select the atlas hostname -> components -> Add ( hive client, hadoop client )

15091-screen-shot-2017-05-05-at-82757-am.png

Don't have an account?
Coming from Hortonworks? Activate your account here