Support Questions
Find answers, ask questions, and share your expertise gives classnotfound exceptions

Highlighted gives classnotfound exceptions


Hi, I've setup apache-atlas-0.9-incubating locally on Ubuntu 16.04 LTS. In order to import Hive metadata into atlas, I've setup a cluster of Hadoop 2.7 with Apache Hive 1.2.2 now when I execute shell script ( from atlas/distro/target/apache-atlas-0.9-incubating-SNAPSHOT/bin. It gives me different exceptions for "(MIssing jars) ClassNotFound".

I've tried to fix them by manually placing the jar under folder atlas/distro/target/apache-atlas-0.9-incubating-SNAPSHOT/hook/hive/atlas-hive-plugin-impl/ but every time there is a new missing jar. So, I think this is not the right solution.

Please guide me on this urgently with very basic level guide as I'm a newbee... :)



Re: gives classnotfound exceptions


@Saba Baig

Try installing hive client on the atlas server and try using from atlas server.

Hive metadata is imported using command, The script needs Hadoop and Hive classpath jars. * For Hadoop jars, please make sure that the environment variable HADOOP_CLASSPATH is set. Another way is to set HADOOP_HOME to point to root directory of your Hadoop installation * Similarly, for Hive jars, set HIVE_HOME to the root of Hive installation * Set environment variable HIVE_CONF_DIR to Hive configuration directory * Copy <atlas-conf>/ to the hive conf directory


Re: gives classnotfound exceptions


@Vinod Bonthu

This is my bashrc file, here I tried to set CLASSPATH=$CLASSPATH:/usr/local/hadoop/lib/*:. but it says that it's a non-existing directory. whereas in my /usr/local/hadoop folder their is a folder named as native which has files with .so extension. Please guide me how can I resolve my HADOOP_CLASSPATH variable?

Any help on setting up Hive-client on locally installed Apache Atlas Server would be highly appreciated.

Re: gives classnotfound exceptions


@Saba Baig are you using apache hadoop ? or HDP ?

To get the whole hadoop classpath, you just need to run the following command on your machine. You can copy that pass it to HADOOP_CLASSPATH

hadoop classpath

Example :[root@vb-atlas-node2 ~]# hadoop classpath

To install hadoop client and hive client you can go to Ambari and select hosts -> select the atlas hostname -> components -> Add ( hive client, hadoop client )