Hi, I've setup apache-atlas-0.9-incubating locally on Ubuntu 16.04 LTS. In order to import Hive metadata into atlas, I've setup a cluster of Hadoop 2.7 with Apache Hive 1.2.2 now when I execute shell script (import-hive.sh) from atlas/distro/target/apache-atlas-0.9-incubating-SNAPSHOT/bin. It gives me different exceptions for "(MIssing jars) ClassNotFound".
I've tried to fix them by manually placing the jar under folder atlas/distro/target/apache-atlas-0.9-incubating-SNAPSHOT/hook/hive/atlas-hive-plugin-impl/ but every time there is a new missing jar. So, I think this is not the right solution.
Please guide me on this urgently with very basic level guide as I'm a newbee... :)
Try installing hive client on the atlas server and try using import-hive.sh from atlas server.
Hive metadata is imported using import-hive.sh command, The script needs Hadoop and Hive classpath jars. * For Hadoop jars, please make sure that the environment variable HADOOP_CLASSPATH is set. Another way is to set HADOOP_HOME to point to root directory of your Hadoop installation * Similarly, for Hive jars, set HIVE_HOME to the root of Hive installation * Set environment variable HIVE_CONF_DIR to Hive configuration directory * Copy <atlas-conf>/atlas-application.properties to the hive conf directory
This is my bashrc file, here I tried to set CLASSPATH=$CLASSPATH:/usr/local/hadoop/lib/*:. but it says that it's a non-existing directory. whereas in my /usr/local/hadoop folder their is a folder named as native which has files with .so extension. Please guide me how can I resolve my HADOOP_CLASSPATH variable?
Any help on setting up Hive-client on locally installed Apache Atlas Server would be highly appreciated.
@Saba Baig are you using apache hadoop ? or HDP ?
To get the whole hadoop classpath, you just need to run the following command on your machine. You can copy that pass it to HADOOP_CLASSPATH
Example :[root@vb-atlas-node2 ~]# hadoop classpath /usr/hdp/184.108.40.206-8/hadoop/conf:/usr/hdp/220.127.116.11-8/hadoop/lib/*:/usr/hdp/18.104.22.168-8/hadoop/.//*:/usr/hdp/22.214.171.124-8/hadoop-hdfs/./:/usr/hdp/126.96.36.199-8/hadoop-hdfs/lib/*:/usr/hdp/188.8.131.52-8/hadoop-hdfs/.//*:/usr/hdp/184.108.40.206-8/hadoop-yarn/lib/*:/usr/hdp/220.127.116.11-8/hadoop-yarn/.//*:/usr/hdp/18.104.22.168-8/hadoop-mapreduce/lib/*:/usr/hdp/22.214.171.124-8/hadoop-mapreduce/.//*::mysql-connector-java-5.1.17.jar:mysql-connector-java.jar:/usr/hdp/126.96.36.199-8/tez/*:/usr/hdp/188.8.131.52-8/tez/lib/*:/usr/hdp/184.108.40.206-8/tez/conf
To install hadoop client and hive client you can go to Ambari and select hosts -> select the atlas hostname -> components -> Add ( hive client, hadoop client )