Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

how to add jar files through hive view on ambari

avatar
Contributor

hello all

I am very new to hadoop and I wanted to add a couple of jar files to be able to perform my queries

I tried using add jars and the path of the jars like I do in the terminal but I get an error

add jar /tmp/udfs/esri-geometry-api.jar /tmp/udfs/spatial-sdk-hadoop.jar;

Error while compiling statement: FAILED: ParseException line 4:0 cannot recognize input near 'add' 'jar' '/' [ERROR_STATUS]

since I am a newbie i dont exactly know how to add them since the path where the jars are located is right, and I am able to do that successfully on the terminal throught he back end.

Kindly if someone could elaborately explain me how to add these jar files, i would be eternally grateful

1 ACCEPTED SOLUTION

avatar
Super Guru

@Srinivas Santhanam

Add semicolon at the end and make sure that your ambari user is mapped to an OS user that has access to the path

add jar /tmp/udfs/esri-geometry-api.jar;

add jar /tmp/udfs/spatial-sdk-hadoop.jar;

My suggestion is to place these libraries in HDFS with your Ambari user that has hdfs priviles. Such you can have access to the libraries from any node with hive client.

Example:

add jar hdfs://whateverhostname:8020/tmp/esri/esri-geometry-api.jar;

If the response addresses your problem, don't forget to vote and accept best answer. It is a stimulus for the effort. Thank you.

View solution in original post

15 REPLIES 15

avatar
Contributor

no luck. here is what is visible initially and i added the path below those lines. Restarted it, but still getting the same error

Folder containing extra libraries required for hive compilation/execution can be controlled by: if [ "${HIVE_AUX_JARS_PATH}" != "" ]; then if [ -f "${HIVE_AUX_JARS_PATH}" ]; then export HIVE_AUX_JARS_PATH=${HIVE_AUX_JARS_PATH} elif [ -d "/usr/hdp/current/hive-webhcat/share/hcatalog" ]; then export HIVE_AUX_JARS_PATH=/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar fi elif [ -d "/usr/hdp/current/hive-webhcat/share/hcatalog" ]; then export HIVE_AUX_JARS_PATH=/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar fi export METASTORE_PORT={{hive_metastore_port}}

export HIVE_AUX_JARS_PATH=/usr/hdp/current/hive-server2/auxlib/esri-geometry-api.jar,/usr/hdp/current/hive-server2/auxlib/spatial-sdk-hadoop.jar.

I am not sure whether I am making any mistake

Should I try changing it in hive-site.xml or hive-env.sh through the terminal/shell, will it make any difference?

I was referring to this blog

http://chetnachaudhari.github.io/2016-02-16/how-to-add-auxiliary-jars-in-hive/

avatar
Super Guru

Manual change won't work becuase when you restart the service through Ambari, confs it will get overwrite. I believe only change remaining is adding in hive-site.xml through Ambari and restart service.

avatar
Contributor

i dont think i see where i can add or edit the property through ambari in advanced hive-site or custom hive-site

avatar
Expert Contributor

avatar
Contributor

Hi,

before posting this question, i did refer the above link, and so managed to add the jar files through the files view into /tmp/udf. But I dont want to create a temporary function, I just want to add the couple of jars because they are needed for any query to run because it is geospatial data.

thanks

avatar
Super Guru

@Srinivas Santhanam

Add semicolon at the end and make sure that your ambari user is mapped to an OS user that has access to the path

add jar /tmp/udfs/esri-geometry-api.jar;

add jar /tmp/udfs/spatial-sdk-hadoop.jar;

My suggestion is to place these libraries in HDFS with your Ambari user that has hdfs priviles. Such you can have access to the libraries from any node with hive client.

Example:

add jar hdfs://whateverhostname:8020/tmp/esri/esri-geometry-api.jar;

If the response addresses your problem, don't forget to vote and accept best answer. It is a stimulus for the effort. Thank you.