Created on 03-21-2017 09:44 PM - edited 09-16-2022 04:18 AM
Where do i update HADOOP_USER_CLASSPATH_FIRST=true and HADOOP_CLASSPATH in cloudera manager so that it has cluster wide effect.
I manually changed the seeting in /etc/hadoop/conf/hadoop-env.sh. However, i dont see that effect. Basically, i want to prepend classpath when starting hiveserver2 and metastore. This works fine on HDP when these parameters are set in hadoop-env.sh via ambari.
Created 03-23-2017 04:13 AM
Created 03-23-2017 09:48 AM
Created 03-23-2017 10:21 AM
Created 03-23-2017 04:13 AM
Created on 03-23-2017 08:40 AM - edited 03-23-2017 09:36 AM
I also want to override jars when i run mapreduce job. That is when i run wordcount example using hadoop as well as yarn. Where do i place these variables in cloudera manager ?
I tried following option:
CM > Yarn > Congihuration > search for "YARN (MR2 Included) Service Environment Advanced Configuration Snippet (Safety Valve)"
This did not help. It still picks the old jars
Created 03-23-2017 09:48 AM
Created 03-23-2017 10:10 AM
Which client environment ?
I changed the following option:
CM > Yarn > Configuration > Gateway Client Environment Advanced Configuration Snippet (Safety Valve) for hadoop-env.sh
It has no effect. It is still picks the default library, not the one which i am trying to override.
Service Environment Advanced Configuration works fine in case of hive.
Created 03-23-2017 10:21 AM
My bad. I forgot to deploy, stale configuration after making the changes.
CM > Yarn > Configuration > Gateway Client Environment Advanced Configuration Snippet (Safety Valve) works fine. I am able to override the default jars
Created 03-23-2017 10:21 AM
Created 02-01-2019 02:39 PM
@Harsh J How would I do this just for one job ?.
I tried using below setting but it is not working. The issue is that I want to use a version of jersey which I bundled into my fat jar,however gateway node has an older version of that jar and it loads a class from there resulting in a NosSuchMethodException .My application is not a map reduce job and I run it by using hadoop jar and running on 5.14.4
export HADOOP_USER_CLASSPATH_FIRST=true
export HADOOP_CLASSPATH=/projects/poc/test/config:$HADOOP_CLASSPATH