- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Configuration setting via Cloudera Manager to set HADOOP_CLASSPATH and HADOOP_USER_CLASSPATH_FIRST
Created on ‎03-21-2017 09:44 PM - edited ‎09-16-2022 04:18 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Where do i update HADOOP_USER_CLASSPATH_FIRST=true and HADOOP_CLASSPATH in cloudera manager so that it has cluster wide effect.
I manually changed the seeting in /etc/hadoop/conf/hadoop-env.sh. However, i dont see that effect. Basically, i want to prepend classpath when starting hiveserver2 and metastore. This works fine on HDP when these parameters are set in hadoop-env.sh via ambari.
Created ‎03-23-2017 04:13 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
CM > Hive > Configuration > search for "Hive Service Environment Advanced Configuration Snippet (Safety Valve)"
and put in environment settings you want to update here.
Cloudera Manager will then ask you to restart Hive service. This will affect both HS2 and HMS.
Created ‎03-23-2017 09:48 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
clients, use the equivalent "Client Environment" valve.
FWIW, the Cloudera Manager introduction page goes over the terminology it
uses to classify different layers of configuration:
https://www.cloudera.com/documentation/enterprise/latest/topics/cm_intro_primer.html#concept_wfj_tny....
In your case, clients would mean gateways.
P.s. Typically a bad idea to override system jar versions, as the
software's been compiled with the version that was included along. Any
incompatible changes to methods or availability of classes within the
changed version you're forcing the classloader to pick would cause runtime
failures (NoSuchMethodError, NoClassDefFound, ClassNotFoundException, etc.).
P.p.s. If overriding some pre-included library is the prime purpose, try to
use Maven shading with namespace relocation:
https://maven.apache.org/plugins/maven-shade-plugin/examples/class-relocation.html
Created ‎03-23-2017 10:21 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
configuration redeploy to have it go to your local gateway configuration:
https://www.youtube.com/watch?v=4S9H3wftM_0
Created ‎03-23-2017 04:13 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
CM > Hive > Configuration > search for "Hive Service Environment Advanced Configuration Snippet (Safety Valve)"
and put in environment settings you want to update here.
Cloudera Manager will then ask you to restart Hive service. This will affect both HS2 and HMS.
Created on ‎03-23-2017 08:40 AM - edited ‎03-23-2017 09:36 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I also want to override jars when i run mapreduce job. That is when i run wordcount example using hadoop as well as yarn. Where do i place these variables in cloudera manager ?
I tried following option:
CM > Yarn > Congihuration > search for "YARN (MR2 Included) Service Environment Advanced Configuration Snippet (Safety Valve)"
This did not help. It still picks the old jars
Created ‎03-23-2017 09:48 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
clients, use the equivalent "Client Environment" valve.
FWIW, the Cloudera Manager introduction page goes over the terminology it
uses to classify different layers of configuration:
https://www.cloudera.com/documentation/enterprise/latest/topics/cm_intro_primer.html#concept_wfj_tny....
In your case, clients would mean gateways.
P.s. Typically a bad idea to override system jar versions, as the
software's been compiled with the version that was included along. Any
incompatible changes to methods or availability of classes within the
changed version you're forcing the classloader to pick would cause runtime
failures (NoSuchMethodError, NoClassDefFound, ClassNotFoundException, etc.).
P.p.s. If overriding some pre-included library is the prime purpose, try to
use Maven shading with namespace relocation:
https://maven.apache.org/plugins/maven-shade-plugin/examples/class-relocation.html
Created ‎03-23-2017 10:10 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Which client environment ?
I changed the following option:
CM > Yarn > Configuration > Gateway Client Environment Advanced Configuration Snippet (Safety Valve) for hadoop-env.sh
It has no effect. It is still picks the default library, not the one which i am trying to override.
Service Environment Advanced Configuration works fine in case of hive.
Created ‎03-23-2017 10:21 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
My bad. I forgot to deploy, stale configuration after making the changes.
CM > Yarn > Configuration > Gateway Client Environment Advanced Configuration Snippet (Safety Valve) works fine. I am able to override the default jars
Created ‎03-23-2017 10:21 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
configuration redeploy to have it go to your local gateway configuration:
https://www.youtube.com/watch?v=4S9H3wftM_0
Created ‎02-01-2019 02:39 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Harsh J How would I do this just for one job ?.
I tried using below setting but it is not working. The issue is that I want to use a version of jersey which I bundled into my fat jar,however gateway node has an older version of that jar and it loads a class from there resulting in a NosSuchMethodException .My application is not a map reduce job and I run it by using hadoop jar and running on 5.14.4
export HADOOP_USER_CLASSPATH_FIRST=true
export HADOOP_CLASSPATH=/projects/poc/test/config:$HADOOP_CLASSPATH
