Support Questions

Find answers, ask questions, and share your expertise

Update JDK from 1.7 to 1.8.0_121 on HDP


I want to update the JDK for HDP 2.4 from standard 1.7 to present 1.8. For this, I have tried several ways, few of which includes :

1) Updating the JAVA_HOME and PATH variable in /root/.bashrc to the 1.8 JDK location. This works well when I have my jar compiled on my local on 1.8 and I bring it on HDP to run on the newly updated JDK. There is no version mismatch.

2) However for any hadoop jar compiled on my local at 1.8 and brought to run on HDP at this 1.8, still fails due to versioning issue because hadoop-client, etc have their conf set as - /usr/lib/jvm/java. And to correct that, I have manually changed the simulink at /usr/lib/jvm/ and pointed it as :

cd /usr/lib/jvm/

ln -s $JAVA_HOME java

However this affects, when I try to restart my HDP, where webcat server fails to start.

I have also tried, alternatives --config java pointing it to my custom JDK. However this as well doesn't seems to help much in updating JDK.

Can you revert back asap.


Super Mentor

@Aditya Kumar Roy

Are you using Ambari to manage your HDP 2.4 cluster ?

If you are using Ambari then changing the JDK version is much simpler. Extract the desired version of JDK to all your host at the same location on every host and then from ambari run the "ambari-server setup" then choose the right JDK PATH.

# ambari-server setup
Do you want to change Oracle JDK [y/n] (n)? y
[3] - Custom JDK : 3

. Then choose Custom JDK, verify or add the custom JDK path on all hosts in the cluster.

Super Mentor

@Aditya Kumar Roy

Also as we see that you mentioned that "alternatives" options did not work for you. So can you please let us know what issue did you face while using the following option? It should be working though and should be changing the default "java" to point to the correct binary.

 /usr/sbin/alternatives --config java


Hi @Jay SenSharma,

Thanks for extending all the support. I have already tried out both the steps mentioned by you several times in the past. I did that one more time today, but the result was no different. Updating Ambari JDK and /usr/sbin/alternatives --config java help me run a normal java jar compiled on 1.8 on HDP. However in case of hadoop jar, I have noticed one thing, in hadoop-client/conf/ and several other configuration files, the JAVA_HOME is hard coded as /usr/lib/jvm/java.

Now in the path - /usr/lib/jvm, java is a simulink to java_sdk

java -> /etc/alternatives/java_sdk

When I trace back to the path, /etc/alternatives/ I find java_sdk is a simulink to -

java_sdk -> /usr/lib/jvm/java-1.7.0-openjdk.x86_64

And all this after re-iterating all the steps mentioned above. How can i change this path to get my hadoop jars run comfortably ? I have tried pointing /usr/lib/jvm/java simulink to my ${JAVA_HOME}. But this doesn't help. As hadoop runs from yarn user of hadoop group. How do i update all the permissions successfully ?