Community Articles
Find and share helpful community-sourced technical articles
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.
Labels (1)
New Contributor

To upgrade JDK across all HDP services on all nodes using Ambari, the following instructions have proven to be helpful. This requires downtime on ALL HDP components.

a. Use the 'Stop all' from the Actions button off the main Ambari dashboard to shut down all HDP components. 

b. Using Clush/Pdsh/Cluster scripting tool of choice, confirm that ALL Java processes are fully stopped across ALL cluster nodes (excluding Ambari), killing any orphaned processes that Ambari was unable to stop. 

c. Run the following from Ambari master node (as root) to stop the Ambari-agent and Ambari-server processes. 
service ambari-agent stop 
service ambari-server stop 

d. Run the following from Ambari-server master node (as root): 
ambari-server setup (and follow the prompts, example provided below) 
e.g.: 
[root@sandbox conf]# ambari-server setup 
Using python /usr/bin/python2.6 
Setup ambari-server 
Checking SELinux... 
SELinux status is 'disabled' 
Customize user account for ambari-server daemon [y/n] (n)? n 
Adjusting ambari-server permissions and ownership... 
Checking firewall status... 
Checking JDK... 
Do you want to change Oracle JDK [y/n] (n)? y 
[1] Oracle JDK 1.8 + Java Cryptography Extension (JCE) Policy Files 8 
[2] Oracle JDK 1.7 + Java Cryptography Extension (JCE) Policy Files 7 
[3] Custom JDK 
============================================================================== 
Enter choice (1): 3 
WARNING: JDK must be installed on all hosts and JAVA_HOME must be valid on all hosts. 
WARNING: JCE Policy files are required for configuring Kerberos security. If you plan to use Kerberos,please make sure JCE Unlimited Strength Jurisdiction Policy Files are valid on all hosts. 
Path to JAVA_HOME: {Enter your exact full path to JDK base location} 
Validating JDK on Ambari Server...done. 
Completing setup... 
Configuring database... 
Enter advanced database configuration [y/n] (n)? n 
Configuring database... 
Default properties detected. Using built-in database. 
Configuring ambari database... 
Checking PostgreSQL... 
Configuring local database... 
Connecting to local database...done. 
Configuring PostgreSQL... 
Backup for pg_hba found, reconfiguration not required 
Extracting system views... 
........ 
Adjusting ambari-server permissions and ownership... 
Ambari Server 'setup' completed successfully. 

e. Start the Ambari-server and ambari-agent processes: 
service ambari-server start 
service ambari-agent start 

f. Using 'Start All' from Actions button on main Ambari Dashboard, start up all the HDP components. 

g. Confirm with 'ps -ef' that all the processes are using the new java location for all HDP components. 
e.g. 
ps -ef | grep HiveServer2 | grep -v grep 
hive      95143      1  0 15:26 ?        00:00:37 /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.91.x86_64/bin/java ...
ps -ef | grep ResourceManager | grep -v grep 
yarn     116722      1  1 15:48 ?        00:03:22 /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.91.x86_64/bin/java...

Due to Option #1 and Option #2 being hardcoded to older/problematic version of JDK on various Ambari versions, it is advised to use Option 3 only.

https://hortonworks.jira.com/browse/BUG-44244

https://hortonworks.jira.com/browse/BUG-45794

5,463 Views
Comments
Contributor

Hi ,

I have created a HDP 2.4 VM on Azure and created a local instance .NiFi component is down due version java version mismatch error. hdp 2.4 and nifi 1.1.1 is expecting jdk 1.8 and installed version is jdk 1.7. I have followed your steps to change the jdk version and it still says version 1.7

Java version

[root@sandbox nifi]# java -version java version "1.7.0_95" OpenJDK Runtime Environment (rhel-2.6.4.0.el6_7-x86_64 u95-b00) OpenJDK 64-Bit Server VM (build 24.95-b01, mixed mode)

2016-12-15 00:09:17,285 ERROR [main] org.apache.nifi.NiFi Failure to launch NiFi due to java.lang.UnsupportedClassVersionError: org/apache/nifi/atlas/reporting/AtlasFlowReportingTask : Unsupported major.minor version 52.0 java.lang.UnsupportedClassVersionError: org/apache/nifi/atlas/reporting/AtlasFlowReportingTask : Unsupported major.minor version 52.0

Appreciate if you can suggest a solution for this

Don't have an account?
Coming from Hortonworks? Activate your account here
Version history
Revision #:
1 of 1
Last update:
‎11-20-2015 09:10 PM
Updated by:
 
Contributors
Top Kudoed Authors