Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to change the default Java version?

avatar
New Contributor

I am new to Hadoop and started learning it, but have some beginners problems with the setup. I had HDP 2.2 with JDK 1.7 and i tried to execute map reduce coded with 1.8. And i realized that HDP 2.2 does not support JDK 1.8 and i have upgraded it to HDP 2.3. When i execute Java -Version, i am still getting Java 1.7. As per the directions in HDP, i have used ambari-server setup to download JDK 1.8 and even after that Java -version is showing 1.7. How to make it use JDK 1.8 as the default?

1 ACCEPTED SOLUTION

avatar

You might have to change the current java version by using

 /usr/sbin/alternatives --config java

This will change the java version for the whole system and not just Ambari and the HDP components.

View solution in original post

5 REPLIES 5

avatar

avatar
Master Mentor

@Sairam Rachuri in your IDE make sure your code is set to sources 1.8.

avatar
New Contributor

Thanks@Ali Bajwa and @Artem Ervits. I tried Ali's suggestion and it still shows that Java 1.7 when i do Java -version. But, i got a different error when I ran the mapred program in the terminal. Now it says that i don't have the write access on the file/directory i am trying to write. I need to check that.

When i run the same in Hue, i get a message that 'Output directory not set in JobConf.' I am researching on this error now.

avatar
Master Mentor

you need to set output directory in your job

    FileInputFormat.addInputPath(job, new Path(otherArgs.get(0)));
    FileOutputFormat.setOutputPath(job, new Path(otherArgs.get(1)));

avatar

You might have to change the current java version by using

 /usr/sbin/alternatives --config java

This will change the java version for the whole system and not just Ambari and the HDP components.