Created 01-10-2016 04:59 PM
I am new to Hadoop and started learning it, but have some beginners problems with the setup. I had HDP 2.2 with JDK 1.7 and i tried to execute map reduce coded with 1.8. And i realized that HDP 2.2 does not support JDK 1.8 and i have upgraded it to HDP 2.3. When i execute Java -Version, i am still getting Java 1.7. As per the directions in HDP, i have used ambari-server setup to download JDK 1.8 and even after that Java -version is showing 1.7. How to make it use JDK 1.8 as the default?
Created 01-12-2016 08:14 AM
You might have to change the current java version by using
/usr/sbin/alternatives --config java
This will change the java version for the whole system and not just Ambari and the HDP components.
Created 01-10-2016 05:02 PM
Created 01-10-2016 07:35 PM
@Sairam Rachuri in your IDE make sure your code is set to sources 1.8.
Created 01-12-2016 02:25 AM
Thanks@Ali Bajwa and @Artem Ervits. I tried Ali's suggestion and it still shows that Java 1.7 when i do Java -version. But, i got a different error when I ran the mapred program in the terminal. Now it says that i don't have the write access on the file/directory i am trying to write. I need to check that.
When i run the same in Hue, i get a message that 'Output directory not set in JobConf.' I am researching on this error now.
Created 01-12-2016 08:22 AM
you need to set output directory in your job
FileInputFormat.addInputPath(job, new Path(otherArgs.get(0))); FileOutputFormat.setOutputPath(job, new Path(otherArgs.get(1)));
Created 01-12-2016 08:14 AM
You might have to change the current java version by using
/usr/sbin/alternatives --config java
This will change the java version for the whole system and not just Ambari and the HDP components.