Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Upgrade to Hadoop 2.8

Upgrade to Hadoop 2.8

New Contributor

When will CDH incorporate Apache Hadoop 2.8?

7 REPLIES 7

Re: Upgrade to Hadoop 2.8

Rising Star

Howdy,

 

Thanks for reaching out on this. Currently, we're on Hadoop 2.6, and 2.8 is Slated as "TBD". We don't rebase very often on minor versions because of all the changes it makes, and opt instead to backport features into CDH, which leads me to my next question. Is there a particular feature in Hadoop 2.8 you're looking to use? 

 

Let me know when you can.

 

Cheers,

Josh

Re: Upgrade to Hadoop 2.8

New Contributor

Yes, I'm curious when the settings described in this JIRA will be available. Based on the latest release notes, HADOOP-12437 is not in CDH yet. This would be ideal for those of us using multihomed appliances.

https://issues.apache.org/jira/browse/HADOOP-12437

 

 

Re: Upgrade to Hadoop 2.8

Master Guru

@jabberwockkiewrote:

Yes, I'm curious when the settings described in this JIRA will be available. Based on the latest release notes, HADOOP-12437 is not in CDH yet. This would be ideal for those of us using multihomed appliances.

https://issues.apache.org/jira/browse/HADOOP-12437

 

 


While this may arrive in C6 or forward, I wanted point out that Cloudera currently (as of C5) does not cover support for multi-homed networks with a few specific exclusions that have been intensively tested: https://www.cloudera.com/documentation/enterprise/release-notes/topics/rn_consolidated_pcm.html#cdh_...

Re: Upgrade to Hadoop 2.8

New Contributor

I would like the ability to use Java 9.

Java 7/8 have this 64kb mehtod limit https://dzone.com/articles/method-size-limit-java

This limit restricts the number of variables I can use, say in a linear model in mllib, to 500-2000, depending on how long my column names are.

Re: Upgrade to Hadoop 2.8

Master Guru

@axiomwrote:

I would like the ability to use Java 9.

Java 7/8 have this 64kb mehtod limit https://dzone.com/articles/method-size-limit-java

This limit restricts the number of variables I can use, say in a linear model in mllib, to 500-2000, depending on how long my column names are.


JDK9 support is not currently planned for C5. Worth remembering that JDK9 follows the new release policy of Oracle, and will reach EOL for updates as of March 2018: http://www.oracle.com/technetwork/java/eol-135779.html#Interfaces. This makes it unfeasible to support (as a server runtime, but you may try and use it for clients for leveraging new language features). Same limited lifetime applies to JDK10.

Re: Upgrade to Hadoop 2.8

Explorer

Hi,

 

Is there any timeline for HDFS-11047? ( https://issues.apache.org/jira/browse/HDFS-11047

 

Based on heap usage observations, we suspect this is an issue currently affecting all CDH versions of Hadoop.  It's affecting us greatly, so we need to know if it is feasible to stay with CDH or not. 

 

Thanks!

Highlighted

Re: Upgrade to Hadoop 2.8

Master Guru

@Chewlockawrote:

Hi,

 

Is there any timeline for HDFS-11047? ( https://issues.apache.org/jira/browse/HDFS-11047

 

Based on heap usage observations, we suspect this is an issue currently affecting all CDH versions of Hadoop.  It's affecting us greatly, so we need to know if it is feasible to stay with CDH or not. 

 

Thanks!


Could you post your DataNode heap investigations over a separate topic under the Storage board, to help Engg. investigate this report? We do have a number of customers running with a lot of blocks on their DNs but their DNs do not appear to OOM crash (which I think is implied in your post). Or if you have access to Support, please log a case.