Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length

avatar
Explorer

Hello

 

I am running a CDH 5.12 cluster and I have some avro files on HDFS. I wrote a Java program that should connect to HDFS and deserialize the avro files.

The files aren't very big, only 10-20 Mb each, However, whenever I try to run my program it throws this exception:

org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length

 

I googled it and found that some people advised to increase the parameter "ipc.maximum.data.length" to 128Gb. I did that but the error persists.

 

Does anyone have an idea what can be the problem ? Maybe this is just a cover for another problem ?

 

Thank you

 

Guy

3 REPLIES 3

avatar
Champion

@ni4ni

 

Looks like this is one of the known issue in Cloudera

 

https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_rn_upgrade.html#concept_s...

 

Just to make sure, did you restart the namenode after increase ipc.maximum.data.length ?

 

 

avatar
Explorer

Hi

 

I did restart Namnode, but it doesn't seem to have any effect.

 

Im not sure I'm hitting the bug you mentioned because I was trying to read files from a client program. Anyway, I increased the value of ipc.maximum.data.length and it did not help.

 

I changed it in "HDFS Service Advanced Configuration Snippet (Safety Valve) for hdfs-site.xml", maybe it's not the right place ?

 

Thanks

 

Guy

avatar
Champion

@ni4ni

 

Yes that is not the right place, according to the link that i've given above this configuration change should go to core-site.xml, so search for 

 

"Cluster-wide Advanced Configuration Snippet (Safety Valve) for core-site.xml" and add/modify as needed and restart the namenode