Created on 09-11-2017 06:11 AM - edited 09-16-2022 05:13 AM
Hello
I am running a CDH 5.12 cluster and I have some avro files on HDFS. I wrote a Java program that should connect to HDFS and deserialize the avro files.
The files aren't very big, only 10-20 Mb each, However, whenever I try to run my program it throws this exception:
org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length
I googled it and found that some people advised to increase the parameter "ipc.maximum.data.length" to 128Gb. I did that but the error persists.
Does anyone have an idea what can be the problem ? Maybe this is just a cover for another problem ?
Thank you
Guy
Created 09-11-2017 08:54 AM
Looks like this is one of the known issue in Cloudera
Just to make sure, did you restart the namenode after increase ipc.maximum.data.length ?
Created 09-12-2017 12:46 AM
Hi
I did restart Namnode, but it doesn't seem to have any effect.
Im not sure I'm hitting the bug you mentioned because I was trying to read files from a client program. Anyway, I increased the value of ipc.maximum.data.length and it did not help.
I changed it in "HDFS Service Advanced Configuration Snippet (Safety Valve) for hdfs-site.xml", maybe it's not the right place ?
Thanks
Guy
Created 09-12-2017 07:22 AM
Yes that is not the right place, according to the link that i've given above this configuration change should go to core-site.xml, so search for
"Cluster-wide Advanced Configuration Snippet (Safety Valve) for core-site.xml" and add/modify as needed and restart the namenode