Support Questions
Find answers, ask questions, and share your expertise

hadoop-httpfs - Large files fail to transfer through REST

Highlighted

hadoop-httpfs - Large files fail to transfer through REST

New Contributor

I'm working on a windows service that will deliver files to HDFS using the REST API to connect to hadoop-httpfs.

 

The service works fine for files under 2MB.  It immediatly throws an exception when transferring files over 2MB.  I believe this is because there is a 2MB POST limit default in the embedded Tomcat server.  On a normal Tomcat server you can override that by adding maxPostSize=-1.  I haven't been able to find a place where overriding that works in the embedded version or any of the Cloudera configuration items.

 

Has anyone managed to get past that 2MB limit?

1 REPLY 1
Highlighted

Re: hadoop-httpfs - Large files fail to transfer through REST

New Contributor

I have the same issue with 2MB limit.

And I managed solve this: ( In CDH 5.5.2 )

1.override tomcat server.xml

 

edit /etc/hadoop-httpfs/tomcat-conf/conf/server.xml and add maxPostSize in connector settings. Then restart hadoop-httpfs. Will found edit apply in /var/lib/hadoop-httpfs/tomcat-deployment/conf/server.xml 

 

2. check your tomcat version and workaround

The issue happens in tomcat 6.0.44

check tomcat version using /usr/lib/bigtop-tomcat/bin/version.sh 

 

In tomcat 6.0.44 changelog, due to CVE-2014-0230, tomcat add org.apache.coyote.MAX_SWALLOW_SIZE (defaults to 2MB)

change log:

https://tomcat.apache.org/tomcat-6.0-doc/changelog.html

 

Add java property -Dorg.apache.coyote.MAX_SWALLOW_SIZE=2097152000 to increase the limit