Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

E090 HDFS020 Could not write file /user/anirudh.deshpande/hive/jobs/hive-job-1002-2018-08-22_11-58/query.hql [HdfsApiException]

Hello all,

I've created a hbase integrated table in hive.

I was facing the above mentioned error while creating the table, but after sometime, the table got created with no change in the query or any other parameters.

But now when I'm trying to run the below query on the table, the job fails with the similar error.

I've a dedicated directory under user for anirudh.deshpande.

Also, I've the parameter hadoop.proxyuser.root.hosts=*

What might be the issue ?

Attaching exception capture.pngTrace for the same

Also the Ranger configs for the same.capture1.png

select count(*) from test_addresses;
3 REPLIES 3

Hi @Prathamesh H
Could you share the full stack trace that is showing partially on capture.png?

Super Mentor

@Prathamesh H

Looks like your NameNode might be either going through a Long GC Pause or due to some reason it was not responding. As we see the following message:

Caused by: java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
	at java.net.SocketInputStream.read(SocketInputStream.java:171)
	at java.net.SocketInputStream.read(SocketInputStream.java:141)
	at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
	at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
	at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
	at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:877)
	at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
	at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1587)
	at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1492)
	at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:468)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:115)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$FsPathOutputStreamRunner$1.close(WebHdfsFileSystem.java:962)
	at org.apache.ambari.view.utils.hdfs.HdfsUtil$1.run(HdfsUtil.java:51)
	at org.apache.ambari.view.utils.hdfs.HdfsUtil$1.run(HdfsUtil.java:46)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
	at org.apache.ambari.view.utils.hdfs.HdfsApi.execute(HdfsApi.java:513)
	at org.apache.ambari.view.utils.hdfs.HdfsUtil.putStringToFile(HdfsUtil.java:46)

.

HIve View is imply making a WebHDFS call to read the hive query file from hdfs and getting this error. So please check the NameNode log of that time to understand what was going on their or If there was a slow n/w issue (or long GC Pause on NameNode ..etc).
If you see this "Read timed out" message very frequently then may be NameNode tuning might help (liek tunign the heap size) revewing the GC log of NameNode, Or Network Delays.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.