Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hive java.io.IOException: Couldn't set up IO streams

avatar
Contributor

Hi,

We have some queries that work fine with small set of data, but when I am pulling a month worth of data, I got the following error:

java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "be-bi-secondary-528.soleocommunications.com/10.10.11.6"; destination host is: "be-bi-secondary-528.soleocommunications.com":8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:773) at org.apache.hadoop.ipc.Client.call(Client.java:1431) at org.apache.hadoop.ipc.Client.call(Client.java:1358) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy16.mkdirs(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:558) at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy17.mkdirs(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008) at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978) at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047) at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036) at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:226) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1655) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1414) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: java.io.IOException: Couldn't set up IO streams at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:791) at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:373) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1493) at org.apache.hadoop.ipc.Client.call(Client.java:1397) ... 39 more Caused by: java.lang.OutOfMemoryError: unable to create new native thread at java.lang.Thread.start0(Native Method)  at java.lang.Thread.start(Thread.java:713) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:784) ... 42 more Error launching map-reduce job

These queries used to work with large data set before. I start seeing this problem after I upgraded HDP from 2.2.4.2 to 2.3.2.

I tried few things people suggested online, such as increase ulimit (from 1024 to 64000), increase map/reduce java.opts (in my hive session before running the job, from system setting -Xmx2867m to -Xmx10240m), they didn't help. I also saw people talking about turning max data transfer threads, my system is already set to a pretty high value suggested by SmartSense.

Any help will be greatly appreciated!

Xi

1 ACCEPTED SOLUTION

avatar
Contributor

Problem solved by changing the ulimit on both service accounts and user accounts. 32k for files, 64k for processes worked for me.

View solution in original post

3 REPLIES 3

avatar
Contributor

Problem solved by changing the ulimit on both service accounts and user accounts. 32k for files, 64k for processes worked for me.

avatar
Contributor

The changing of the ulimit appears to have fixed that issue. I am now presented with the below issue but at least it is something different. Thank you Xi for the help.

avatar
Contributor

Hey @Montrial Harrell, We are facing this once or twice a week and we use to solve this issue by restarting the hive server. Can you let me know what and where did you increase the limit to clear this issue.