Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Connection reset while CREATE EXTERNAL TABLE

avatar
New Contributor

Hi!

 

I have a problem with the following statement:

 

CREATE EXTERNAL TABLE testtable
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.avro.AvroSerDe'
WITH SERDEPROPERTIES ('avro.schema.url'='hdfs://ip-10-0-1-138.eu-central-1.compute.internal/files/test/schema/testtable.avsc')
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
LOCATION 'hdfs://ip-10-0-1-138.eu-central-1.compute.internal/files/test/avro';

 

I am connected via beeline on the same host as hiveserver2. When I execute the statement i get the following error:

 

 

ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
INFO  : Completed executing command(queryId=hive_20180219135454_44116c51-57ba-4cb6-a33b-76869a96cbca); Time taken: 118.062 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset (state=08S01,code=1)

Before that I had an java heap size error so i increased the hiveserver2 heap memory from 8GB to 12GB.

 

What causes the error?

4 REPLIES 4

avatar
Champion

transport exception is generic , its not able to communicate with HiveServer2.

Check if you have enough resouce vcores /memory  available ? possible in yarn web ui 

Check if you have enough memory in the host (linux )  were you have deployed HiveServer2  -

Check if you can perform ssh into the host were HiveServer2 is runining ? 

Finally see if you have HiveServer2 is up and runining in green  status ? 

How many roles are in the host were you had deployed HiveServer2 ? 

 Can you provide me the full stack trace of HiveServer2 log - Check if you have any Out of memory being thrown again as you had heap issue before ? 

avatar
New Contributor

I checked the topics you have mentioned.

 

The Server on which HiveServer2 is running has 32GB RAM and runs the DataNode, NodeManager and HiveServer2 role.

 

The cluster has 24 vCores ans 52GB RAM.

 

SSH access to all Servers in the cluster is okay.

 

HiveServer2 is running. Creating normal tables and insert data to it, works without any problems.

 

Here is the full stack trace from the log:

 

2018-02-20 07:58:26,741 WARN  org.apache.hadoop.hive.metastore.RetryingMetaStoreClient: [HiveServer2-Background-Pool: Thread-39]: MetaStoreClient lost connection. Attempting to reconnect.
org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
	at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:147)
	at org.apache.thrift.protocol.TBinaryProtocol.writeString(TBinaryProtocol.java:202)
	at org.apache.hadoop.hive.metastore.api.FieldSchema$FieldSchemaStandardScheme.write(FieldSchema.java:532)
	at org.apache.hadoop.hive.metastore.api.FieldSchema$FieldSchemaStandardScheme.write(FieldSchema.java:476)
	at org.apache.hadoop.hive.metastore.api.FieldSchema.write(FieldSchema.java:414)
	at org.apache.hadoop.hive.metastore.api.StorageDescriptor$StorageDescriptorStandardScheme.write(StorageDescriptor.java:1461)
	at org.apache.hadoop.hive.metastore.api.StorageDescriptor$StorageDescriptorStandardScheme.write(StorageDescriptor.java:1288)
	at org.apache.hadoop.hive.metastore.api.StorageDescriptor.write(StorageDescriptor.java:1154)
	at org.apache.hadoop.hive.metastore.api.Table$TableStandardScheme.write(Table.java:1596)
	at org.apache.hadoop.hive.metastore.api.Table$TableStandardScheme.write(Table.java:1408)
	at org.apache.hadoop.hive.metastore.api.Table.write(Table.java:1262)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_args$create_table_with_environment_context_argsStandardScheme.write(ThriftHiveMetastore.java:29575)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_args$create_table_with_environment_context_argsStandardScheme.write(ThriftHiveMetastore.java:29530)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_args.write(ThriftHiveMetastore.java:29470)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:71)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_create_table_with_environment_context(ThriftHiveMetastore.java:1077)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1068)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2135)
	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:97)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:732)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:720)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:105)
	at com.sun.proxy.$Proxy19.createTable(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2067)
	at com.sun.proxy.$Proxy19.createTable(Unknown Source)
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:783)
	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4155)
	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:309)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1978)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1691)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1423)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1207)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1202)
	at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:237)
	at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:88)
	at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:293)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
	at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:306)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketException: Connection reset
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:115)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:145)
	... 55 more
2018-02-20 07:58:27,747 WARN  org.apache.thrift.transport.TIOStreamTransport: [HiveServer2-Background-Pool: Thread-39]: Error closing output stream.
java.net.SocketException: Socket closed
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:118)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
	at org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110)
	at org.apache.thrift.transport.TSocket.close(TSocket.java:235)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:546)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:324)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:98)
	at com.sun.proxy.$Proxy19.createTable(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2067)
	at com.sun.proxy.$Proxy19.createTable(Unknown Source)
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:783)
	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4155)
	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:309)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1978)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1691)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1423)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1207)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1202)
	at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:237)
	at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:88)
	at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:293)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
	at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:306)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)
2018-02-20 07:58:27,747 INFO  hive.metastore: [HiveServer2-Background-Pool: Thread-39]: Closed a connection to metastore, current connections: 1
2018-02-20 07:58:27,747 INFO  hive.metastore: [HiveServer2-Background-Pool: Thread-39]: Trying to connect to metastore with URI thrift://ip-10-0-1-138.eu-central-1.compute.internal:9083
2018-02-20 07:58:27,748 INFO  hive.metastore: [HiveServer2-Background-Pool: Thread-39]: Opened a connection to metastore, current connections: 2
2018-02-20 07:58:27,750 INFO  hive.metastore: [HiveServer2-Background-Pool: Thread-39]: Connected to metastore.
2018-02-20 07:58:28,069 ERROR hive.ql.exec.DDLTask: [HiveServer2-Background-Pool: Thread-39]: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:789)
	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4155)
	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:309)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1978)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1691)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1423)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1207)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1202)
	at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:237)
	at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:88)
	at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:293)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
	at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:306)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
	at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:147)
	at org.apache.thrift.protocol.TBinaryProtocol.writeString(TBinaryProtocol.java:202)
	at org.apache.hadoop.hive.metastore.api.FieldSchema$FieldSchemaStandardScheme.write(FieldSchema.java:532)
	at org.apache.hadoop.hive.metastore.api.FieldSchema$FieldSchemaStandardScheme.write(FieldSchema.java:476)
	at org.apache.hadoop.hive.metastore.api.FieldSchema.write(FieldSchema.java:414)
	at org.apache.hadoop.hive.metastore.api.StorageDescriptor$StorageDescriptorStandardScheme.write(StorageDescriptor.java:1461)
	at org.apache.hadoop.hive.metastore.api.StorageDescriptor$StorageDescriptorStandardScheme.write(StorageDescriptor.java:1288)
	at org.apache.hadoop.hive.metastore.api.StorageDescriptor.write(StorageDescriptor.java:1154)
	at org.apache.hadoop.hive.metastore.api.Table$TableStandardScheme.write(Table.java:1596)
	at org.apache.hadoop.hive.metastore.api.Table$TableStandardScheme.write(Table.java:1408)
	at org.apache.hadoop.hive.metastore.api.Table.write(Table.java:1262)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_args$create_table_with_environment_context_argsStandardScheme.write(ThriftHiveMetastore.java:29575)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_args$create_table_with_environment_context_argsStandardScheme.write(ThriftHiveMetastore.java:29530)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_args.write(ThriftHiveMetastore.java:29470)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:71)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_create_table_with_environment_context(ThriftHiveMetastore.java:1077)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1068)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2135)
	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:97)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:732)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:720)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:105)
	at com.sun.proxy.$Proxy19.createTable(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2067)
	at com.sun.proxy.$Proxy19.createTable(Unknown Source)
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:783)
	... 21 more
Caused by: java.net.SocketException: Connection reset
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:115)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:145)
	... 55 more

2018-02-20 07:58:28,070 ERROR org.apache.hadoop.hive.ql.Driver: [HiveServer2-Background-Pool: Thread-39]: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
2018-02-20 07:58:28,070 INFO  org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-39]: </PERFLOG method=Driver.execute start=1519113384488 end=1519113508070 duration=123582 from=org.apache.hadoop.hive.ql.Driver>
2018-02-20 07:58:28,070 INFO  org.apache.hadoop.hive.ql.Driver: [HiveServer2-Background-Pool: Thread-39]: Completed executing command(queryId=hive_20180220075656_f7de14bb-4b32-4641-9332-7f9a877d7719); Time taken: 123.582 seconds
2018-02-20 07:58:28,070 INFO  org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-39]: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2018-02-20 07:58:28,076 INFO  org.apache.hadoop.hive.ql.log.PerfLogger: [HiveServer2-Background-Pool: Thread-39]: </PERFLOG method=releaseLocks start=1519113508070 end=1519113508076 duration=6 from=org.apache.hadoop.hive.ql.Driver>
2018-02-20 07:58:28,112 ERROR org.apache.hive.service.cli.operation.Operation: [HiveServer2-Background-Pool: Thread-39]: Error running hive query: 
org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
	at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:400)
	at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:239)
	at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:88)
	at org.apache.hive.service.cli.operation.SQLOperation$3$1.run(SQLOperation.java:293)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
	at org.apache.hive.service.cli.operation.SQLOperation$3.run(SQLOperation.java:306)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:789)
	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4155)
	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:309)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1978)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1691)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1423)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1207)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1202)
	at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:237)
	... 11 more
Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
	at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:147)
	at org.apache.thrift.protocol.TBinaryProtocol.writeString(TBinaryProtocol.java:202)
	at org.apache.hadoop.hive.metastore.api.FieldSchema$FieldSchemaStandardScheme.write(FieldSchema.java:532)
	at org.apache.hadoop.hive.metastore.api.FieldSchema$FieldSchemaStandardScheme.write(FieldSchema.java:476)
	at org.apache.hadoop.hive.metastore.api.FieldSchema.write(FieldSchema.java:414)
	at org.apache.hadoop.hive.metastore.api.StorageDescriptor$StorageDescriptorStandardScheme.write(StorageDescriptor.java:1461)
	at org.apache.hadoop.hive.metastore.api.StorageDescriptor$StorageDescriptorStandardScheme.write(StorageDescriptor.java:1288)
	at org.apache.hadoop.hive.metastore.api.StorageDescriptor.write(StorageDescriptor.java:1154)
	at org.apache.hadoop.hive.metastore.api.Table$TableStandardScheme.write(Table.java:1596)
	at org.apache.hadoop.hive.metastore.api.Table$TableStandardScheme.write(Table.java:1408)
	at org.apache.hadoop.hive.metastore.api.Table.write(Table.java:1262)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_args$create_table_with_environment_context_argsStandardScheme.write(ThriftHiveMetastore.java:29575)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_args$create_table_with_environment_context_argsStandardScheme.write(ThriftHiveMetastore.java:29530)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_args.write(ThriftHiveMetastore.java:29470)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:71)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_create_table_with_environment_context(ThriftHiveMetastore.java:1077)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1068)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2135)
	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:97)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:732)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:720)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:105)
	at com.sun.proxy.$Proxy19.createTable(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2067)
	at com.sun.proxy.$Proxy19.createTable(Unknown Source)
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:783)
	... 21 more
Caused by: java.net.SocketException: Connection reset
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:115)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:145)
	... 55 more
2018-02-20 07:58:28,134 INFO  org.apache.hive.service.cli.operation.OperationManager: [HiveServer2-Handler-Pool: Thread-36]: Closing operation: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=303f24a4-d34a-4ec3-9844-bb88ae7e4759]
2018-02-20 07:58:55,739 INFO  org.apache.hive.service.CompositeService: [HiveServer2-Handler-Pool: Thread-36]: Session closed, SessionHandle [1f8918ed-337b-42de-8741-eec6deaa5f97], current sessions:0
2018-02-20 07:58:55,740 INFO  org.apache.hive.service.cli.session.HiveSessionImpl: [HiveServer2-Handler-Pool: Thread-36]: Operation log session directory is deleted: /var/log/hive/operation_logs/1f8918ed-337b-42de-8741-eec6deaa5f97
2018-02-20 07:58:55,750 INFO  org.apache.hadoop.hive.ql.session.SessionState: [HiveServer2-Handler-Pool: Thread-36]: Deleted directory: /tmp/hive/hive/1f8918ed-337b-42de-8741-eec6deaa5f97 on fs with scheme hdfs
2018-02-20 07:58:55,754 INFO  org.apache.hadoop.hive.ql.session.SessionState: [HiveServer2-Handler-Pool: Thread-36]: Deleted directory: /tmp/hive/1f8918ed-337b-42de-8741-eec6deaa5f97 on fs with scheme file
2018-02-20 07:58:55,755 WARN  org.apache.thrift.transport.TIOStreamTransport: [HiveServer2-Handler-Pool: Thread-36]: Error closing output stream.
java.net.SocketException: Socket closed
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:118)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
	at org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110)
	at org.apache.thrift.transport.TSocket.close(TSocket.java:235)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:546)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:105)
	at com.sun.proxy.$Proxy19.close(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2067)
	at com.sun.proxy.$Proxy19.close(Unknown Source)
	at org.apache.hadoop.hive.ql.metadata.Hive.close(Hive.java:357)
	at org.apache.hadoop.hive.ql.metadata.Hive.access$000(Hive.java:153)
	at org.apache.hadoop.hive.ql.metadata.Hive$1.remove(Hive.java:173)
	at org.apache.hadoop.hive.ql.metadata.Hive.closeCurrent(Hive.java:326)
	at org.apache.hadoop.hive.ql.session.SessionState.close(SessionState.java:1552)
	at org.apache.hive.service.cli.session.HiveSessionImpl.close(HiveSessionImpl.java:628)
	at org.apache.hive.service.cli.session.HiveSessionImplwithUGI.close(HiveSessionImplwithUGI.java:94)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
	at com.sun.proxy.$Proxy20.close(Unknown Source)
	at org.apache.hive.service.cli.session.SessionManager.closeSession(SessionManager.java:405)
	at org.apache.hive.service.cli.CLIService.closeSession(CLIService.java:223)
	at org.apache.hive.service.cli.thrift.ThriftCLIService.CloseSession(ThriftCLIService.java:462)
	at org.apache.hive.service.cli.thrift.TCLIService$Processor$CloseSession.getResult(TCLIService.java:1273)
	at org.apache.hive.service.cli.thrift.TCLIService$Processor$CloseSession.getResult(TCLIService.java:1258)
	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)
2018-02-20 07:58:55,755 INFO  hive.metastore: [HiveServer2-Handler-Pool: Thread-36]: Closed a connection to metastore, current connections: 1

avatar
Champion

does your location - hdfs path resides on S3 ? 

 

hdfs://ip-10-0-1-138.eu-central-1.compute.internal/files/test/avro';

avatar
New Contributor

No, they reside on the hdfs which resides on the cluster nodes.