Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

TTransportException when writing data with HiveBolt in a Storm topology

avatar
New Contributor

I am running a Storm topology in a kerberized HDP cluster. The topology writes data to Hive using the HiveBolt from org.apache.storm.hive.bolt. However, when the HiveBolt executes I get the following exception:

org.apache.thrift.transport.TTransportException: null
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[stormjar.jar:?]
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3802) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3788) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:503) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:282) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:188) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:406) [stormjar.jar:?]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [?:1.8.0_161]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [?:1.8.0_161]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [?:1.8.0_161]
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1564) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:92) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:138) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:124) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:295) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:291) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4792) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257) [stormjar.jar:?]
	at com.google.common.cache.LocalCache.get(LocalCache.java:4000) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4789) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:291) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:266) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:544) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:312) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) [stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) [stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
2018-03-30 09:11:43.898 h.metastore [INFO] Connected to metastore.
2018-03-30 09:11:43.898 h.log [ERROR] Got exception: org.apache.thrift.transport.TTransportException null
org.apache.thrift.transport.TTransportException: null
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[stormjar.jar:?]
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_databases(ThriftHiveMetastore.java:746) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_databases(ThriftHiveMetastore.java:733) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1116) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:469) [stormjar.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_161]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_161]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_161]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:174) [stormjar.jar:?]
	at com.sun.proxy.$Proxy42.isOpen(Unknown Source) [?:?]
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:269) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:544) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:312) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) [stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) [stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
2018-03-30 09:11:43.899 h.log [ERROR] Converting exception to MetaException
2018-03-30 09:11:43.899 h.metastore [WARN] Evicted client has non-zero user count: 1
2018-03-30 09:11:43.899 h.metastore [WARN] Non-zero user count preventing client tear down: users=1 expired=true
2018-03-30 09:11:43.900 h.metastore [INFO] Trying to connect to metastore with URI thrift://host2:9083
2018-03-30 09:11:43.915 o.a.s.d.executor [INFO] Processing received message FOR 9 TUPLE: source: __acker:11, stream: __ack_ack, id: {}, [-3886153567507180489 306]
2018-03-30 09:11:43.916 o.a.s.d.executor [INFO] SPOUT Acking message -3886153567507180489 {topic-partition=output1-0, offset=18, numFails=0}
2018-03-30 09:11:43.925 h.metastore [WARN] set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException: null
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[stormjar.jar:?]
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3802) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3788) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:503) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:282) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:188) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:406) [stormjar.jar:?]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [?:1.8.0_161]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [?:1.8.0_161]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [?:1.8.0_161]
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1564) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:92) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:138) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:124) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:295) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:291) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4792) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257) [stormjar.jar:?]
	at com.google.common.cache.LocalCache.get(LocalCache.java:4000) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4789) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:291) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:273) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:544) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:312) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) [stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) [stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
2018-03-30 09:11:43.926 h.metastore [INFO] Connected to metastore.
2018-03-30 09:11:43.926 h.metastore [WARN] Unexpected increment of user count beyond one: 2 HCatClient: thread: 80 users=2 expired=false closed=false
2018-03-30 09:11:43.927 h.log [ERROR] Got exception: org.apache.thrift.transport.TTransportException null
org.apache.thrift.transport.TTransportException: null
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[stormjar.jar:?]
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_databases(ThriftHiveMetastore.java:746) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_databases(ThriftHiveMetastore.java:733) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1116) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:469) [stormjar.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_161]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_161]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_161]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:174) [stormjar.jar:?]
	at com.sun.proxy.$Proxy42.isOpen(Unknown Source) [?:?]
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:269) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:544) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:315) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) [stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) [stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
2018-03-30 09:11:43.927 h.log [ERROR] Converting exception to MetaException
2018-03-30 09:11:43.927 h.metastore [WARN] Evicted client has non-zero user count: 2
2018-03-30 09:11:43.927 h.metastore [WARN] Non-zero user count preventing client tear down: users=2 expired=true
2018-03-30 09:11:43.927 h.metastore [WARN] Non-zero user count preventing client tear down: users=1 expired=true
2018-03-30 09:11:43.928 h.metastore [INFO] Trying to connect to metastore with URI thrift://host2:9083
2018-03-30 09:11:43.943 h.metastore [WARN] set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException: null
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[stormjar.jar:?]
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3802) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3788) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:503) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:282) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:188) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:406) [stormjar.jar:?]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [?:1.8.0_161]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [?:1.8.0_161]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [?:1.8.0_161]
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1564) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:92) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:138) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:124) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:295) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:291) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4792) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257) [stormjar.jar:?]
	at com.google.common.cache.LocalCache.get(LocalCache.java:4000) [stormjar.jar:?]
	at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4789) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:291) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:273) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:544) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:315) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) [stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) [stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
2018-03-30 09:11:43.945 h.metastore [INFO] Connected to metastore.
2018-03-30 09:11:43.945 o.a.h.h.m.RetryingMetaStoreClient [WARN] MetaStoreClient lost connection. Attempting to reconnect (1 of 1) after 1s. getTable
org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed)
	at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_table(ThriftHiveMetastore.java:1269) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1260) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1304) ~[stormjar.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_161]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_161]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_161]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:174) [stormjar.jar:?]
	at com.sun.proxy.$Proxy42.getTable(Unknown Source) [?:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.checkEndPoint(HiveEndPoint.java:332) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:316) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) [stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) [stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
Caused by: java.net.SocketException: Broken pipe (Write failed)
	at java.net.SocketOutputStream.socketWrite0(Native Method) ~[?:1.8.0_161]
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111) ~[?:1.8.0_161]
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155) ~[?:1.8.0_161]
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[?:1.8.0_161]
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[?:1.8.0_161]
	at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159) ~[stormjar.jar:?]
	... 29 more
2018-03-30 09:11:44.947 o.a.t.t.TIOStreamTransport [WARN] Error closing output stream.
java.net.SocketException: Socket closed
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:118) ~[?:1.8.0_161]
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155) ~[?:1.8.0_161]
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[?:1.8.0_161]
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[?:1.8.0_161]
	at java.io.FilterOutputStream.close(FilterOutputStream.java:158) ~[?:1.8.0_161]
	at org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110) [stormjar.jar:?]
	at org.apache.thrift.transport.TSocket.close(TSocket.java:235) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:567) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.tearDown(HiveClientCache.java:508) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.tearDownIfUnused(HiveClientCache.java:498) [stormjar.jar:?]
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.close(HiveClientCache.java:483) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:365) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:168) [stormjar.jar:?]
	at com.sun.proxy.$Proxy42.getTable(Unknown Source) [?:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.checkEndPoint(HiveEndPoint.java:332) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:316) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) [stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) [stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
2018-03-30 09:11:44.950 h.metastore [INFO] Trying to connect to metastore with URI thrift://host2:9083
2018-03-30 09:11:44.981 h.metastore [WARN] set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException: null
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[stormjar.jar:?]
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3802) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3788) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:503) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:370) [stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:168) [stormjar.jar:?]
	at com.sun.proxy.$Proxy42.getTable(Unknown Source) [?:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.checkEndPoint(HiveEndPoint.java:332) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:316) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) [stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) [stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
2018-03-30 09:11:44.982 h.metastore [INFO] Connected to metastore.
2018-03-30 09:11:44.983 o.a.h.h.s.HiveEndPoint [WARN] Unable to check the endPoint: {metaStoreUri='thrift://host2:9083', database='default', table='tablename', partitionVals=[] }
org.apache.thrift.transport.TTransportException: null
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[stormjar.jar:?]
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1275) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1261) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1304) ~[stormjar.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_161]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_161]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_161]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:174) ~[stormjar.jar:?]
	at com.sun.proxy.$Proxy42.getTable(Unknown Source) ~[?:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.checkEndPoint(HiveEndPoint.java:332) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:316) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) [stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) [stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) [stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) [stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
2018-03-30 09:11:44.984 o.a.s.h.b.HiveBolt [ERROR] Exception occurred 
java.lang.reflect.UndeclaredThrowableException: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1887) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) ~[stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) ~[stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) ~[stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) ~[stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
Caused by: org.apache.hive.hcatalog.streaming.InvalidTable: Invalid table db:default, table:tablename: null
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.checkEndPoint(HiveEndPoint.java:335) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:316) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) ~[stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) ~[stormjar.jar:?]
	... 8 more
Caused by: org.apache.thrift.transport.TTransportException
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[stormjar.jar:?]
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1275) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1261) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1304) ~[stormjar.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_161]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_161]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_161]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:174) ~[stormjar.jar:?]
	at com.sun.proxy.$Proxy42.getTable(Unknown Source) ~[?:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.checkEndPoint(HiveEndPoint.java:332) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:316) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) ~[stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) ~[stormjar.jar:?]
	... 8 more
2018-03-30 09:11:44.990 o.a.s.d.executor [ERROR] 
java.lang.reflect.UndeclaredThrowableException: null
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1887) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:196) ~[stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:271) ~[stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$6.call(HiveWriter.java:267) ~[stormjar.jar:?]
	at org.apache.storm.hive.common.HiveWriter$11.call(HiveWriter.java:419) ~[stormjar.jar:?]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_161]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_161]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_161]
Caused by: org.apache.hive.hcatalog.streaming.InvalidTable: Invalid table db:default, table:tablename: null
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.checkEndPoint(HiveEndPoint.java:335) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:316) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) ~[stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) ~[stormjar.jar:?]
	... 8 more
Caused by: org.apache.thrift.transport.TTransportException
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[stormjar.jar:?]
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) ~[stormjar.jar:?]
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) ~[stormjar.jar:?]
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1275) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1261) ~[stormjar.jar:?]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1304) ~[stormjar.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_161]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_161]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_161]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_161]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:174) ~[stormjar.jar:?]
	at com.sun.proxy.$Proxy42.getTable(Unknown Source) ~[?:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.checkEndPoint(HiveEndPoint.java:332) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:316) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint.access$000(HiveEndPoint.java:62) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:202) ~[stormjar.jar:?]
	at org.apache.hive.hcatalog.streaming.HiveEndPoint$1.run(HiveEndPoint.java:197) ~[stormjar.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_161]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_161]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) ~[stormjar.jar:?]
	... 8 more



A tcpdump showed that when connecting to the Hive metastore, the topology performs set_ugi with user storm, even though the HiveBolt was configured to use a specific user's Kerberos principal and keytab. The metastore server then returned invalid status -128.

.....#{...E..~............
C]9^C]\;........set_ugi...........storm............hadoop.
15:39:00.344583 IP 10.0.1.2.9083 > 10.0.1.3.37546: Flags [.], ack 51, win 220, options [nop,nop,TS val 1130191942 ecr 1130183006], length 0
E..4"$@.@..t
...
...#{....~....w.....S.....
C]\FC]9^
15:39:00.344671 IP 10.0.1.2.9083 > 10.0.1.3.37546: Flags [P.], seq 1:25, ack 51, win 220, options [nop,nop,TS val 1130191942 ecr 1130183006], length 24
E..L"%@.@..[
...
...#{....~....w.....k.....
C]\FC]9^.....Invalid status -128
15:39:00.346427 IP 10.0.1.3.37546 > 10.0.1.2.9083: Flags [.], ack 25, win 221, options [nop,nop,TS val 1130183007 ecr 1130191942], length 0
E..4'.@.@..}

This is the setup for the HiveBolt in the topology:

DelimitedRecordHiveMapper mapper = new DelimitedRecordHiveMapper().withColumnFields(new Fields(columnNames));
HiveOptions hiveOptions = new HiveOptions(
    properties.getProperty("HIVE_METASTORE_URL"),
    properties.getProperty("HIVE_DATABASE_NAME"),
    properties.getProperty("HIVE_TABLE_NAME"),
    mapper
);
hiveOptions.withKerberosKeytab(properties.getProperty("HIVE_USER_KEYTAB"));
hiveOptions.withKerberosPrincipal(properties.getProperty("HIVE_USER_PRINCIPAL"));
builder.setBolt("HiveBolt", new HiveBolt(hiveOptions), 2).setNumTasks(2).shuffleGrouping("DataProcessingBolt");

And these are the properties from hive-site.xml for my cluster that may be relevant to this issue:

<property>
  <name>hive.metastore.sasl.enabled</name>
  <value>true</value>
</property>
<property>
  <name>hive.security.authenticator.manager</name>
  <value>org.apache.hadoop.hive.ql.security.ProxyUserAuthenticator</value>
</property>
<property>
  <name>hive.security.authorization.enabled</name>
  <value>false</value>
</property>
<property>
  <name>hive.security.authorization.manager</name>
  <value>org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory</value>
</property>
<property>
  <name>hive.security.metastore.authenticator.manager</name>
  <value>org.apache.hadoop.hive.ql.security.HadoopDefaultMetastoreAuthenticator</value>
</property>
<property>
  <name>hive.security.metastore.authorization.auth.reads</name>
  <value>true</value>
</property>
<property>
  <name>hive.security.metastore.authorization.manager</name>
  <value>org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider</value>
</property>
<property>
  <name>hive.server2.allow.user.substitution</name>
  <value>true</value>
</property>
<property>
  <name>hive.server2.authentication</name>
  <value>KERBEROS</value>
</property>
<property>
  <name>hive.server2.thrift.sasl.qop</name>
  <value>auth</value>
</property>
<property>
  <name>hive.server2.use.SSL</name>
  <value>false</value>
</property>
4 REPLIES 4

avatar
Rising Star
@Desislav Arashev

What versions of storm-hive in your topology and what version of storm is your cluster running? Can you paste your code where you are creating topology and full worker log of the worker running hive bolt?

avatar
Contributor

@Desislav Arashev
I am also running into the same issue? Did you find a way to solve this issue? Appreciate your help.

avatar
Contributor

I was able to get past that issue after including the core-site.xml and hdfs-site.xml.
But now hitting a different error, it's using the right keytab and principal that I have specified in hiveoptions, but having trouble authenticating with the supplied keytab.. here is the exception
java.io.IOException: Login failure for <myprinciapl> from<keytab>: javax.security.auth.login.LoginException: Checksum failed at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:1098) ~[stormjar.jar:?]

avatar
Contributor

@pshah Any clue on how to resolve this?