Support Questions

Find answers, ask questions, and share your expertise

set_ugi() not successful error

avatar
Explorer

I implemented sqoop export (using hcatalog tables) in oozie workflow but got this error. I also copied hive-site.xml

in hdfs code directory from server (path /etc/hive/conf/ ) but that didnt help.Do i need to pass/add any special parameters in hive-site.xml ?

 

 

2019-03-08 23:23:03,838 [main] INFO  hive.metastore  - Trying to connect to metastore with URI thrift://xxxx:1000
2019-03-08 23:23:03,875 [main] INFO  hive.metastore  - Opened a connection to metastore, current connections: 1
2019-03-08 23:23:03,964 [main] WARN  hive.metastore  - set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
	at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3827)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3813)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:490)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:252)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:187)
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:303)
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:227)
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:224)
	at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4767)
	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
	at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
	at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764)
	at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:224)
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:200)
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:558)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:104)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
	at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:348)
	at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:426)
	at org.apache.sqoop.manager.OracleManager.exportTable(OracleManager.java:465)
	at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
	at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:187)
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:170)
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:81)
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:51)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:235)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:459)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
2019-03-08 23:23:03,971 [main] INFO  hive.metastore  - Connected to metastore.
2019-03-08 23:23:03,982 [main] ERROR hive.log  - Got exception: org.apache.thrift.transport.TTransportException null
org.apache.thrift.transport.TTransportException
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
	at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
	at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_databases(ThriftHiveMetastore.java:730)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_databases(ThriftHiveMetastore.java:717)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1083)
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:333)
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:203)
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:558)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:104)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
	at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:348)
	at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:426)
	at org.apache.sqoop.manager.OracleManager.exportTable(OracleManager.java:465)
	at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
	at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:187)
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:170)
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:81)
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:51)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:235)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:459)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
2019-03-08 23:23:03,982 [main] ERROR hive.log  - Converting exception to MetaException
2019-03-08 23:23:03,989 [main] WARN  org.apache.thrift.transport.TIOStreamTransport  - Error closing output stream.
java.net.SocketException: Socket closed
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:118)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
	at org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110)
	at org.apache.thrift.transport.TSocket.close(TSocket.java:235)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:554)
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.tearDown(HiveClientCache.java:369)
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.tearDownIfUnused(HiveClientCache.java:359)
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.close(HiveClientCache.java:349)
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:206)
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:558)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:104)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
	at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:348)
	at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:426)
	at org.apache.sqoop.manager.OracleManager.exportTable(OracleManager.java:465)
	at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:80)
	at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:187)
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:170)
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:81)
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:51)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:235)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:459)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

 

4 REPLIES 4

avatar
Super Guru
From the warning message:

set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException

This issue can occurs when hive-site.xml used by job is not in sync with the server.

Could you please confirm if the "hive-site.xml" passed in the job was updated, can you please confirm that hive-site.xml file is the correct one?

Have you also checked on HMS side? What did HMS server log complain? Is your HS2 working correctly with HMS? What about Hive CLI?

Cheers

avatar
Explorer

Thanks for the response. Is there a way to check whether hive-site.xml is updated ?

Because i picked hive-site.xml from server ( path /etc/hive/conf/ ) , place it in hdfs code directory

and pass xml name in workflow.xml.

 

avatar
New Contributor

Hi!
I have the same issue. Did you find any solution?

avatar
New Contributor

Test 2 ways:

 

1.- Increase the Java Heap Size of HiveServer2

 

  1. Go to Hive service.
  2. Click the Configuration tab.
  3. Search for Java Heap Size of HiveServer2 in Bytes, and increase the value. 
  4. Click Save Changes.
  5. Restart HiveServer2.

 

2.- In Hive >  Configs > Custom hive-site.xml, add the following:

      hive.metastore.event.db.notification.api.auth=false