Created 11-14-2018 03:32 PM
2018-11-14T20:36:33,075 INFO [main] org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool - Creating metastore client for PreUpgradeTool 2018-11-14T20:36:33,106 INFO [main] hive.metastore - Trying to connect to metastore with URI thrift://sjdcdlake02.np1.ril.com:9083 2018-11-14T20:36:33,328 INFO [main] hive.metastore - Opened a connection to metastore, current connections: 1 2018-11-14T20:36:33,329 INFO [main] hive.metastore - Connected to metastore. 2018-11-14T20:36:34,533 INFO [main] hive.metastore - Trying to connect to metastore with URI thrift://sjdcdlake02.np1.ril.com:9083 2018-11-14T20:36:34,548 INFO [main] hive.metastore - Opened a connection to metastore, current connections: 2 2018-11-14T20:36:34,549 INFO [main] hive.metastore - Connected to metastore. 2018-11-14T20:36:36,572
ERROR [main] org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool - PreUpgradeTool failed org.apache.hadoop.hive.metastore.api.MetaException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:87) at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:2015) at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1404) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4137) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1137) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:866) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:53086) ~[hive-metastore-2.1.0.2.6.4.0-91.jar:2.1.0.2.6.4.0-91] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:53063) ~[hive-metastore-2.1.0.2.6.4.0-91.jar:2.1.0.2.6.4.0-91] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:52994) ~[hive-metastore-2.1.0.2.6.4.0-91.jar:2.1.0.2.6.4.0-91] at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86) ~[libthrift-0.9.3.jar:0.9.3] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1507) ~[hive-metastore-2.1.0.2.6.4.0-91.jar:2.1.0.2.6.4.0-91]
Created 07-16-2020 11:58 AM
Encountered this error during our upgrade.
User and third party tool had created external Hive tables with locations that referenced single namenodes rather than HA name. These had to be corrected.
Created 07-16-2020 03:23 PM
I would firstly advise you to start a new thread rather than updating a thread that's not being followed and please share the logs too.
Created 07-16-2020 04:05 PM
My issue is resolved. Wanted to provide an update to the thread in case it is later discovered through a search.
Created 07-16-2020 10:08 PM
I'm happy to see you resolved your issue. Can you please mark the appropriate reply as the solution? As you mentioned, it will make it easier for others to find the answer in the future.
Regards,
Vidya Sargur,Learn more about the Cloudera Community: