Member since
12-28-2015
3
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1963 | 12-28-2015 09:06 PM |
01-13-2016
07:45 PM
2 Kudos
I have succesfully setup a hadoop cluster with HDFS HA enabled. I did a manual failover of Active NN (nn1) . Now the Standby NN (nn2) has become Active as expected. nn1 is now the standby. All HDFS write and read operations work fine (hdfs command line ..). However the below error pops up while using Hive (create table ) . I know turning the nn1 back to ACTIVE solves the issue. Looking for workarounds which doesn't require this manual operation. Thanks in advance. FAILED: SemanticException MetaException(message:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby
at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:87)
at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:1915)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1407)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4501)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:961)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:835)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2141)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2135)
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
-
HDFS
-
Security
12-31-2015
09:00 PM
Hi @R M, The tasks were failed as they were idle for 300 secs. Looks like the FAILED tasks had issues pulling the data. It will be great if you can post the full logs of the failed task. Pull the logs with DEBUG as the log level.
... View more
12-28-2015
09:06 PM
1 Kudo
Looks like the expected format for labeled points is different from what you have. I did run below statement to understand format. I guess the row should be in the format highlighted below. scala> val pos = LabeledPoint(1.0, Vectors.dense(1.0, 0.0, 3.0))
pos: org.apache.spark.mllib.regression.LabeledPoint = (1.0,[1.0,0.0,3.0])
... View more