Support Questions

Find answers, ask questions, and share your expertise

Hbase shell error

avatar
Contributor

/usr/local/Hbase/bin# ./hbase shell
Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.java.invokers.RubyToJavaInvoker (file:/usr/local/Hbase/lib/jruby-complete-1.6.8.jar) to method java.lang.Object.registerNatives()
WARNING: Please consider reporting this to the maintainers of org.jruby.java.invokers.RubyToJavaInvoker
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
ArgumentError: wrong number of arguments (0 for 1)
method_added at file:/usr/local/Hbase/lib/jruby-complete-1.6.8.jar!/builtin/javasupport/core_ext/object.rb:10
method_added at file:/usr/local/Hbase/lib/jruby-complete-1.6.8.jar!/builtin/javasupport/core_ext/object.rb:129
Pattern at file:/usr/local/Hbase/lib/jruby-complete-1.6.8.jar!/builtin/java/java.util.regex.rb:2
(root) at file:/usr/local/Hbase/lib/jruby-complete-1.6.8.jar!/builtin/java/java.util.regex.rb:1
require at org/jruby/RubyKernel.java:1062
(root) at file:/usr/local/Hbase/lib/jruby-complete-1.6.8.jar!/builtin/java/java.util.regex.rb:42
(root) at /usr/local/Hbase/bin/../bin/hirb.rb:38
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/Hbase/lib/phoenix-5.0.0-HBase-2.0-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/Hbase/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Manoj690 

 

This looks a hdfs permission, you are executing the command as the root user while the inode permission are hive:hadoop.See below

Error: Error while processing statement: FAILED: Execution Error, return
code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:Got exception:
org.apache.hadoop.security.AccessControlException Permission denied:
user=root, access=EXECUTE,
inode="/warehouse/tablespace/managed/hive":hive:hadoop:drwx------
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)

Can you do the following while logged on the CLI as root

# su - hive

$ hive

The execute your create table statement that should succeed if not share the log

View solution in original post

15 REPLIES 15

avatar
Master Collaborator

@Manoj690 What is the java version

avatar
Contributor
java -version
java version "11.0.4" 2019-07-16 LTS
Java(TM) SE Runtime Environment 18.9 (build 11.0.4+10-LTS)
Java HotSpot(TM) 64-Bit Server VM 18.9 (build 11.0.4+10-LTS, mixed mode)

avatar
Master Collaborator

@Manoj690  Jdk 11 is not yet tested, refer below document

 

https://hbase.apache.org/book.html#java

avatar
Contributor
So which Hbase version i need to install

avatar
Master Collaborator

@Manoj690 Jdk 8 is supported with the currenty hbase version.

avatar
Contributor
is_disabled 'emp'

ERROR: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is
not online on gaian-lap386.com,16020,1573102321915
at
org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionByEncodedName(HRegionServer.java:3273)
at
org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3250)
at
org.apache.hadoop.hbase.regionserver.RSRpcServices.getRegion(RSRpcServices.java:1414)
at
org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2446)
at
org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:41998)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:131)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)

Is named table disabled? For example:
hbase> is_disabled 't1'
hbase> is_disabled 'ns1:t1'

avatar
Master Collaborator

It looks like now you are able to open HBase shell.

 

If the table is disabled, "> is_disabled ’t1′ "command will return true.

ERROR: org.apache.hadoop.hbase.NotServingRegionException: hbase:meta,,1 is
not online on gaian-lap386.com,16020,1573102321915

 Check is  the region server on gaian-lap386.com is up and running

 

 

avatar
Contributor
CREATE TABLE IF NOT EXISTS emp ( eid int, name String,salary int)
. . . . . . . . . . . . . . . . . . . . . . .> COMMENT 'Emp Details'
. . . . . . . . . . . . . . . . . . . . . . .> ROW FORMAT DELIMITED
. . . . . . . . . . . . . . . . . . . . . . .> FIELDS TERMINATED BY '\t'
. . . . . . . . . . . . . . . . . . . . . . .> LINES TERMINATED BY '\n'
. . . . . . . . . . . . . . . . . . . . . . .> STORED AS TEXTFILE;
INFO : Compiling
command(queryId=hive_20191107162111_3eabc23c-ec62-44f2-aa14-fcaefb1f3e2d):
CREATE TABLE IF NOT EXISTS emp ( eid int, name String,salary int)
COMMENT 'Emp Details'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
LINES TERMINATED BY '\n'
STORED AS TEXTFILE
INFO : Semantic Analysis Completed (retrial = false)
INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
INFO : Completed compiling
command(queryId=hive_20191107162111_3eabc23c-ec62-44f2-aa14-fcaefb1f3e2d);
Time taken: 0.863 seconds
INFO : Executing
command(queryId=hive_20191107162111_3eabc23c-ec62-44f2-aa14-fcaefb1f3e2d):
CREATE TABLE IF NOT EXISTS emp ( eid int, name String,salary int)
COMMENT 'Emp Details'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
LINES TERMINATED BY '\n'
STORED AS TEXTFILE
INFO : Starting task [Stage-0:DDL] in serial mode
ERROR : FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got
exception: org.apache.hadoop.security.AccessControlException Permission
denied: user=root, access=EXECUTE,
inode="/warehouse/tablespace/managed/hive":hive:hadoop:drwx------
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:315)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:242)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:606)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1806)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1824)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:681)
at
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:114)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3106)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1154)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:966)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)
)
INFO : Completed executing
command(queryId=hive_20191107162111_3eabc23c-ec62-44f2-aa14-fcaefb1f3e2d);
Time taken: 25.329 seconds
Error: Error while processing statement: FAILED: Execution Error, return
code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:Got exception:
org.apache.hadoop.security.AccessControlException Permission denied:
user=root, access=EXECUTE,
inode="/warehouse/tablespace/managed/hive":hive:hadoop:drwx------
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:315)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:242)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:606)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1806)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1824)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:681)
at
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:114)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3106)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1154)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:966)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)
) (state=08S01,code=1)

avatar
Master Mentor

@Manoj690 

 

This looks a hdfs permission, you are executing the command as the root user while the inode permission are hive:hadoop.See below

Error: Error while processing statement: FAILED: Execution Error, return
code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:Got exception:
org.apache.hadoop.security.AccessControlException Permission denied:
user=root, access=EXECUTE,
inode="/warehouse/tablespace/managed/hive":hive:hadoop:drwx------
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)

Can you do the following while logged on the CLI as root

# su - hive

$ hive

The execute your create table statement that should succeed if not share the log