Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

I am getting an error while creating DB using Hive CLI in HDP 3.0 4 node cluster ..

avatar
Explorer

0: jdbc:hive2://dpysydirbd201.sl.bluecloud.ib> create database test;
INFO : Compiling command(queryId=hive_20180731025445_cecd377d-4927-4566-8c2b-6305ee07d798): create database test
INFO : Semantic Analysis Completed (retrial = false)
INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
INFO : Completed compiling command(queryId=hive_20180731025445_cecd377d-4927-4566-8c2b-6305ee07d798); Time taken: 0.021 seconds
INFO : Executing command(queryId=hive_20180731025445_cecd377d-4927-4566-8c2b-6305ee07d798): create database test
INFO : Starting task [Stage-0:DDL] in serial mode
ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/warehouse/tablespace/external/hive":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:261)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkDefaultEnforcer(RangerHdfsAuthorizer.java:512)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:305)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1850)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1834)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1784)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:7767)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2217)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1659)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:872)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:818)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1688)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2678)
)
INFO : Completed executing command(queryId=hive_20180731025445_cecd377d-4927-4566-8c2b-6305ee07d798); Time taken: 0.019 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=hive, access=WRITE, inode="/warehouse/tablespace/external/hive":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:261)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkDefaultEnforcer(RangerHdfsAuthorizer.java:512)
at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:305)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1850)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1834)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1784)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:7767)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2217)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1659)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:872)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:818)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1688)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2678)
) (state=08S01,code=1)

2 ACCEPTED SOLUTIONS

avatar
Super Collaborator

@Birendra Singh The problem is because of hive unable to create database directory in default warehouse directory i.e /warehouse/tablespace/managed/hive

By default the directory is owned by hive user and hadoop group, you can run below hdfs command as hdfs user and then try to create a database

hdfs dfs -chown -R hive:hadoop /warehouse/tablespace/managed/hive

PS: Please accept the answer if you find it correct.

View solution in original post

avatar
Explorer

Hi
Yes,It is resolved. Thank you.

View solution in original post

6 REPLIES 6

avatar
Super Collaborator

@Birendra Singh The problem is because of hive unable to create database directory in default warehouse directory i.e /warehouse/tablespace/managed/hive

By default the directory is owned by hive user and hadoop group, you can run below hdfs command as hdfs user and then try to create a database

hdfs dfs -chown -R hive:hadoop /warehouse/tablespace/managed/hive

PS: Please accept the answer if you find it correct.

avatar
New Contributor

I am new to Hive and getting this same error. Could you please tell where to execute this command

avatar
Explorer

Hi
Yes,It is resolved. Thank you.

avatar
Super Collaborator

@Birendra Singh Curious to know if the problem was solved?

avatar
Explorer

@Chiran Ravani

After doing what you mentioend , still im unable to create the table

please find the below error message


create table patient(Patient_Id int,

. . . . . . . . . . . . . . . . . . . . . > Full_Name string,

. . . . . . . . . . . . . . . . . . . . . > SSN string,

. . . . . . . . . . . . . . . . . . . . . > Email string,

. . . . . . . . . . . . . . . . . . . . . > Phone_no string,

. . . . . . . . . . . . . . . . . . . . . > Gender string,

. . . . . . . . . . . . . . . . . . . . . > Addr_line1 string,

. . . . . . . . . . . . . . . . . . . . . > Addr_line2 string,

. . . . . . . . . . . . . . . . . . . . . > Addr_line3 string,

. . . . . . . . . . . . . . . . . . . . . > City string,

. . . . . . . . . . . . . . . . . . . . . > Country string,

. . . . . . . . . . . . . . . . . . . . . > Race string,

. . . . . . . . . . . . . . . . . . . . . > Drug1 string,

. . . . . . . . . . . . . . . . . . . . . > Drug2 string,

. . . . . . . . . . . . . . . . . . . . . > ICD_Code string)

. . . . . . . . . . . . . . . . . . . . . > ROW FORMAT DELIMITED

. . . . . . . . . . . . . . . . . . . . . > FIELDS TERMINATED BY ','

. . . . . . . . . . . . . . . . . . . . . > LINES TERMINATED BY '\n'

. . . . . . . . . . . . . . . . . . . . . > STORED AS TEXTFILE ;

INFO : Compiling command(queryId=hive_20190322085132_34c73f65-38bf-4445-8580-44a263066a55): create table patient(Patient_Id int,

Full_Name string,

SSN string,

Email string,

Phone_no string,

Gender string,

Addr_line1 string,

Addr_line2 string,

Addr_line3 string,

City string,

Country string,

Race string,

Drug1 string,

Drug2 string,

ICD_Code string)

ROW FORMAT DELIMITED

FIELDS TERMINATED BY ','

LINES TERMINATED BY '\n'

STORED AS TEXTFILE

INFO : Semantic Analysis Completed (retrial = false)

INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)

INFO : Completed compiling command(queryId=hive_20190322085132_34c73f65-38bf-4445-8580-44a263066a55); Time taken: 0.064 seconds

INFO : Executing command(queryId=hive_20190322085132_34c73f65-38bf-4445-8580-44a263066a55): create table patient(Patient_Id int,

Full_Name string,

SSN string,

Email string,

Phone_no string,

Gender string,

Addr_line1 string,

Addr_line2 string,

Addr_line3 string,

City string,

Country string,

Race string,

Drug1 string,

Drug2 string,

ICD_Code string)

ROW FORMAT DELIMITED

FIELDS TERMINATED BY ','

LINES TERMINATED BY '\n'

STORED AS TEXTFILE

INFO : Starting task [Stage-0:DDL] in serial mode

ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=dm_user, access=EXECUTE, inode="/warehouse/tablespace/managed/hive/dm_dev.db":hive:hadoop:drwxrwx---

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:315)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:242)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:606)

at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1799)

at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1817)

at org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:674)

at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:114)

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3091)

at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1154)

at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:966)

at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)

at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)

at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)

at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:872)

at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:818)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1688)

at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2678)

)

INFO : Completed executing command(queryId=hive_20190322085132_34c73f65-38bf-4445-8580-44a263066a55); Time taken: 0.102 seconds

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=dm_user, access=EXECUTE, inode="/warehouse/tablespace/managed/hive/dm_dev.db":hive:hadoop:drwxrwx---

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:315)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:242)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:606)

at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1799)

at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1817)

at org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:674)

at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:114)

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3091)

at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1154)

at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:966)

at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)

at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)

at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)

at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:872)

at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:818)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1688)

at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2678)

) (state=08S01,code=1)

avatar
Explorer

I'm also getting the same issue but for me instead of taking my username.its showing user Anonymous.Can anyone suggest how should i resolve this issue?