Member since
06-28-2022
9
Posts
0
Kudos Received
0
Solutions
10-09-2023
12:49 AM
Hi hassan, Please check the yarn instance if any of those are down? if not please do rolling restart for yarn and then try running the same command and check if this message persist. Secondly can you share the roles you have assigned to all these 3 nodes.
... View more
02-02-2023
05:30 AM
when I open the HUE dashboard, I got this error "Could not connect to hostname:10000" and Error loading databases. same when i try to run any query on Hive it gives: ImpalaRuntimeException: Error making 'createDatabase' RPC to Hive Metastore: CAUSED BY: MetaException: Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=impala, access=WRITE, inode="/user":esadmin:supergroup:drwxrwxr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:400) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:256) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:194) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1846) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1830) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1789) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:60) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3130) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1116) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:696) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1726) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)
... View more
Labels:
11-12-2022
01:09 PM
@hassan-ki5 This looks a typical CM database connection issue can you check and compare the entries in cat /etc/cloudera-scm-server/db.properties com.cloudera.cmf.db.type=[Oracle/mysql/postgresql] com.cloudera.cmf.db.host=localhost com.cloudera.cmf.db.name=scm com.cloudera.cmf.db.user=scm com.cloudera.cmf.db.setupType=EXTERNAL com.cloudera.cmf.db.password=scm Ensure the DB.password, name, and user are correct since you seem to be running Mysql can you check this page CM using Mysql
... View more
11-10-2022
10:39 PM
Hi @pajoshi I have tried to generate new Certificates and already insert the sign crt files. but getting the same issue.
... View more
11-03-2022
07:53 AM
Hi @hassan-ki5, From the error, it is clear that the certificate verification has failed. It should be because your ssl certificates must have already expired. You will need to renew your certificates to make this work. The services keep on running even after certificates expire until restarted. Thank you
... View more