Member since
03-19-2018
47
Posts
2
Kudos Received
0
Solutions
07-11-2019
02:15 PM
@BiggieSmalls, The error you show indicates that the certificate and key files specified for Hue are not in the expected PEM format. The key needs to have the key stored in base64 between: -----BEGIN ENCRYPTED PRIVATE KEY----- -----END ENCRYPTED PRIVATE KEY----- The certificate file needs to have the certificate in base64 between: -----BEGIN CERTIFICATE----- -----END CERTIFICATE----- The "no start line" error from openssl libraries is explaining that it cannot find any BEGIN line make sure your ssl_certificate and ssl_private_key files contain the above text.
... View more
02-26-2019
02:14 AM
Hello @Johnny_Bach, As per Enterprise Datasheet, express edition doesn't support "Advanced Features". Below is the screenshot for your reference: Hope that helps.
... View more
02-04-2019
03:18 AM
Faced same issue. Turned out that it's due too enabled AutoTLS, and it's feature of enterprise version only. it's not obvious from setup tutorial.
... View more
11-21-2018
10:52 PM
@Tomas79, Thank you for the inputs
... View more
10-10-2018
01:30 PM
The MEM_LIMIT is a hard limit on the amount of memory that can be used by the query and cannot be re-negotiated during execution. If the default mem_limit that you set does not suffice, you can either increase it OR you can set the mem_limit query option to a higher value only for that query.
... View more
09-25-2018
12:36 AM
@mmm286- in-case your issue is resolved , could you please kindly update the details and mark as solved
... View more
09-13-2018
09:10 AM
No, restarting Cloudera Manager server and agents to pick up the new certificates should not affect any of the Hadoop cluster services. Regards, Jim
... View more
09-13-2018
07:06 AM
agree to @Tomas79 , in restarting the services for new certs to come into effect
... View more
06-20-2018
12:30 AM
we are using CDH 5.14.0,I found our components [hdfs,yarn,hbase] would restart because of the same issue. the exception like this : java.io.IOException: Cannot run program "stat": error=11, Resource temporarily unavailable at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048) at org.apache.hadoop.util.Shell.runCommand(Shell.java:551) at org.apache.hadoop.util.Shell.run(Shell.java:507) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:789) at org.apache.hadoop.fs.HardLink.getLinkCount(HardLink.java:218) at org.apache.hadoop.hdfs.server.datanode.ReplicaInfo.breakHardLinksIfNeeded(ReplicaInfo.java:265) at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:1177) at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:1148) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.<init>(BlockReceiver.java:210) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:675) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:169) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:106) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:246) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.IOException: error=11, Resource temporarily unavailable at java.lang.UNIXProcess.forkAndExec(Native Method) at java.lang.UNIXProcess.<init>(UNIXProcess.java:247) at java.lang.ProcessImpl.start(ProcessImpl.java:134) at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) ... 13 more 2018-06-20 02:05:54,797 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: DataNode is out of memory. Will retry in 30 seconds. java.lang.OutOfMemoryError: unable to create new native thread at java.lang.Thread.start0(Native Method) at java.lang.Thread.start(Thread.java:717) at org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:154) at java.lang.Thread.run(Thread.java:748) alse,I noted cloudera manager help us set the ulimit. here is our config: if [ $(id -u) -eq 0 ]; then # Max number of open files ulimit -n 32768 # Max number of child processes and threads ulimit -u 65536 # Max locked memory ulimit -l unlimited fi ps: our machine is 72c 250g. could you help me that what the reason causes create native thread failed?
... View more