Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Unable to start ZKFC

avatar
Explorer

While trying to start ZKFC from Ambari, its failing with the below error:

resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s 
/bin/bash -c 'ulimit -c unlimited ; /usr/hdp/3.1.0.0-78/hadoop/bin/hdfs --config /usr/hdp/3.1.0.0-78/hadoop/conf
--daemon start zkfc'' returned 1. ERROR: Cannot set priority of zkfc process 2866

Does anybody know a solution,

Thanks in advance

 

2 REPLIES 2

avatar
Master Guru

@ManjunathK Can you check the jars in the hadoop class /usr/hdp/3.1.0.0-78/hadoop-hdfs/ in the problematic node and see if there is any 0 bytes file. If yes, then copy all the jars from working node and started the services.


Cheers!
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
Explorer

I recently face this same issue on one of our environment. 

Error:

returned 1. ERROR: Cannot set priority of zkfc process 24167

When digged more on the ZKFC Log, I found this error:

ERROR org.apache.hadoop.ha.ZKFailoverController: Unable to start failover controller. Parent znode does not exist.

I applied the solution to format ZKFC and once done I have restarted the services and it worked fine.

bin/hdfs zkfc -formatZK