Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Can't start service cloudera-scm-server | Permission denied /etc/init.d/cloudera-scm-server

New Contributor

I am unable to start service cloudera-scm-server owing to permission denied error. I am trying to set up Spark 2.0 in cloudera quickstart vm. After java 8 update , I am trying to start the server , but below error occurs. Any help will be

much appreciated.




Expert Contributor

Please restart the service with root privileges e.g. 

# sudo service cloudera-scm-server restart

Note that force_start has a different purpose than helping over permissions issues.

New Contributor

sorry this did not work with restart, start, force_start anything. Same error I am getting.

Super Collaborator

Hi @Abhipreyo ,


Did you try with root user or sudo suggested by @gzigldrum? What is the CM version?




Li Wang, Technical Solution Manager

Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:

Terms of Service

Community Guidelines

How to use the forum

New Contributor

I tried with sudo.Nevertheless, issue is resolved now. 


After doing three four time reboot, it was solved.


But once I am trying to achieve my end goal which is to set up spark2 in cloudera. After every configuration I am trying to add service for Spark2. During First Run I am getting an error as givne below


5:28:57.606 PM FATAL DataNode

Exception in secureMain
java.lang.RuntimeException: Cannot start datanode because the configured max locked memory size (dfs.datanode.max.locked.memory) is greater than zero and native code is not available.
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(



New Contributor

Can anyone help me here?

Expert Contributor

Can you please copy the output of the following commands to here, to clarify the permissions problem:


# namei -l /var/log/cloudera-scm-server/cloudera-scm-server.out
# namei -l /var/run/

New Contributor

Here it goes. This issue is resolved now. But I cant reach the lvel where i can start using Spark2 in cludera. isuue description is guven in this thread. 


Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.