Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

org.apache.ambari.server.AmbariException: sudo

org.apache.ambari.server.AmbariException: sudo

New Contributor
WARN [Server Action Executor Worker 3355] ServerActionExecutor:497 - Task #3355 failed to complete execution due to thrown exception: org.apache.ambari.server.AmbariException:sudo:sa terminal is needed to exc sudo


org.apache.ambari.server.AmbariException: sudo:抱歉,您必须拥有一个终端来执行 sudo


6 REPLIES 6

Re: org.apache.ambari.server.AmbariException: sudo

Super Mentor

@Elvis Zhang

Are you running ambari as non "root" user?

If yes then you should refer to: https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.18/bk_ambari-security/content/_how_to_configu...

Also please check your "visudo" (/etc/sudoers) file and check the sudo permissions Example:

# sudo visudo

## Allow root to run any commands anywhere
root  ALL=(ALL)  ALL

.

Sudo Defaults - Ambari Server : https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-security/content/sudo_defaults_se...

If sudo is not properly set up, the following error will be seen when the "Configure
            Ambari Identity" stage fails:
stderr: 
sudo: no tty present and no askpass program specified

stdout:
Server action failed

.

Highlighted

Re: org.apache.ambari.server.AmbariException: sudo

New Contributor

yes ,i change user to ambari , the error gone. but another quesion occured hadoop's datanode and NameNode can't be start .

resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config 

Re: org.apache.ambari.server.AmbariException: sudo

Super Mentor

@Elvis Zhang

As you mentioned that now the "hadoop's datanode and NameNode can't be start ."

- Does it means that you are getting any error/exception int he DataNode/NameNode log?

- Or in ambari server. log ? Can you please share the complete log along with the respective stackTrace.

- Apart from NN and DN are you able to start other components (like zookeeper...etc)?

- Apart form ambari-server, Have you setup the "sudoer" permission properly for every ambari-agent as well as mentioned in the : https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-security/content/how_to_configure...

Re: org.apache.ambari.server.AmbariException: sudo

New Contributor

1.not all the service can't start

14118-image-208.png

2.I have setup the "sudore" permission properly for every ambari-agent as well as mentioned in "ambari_agent_for_non-root.html"

3.zookeeper can start

4. Start DN error logs bellow

stderr: /var/lib/ambari-agent/data/errors-3731.txt

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 174, in <module>
    DataNode().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 720, in restart
    self.start(env, upgrade_type=upgrade_type)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 61, in start
    datanode(action="start")
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_datanode.py", line 68, in datanode
    create_log_dir=True
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py", line 269, in service
    Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 273, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start datanode'' returned 1. starting datanode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-datanode-hadoop-namenode-1.out

stdout: /var/lib/ambari-agent/data/output-3731.txt

2017-03-28 15:59:11,350 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.0.0-169
2017-03-28 15:59:11,352 - Checking if need to create versioned

Re: org.apache.ambari.server.AmbariException: sudo

Super Mentor

@Elvis Zhang

This error seems to be occuring on the DataNode side because the command to start it seems to be triggered properly from ambari side but getting error, so looking at the following file might help

/var/log/hadoop/hdfs/hadoop-hdfs-datanode-hadoop-namenode-1.out

.

Re: org.apache.ambari.server.AmbariException: sudo

Super Mentor

@Elvis Zhang Also it will be good to see if you can start the DataNode without any issue manually using the following command:

# su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start datanode'

.