Support Questions

Find answers, ask questions, and share your expertise

Zeppelin install in Ambari fails with error: Nothing to do

avatar
Rising Star

This is a single node HDP (not sandbox). Installing Zeppelin on the same node using Ambari. Got this error:

 File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install python-pip' returned 1. Error: Nothing to do

Any idea on how to workaround this?

1 ACCEPTED SOLUTION

avatar
Rising Star

I think it is failing because I have pip already installed on this machine

View solution in original post

11 REPLIES 11

avatar
Expert Contributor

Which version of HDP sandbox is this? Try using HDP 2.4 latest one.

http://hortonworks.com/products/hdp/

avatar
Expert Contributor

Refer this page on how to install python pip tools manually.

http://www.liquidweb.com/kb/how-to-install-pip-on-centos-7/

avatar
Rising Star

It is not a sandbox but a standalone cluster. It is indeed 2.4

avatar
Expert Contributor

Ok. This is failing due to python pip tools. Did you find a way to install it successfully ?

avatar
Rising Star

I think it is failing because I have pip already installed on this machine

avatar
Rising Star

I did a manual install instead of Ambari. Went through fine.

avatar

Could you please put more logs

avatar
Rising Star

I need help with zeppelin.Am getting resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install python-pip' returned 1. Error: Nothing to do (below is my log from ambari).Zeppelin is installed on a cluster not sandbox. Thanks

stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 235, in <module> Master().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 54, in install self.install_packages(env) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 404, in install_packages Package(name) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 49, in action_install self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput()) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install python-pip' returned 1. Error: Nothing to do stdout: 2016-04-19 23:09:14,624 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.0.0-169 2016-04-19 23:09:14,624 - Checking if need to create versioned conf dir /etc/hadoop/2.4.0.0-169/0 2016-04-19 23:09:14,624 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2016-04-19 23:09:14,647 - call returned (1, '/etc/hadoop/2.4.0.0-169/0 exist already', '') 2016-04-19 23:09:14,647 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False} 2016-04-19 23:09:14,669 - checked_call returned (0, '/usr/hdp/2.4.0.0-169/hadoop/conf -> /etc/hadoop/2.4.0.0-169/0') 2016-04-19 23:09:14,669 - Ensuring that hadoop has the correct symlink structure 2016-04-19 23:09:14,670 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2016-04-19 23:09:14,671 - Group['hadoop'] {} 2016-04-19 23:09:14,672 - Group['users'] {} 2016-04-19 23:09:14,673 - Group['zeppelin'] {} 2016-04-19 23:09:14,673 - Group['knox'] {} 2016-04-19 23:09:14,673 - Group['spark'] {} 2016-04-19 23:09:14,673 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2016-04-19 23:09:14,674 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,675 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,675 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2016-04-19 23:09:14,676 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,677 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,677 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,678 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,679 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,679 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,680 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2016-04-19 23:09:14,681 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,681 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,682 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2016-04-19 23:09:14,683 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,683 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,684 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,685 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,685 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-04-19 23:09:14,686 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2016-04-19 23:09:14,690 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2016-04-19 23:09:14,694 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2016-04-19 23:09:14,695 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'} 2016-04-19 23:09:14,695 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2016-04-19 23:09:14,696 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2016-04-19 23:09:14,701 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2016-04-19 23:09:14,701 - Group['hdfs'] {} 2016-04-19 23:09:14,701 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']} 2016-04-19 23:09:14,702 - Directory['/etc/hadoop'] {'mode': 0755} 2016-04-19 23:09:14,714 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2016-04-19 23:09:14,714 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777} 2016-04-19 23:09:14,728 - Repository['HDP-2.4'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.0.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None} 2016-04-19 23:09:14,735 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.4]\nname=HDP-2.4\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.0.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2016-04-19 23:09:14,735 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2016-04-19 23:09:14,740 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.20]\nname=HDP-UTILS-1.1.0.20\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'} 2016-04-19 23:09:14,740 - Package['unzip'] {} 2016-04-19 23:09:14,867 - Skipping installation of existing package unzip 2016-04-19 23:09:14,867 - Package['curl'] {} 2016-04-19 23:09:14,906 - Skipping installation of existing package curl 2016-04-19 23:09:14,906 - Package['hdp-select'] {} 2016-04-19 23:09:14,945 - Skipping installation of existing package hdp-select 2016-04-19 23:09:15,189 - Execute['find /var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package -iname "*.sh" | xargs chmod +x'] {} 2016-04-19 23:09:15,197 - Execute['echo platform.linux_distribution:Red Hat Enterprise Linux Server+7.2+Maipo'] {} 2016-04-19 23:09:15,201 - Package['gcc-gfortran'] {} 2016-04-19 23:09:15,332 - Skipping installation of existing package gcc-gfortran 2016-04-19 23:09:15,333 - Package['blas-devel'] {} 2016-04-19 23:09:15,372 - Skipping installation of existing package blas-devel 2016-04-19 23:09:15,373 - Package['lapack-devel'] {} 2016-04-19 23:09:15,412 - Skipping installation of existing package lapack-devel 2016-04-19 23:09:15,412 - Package['python-devel'] {} 2016-04-19 23:09:15,452 - Skipping installation of existing package python-devel 2016-04-19 23:09:15,453 - Package['python-pip'] {} 2016-04-19 23:09:15,492 - Installing package python-pip ('/usr/bin/yum -d 0 -e 0 -y install python-pip')

avatar
Rising Star

Instead of installing via Ambari, try to install it on the cluster manually. Steps are in this link below under the heading Installing Zeppelin manually

http://hortonworks.com/hadoop-tutorial/apache-zeppelin-hdp-2-4/

I did that and was able to install Zeppelin successfully.