Created 06-09-2016 01:23 AM
I followed the steps in
at the deployment step I got the following error:
stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 235, in <module> Master().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 36, in install import params File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/params.py", line 65, in <module> fline = open(spark_home + "/RELEASE").readline().rstrip() IOError: [Errno 2] No such file or directory: u'/usr/hdp/current/spark-client//RELEASE' stdout: 2016-06-08 13:38:24,720 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.2.0-258 2016-06-08 13:38:24,720 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-258/0 2016-06-08 13:38:24,720 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2016-06-08 13:38:24,740 - call returned (1, '/etc/hadoop/2.4.2.0-258/0 exist already', '') 2016-06-08 13:38:24,741 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False} 2016-06-08 13:38:24,760 - checked_call returned (0, '') 2016-06-08 13:38:24,760 - Ensuring that hadoop has the correct symlink structure 2016-06-08 13:38:24,761 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2016-06-08 13:38:24,762 - Group['spark'] {} 2016-06-08 13:38:24,763 - Group['ranger'] {} 2016-06-08 13:38:24,763 - Group['zeppelin'] {} 2016-06-08 13:38:24,764 - Group['hadoop'] {} 2016-06-08 13:38:24,764 - Group['users'] {} 2016-06-08 13:38:24,764 - Group['knox'] {} 2016-06-08 13:38:24,764 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,765 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,765 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2016-06-08 13:38:24,766 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,766 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,767 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2016-06-08 13:38:24,768 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger']} 2016-06-08 13:38:24,768 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2016-06-08 13:38:24,769 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,769 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,770 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2016-06-08 13:38:24,770 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,771 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,771 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,772 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,772 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,773 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,773 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2016-06-08 13:38:24,774 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2016-06-08 13:38:24,775 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2016-06-08 13:38:24,779 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2016-06-08 13:38:24,779 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'} 2016-06-08 13:38:24,780 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2016-06-08 13:38:24,781 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2016-06-08 13:38:24,785 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2016-06-08 13:38:24,785 - Group['hdfs'] {} 2016-06-08 13:38:24,785 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']} 2016-06-08 13:38:24,786 - FS Type: 2016-06-08 13:38:24,786 - Directory['/etc/hadoop'] {'mode': 0755} 2016-06-08 13:38:24,797 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2016-06-08 13:38:24,797 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777} 2016-06-08 13:38:24,808 - Repository['HDP-2.4'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None} 2016-06-08 13:38:24,814 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.4]\nname=HDP-2.4\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2016-06-08 13:38:24,815 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2016-06-08 13:38:24,818 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.20]\nname=HDP-UTILS-1.1.0.20\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'} 2016-06-08 13:38:24,818 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2016-06-08 13:38:24,934 - Skipping installation of existing package unzip 2016-06-08 13:38:24,934 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2016-06-08 13:38:24,947 - Skipping installation of existing package curl 2016-06-08 13:38:24,947 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2016-06-08 13:38:24,960 - Skipping installation of existing package hdp-select
Created 06-09-2016 01:31 AM
@Wayne Vovil Are you running this on CentOS? If so try this (Found this on HCC post here):
Created 06-09-2016 01:31 AM
@Wayne Vovil Are you running this on CentOS? If so try this (Found this on HCC post here):
Created 06-09-2016 02:52 AM
Can you make sure you have spark client installed on the node you are trying to install Zeppelin on? If so, make sure this file exists on that node: /usr/hdp/current/spark-client/RELEASE?
This is the error causing the failure on your setup:
IOError: [Errno 2] No such file or directory: u'/usr/hdp/current/spark-client//RELEASE'
On my 2.4.2.0-195 setup, this is what I see in the contents of that file
# cat /usr/hdp/current/spark-client/RELEASE Spark 1.6.1.2.4.2.0-195 built for Hadoop 2.7.1.2.4.2.0-195
Created 06-10-2016 06:31 PM
@Wayne Vovil did u try this?