stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 37, in AfterInstallHook().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 31, in hook setup_stack_symlinks() File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", line 52, in setup_stack_symlinks stack_select.select_all(version) File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 135, in select_all Execute(command, only_if = only_if_command) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.6 | tail -1`' returned 1. ERROR: set command takes 2 parameters, instead of 1 usage: distro-select [-h] [] [] [] Set the selected version of HDP. positional arguments: One of set, status, versions, or packages the package name to set the HDP version to set optional arguments: -h, --help show this help message and exit -r, --rpm-mode if true checks if there is symlink exists and creates the symlink if it doesn't Commands: set : set the package to a specified version status : show the version of the package versions : show the currently installed versions packages : show the individual package names stdout: 2017-04-04 15:46:58,004 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6 2017-04-04 15:46:58,019 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf User Group mapping (user_group) is missing in the hostLevelParams 2017-04-04 15:46:58,021 - Group['livy'] {} 2017-04-04 15:46:58,022 - Group['spark'] {} 2017-04-04 15:46:58,022 - Group['hadoop'] {} 2017-04-04 15:46:58,023 - Group['users'] {} 2017-04-04 15:46:58,023 - Group['knox'] {} 2017-04-04 15:46:58,023 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,024 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,025 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,026 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2017-04-04 15:46:58,026 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,027 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,028 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2017-04-04 15:46:58,029 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,030 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,030 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2017-04-04 15:46:58,031 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,032 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,033 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,033 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,034 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,035 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,036 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,037 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2017-04-04 15:46:58,037 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-04-04 15:46:58,039 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2017-04-04 15:46:58,046 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2017-04-04 15:46:58,046 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2017-04-04 15:46:58,047 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-04-04 15:46:58,049 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2017-04-04 15:46:58,055 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2017-04-04 15:46:58,055 - Group['hdfs'] {} 2017-04-04 15:46:58,056 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']} 2017-04-04 15:46:58,057 - FS Type: 2017-04-04 15:46:58,057 - Directory['/etc/hadoop'] {'mode': 0755} 2017-04-04 15:46:58,082 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2017-04-04 15:46:58,083 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2017-04-04 15:46:58,100 - Initializing 2 repositories 2017-04-04 15:46:58,100 - Repository['HDP-2.6'] {'base_url': 'http://HD-PRD-C1-SM01/install/custom/repos/HDP-2.6.0.3', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None} 2017-04-04 15:46:58,110 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.6]\nname=HDP-2.6\nbaseurl=http://HD-PRD-C1-SM01/install/custom/repos/HDP-2.6.0.3\n\npath=/\nenabled=1\ngpgcheck=0'} 2017-04-04 15:46:58,111 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://HD-PRD-C1-SM01/install/custom/repos/HDP-UTILS-1.1.0.21', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2017-04-04 15:46:58,115 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://HD-PRD-C1-SM01/install/custom/repos/HDP-UTILS-1.1.0.21\n\npath=/\nenabled=1\ngpgcheck=0'} 2017-04-04 15:46:58,116 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-04 15:46:58,285 - Skipping installation of existing package unzip 2017-04-04 15:46:58,285 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-04 15:46:58,301 - Skipping installation of existing package curl 2017-04-04 15:46:58,301 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-04 15:46:58,316 - Skipping installation of existing package hdp-select 2017-04-04 15:46:58,615 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-04-04 15:46:58,616 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6 2017-04-04 15:46:58,660 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-04-04 15:46:58,687 - checked_call['rpm -q --queryformat '%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g''] {'stderr': -1} 2017-04-04 15:46:58,764 - checked_call returned (0, '2.6.0.3-8', '') 2017-04-04 15:46:58,775 - Package['hadoop_2_6_0_3_8'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-04 15:46:58,949 - Skipping installation of existing package hadoop_2_6_0_3_8 2017-04-04 15:46:58,951 - Package['hadoop_2_6_0_3_8-client'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-04 15:46:58,967 - Skipping installation of existing package hadoop_2_6_0_3_8-client 2017-04-04 15:46:58,968 - Package['snappy'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-04 15:46:58,984 - Skipping installation of existing package snappy 2017-04-04 15:46:58,986 - Package['snappy-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-04 15:46:59,001 - Skipping installation of existing package snappy-devel 2017-04-04 15:46:59,003 - Package['hadoop_2_6_0_3_8-libhdfs'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-04 15:46:59,018 - Skipping installation of existing package hadoop_2_6_0_3_8-libhdfs 2017-04-04 15:46:59,020 - Package['libtirpc-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-04-04 15:46:59,035 - Skipping installation of existing package libtirpc-devel 2017-04-04 15:46:59,039 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'} 2017-04-04 15:46:59,047 - File['/etc/security/limits.d/hdfs.conf'] {'content': Template('hdfs.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644} 2017-04-04 15:46:59,047 - XmlConfig['hadoop-policy.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'configurations': ...} 2017-04-04 15:46:59,064 - Generating config: /usr/hdp/current/hadoop-client/conf/hadoop-policy.xml 2017-04-04 15:46:59,064 - File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-04-04 15:46:59,077 - XmlConfig['ssl-client.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'configurations': ...} 2017-04-04 15:46:59,090 - Generating config: /usr/hdp/current/hadoop-client/conf/ssl-client.xml 2017-04-04 15:46:59,091 - File['/usr/hdp/current/hadoop-client/conf/ssl-client.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-04-04 15:46:59,099 - Directory['/usr/hdp/current/hadoop-client/conf/secure'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'cd_access': 'a'} 2017-04-04 15:46:59,100 - XmlConfig['ssl-client.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf/secure', 'configuration_attributes': {}, 'configurations': ...} 2017-04-04 15:46:59,113 - Generating config: /usr/hdp/current/hadoop-client/conf/secure/ssl-client.xml 2017-04-04 15:46:59,114 - File['/usr/hdp/current/hadoop-client/conf/secure/ssl-client.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-04-04 15:46:59,122 - XmlConfig['ssl-server.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'configurations': ...} 2017-04-04 15:46:59,135 - Generating config: /usr/hdp/current/hadoop-client/conf/ssl-server.xml 2017-04-04 15:46:59,136 - File['/usr/hdp/current/hadoop-client/conf/ssl-server.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-04-04 15:46:59,145 - XmlConfig['hdfs-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {u'final': {u'dfs.support.append': u'true', u'dfs.datanode.data.dir': u'true', u'dfs.namenode.http-address': u'true', u'dfs.namenode.name.dir': u'true', u'dfs.webhdfs.enabled': u'true', u'dfs.datanode.failed.volumes.tolerated': u'true'}}, 'configurations': ...} 2017-04-04 15:46:59,159 - Generating config: /usr/hdp/current/hadoop-client/conf/hdfs-site.xml 2017-04-04 15:46:59,159 - File['/usr/hdp/current/hadoop-client/conf/hdfs-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'} 2017-04-04 15:46:59,222 - XmlConfig['core-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'mode': 0644, 'configuration_attributes': {u'final': {u'fs.defaultFS': u'true'}}, 'owner': 'hdfs', 'configurations': ...} 2017-04-04 15:46:59,235 - Generating config: /usr/hdp/current/hadoop-client/conf/core-site.xml 2017-04-04 15:46:59,235 - File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'} 2017-04-04 15:46:59,270 - File['/usr/hdp/current/hadoop-client/conf/slaves'] {'content': Template('slaves.j2'), 'owner': 'hdfs'} 2017-04-04 15:46:59,277 - Directory['/hadoop/hdfs/namenode'] {'owner': 'hdfs', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'} 2017-04-04 15:46:59,277 - Directory['/opt/hadoop/hdfs/namenode'] {'owner': 'hdfs', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'} 2017-04-04 15:46:59,278 - Directory['/usr/hdp/hadoop/hdfs/namenode'] {'owner': 'hdfs', 'create_parents': True, 'group': 'hadoop', 'mode': 0755, 'cd_access': 'a'} 2017-04-04 15:46:59,278 - Creating directory Directory['/usr/hdp/hadoop/hdfs/namenode'] since it doesn't exist. 2017-04-04 15:46:59,278 - Changing owner for /usr/hdp/hadoop/hdfs/namenode from 0 to hdfs 2017-04-04 15:46:59,278 - Changing group for /usr/hdp/hadoop/hdfs/namenode from 0 to hadoop 2017-04-04 15:46:59,279 - Directory['/var/hadoop/hdfs/namenode'] {'owner': 'hdfs', 'group': 'hadoop', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'} 2017-04-04 15:46:59,279 - Skipping setting up secure ZNode ACL for HFDS as it's supported only for NameNode HA mode. 2017-04-04 15:46:59,620 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-04-04 15:46:59,621 - Executing hdp-select set all on 2.6 2017-04-04 15:46:59,621 - Execute['ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.6 | tail -1`'] {'only_if': 'ls -d /usr/hdp/2.6*'} Command failed after 1 tries