stderr: /var/lib/ambari-agent/data/errors-252.txt 2018-04-21 00:30:44,596 - Stack selector path does not exist Command aborted. Reason: 'Server considered task failed and automatically aborted it' stdout: /var/lib/ambari-agent/data/output-252.txt 2018-04-21 00:30:41,433 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6 2018-04-21 00:30:41,442 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2018-04-21 00:30:41,487 - Group['hdfs'] {} 2018-04-21 00:30:41,526 - Group['hadoop'] {} 2018-04-21 00:30:41,527 - Group['nifi'] {} 2018-04-21 00:30:41,528 - Group['users'] {} 2018-04-21 00:30:41,529 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-04-21 00:30:41,532 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-04-21 00:30:41,537 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-04-21 00:30:41,539 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2018-04-21 00:30:41,541 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['nifi'], 'uid': None} 2018-04-21 00:30:41,543 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2018-04-21 00:30:41,558 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-04-21 00:30:41,561 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None} 2018-04-21 00:30:41,567 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-04-21 00:30:41,589 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-04-21 00:30:41,591 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-04-21 00:30:41,592 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-04-21 00:30:41,593 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2018-04-21 00:30:41,594 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-04-21 00:30:41,632 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-04-21 00:30:41,703 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2018-04-21 00:30:41,704 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2018-04-21 00:30:41,706 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-04-21 00:30:41,708 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-04-21 00:30:41,710 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {} 2018-04-21 00:30:41,737 - call returned (0, '1012') 2018-04-21 00:30:41,738 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2018-04-21 00:30:41,754 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1012'] due to not_if 2018-04-21 00:30:41,755 - Group['hdfs'] {} 2018-04-21 00:30:41,756 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']} 2018-04-21 00:30:41,757 - FS Type: 2018-04-21 00:30:41,758 - Directory['/etc/hadoop'] {'mode': 0755} 2018-04-21 00:30:41,789 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2018-04-21 00:30:41,797 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2018-04-21 00:30:41,940 - Repository['HDF-3.0-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDF/centos7-ppc/3.x/updates/3.0.3.0/', 'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': ''} 2018-04-21 00:30:41,988 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDF-3.0-repo-1]\nname=HDF-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7-ppc/3.x/updates/3.0.3.0/\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-04-21 00:30:41,989 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-04-21 00:30:41,991 - Repository['HDP-2.6-repo-1'] {'append_to_file': True, 'base_url': 'http://192.168.1.8/repo/HDP/centos7/2.6.3.0-235/', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': ''} 2018-04-21 00:30:42,002 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDF-3.0-repo-1]\nname=HDF-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7-ppc/3.x/updates/3.0.3.0/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://192.168.1.8/repo/HDP/centos7/2.6.3.0-235/\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-04-21 00:30:42,003 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-04-21 00:30:42,004 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://192.168.1.8/repo/HDP-UTILS-1.1.0.21/', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': ''} 2018-04-21 00:30:42,025 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDF-3.0-repo-1]\nname=HDF-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7-ppc/3.x/updates/3.0.3.0/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://192.168.1.8/repo/HDP/centos7/2.6.3.0-235/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://192.168.1.8/repo/HDP-UTILS-1.1.0.21/\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-04-21 00:30:42,026 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-04-21 00:30:42,030 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-04-21 00:30:42,450 - Skipping installation of existing package unzip 2018-04-21 00:30:42,451 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-04-21 00:30:42,481 - Skipping installation of existing package curl 2018-04-21 00:30:42,481 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-04-21 00:30:42,562 - Skipping installation of existing package hdp-select 2018-04-21 00:30:42,592 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2018-04-21 00:30:42,615 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure. 2018-04-21 00:30:44,596 - Stack selector path does not exist 2018-04-21 00:30:44,596 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6 2018-04-21 00:30:45,072 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2018-04-21 00:30:45,108 - Package['nifi_3_0_*'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-04-21 00:30:45,649 - Installing package nifi_3_0_* ('/usr/bin/yum -d 0 -e 0 -y install 'nifi_3_0_*'') 2018-04-21 00:32:11,066 - Execution of '/usr/bin/yum -d 0 -e 0 -y install 'nifi_3_0_*'' returned 1. Error: Nothing to do 2018-04-21 00:32:11,067 - Failed to install package nifi_3_0_*. Executing '/usr/bin/yum clean metadata' 2018-04-21 00:32:12,266 - Retrying to install package nifi_3_0_* after 30 seconds Command aborted. Reason: 'Server considered task failed and automatically aborted it' Command failed after 1 tries