stderr: /var/lib/ambari-agent/data/errors-1795.txt Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/NIFI/package/scripts/master.py", line 199, in Master().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.3/services/NIFI/package/scripts/master.py", line 174, in start Execute('cat '+params.bin_dir+'/nifi.pid'+" | grep pid | sed 's/pid=\(\.*\)/\\1/' > " + status_params.nifi_pid_file) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of 'cat /opt/nifi-1.1.1.0-12/bin/nifi.pid | grep pid | sed 's/pid=\(\.*\)/\1/' > /var/run/nifi/nifi.pid' returned 1. /bin/bash: /var/run/nifi/nifi.pid: No such file or directory cat: /opt/nifi-1.1.1.0-12/bin/nifi.pid: No such file or directory stdout: /var/lib/ambari-agent/data/output-1795.txt 2016-01-30 14:19:11,861 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.3.4.0-3485 2016-01-30 14:19:11,861 - Checking if need to create versioned conf dir /etc/hadoop/2.3.4.0-3485/0 2016-01-30 14:19:11,861 - call['conf-select create-conf-dir --package hadoop --stack-version 2.3.4.0-3485 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2016-01-30 14:19:11,883 - call returned (1, '/etc/hadoop/2.3.4.0-3485/0 exist already', '') 2016-01-30 14:19:11,883 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.3.4.0-3485 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False} 2016-01-30 14:19:11,904 - checked_call returned (0, '/usr/hdp/2.3.4.0-3485/hadoop/conf -> /etc/hadoop/2.3.4.0-3485/0') 2016-01-30 14:19:11,905 - Ensuring that hadoop has the correct symlink structure 2016-01-30 14:19:11,905 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2016-01-30 14:19:12,017 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.3.4.0-3485 2016-01-30 14:19:12,017 - Checking if need to create versioned conf dir /etc/hadoop/2.3.4.0-3485/0 2016-01-30 14:19:12,017 - call['conf-select create-conf-dir --package hadoop --stack-version 2.3.4.0-3485 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1} 2016-01-30 14:19:12,038 - call returned (1, '/etc/hadoop/2.3.4.0-3485/0 exist already', '') 2016-01-30 14:19:12,038 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.3.4.0-3485 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False} 2016-01-30 14:19:12,058 - checked_call returned (0, '/usr/hdp/2.3.4.0-3485/hadoop/conf -> /etc/hadoop/2.3.4.0-3485/0') 2016-01-30 14:19:12,059 - Ensuring that hadoop has the correct symlink structure 2016-01-30 14:19:12,059 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2016-01-30 14:19:12,060 - Group['hadoop'] {} 2016-01-30 14:19:12,061 - Group['nifi'] {} 2016-01-30 14:19:12,061 - Group['users'] {} 2016-01-30 14:19:12,061 - User['zookeeper'] {'gid': 'hadoop', 'groups': [u'hadoop']} 2016-01-30 14:19:12,062 - User['ams'] {'gid': 'hadoop', 'groups': [u'hadoop']} 2016-01-30 14:19:12,062 - User['ambari-qa'] {'gid': 'hadoop', 'groups': [u'users']} 2016-01-30 14:19:12,063 - User['kafka'] {'gid': 'hadoop', 'groups': [u'hadoop']} 2016-01-30 14:19:12,063 - User['hdfs'] {'gid': 'hadoop', 'groups': [u'hadoop']} 2016-01-30 14:19:12,064 - User['yarn'] {'gid': 'hadoop', 'groups': [u'hadoop']} 2016-01-30 14:19:12,064 - User['nifi'] {'gid': 'hadoop', 'groups': [u'hadoop']} 2016-01-30 14:19:12,065 - User['mapred'] {'gid': 'hadoop', 'groups': [u'hadoop']} 2016-01-30 14:19:12,065 - User['hbase'] {'gid': 'hadoop', 'groups': [u'hadoop']} 2016-01-30 14:19:12,066 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2016-01-30 14:19:12,067 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2016-01-30 14:19:12,071 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2016-01-30 14:19:12,071 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'} 2016-01-30 14:19:12,072 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2016-01-30 14:19:12,073 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2016-01-30 14:19:12,077 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2016-01-30 14:19:12,077 - Group['hdfs'] {'ignore_failures': False} 2016-01-30 14:19:12,077 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', u'hdfs']} 2016-01-30 14:19:12,078 - Directory['/etc/hadoop'] {'mode': 0755} 2016-01-30 14:19:12,090 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2016-01-30 14:19:12,090 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777} 2016-01-30 14:19:12,101 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'} 2016-01-30 14:19:12,109 - Skipping Execute[('setenforce', '0')] due to only_if 2016-01-30 14:19:12,109 - Directory['/var/log/hadoop'] {'owner': 'root', 'mode': 0775, 'group': 'hadoop', 'recursive': True, 'cd_access': 'a'} 2016-01-30 14:19:12,111 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True, 'cd_access': 'a'} 2016-01-30 14:19:12,111 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True, 'cd_access': 'a'} 2016-01-30 14:19:12,115 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'} 2016-01-30 14:19:12,116 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'} 2016-01-30 14:19:12,117 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644} 2016-01-30 14:19:12,124 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'} 2016-01-30 14:19:12,124 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755} 2016-01-30 14:19:12,128 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'} 2016-01-30 14:19:12,132 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755} 2016-01-30 14:19:12,312 - File['/opt/nifi-1.1.1.0-12/conf/nifi.properties'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi'} 2016-01-30 14:19:12,315 - File['/opt/nifi-1.1.1.0-12/conf/bootstrap.conf'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi'} 2016-01-30 14:19:12,319 - File['/opt/nifi-1.1.1.0-12/conf/logback.xml'] {'owner': 'nifi', 'content': InlineTemplate(...), 'group': 'nifi'} 2016-01-30 14:19:12,320 - Execute['echo pid file /var/run/nifi/nifi.pid'] {} 2016-01-30 14:19:12,323 - Execute['echo JAVA_HOME=/usr/jdk64/jdk1.8.0_60'] {} 2016-01-30 14:19:12,325 - Execute['export JAVA_HOME=/usr/jdk64/jdk1.8.0_60;/opt/nifi-1.1.1.0-12/bin/nifi.sh start >> /var/log/nifi/nifi-setup.log'] {'user': 'nifi'} 2016-01-30 14:19:15,380 - Execute['cat /opt/nifi-1.1.1.0-12/bin/nifi.pid | grep pid | sed 's/pid=\(\.*\)/\1/' > /var/run/nifi/nifi.pid'] {}