Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Can't install NiFi in HDP Sandbox 2.6.4 from Ambari

Can't install NiFi in HDP Sandbox 2.6.4 from Ambari

New Contributor

Running HDP with above version in Docker on Virtual Box CentOS

In Ambari, I go to Add Service, click on Nifi, click through the prompts and I see this Error at 'Configurations':

ErrorTeztez.tez-ui.history-url.base 	Value should be set for tez.tez-ui.history-url.base 	

I proceed anyway.

Then, after I click Deploy:

	stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/NIFI/package/scripts/master.py", line 131, in <module> Master().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/NIFI/package/scripts/master.py", line 50, in install Execute('tar -xf '+params.temp_file+' -C '+ params.nifi_dir +' >> ' + params.nifi_log_file, user=params.nifi_user) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of 'tar -xf /tmp/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -C /opt/nifi-1.1.0.2.1.2.0-10-bin >> /var/log/nifi/nifi-setup.log' returned 2. tar: This does not look like a tar archive
	gzip: stdin: unexpected end of file tar: Child returned status 1 tar: Error is not recoverable: exiting now stdout: 2018-02-14 21:23:46,486 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6 2018-02-14 21:23:46,487 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2018-02-14 21:23:46,489 - Group['livy'] {} 2018-02-14 21:23:46,491 - Group['spark'] {} 2018-02-14 21:23:46,491 - Group['ranger'] {} 2018-02-14 21:23:46,491 - Group['hdfs'] {} 2018-02-14 21:23:46,491 - Group['zeppelin'] {} 2018-02-14 21:23:46,492 - Group['hadoop'] {} 2018-02-14 21:23:46,492 - Group['nifi'] {} 2018-02-14 21:23:46,492 - Group['users'] {} 2018-02-14 21:23:46,492 - Group['knox'] {} 2018-02-14 21:23:46,493 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,500 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,501 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,503 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,504 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-02-14 21:23:46,505 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,506 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-02-14 21:23:46,506 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger'], 'uid': None} 2018-02-14 21:23:46,507 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-02-14 21:23:46,509 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None} 2018-02-14 21:23:46,510 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,510 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,511 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,512 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-02-14 21:23:46,513 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,514 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,515 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None} 2018-02-14 21:23:46,516 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,517 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,518 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,519 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,520 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,520 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None} 2018-02-14 21:23:46,521 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-02-14 21:23:46,524 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-02-14 21:23:46,553 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2018-02-14 21:23:46,553 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2018-02-14 21:23:46,554 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-02-14 21:23:46,556 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-02-14 21:23:46,556 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {} 2018-02-14 21:23:46,584 - call returned (0, '1002') 2018-02-14 21:23:46,584 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2018-02-14 21:23:46,610 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] due to not_if 2018-02-14 21:23:46,611 - Group['hdfs'] {} 2018-02-14 21:23:46,611 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']} 2018-02-14 21:23:46,612 - FS Type: 2018-02-14 21:23:46,612 - Directory['/etc/hadoop'] {'mode': 0755} 2018-02-14 21:23:46,648 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2018-02-14 21:23:46,649 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2018-02-14 21:23:46,672 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2018-02-14 21:23:46,681 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-02-14 21:23:46,685 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-02-14 21:23:46,685 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/centos6/2.x/updates/2.6.4.0 is not created due to its tags: set(['GPL']) 2018-02-14 21:23:46,686 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None} 2018-02-14 21:23:46,689 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-02-14 21:23:46,689 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match 2018-02-14 21:23:46,690 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-02-14 21:23:46,896 - Skipping installation of existing package unzip 2018-02-14 21:23:46,896 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-02-14 21:23:46,914 - Skipping installation of existing package curl 2018-02-14 21:23:46,914 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-02-14 21:23:46,926 - Skipping installation of existing package hdp-select 2018-02-14 21:23:46,927 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2018-02-14 21:23:46,928 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure. 2018-02-14 21:23:47,132 - Directory['/var/run/nifi'] {'owner': 'nifi', 'group': 'nifi'} 2018-02-14 21:23:47,133 - Directory['/var/log/nifi'] {'owner': 'nifi', 'group': 'nifi'} 2018-02-14 21:23:47,134 - Execute['touch /var/log/nifi/nifi-setup.log'] {'user': 'nifi'} 2018-02-14 21:23:47,213 - Creating /opt/nifi-1.1.0.2.1.2.0-10-bin 2018-02-14 21:23:47,213 - Execute['rm -rf /opt/nifi-1.1.0.2.1.2.0-10-bin'] {'ignore_failures': True} 2018-02-14 21:23:47,224 - Directory['/opt/nifi-1.1.0.2.1.2.0-10-bin'] {'owner': 'nifi', 'group': 'nifi'} 2018-02-14 21:23:47,224 - Creating directory Directory['/opt/nifi-1.1.0.2.1.2.0-10-bin'] since it doesn't exist. 2018-02-14 21:23:47,225 - Changing owner for /opt/nifi-1.1.0.2.1.2.0-10-bin from 0 to nifi 2018-02-14 21:23:47,225 - Changing group for /opt/nifi-1.1.0.2.1.2.0-10-bin from 0 to nifi 2018-02-14 21:23:47,225 - Execute['tar -xf /tmp/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -C /opt/nifi-1.1.0.2.1.2.0-10-bin >> /var/log/nifi/nifi-setup.log'] {'user': 'nifi'} 2018-02-14 21:23:47,260 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2018-02-14 21:23:47,263 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure.

Command failed after 1 tries

Please let me know what additional information is necessary.

Thank you in advance,

Tommy

6 REPLIES 6

Re: Can't install NiFi in HDP Sandbox 2.6.4 from Ambari

New Contributor

I have the same problem. The size of tmp\nifi-1.1.0.2.1.2.0-10-bin.tar.gz is 0 and wget couldn't get the file. So the tar -cvf didn't work.

I searched and found other post reply that it is because of the server that hosts this file may be not working, but I have been trying to reinstall many days and still got the same problem.

Help pleaseeee.

Re: Can't install NiFi in HDP Sandbox 2.6.4 from Ambari

New Contributor
@Chuarkai Prohmpatarapant

Depending on what you want to do, you could do what I did.

I am working in Virtual Box with CentOS. So, I downloaded the HDF sandbox startup script for docker. Ran that in the VM and NiFi comes preinstalled and works fine.

So, a working solution is to use the hdf sandbox. But, I would still like to know if installing hdf/NiFi (never really sure if HDF and NiFi are synonymous) onto the HDP sandbox is possible.

Re: Can't install NiFi in HDP Sandbox 2.6.4 from Ambari

New Contributor

HDF comes with Nifi preinstall but this case is installing Nifi on HDP sandbox as mentioned in the tutorial

Re: Can't install NiFi in HDP Sandbox 2.6.4 from Ambari

Expert Contributor
@glupu

, can you take a look at this. At first I thought it was a memory issue, but seems the link to https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz is broken.

File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'wget https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -O /tmp/nifi-1.1.0.2.1.2.0-10-bin.tar.gz -a /var/log/nifi/nifi-setup.log' returned 8.

Re: Can't install NiFi in HDP Sandbox 2.6.4 from Ambari

New Contributor

Yes, the link is broken. If you try to execute wget on command line, you get

# wget https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz
--2018-04-06 13:07:00--  https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz
Resolving public-repo-1.hortonworks.com (public-repo-1.hortonworks.com)... 13.32.14.146, 13.32.14.230, 13.32.14.164, ...
Connecting to public-repo-1.hortonworks.com (public-repo-1.hortonworks.com)|13.32.14.146|:443... connected.
HTTP request sent, awaiting response... 404 Not Found
2018-04-06 13:07:01 ERROR 404: Not Found.

And if you open URL using browser

This XML file does not appear to have any style information associated with it. The document tree is shown below.
<Error>
<Code>NoSuchKey</Code>
<Message>The specified key does not exist.</Message>
<Key>HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz</Key>
<RequestId>2530B9077A50BDCE</RequestId>
<HostId>
2JMQs01QCEreVQu039RRS7KoSyN8u5lUSXvZ6AFuAJwgrwDBIYRqRWOX70aVQEMv5IcH1T6JCZI=
</HostId>
</Error>

Re: Can't install NiFi in HDP Sandbox 2.6.4 from Ambari

New Contributor

Yes, the link is broken. If you try to execute wget on command line, you get

# wget https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz
--2018-04-06 13:07:00--  https://public-repo-1.hortonworks.com/HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz
Resolving public-repo-1.hortonworks.com (public-repo-1.hortonworks.com)... 13.32.14.146, 13.32.14.230, 13.32.14.164, ...
Connecting to public-repo-1.hortonworks.com (public-repo-1.hortonworks.com)|13.32.14.146|:443... connected.
HTTP request sent, awaiting response... 404 Not Found
2018-04-06 13:07:01 ERROR 404: Not Found.

And if you open URL using browser

This XML file does not appear to have any style information associated with it. The document tree is shown below.
<Error>
<Code>NoSuchKey</Code>
<Message>The specified key does not exist.</Message>
<Key>HDF/2.1.2.0/nifi-1.1.0.2.1.2.0-10-bin.tar.gz</Key>
<RequestId>2530B9077A50BDCE</RequestId>
<HostId>
2JMQs01QCEreVQu039RRS7KoSyN8u5lUSXvZ6AFuAJwgrwDBIYRqRWOX70aVQEMv5IcH1T6JCZI=
</HostId>
</Error>