Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Nifi installation failed

avatar
Contributor

Hi everyone,

I am having 4 node cluster hdp installed. i am trying to install nifi using ambari it is throwing an error.

followed steps:

1)i have upgraded ambari. now the current version is ambari 2.7, hdp version is 2.6.1.0 and HDF 3.1.2.0

using below link i have upgraded ambari..

https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.2.0/installing-hdf-on-hdp/content/hdf-upgrade-a...

2)Installed M-pack using ambari server. and i am able to see nifi in my ambari add services list.

here comes the problem when i try to add a nifi service in one of the nodes in the cluster it is throwing an error.

I am attaching the error here.

std_err

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi.py", line 231, in <module>
    Master().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 353, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/nifi.py", line 56, in install
    import params
  File "/var/lib/ambari-agent/cache/common-services/NIFI/1.0.0/package/scripts/params.py", line 284, in <module>
    for host in config['clusterHostInfo']['zookeeper_hosts']:
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/config_dictionary.py", line 73, in __getattr__
    raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
resource_management.core.exceptions.Fail: Configuration parameter 'zookeeper_hosts' was not found in configurations dictionary!

std_out

2018-10-24 19:33:00,075 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-10-24 19:33:00,089 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-10-24 19:33:00,096 - Group['livy'] {}
2018-10-24 19:33:00,097 - Group['spark'] {}
2018-10-24 19:33:00,097 - Group['hdfs'] {}
2018-10-24 19:33:00,098 - Group['zeppelin'] {}
2018-10-24 19:33:00,098 - Group['hadoop'] {}
2018-10-24 19:33:00,099 - Group['nifi'] {}
2018-10-24 19:33:00,099 - Group['users'] {}
2018-10-24 19:33:00,099 - Group['knox'] {}
2018-10-24 19:33:00,101 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,108 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,110 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,112 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,117 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-10-24 19:33:00,119 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2018-10-24 19:33:00,121 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['nifi'], 'uid': None}
2018-10-24 19:33:00,123 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2018-10-24 19:33:00,128 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2018-10-24 19:33:00,131 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2018-10-24 19:33:00,132 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,138 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2018-10-24 19:33:00,140 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,142 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,147 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,149 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,151 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2018-10-24 19:33:00,153 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-10-24 19:33:00,158 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-10-24 19:33:00,160 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-10-24 19:33:00,167 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-10-24 19:33:00,168 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-10-24 19:33:00,169 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-10-24 19:33:00,171 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-10-24 19:33:00,172 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-10-24 19:33:00,187 - call returned (0, '1015')
2018-10-24 19:33:00,188 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-10-24 19:33:00,195 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] due to not_if
2018-10-24 19:33:00,195 - Group['hdfs'] {}
2018-10-24 19:33:00,196 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2018-10-24 19:33:00,197 - FS Type: HDFS
2018-10-24 19:33:00,197 - Directory['/etc/hadoop'] {'mode': 0755}
2018-10-24 19:33:00,236 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2018-10-24 19:33:00,239 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-10-24 19:33:00,272 - Repository['HDP-UTILS-2.6.1.0-129'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-2.6.1.0-129', 'mirror_list': ''}
2018-10-24 19:33:00,290 - File['/etc/yum.repos.d/HDP-2.6.1.0-129.repo'] {'content': '[HDP-UTILS-2.6.1.0-129]\nname=HDP-UTILS-2.6.1.0-129\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-10-24 19:33:00,291 - Writing File['/etc/yum.repos.d/HDP-2.6.1.0-129.repo'] because contents don't match
2018-10-24 19:33:00,291 - Repository['HDP-2.6.1.0-129'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.1.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-2.6.1.0-129', 'mirror_list': ''}
2018-10-24 19:33:00,301 - File['/etc/yum.repos.d/HDP-2.6.1.0-129.repo'] {'content': '[HDP-UTILS-2.6.1.0-129]\nname=HDP-UTILS-2.6.1.0-129\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6.1.0-129]\nname=HDP-2.6.1.0-129\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.1.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-10-24 19:33:00,301 - Writing File['/etc/yum.repos.d/HDP-2.6.1.0-129.repo'] because contents don't match
2018-10-24 19:33:00,302 - Repository['HDF-2.6.1.0-129'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.1.2.0', 'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-2.6.1.0-129', 'mirror_list': ''}
2018-10-24 19:33:00,311 - File['/etc/yum.repos.d/HDP-2.6.1.0-129.repo'] {'content': '[HDP-UTILS-2.6.1.0-129]\nname=HDP-UTILS-2.6.1.0-129\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6.1.0-129]\nname=HDP-2.6.1.0-129\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDF-2.6.1.0-129]\nname=HDF-2.6.1.0-129\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.1.2.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-10-24 19:33:00,311 - Writing File['/etc/yum.repos.d/HDP-2.6.1.0-129.repo'] because contents don't match
2018-10-24 19:33:00,312 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-10-24 19:33:00,530 - Skipping installation of existing package unzip
2018-10-24 19:33:00,534 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-10-24 19:33:00,558 - Skipping installation of existing package curl
2018-10-24 19:33:00,558 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-10-24 19:33:00,587 - Skipping installation of existing package hdp-select
2018-10-24 19:33:00,601 - The repository with version 2.6.1.0-129 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-10-24 19:33:00,621 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure.
2018-10-24 19:33:01,101 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-10-24 19:33:01,151 - The repository with version 2.6.1.0-129 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-10-24 19:33:01,201 - Skipping stack-select on NIFI because it does not exist in the stack-select package structure.

Command failed after 1 tries

please help me out to troubleshoot the issue.

Thanks in advance....

1 ACCEPTED SOLUTION

avatar

Hi @kanna k ,

I guess you are doing wrong here as per https://supportmatrix.hortonworks.com/

Ambari-2.7 version only supports HDP-3.0 and HDF-3.2 that's the reason you are facing the issue here.

Refer to : https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.0.0/bk_ambari-installation/content/determine_pr...

as ambari is not fully compatable with HDP version is failing.

View solution in original post

3 REPLIES 3

avatar

Hi @kanna k ,

I guess you are doing wrong here as per https://supportmatrix.hortonworks.com/

Ambari-2.7 version only supports HDP-3.0 and HDF-3.2 that's the reason you are facing the issue here.

Refer to : https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.0.0/bk_ambari-installation/content/determine_pr...

as ambari is not fully compatable with HDP version is failing.

avatar
Contributor

Thank you @Akhil S Naik

I have a doubt

i've upgraded to ambari2.7.

1) Do i need to upgrade HDP if i want to install HDF 3.2 or can i proceed wih HDF installation directly without upgrading HDP????

Please suggest me the possibility.

Thanks in advance....

avatar
Contributor

Thank you so much @Akhil S Naik

I have changed mpack from hdf 3.1 to hdf 3.2.

My nifi installation is completed without upgrading HDP.