<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Failure deploying cluster to localhost - zookeeper in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Failure-deploying-cluster-to-localhost-zookeeper/m-p/190802#M80631</link>
    <description>&lt;P&gt;Hi,  I am getting failures when deploying to localhost.  Trying to test out nifi mainly.  Getting failures on zookeeper and nifi certificate auth.  I did add a pasword for the Advanced nifi-ambari-ssl-config.  But it still seems to fail..  the logs are below, and the screenshot of the failures is attached.  &lt;/P&gt;&lt;P&gt;My server setup is good i believe, Ubuntu 16, Build # 3.1.2.0-7. &lt;/P&gt;&lt;P&gt; After i launch the wizard, and do the install options.  It gave me 2 warnings about what i missed in my server setup, installing/enabling ntp, and disabling THP.  Also changed hostname to localhost.&lt;/P&gt;&lt;P&gt;Seems like zookeeper might be the problem, i'm not sure.&lt;/P&gt;&lt;P&gt;I tried the fix in this article, but i dont' have a /usr/hdp/ folder.&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/questions/33519/could-not-determine-hdp-version-for-component-zook.html" target="_blank"&gt;https://community.hortonworks.com/questions/33519/could-not-determine-hdp-version-for-component-zook.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Following this tutorial as closely as i can:&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.2/bk_installing-hdf/content/ch_install-ambari.html" target="_blank"&gt;https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.2/bk_installing-hdf/content/ch_install-ambari.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Thanks for any help,&lt;/P&gt;&lt;P&gt;Ron&lt;/P&gt;&lt;P&gt;p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff}
p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff; min-height: 13.0px}
span.s1 {font-variant-ligatures: no-common-ligatures}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,796 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -&amp;gt; 3.1&lt;/P&gt;&lt;P&gt;User Group mapping (user_group) is missing in the hostLevelParams&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,799 - Group['hadoop'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,800 - Group['nifi'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,800 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,800 - call['/var/lib/ambari-agent/tmp/changeUid.sh zookeeper'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,806 - call returned (0, '1001')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,807 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1001}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,807 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,808 - call['/var/lib/ambari-agent/tmp/changeUid.sh ams'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,813 - call returned (0, '1002')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,814 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1002}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,814 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,815 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,815 - call['/var/lib/ambari-agent/tmp/changeUid.sh nifi'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,821 - call returned (0, '1004')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,821 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1004}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,821 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,822 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,826 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,837 - Repository['HDF-3.1-repo-4'] {'append_to_file': False, 'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0&lt;/A&gt;', 'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdf-4', 'mirror_list': None}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,843 - File['/tmp/tmp0JlPhQ'] {'content': 'deb &lt;A href="http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0&lt;/A&gt; HDF main'}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,843 - Writing File['/tmp/tmp0JlPhQ'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,843 - File['/tmp/tmphevGen'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdf-4.list')}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,843 - Writing File['/tmp/tmphevGen'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,844 - File['/etc/apt/sources.list.d/ambari-hdf-4.list'] {'content': StaticFile('/tmp/tmp0JlPhQ')}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,844 - Writing File['/etc/apt/sources.list.d/ambari-hdf-4.list'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,844 - checked_call[['apt-get', 'update', '-qq', '-o', u'Dir::Etc::sourcelist=sources.list.d/ambari-hdf-4.list', '-o', 'Dir::Etc::sourceparts=-', '-o', 'APT::Get::List-Cleanup=0']] {'sudo': True, 'quiet': False}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,975 - checked_call returned (0, '')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,976 - Repository['HDP-UTILS-1.1.0.21-repo-4'] {'append_to_file': True, 'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16&lt;/A&gt;', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdf-4', 'mirror_list': None}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,978 - File['/tmp/tmpdUcPvV'] {'content': 'deb &lt;A href="http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0&lt;/A&gt; HDF main\ndeb &lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16&lt;/A&gt; HDP-UTILS main'}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,978 - Writing File['/tmp/tmpdUcPvV'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,978 - File['/tmp/tmpQlo4lv'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdf-4.list')}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,979 - Writing File['/tmp/tmpQlo4lv'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,979 - File['/etc/apt/sources.list.d/ambari-hdf-4.list'] {'content': StaticFile('/tmp/tmpdUcPvV')}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,979 - Writing File['/etc/apt/sources.list.d/ambari-hdf-4.list'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,979 - checked_call[['apt-get', 'update', '-qq', '-o', u'Dir::Etc::sourcelist=sources.list.d/ambari-hdf-4.list', '-o', 'Dir::Etc::sourceparts=-', '-o', 'APT::Get::List-Cleanup=0']] {'sudo': True, 'quiet': False}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,207 - checked_call returned (0, 'W: &lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16/dists/HDP-UTILS/InRelease:" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16/dists/HDP-UTILS/InRelease:&lt;/A&gt; Signature by key DF52ED4F7A3A5882C0994C66B9733A7A07513CAD uses weak digest algorithm (SHA1)')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,207 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,221 - Skipping installation of existing package unzip&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,221 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,234 - Skipping installation of existing package curl&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,234 - Package['hdf-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,246 - Skipping installation of existing package hdf-select&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,285 - call[('ambari-python-wrap', u'/usr/bin/hdf-select', 'versions')] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,301 - call returned (1, 'Traceback (most recent call last):\nFile "/usr/bin/hdf-select", line 403, in &amp;lt;module&amp;gt;\nprintVersions()\nFile "/usr/bin/hdf-select", line 248, in printVersions\nfor f in os.listdir(root):\nOSError: [Errno 2] No such file or directory: \'/usr/hdf\'')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,425 - Could not determine stack version for component zookeeper by calling '/usr/bin/hdf-select status zookeeper &amp;gt; /tmp/tmpgMWFwS'. Return Code: 1, Output: ERROR: Invalid package - zookeeper&lt;/P&gt;&lt;P&gt;Packages:&lt;/P&gt;&lt;P&gt;accumulo-client&lt;/P&gt;&lt;P&gt;accumulo-gc&lt;/P&gt;&lt;P&gt;accumulo-master&lt;/P&gt;&lt;P&gt;accumulo-monitor&lt;/P&gt;&lt;P&gt;accumulo-tablet&lt;/P&gt;&lt;P&gt;accumulo-tracer&lt;/P&gt;&lt;P&gt;atlas-client&lt;/P&gt;&lt;P&gt;atlas-server&lt;/P&gt;&lt;P&gt;falcon-client&lt;/P&gt;&lt;P&gt;falcon-server&lt;/P&gt;&lt;P&gt;flume-server&lt;/P&gt;&lt;P&gt;hadoop-client&lt;/P&gt;&lt;P&gt;hadoop-hdfs-datanode&lt;/P&gt;&lt;P&gt;hadoop-hdfs-journalnode&lt;/P&gt;&lt;P&gt;hadoop-hdfs-namenode&lt;/P&gt;&lt;P&gt;hadoop-hdfs-nfs3&lt;/P&gt;&lt;P&gt;hadoop-hdfs-portmap&lt;/P&gt;&lt;P&gt;hadoop-hdfs-secondarynamenode&lt;/P&gt;&lt;P&gt;hadoop-httpfs&lt;/P&gt;&lt;P&gt;hadoop-mapreduce-historyserver&lt;/P&gt;&lt;P&gt;hadoop-yarn-nodemanager&lt;/P&gt;&lt;P&gt;hadoop-yarn-resourcemanager&lt;/P&gt;&lt;P&gt;hadoop-yarn-timelineserver&lt;/P&gt;&lt;P&gt;hbase-client&lt;/P&gt;&lt;P&gt;hbase-master&lt;/P&gt;&lt;P&gt;hbase-regionserver&lt;/P&gt;&lt;P&gt;hive-metastore&lt;/P&gt;&lt;P&gt;hive-server2&lt;/P&gt;&lt;P&gt;hive-server2-hive2&lt;/P&gt;&lt;P&gt;hive-webhcat&lt;/P&gt;&lt;P&gt;kafka-broker&lt;/P&gt;&lt;P&gt;knox-server&lt;/P&gt;&lt;P&gt;livy-server&lt;/P&gt;&lt;P&gt;mahout-client&lt;/P&gt;&lt;P&gt;nifi&lt;/P&gt;&lt;P&gt;nifi-registry&lt;/P&gt;&lt;P&gt;oozie-client&lt;/P&gt;&lt;P&gt;oozie-server&lt;/P&gt;&lt;P&gt;phoenix-client&lt;/P&gt;&lt;P&gt;phoenix-server&lt;/P&gt;&lt;P&gt;ranger-admin&lt;/P&gt;&lt;P&gt;ranger-kms&lt;/P&gt;&lt;P&gt;ranger-tagsync&lt;/P&gt;&lt;P&gt;ranger-usersync&lt;/P&gt;&lt;P&gt;registry&lt;/P&gt;&lt;P&gt;slider-client&lt;/P&gt;&lt;P&gt;spark-client&lt;/P&gt;&lt;P&gt;spark-historyserver&lt;/P&gt;&lt;P&gt;spark-thriftserver&lt;/P&gt;&lt;P&gt;spark2-client&lt;/P&gt;&lt;P&gt;spark2-historyserver&lt;/P&gt;&lt;P&gt;spark2-thriftserver&lt;/P&gt;&lt;P&gt;sqoop-client&lt;/P&gt;&lt;P&gt;sqoop-server&lt;/P&gt;&lt;P&gt;storm-client&lt;/P&gt;&lt;P&gt;storm-nimbus&lt;/P&gt;&lt;P&gt;storm-supervisor&lt;/P&gt;&lt;P&gt;streamline&lt;/P&gt;&lt;P&gt;zeppelin-server&lt;/P&gt;&lt;P&gt;zookeeper-client&lt;/P&gt;&lt;P&gt;zookeeper-server&lt;/P&gt;&lt;P&gt;Aliases:&lt;/P&gt;&lt;P&gt;accumulo-server&lt;/P&gt;&lt;P&gt;all&lt;/P&gt;&lt;P&gt;client&lt;/P&gt;&lt;P&gt;hadoop-hdfs-server&lt;/P&gt;&lt;P&gt;hadoop-mapreduce-server&lt;/P&gt;&lt;P&gt;hadoop-yarn-server&lt;/P&gt;&lt;P&gt;hive-server&lt;/P&gt;&lt;BR /&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="screen-shot-2018-07-12-at-34553-pm.png" style="width: 999px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/6266i5B14F4640AB65209/image-size/large?v=v2&amp;amp;px=999" role="button" title="screen-shot-2018-07-12-at-34553-pm.png" alt="screen-shot-2018-07-12-at-34553-pm.png" /&gt;&lt;/span&gt;</description>
    <pubDate>Tue, 21 Apr 2026 12:16:14 GMT</pubDate>
    <dc:creator>ronlabau</dc:creator>
    <dc:date>2026-04-21T12:16:14Z</dc:date>
    <item>
      <title>Failure deploying cluster to localhost - zookeeper</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Failure-deploying-cluster-to-localhost-zookeeper/m-p/190802#M80631</link>
      <description>&lt;P&gt;Hi,  I am getting failures when deploying to localhost.  Trying to test out nifi mainly.  Getting failures on zookeeper and nifi certificate auth.  I did add a pasword for the Advanced nifi-ambari-ssl-config.  But it still seems to fail..  the logs are below, and the screenshot of the failures is attached.  &lt;/P&gt;&lt;P&gt;My server setup is good i believe, Ubuntu 16, Build # 3.1.2.0-7. &lt;/P&gt;&lt;P&gt; After i launch the wizard, and do the install options.  It gave me 2 warnings about what i missed in my server setup, installing/enabling ntp, and disabling THP.  Also changed hostname to localhost.&lt;/P&gt;&lt;P&gt;Seems like zookeeper might be the problem, i'm not sure.&lt;/P&gt;&lt;P&gt;I tried the fix in this article, but i dont' have a /usr/hdp/ folder.&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/questions/33519/could-not-determine-hdp-version-for-component-zook.html" target="_blank"&gt;https://community.hortonworks.com/questions/33519/could-not-determine-hdp-version-for-component-zook.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Following this tutorial as closely as i can:&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.2/bk_installing-hdf/content/ch_install-ambari.html" target="_blank"&gt;https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.2/bk_installing-hdf/content/ch_install-ambari.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Thanks for any help,&lt;/P&gt;&lt;P&gt;Ron&lt;/P&gt;&lt;P&gt;p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff}
p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff; min-height: 13.0px}
span.s1 {font-variant-ligatures: no-common-ligatures}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,796 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -&amp;gt; 3.1&lt;/P&gt;&lt;P&gt;User Group mapping (user_group) is missing in the hostLevelParams&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,799 - Group['hadoop'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,800 - Group['nifi'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,800 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,800 - call['/var/lib/ambari-agent/tmp/changeUid.sh zookeeper'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,806 - call returned (0, '1001')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,807 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1001}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,807 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,808 - call['/var/lib/ambari-agent/tmp/changeUid.sh ams'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,813 - call returned (0, '1002')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,814 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1002}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,814 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,815 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,815 - call['/var/lib/ambari-agent/tmp/changeUid.sh nifi'] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,821 - call returned (0, '1004')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,821 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1004}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,821 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,822 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,826 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,837 - Repository['HDF-3.1-repo-4'] {'append_to_file': False, 'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0&lt;/A&gt;', 'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdf-4', 'mirror_list': None}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,843 - File['/tmp/tmp0JlPhQ'] {'content': 'deb &lt;A href="http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0&lt;/A&gt; HDF main'}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,843 - Writing File['/tmp/tmp0JlPhQ'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,843 - File['/tmp/tmphevGen'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdf-4.list')}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,843 - Writing File['/tmp/tmphevGen'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,844 - File['/etc/apt/sources.list.d/ambari-hdf-4.list'] {'content': StaticFile('/tmp/tmp0JlPhQ')}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,844 - Writing File['/etc/apt/sources.list.d/ambari-hdf-4.list'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,844 - checked_call[['apt-get', 'update', '-qq', '-o', u'Dir::Etc::sourcelist=sources.list.d/ambari-hdf-4.list', '-o', 'Dir::Etc::sourceparts=-', '-o', 'APT::Get::List-Cleanup=0']] {'sudo': True, 'quiet': False}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,975 - checked_call returned (0, '')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,976 - Repository['HDP-UTILS-1.1.0.21-repo-4'] {'append_to_file': True, 'base_url': '&lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16&lt;/A&gt;', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdf-4', 'mirror_list': None}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,978 - File['/tmp/tmpdUcPvV'] {'content': 'deb &lt;A href="http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDF/ubuntu16/3.x/updates/3.1.2.0&lt;/A&gt; HDF main\ndeb &lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16&lt;/A&gt; HDP-UTILS main'}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,978 - Writing File['/tmp/tmpdUcPvV'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,978 - File['/tmp/tmpQlo4lv'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdf-4.list')}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,979 - Writing File['/tmp/tmpQlo4lv'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,979 - File['/etc/apt/sources.list.d/ambari-hdf-4.list'] {'content': StaticFile('/tmp/tmpdUcPvV')}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,979 - Writing File['/etc/apt/sources.list.d/ambari-hdf-4.list'] because contents don't match&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:25,979 - checked_call[['apt-get', 'update', '-qq', '-o', u'Dir::Etc::sourcelist=sources.list.d/ambari-hdf-4.list', '-o', 'Dir::Etc::sourceparts=-', '-o', 'APT::Get::List-Cleanup=0']] {'sudo': True, 'quiet': False}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,207 - checked_call returned (0, 'W: &lt;A href="http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16/dists/HDP-UTILS/InRelease:" target="_blank"&gt;http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16/dists/HDP-UTILS/InRelease:&lt;/A&gt; Signature by key DF52ED4F7A3A5882C0994C66B9733A7A07513CAD uses weak digest algorithm (SHA1)')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,207 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,221 - Skipping installation of existing package unzip&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,221 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,234 - Skipping installation of existing package curl&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,234 - Package['hdf-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,246 - Skipping installation of existing package hdf-select&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,285 - call[('ambari-python-wrap', u'/usr/bin/hdf-select', 'versions')] {}&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,301 - call returned (1, 'Traceback (most recent call last):\nFile "/usr/bin/hdf-select", line 403, in &amp;lt;module&amp;gt;\nprintVersions()\nFile "/usr/bin/hdf-select", line 248, in printVersions\nfor f in os.listdir(root):\nOSError: [Errno 2] No such file or directory: \'/usr/hdf\'')&lt;/P&gt;&lt;P&gt;2018-07-12 19:33:26,425 - Could not determine stack version for component zookeeper by calling '/usr/bin/hdf-select status zookeeper &amp;gt; /tmp/tmpgMWFwS'. Return Code: 1, Output: ERROR: Invalid package - zookeeper&lt;/P&gt;&lt;P&gt;Packages:&lt;/P&gt;&lt;P&gt;accumulo-client&lt;/P&gt;&lt;P&gt;accumulo-gc&lt;/P&gt;&lt;P&gt;accumulo-master&lt;/P&gt;&lt;P&gt;accumulo-monitor&lt;/P&gt;&lt;P&gt;accumulo-tablet&lt;/P&gt;&lt;P&gt;accumulo-tracer&lt;/P&gt;&lt;P&gt;atlas-client&lt;/P&gt;&lt;P&gt;atlas-server&lt;/P&gt;&lt;P&gt;falcon-client&lt;/P&gt;&lt;P&gt;falcon-server&lt;/P&gt;&lt;P&gt;flume-server&lt;/P&gt;&lt;P&gt;hadoop-client&lt;/P&gt;&lt;P&gt;hadoop-hdfs-datanode&lt;/P&gt;&lt;P&gt;hadoop-hdfs-journalnode&lt;/P&gt;&lt;P&gt;hadoop-hdfs-namenode&lt;/P&gt;&lt;P&gt;hadoop-hdfs-nfs3&lt;/P&gt;&lt;P&gt;hadoop-hdfs-portmap&lt;/P&gt;&lt;P&gt;hadoop-hdfs-secondarynamenode&lt;/P&gt;&lt;P&gt;hadoop-httpfs&lt;/P&gt;&lt;P&gt;hadoop-mapreduce-historyserver&lt;/P&gt;&lt;P&gt;hadoop-yarn-nodemanager&lt;/P&gt;&lt;P&gt;hadoop-yarn-resourcemanager&lt;/P&gt;&lt;P&gt;hadoop-yarn-timelineserver&lt;/P&gt;&lt;P&gt;hbase-client&lt;/P&gt;&lt;P&gt;hbase-master&lt;/P&gt;&lt;P&gt;hbase-regionserver&lt;/P&gt;&lt;P&gt;hive-metastore&lt;/P&gt;&lt;P&gt;hive-server2&lt;/P&gt;&lt;P&gt;hive-server2-hive2&lt;/P&gt;&lt;P&gt;hive-webhcat&lt;/P&gt;&lt;P&gt;kafka-broker&lt;/P&gt;&lt;P&gt;knox-server&lt;/P&gt;&lt;P&gt;livy-server&lt;/P&gt;&lt;P&gt;mahout-client&lt;/P&gt;&lt;P&gt;nifi&lt;/P&gt;&lt;P&gt;nifi-registry&lt;/P&gt;&lt;P&gt;oozie-client&lt;/P&gt;&lt;P&gt;oozie-server&lt;/P&gt;&lt;P&gt;phoenix-client&lt;/P&gt;&lt;P&gt;phoenix-server&lt;/P&gt;&lt;P&gt;ranger-admin&lt;/P&gt;&lt;P&gt;ranger-kms&lt;/P&gt;&lt;P&gt;ranger-tagsync&lt;/P&gt;&lt;P&gt;ranger-usersync&lt;/P&gt;&lt;P&gt;registry&lt;/P&gt;&lt;P&gt;slider-client&lt;/P&gt;&lt;P&gt;spark-client&lt;/P&gt;&lt;P&gt;spark-historyserver&lt;/P&gt;&lt;P&gt;spark-thriftserver&lt;/P&gt;&lt;P&gt;spark2-client&lt;/P&gt;&lt;P&gt;spark2-historyserver&lt;/P&gt;&lt;P&gt;spark2-thriftserver&lt;/P&gt;&lt;P&gt;sqoop-client&lt;/P&gt;&lt;P&gt;sqoop-server&lt;/P&gt;&lt;P&gt;storm-client&lt;/P&gt;&lt;P&gt;storm-nimbus&lt;/P&gt;&lt;P&gt;storm-supervisor&lt;/P&gt;&lt;P&gt;streamline&lt;/P&gt;&lt;P&gt;zeppelin-server&lt;/P&gt;&lt;P&gt;zookeeper-client&lt;/P&gt;&lt;P&gt;zookeeper-server&lt;/P&gt;&lt;P&gt;Aliases:&lt;/P&gt;&lt;P&gt;accumulo-server&lt;/P&gt;&lt;P&gt;all&lt;/P&gt;&lt;P&gt;client&lt;/P&gt;&lt;P&gt;hadoop-hdfs-server&lt;/P&gt;&lt;P&gt;hadoop-mapreduce-server&lt;/P&gt;&lt;P&gt;hadoop-yarn-server&lt;/P&gt;&lt;P&gt;hive-server&lt;/P&gt;&lt;BR /&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="screen-shot-2018-07-12-at-34553-pm.png" style="width: 999px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/6266i5B14F4640AB65209/image-size/large?v=v2&amp;amp;px=999" role="button" title="screen-shot-2018-07-12-at-34553-pm.png" alt="screen-shot-2018-07-12-at-34553-pm.png" /&gt;&lt;/span&gt;</description>
      <pubDate>Tue, 21 Apr 2026 12:16:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Failure-deploying-cluster-to-localhost-zookeeper/m-p/190802#M80631</guid>
      <dc:creator>ronlabau</dc:creator>
      <dc:date>2026-04-21T12:16:14Z</dc:date>
    </item>
    <item>
      <title>Re: Failure deploying cluster to localhost - zookeeper</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Failure-deploying-cluster-to-localhost-zookeeper/m-p/190803#M80632</link>
      <description>&lt;P&gt;Well, I took an image of the server just after setup, and i tried again on a new ubuntu server, and it worked!  Think nifi is running fine, about to check it out.&lt;/P&gt;&lt;P&gt;I removed installing the ambari-metrics.  not sure what the difference is, bcz i didn't change much other than that.  Oh, one part did complain about disk space, when i went back and tried to reinstall again, so I bumped up from 8 to 32gb.&lt;/P&gt;&lt;P&gt; Seems zookeeper and nifi installed ok.&lt;/P&gt;</description>
      <pubDate>Fri, 13 Jul 2018 03:26:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Failure-deploying-cluster-to-localhost-zookeeper/m-p/190803#M80632</guid>
      <dc:creator>ronlabau</dc:creator>
      <dc:date>2018-07-13T03:26:21Z</dc:date>
    </item>
  </channel>
</rss>

