<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question I try to install HDP. I got a problem on the step &amp;quot;Confirm Hosts&amp;quot;. I do not understand where the problem comes from. has anyone ever had the same error or an idea of ​​how to solve it? can you help me please ?  the log is in description in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-install-HDP-I-got-a-problem-on-the-step-quot/m-p/213966#M84437</link>
    <description>&lt;PRE&gt;==========================
Creating target directory...
==========================

Command start time 2018-10-18 10:04:53

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:53

==========================
Copying ambari sudo script...
==========================

Command start time 2018-10-18 10:04:53

scp /var/lib/ambari-server/ambari-sudo.sh
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:54

==========================
Copying common functions script...
==========================

Command start time 2018-10-18 10:04:54

scp /usr/lib/python2.6/site-packages/ambari_commons
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:54

==========================
Copying create-python-wrap script...
==========================

Command start time 2018-10-18 10:04:54

scp /var/lib/ambari-server/create-python-wrap.sh
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:54

==========================
Copying OS type check script...
==========================

Command start time 2018-10-18 10:04:54

scp /usr/lib/python2.6/site-packages/ambari_server/os_check_type.py
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:55

==========================
Running create-python-wrap script...
==========================

Command start time 2018-10-18 10:04:55

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:55

==========================
Running OS type check...
==========================

Command start time 2018-10-18 10:04:55
Cluster primary/cluster OS family is ubuntu16 and local/current OS family is ubuntu16

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:55

==========================
Checking 'sudo' package on remote host...
==========================

Command start time 2018-10-18 10:04:55

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Copying repo file to 'tmp' folder...
==========================

Command start time 2018-10-18 10:04:56

scp /etc/apt/sources.list.d/ambari.list
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Moving file to repo dir...
==========================

Command start time 2018-10-18 10:04:56

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Changing permissions for ambari.repo...
==========================

Command start time 2018-10-18 10:04:56

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Update apt cache of repository...
==========================

Command start time 2018-10-18 10:04:56

0% [Working]
            
Get:1 &lt;A href="http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3" target="_blank"&gt;http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3&lt;/A&gt; Ambari InRelease [7,394 B]

0% [1 InRelease 0 B/7,394 B 0%] [Connecting to archive.canonical.com (91.189.92
                                                                               
0% [Connecting to archive.canonical.com (91.189.92.191)]
0% [1 InRelease gpgv 7,394 B] [Connecting to archive.canonical.com (91.189.92.1
                                                                               
Ign:1 &lt;A href="http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3" target="_blank"&gt;http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3&lt;/A&gt; Ambari InRelease

                                                                               
0% [Waiting for headers]
                        
Hit:2 &lt;A href="http://archive.canonical.com/ubuntu" target="_blank"&gt;http://archive.canonical.com/ubuntu&lt;/A&gt; xenial InRelease

                        
0% [Working]
0% [2 InRelease gpgv 11.5 kB]
                             
20% [Working]
             
Fetched 7,394 B in 0s (27.0 kB/s)

Reading package lists... 0%

Reading package lists... 0%

Reading package lists... 0%

Reading package lists... 0%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... Done

W: GPG error: &lt;A href="http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3" target="_blank"&gt;http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3&lt;/A&gt; Ambari InRelease: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY B9733A7A07513CAD
W: The repository 'http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3 Ambari InRelease' is not signed.
N: Data from such a repository can't be authenticated and is therefore potentially dangerous to use.
N: See apt-secure(8) manpage for repository creation and user configuration details.

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:57

==========================
Copying setup script file...
==========================

Command start time 2018-10-18 10:04:57

scp /usr/lib/python2.6/site-packages/ambari_server/setupAgent.py
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:57

==========================
Running setup agent script...
==========================

Command start time 2018-10-18 10:04:57
('INFO 2018-10-18 10:05:30,349 ExitHelper.py:56 - Performing cleanup before exiting...
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,826 DataCleaner.py:39 - Data cleanup thread started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:120 - Data cleanup started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:122 - Data cleanup finished
INFO 2018-10-18 10:05:30,849 PingPortListener.py:50 - Ping port listener started on port: 8670
INFO 2018-10-18 10:05:30,850 main.py:436 - Connecting to Ambari server at &lt;A href="https://master-am.c.deploiementhadoop.internal:8440" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440&lt;/A&gt; (192.168.0.3)
INFO 2018-10-18 10:05:30,850 NetUtil.py:67 - Connecting to &lt;A href="https://master-am.c.deploiementhadoop.internal:8440/ca" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440/ca&lt;/A&gt;
INFO 2018-10-18 10:05:30,900 main.py:446 - Connected to Ambari server master-am.c.deploiementhadoop.internal
INFO 2018-10-18 10:05:30,901 threadpool.py:58 - Started thread pool with 3 core threads and 20 maximum threads
WARNING 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:280 - [AlertScheduler] /var/lib/ambari-agent/cache/alerts/definitions.json not found or invalid. No alerts will be scheduled until registration occurs.
INFO 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:175 - [AlertScheduler] Starting &amp;lt;ambari_agent.apscheduler.scheduler.Scheduler object at 0x7f9c56b89bd0&amp;gt;; currently running: False
INFO 2018-10-18 10:05:30,908 hostname.py:98 - Read public hostname \'w1-am.c.deploiementhadoop.internal\' using socket.getfqdn()
INFO 2018-10-18 10:05:30,916 Hardware.py:174 - Some mount points were ignored: /run, /dev/shm, /run/lock, /sys/fs/cgroup, /run/user/1001, /run/user/0
INFO 2018-10-18 10:05:30,927 Facter.py:202 - Directory: \'/etc/resource_overrides\' does not exist - it won\'t be used for gathering system resources.
INFO 2018-10-18 10:05:31,057 Controller.py:170 - Registering with w1-am.c.deploiementhadoop.internal (192.168.0.4) (agent=\'{"hardwareProfile": {"kernel": "Linux", "domain": "c.deploiementhadoop.internal", "physicalprocessorcount": 1, "kernelrelease": "4.15.0-1021-gcp", "uptime_days": "0", "memorytotal": 3781816, "swapfree": "0.00 GB", "memorysize": 3781816, "osfamily": "ubuntu", "swapsize": "0.00 GB", "processorcount": 1, "netmask": "255.255.255.255", "timezone": "UTC", "hardwareisa": "x86_64", "memoryfree": 2957172, "operatingsystem": "ubuntu", "kernelmajversion": "4.15", "kernelversion": "4.15.0", "macaddress": "42:01:C0:A8:00:04", "operatingsystemrelease": "16.04", "ipaddress": "192.168.0.4", "hostname": "w1-am", "uptime_hours": "3", "fqdn": "w1-am.c.deploiementhadoop.internal", "id": "root", "architecture": "x86_64", "selinux": false, "mounts": [{"available": "1879756", "used": "0", "percent": "0%", "device": "udev", "mountpoint": "/dev", "type": "devtmpfs", "size": "1879756"}, {"available": "149950580", "used": "2442596", "percent": "2%", "device": "/dev/sda1", "mountpoint": "/", "type": "ext4", "size": "152409560"}], "hardwaremodel": "x86_64", "uptime_seconds": "10847", "interfaces": "ens4,lo"}, "currentPingPort": 8670, "prefix": "/var/lib/ambari-agent/data", "agentVersion": "2.5.0.3", "agentEnv": {"transparentHugePage": "madvise", "hostHealth": {"agentTimeStampAtReporting": 1539857131055, "activeJavaProcs": [], "liveServices": [{"status": "Healthy", "name": "ntp or chrony", "desc": ""}]}, "reverseLookup": true, "alternatives": [], "umask": "18", "firewallName": "ufw", "stackFoldersAndFiles": [], "existingUsers": [], "firewallRunning": false}, "timestamp": 1539857130932, "hostname": "w1-am.c.deploiementhadoop.internal", "responseId": -1, "publicHostname": "w1-am.c.deploiementhadoop.internal"}\')
INFO 2018-10-18 10:05:31,057 NetUtil.py:67 - Connecting to &lt;A href="https://master-am.c.deploiementhadoop.internal:8440/connection_info" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440/connection_info&lt;/A&gt;
INFO 2018-10-18 10:05:31,107 security.py:93 - SSL Connect being called.. connecting to the server
', None)
('INFO 2018-10-18 10:05:30,349 ExitHelper.py:56 - Performing cleanup before exiting...
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,826 DataCleaner.py:39 - Data cleanup thread started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:120 - Data cleanup started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:122 - Data cleanup finished
INFO 2018-10-18 10:05:30,849 PingPortListener.py:50 - Ping port listener started on port: 8670
INFO 2018-10-18 10:05:30,850 main.py:436 - Connecting to Ambari server at &lt;A href="https://master-am.c.deploiementhadoop.internal:8440" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440&lt;/A&gt; (192.168.0.3)
INFO 2018-10-18 10:05:30,850 NetUtil.py:67 - Connecting to &lt;A href="https://master-am.c.deploiementhadoop.internal:8440/ca" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440/ca&lt;/A&gt;
INFO 2018-10-18 10:05:30,900 main.py:446 - Connected to Ambari server master-am.c.deploiementhadoop.internal
INFO 2018-10-18 10:05:30,901 threadpool.py:58 - Started thread pool with 3 core threads and 20 maximum threads
WARNING 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:280 - [AlertScheduler] /var/lib/ambari-agent/cache/alerts/definitions.json not found or invalid. No alerts will be scheduled until registration occurs.
INFO 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:175 - [AlertScheduler] Starting &amp;lt;ambari_agent.apscheduler.scheduler.Scheduler object at 0x7f9c56b89bd0&amp;gt;; currently running: False
INFO 2018-10-18 10:05:30,908 hostname.py:98 - Read public hostname \'w1-am.c.deploiementhadoop.internal\' using socket.getfqdn()
INFO 2018-10-18 10:05:30,916 Hardware.py:174 - Some mount points were ignored: /run, /dev/shm, /run/lock, /sys/fs/cgroup, /run/user/1001, /run/user/0
INFO 2018-10-18 10:05:30,927 Facter.py:202 - Directory: \'/etc/resource_overrides\' does not exist - it won\'t be used for gathering system resources.
INFO 2018-10-18 10:05:31,057 Controller.py:170 - Registering with w1-am.c.deploiementhadoop.internal (192.168.0.4) (agent=\'{"hardwareProfile": {"kernel": "Linux", "domain": "c.deploiementhadoop.internal", "physicalprocessorcount": 1, "kernelrelease": "4.15.0-1021-gcp", "uptime_days": "0", "memorytotal": 3781816, "swapfree": "0.00 GB", "memorysize": 3781816, "osfamily": "ubuntu", "swapsize": "0.00 GB", "processorcount": 1, "netmask": "255.255.255.255", "timezone": "UTC", "hardwareisa": "x86_64", "memoryfree": 2957172, "operatingsystem": "ubuntu", "kernelmajversion": "4.15", "kernelversion": "4.15.0", "macaddress": "42:01:C0:A8:00:04", "operatingsystemrelease": "16.04", "ipaddress": "192.168.0.4", "hostname": "w1-am", "uptime_hours": "3", "fqdn": "w1-am.c.deploiementhadoop.internal", "id": "root", "architecture": "x86_64", "selinux": false, "mounts": [{"available": "1879756", "used": "0", "percent": "0%", "device": "udev", "mountpoint": "/dev", "type": "devtmpfs", "size": "1879756"}, {"available": "149950580", "used": "2442596", "percent": "2%", "device": "/dev/sda1", "mountpoint": "/", "type": "ext4", "size": "152409560"}], "hardwaremodel": "x86_64", "uptime_seconds": "10847", "interfaces": "ens4,lo"}, "currentPingPort": 8670, "prefix": "/var/lib/ambari-agent/data", "agentVersion": "2.5.0.3", "agentEnv": {"transparentHugePage": "madvise", "hostHealth": {"agentTimeStampAtReporting": 1539857131055, "activeJavaProcs": [], "liveServices": [{"status": "Healthy", "name": "ntp or chrony", "desc": ""}]}, "reverseLookup": true, "alternatives": [], "umask": "18", "firewallName": "ufw", "stackFoldersAndFiles": [], "existingUsers": [], "firewallRunning": false}, "timestamp": 1539857130932, "hostname": "w1-am.c.deploiementhadoop.internal", "responseId": -1, "publicHostname": "w1-am.c.deploiementhadoop.internal"}\')
INFO 2018-10-18 10:05:31,057 NetUtil.py:67 - Connecting to &lt;A href="https://master-am.c.deploiementhadoop.internal:8440/connection_info" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440/connection_info&lt;/A&gt;
INFO 2018-10-18 10:05:31,107 security.py:93 - SSL Connect being called.. connecting to the server
', None)

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:05:33

Registering with the server... &lt;/PRE&gt;&lt;P&gt;Registration with the server failed.&lt;/P&gt;</description>
    <pubDate>Thu, 18 Oct 2018 19:25:53 GMT</pubDate>
    <dc:creator>amina_meziani21</dc:creator>
    <dc:date>2018-10-18T19:25:53Z</dc:date>
    <item>
      <title>I try to install HDP. I got a problem on the step "Confirm Hosts". I do not understand where the problem comes from. has anyone ever had the same error or an idea of ​​how to solve it? can you help me please ?  the log is in description</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-install-HDP-I-got-a-problem-on-the-step-quot/m-p/213966#M84437</link>
      <description>&lt;PRE&gt;==========================
Creating target directory...
==========================

Command start time 2018-10-18 10:04:53

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:53

==========================
Copying ambari sudo script...
==========================

Command start time 2018-10-18 10:04:53

scp /var/lib/ambari-server/ambari-sudo.sh
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:54

==========================
Copying common functions script...
==========================

Command start time 2018-10-18 10:04:54

scp /usr/lib/python2.6/site-packages/ambari_commons
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:54

==========================
Copying create-python-wrap script...
==========================

Command start time 2018-10-18 10:04:54

scp /var/lib/ambari-server/create-python-wrap.sh
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:54

==========================
Copying OS type check script...
==========================

Command start time 2018-10-18 10:04:54

scp /usr/lib/python2.6/site-packages/ambari_server/os_check_type.py
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:55

==========================
Running create-python-wrap script...
==========================

Command start time 2018-10-18 10:04:55

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:55

==========================
Running OS type check...
==========================

Command start time 2018-10-18 10:04:55
Cluster primary/cluster OS family is ubuntu16 and local/current OS family is ubuntu16

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:55

==========================
Checking 'sudo' package on remote host...
==========================

Command start time 2018-10-18 10:04:55

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Copying repo file to 'tmp' folder...
==========================

Command start time 2018-10-18 10:04:56

scp /etc/apt/sources.list.d/ambari.list
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Moving file to repo dir...
==========================

Command start time 2018-10-18 10:04:56

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Changing permissions for ambari.repo...
==========================

Command start time 2018-10-18 10:04:56

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Update apt cache of repository...
==========================

Command start time 2018-10-18 10:04:56

0% [Working]
            
Get:1 &lt;A href="http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3" target="_blank"&gt;http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3&lt;/A&gt; Ambari InRelease [7,394 B]

0% [1 InRelease 0 B/7,394 B 0%] [Connecting to archive.canonical.com (91.189.92
                                                                               
0% [Connecting to archive.canonical.com (91.189.92.191)]
0% [1 InRelease gpgv 7,394 B] [Connecting to archive.canonical.com (91.189.92.1
                                                                               
Ign:1 &lt;A href="http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3" target="_blank"&gt;http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3&lt;/A&gt; Ambari InRelease

                                                                               
0% [Waiting for headers]
                        
Hit:2 &lt;A href="http://archive.canonical.com/ubuntu" target="_blank"&gt;http://archive.canonical.com/ubuntu&lt;/A&gt; xenial InRelease

                        
0% [Working]
0% [2 InRelease gpgv 11.5 kB]
                             
20% [Working]
             
Fetched 7,394 B in 0s (27.0 kB/s)

Reading package lists... 0%

Reading package lists... 0%

Reading package lists... 0%

Reading package lists... 0%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... Done

W: GPG error: &lt;A href="http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3" target="_blank"&gt;http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3&lt;/A&gt; Ambari InRelease: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY B9733A7A07513CAD
W: The repository 'http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3 Ambari InRelease' is not signed.
N: Data from such a repository can't be authenticated and is therefore potentially dangerous to use.
N: See apt-secure(8) manpage for repository creation and user configuration details.

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:57

==========================
Copying setup script file...
==========================

Command start time 2018-10-18 10:04:57

scp /usr/lib/python2.6/site-packages/ambari_server/setupAgent.py
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:57

==========================
Running setup agent script...
==========================

Command start time 2018-10-18 10:04:57
('INFO 2018-10-18 10:05:30,349 ExitHelper.py:56 - Performing cleanup before exiting...
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,826 DataCleaner.py:39 - Data cleanup thread started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:120 - Data cleanup started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:122 - Data cleanup finished
INFO 2018-10-18 10:05:30,849 PingPortListener.py:50 - Ping port listener started on port: 8670
INFO 2018-10-18 10:05:30,850 main.py:436 - Connecting to Ambari server at &lt;A href="https://master-am.c.deploiementhadoop.internal:8440" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440&lt;/A&gt; (192.168.0.3)
INFO 2018-10-18 10:05:30,850 NetUtil.py:67 - Connecting to &lt;A href="https://master-am.c.deploiementhadoop.internal:8440/ca" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440/ca&lt;/A&gt;
INFO 2018-10-18 10:05:30,900 main.py:446 - Connected to Ambari server master-am.c.deploiementhadoop.internal
INFO 2018-10-18 10:05:30,901 threadpool.py:58 - Started thread pool with 3 core threads and 20 maximum threads
WARNING 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:280 - [AlertScheduler] /var/lib/ambari-agent/cache/alerts/definitions.json not found or invalid. No alerts will be scheduled until registration occurs.
INFO 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:175 - [AlertScheduler] Starting &amp;lt;ambari_agent.apscheduler.scheduler.Scheduler object at 0x7f9c56b89bd0&amp;gt;; currently running: False
INFO 2018-10-18 10:05:30,908 hostname.py:98 - Read public hostname \'w1-am.c.deploiementhadoop.internal\' using socket.getfqdn()
INFO 2018-10-18 10:05:30,916 Hardware.py:174 - Some mount points were ignored: /run, /dev/shm, /run/lock, /sys/fs/cgroup, /run/user/1001, /run/user/0
INFO 2018-10-18 10:05:30,927 Facter.py:202 - Directory: \'/etc/resource_overrides\' does not exist - it won\'t be used for gathering system resources.
INFO 2018-10-18 10:05:31,057 Controller.py:170 - Registering with w1-am.c.deploiementhadoop.internal (192.168.0.4) (agent=\'{"hardwareProfile": {"kernel": "Linux", "domain": "c.deploiementhadoop.internal", "physicalprocessorcount": 1, "kernelrelease": "4.15.0-1021-gcp", "uptime_days": "0", "memorytotal": 3781816, "swapfree": "0.00 GB", "memorysize": 3781816, "osfamily": "ubuntu", "swapsize": "0.00 GB", "processorcount": 1, "netmask": "255.255.255.255", "timezone": "UTC", "hardwareisa": "x86_64", "memoryfree": 2957172, "operatingsystem": "ubuntu", "kernelmajversion": "4.15", "kernelversion": "4.15.0", "macaddress": "42:01:C0:A8:00:04", "operatingsystemrelease": "16.04", "ipaddress": "192.168.0.4", "hostname": "w1-am", "uptime_hours": "3", "fqdn": "w1-am.c.deploiementhadoop.internal", "id": "root", "architecture": "x86_64", "selinux": false, "mounts": [{"available": "1879756", "used": "0", "percent": "0%", "device": "udev", "mountpoint": "/dev", "type": "devtmpfs", "size": "1879756"}, {"available": "149950580", "used": "2442596", "percent": "2%", "device": "/dev/sda1", "mountpoint": "/", "type": "ext4", "size": "152409560"}], "hardwaremodel": "x86_64", "uptime_seconds": "10847", "interfaces": "ens4,lo"}, "currentPingPort": 8670, "prefix": "/var/lib/ambari-agent/data", "agentVersion": "2.5.0.3", "agentEnv": {"transparentHugePage": "madvise", "hostHealth": {"agentTimeStampAtReporting": 1539857131055, "activeJavaProcs": [], "liveServices": [{"status": "Healthy", "name": "ntp or chrony", "desc": ""}]}, "reverseLookup": true, "alternatives": [], "umask": "18", "firewallName": "ufw", "stackFoldersAndFiles": [], "existingUsers": [], "firewallRunning": false}, "timestamp": 1539857130932, "hostname": "w1-am.c.deploiementhadoop.internal", "responseId": -1, "publicHostname": "w1-am.c.deploiementhadoop.internal"}\')
INFO 2018-10-18 10:05:31,057 NetUtil.py:67 - Connecting to &lt;A href="https://master-am.c.deploiementhadoop.internal:8440/connection_info" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440/connection_info&lt;/A&gt;
INFO 2018-10-18 10:05:31,107 security.py:93 - SSL Connect being called.. connecting to the server
', None)
('INFO 2018-10-18 10:05:30,349 ExitHelper.py:56 - Performing cleanup before exiting...
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,826 DataCleaner.py:39 - Data cleanup thread started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:120 - Data cleanup started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:122 - Data cleanup finished
INFO 2018-10-18 10:05:30,849 PingPortListener.py:50 - Ping port listener started on port: 8670
INFO 2018-10-18 10:05:30,850 main.py:436 - Connecting to Ambari server at &lt;A href="https://master-am.c.deploiementhadoop.internal:8440" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440&lt;/A&gt; (192.168.0.3)
INFO 2018-10-18 10:05:30,850 NetUtil.py:67 - Connecting to &lt;A href="https://master-am.c.deploiementhadoop.internal:8440/ca" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440/ca&lt;/A&gt;
INFO 2018-10-18 10:05:30,900 main.py:446 - Connected to Ambari server master-am.c.deploiementhadoop.internal
INFO 2018-10-18 10:05:30,901 threadpool.py:58 - Started thread pool with 3 core threads and 20 maximum threads
WARNING 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:280 - [AlertScheduler] /var/lib/ambari-agent/cache/alerts/definitions.json not found or invalid. No alerts will be scheduled until registration occurs.
INFO 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:175 - [AlertScheduler] Starting &amp;lt;ambari_agent.apscheduler.scheduler.Scheduler object at 0x7f9c56b89bd0&amp;gt;; currently running: False
INFO 2018-10-18 10:05:30,908 hostname.py:98 - Read public hostname \'w1-am.c.deploiementhadoop.internal\' using socket.getfqdn()
INFO 2018-10-18 10:05:30,916 Hardware.py:174 - Some mount points were ignored: /run, /dev/shm, /run/lock, /sys/fs/cgroup, /run/user/1001, /run/user/0
INFO 2018-10-18 10:05:30,927 Facter.py:202 - Directory: \'/etc/resource_overrides\' does not exist - it won\'t be used for gathering system resources.
INFO 2018-10-18 10:05:31,057 Controller.py:170 - Registering with w1-am.c.deploiementhadoop.internal (192.168.0.4) (agent=\'{"hardwareProfile": {"kernel": "Linux", "domain": "c.deploiementhadoop.internal", "physicalprocessorcount": 1, "kernelrelease": "4.15.0-1021-gcp", "uptime_days": "0", "memorytotal": 3781816, "swapfree": "0.00 GB", "memorysize": 3781816, "osfamily": "ubuntu", "swapsize": "0.00 GB", "processorcount": 1, "netmask": "255.255.255.255", "timezone": "UTC", "hardwareisa": "x86_64", "memoryfree": 2957172, "operatingsystem": "ubuntu", "kernelmajversion": "4.15", "kernelversion": "4.15.0", "macaddress": "42:01:C0:A8:00:04", "operatingsystemrelease": "16.04", "ipaddress": "192.168.0.4", "hostname": "w1-am", "uptime_hours": "3", "fqdn": "w1-am.c.deploiementhadoop.internal", "id": "root", "architecture": "x86_64", "selinux": false, "mounts": [{"available": "1879756", "used": "0", "percent": "0%", "device": "udev", "mountpoint": "/dev", "type": "devtmpfs", "size": "1879756"}, {"available": "149950580", "used": "2442596", "percent": "2%", "device": "/dev/sda1", "mountpoint": "/", "type": "ext4", "size": "152409560"}], "hardwaremodel": "x86_64", "uptime_seconds": "10847", "interfaces": "ens4,lo"}, "currentPingPort": 8670, "prefix": "/var/lib/ambari-agent/data", "agentVersion": "2.5.0.3", "agentEnv": {"transparentHugePage": "madvise", "hostHealth": {"agentTimeStampAtReporting": 1539857131055, "activeJavaProcs": [], "liveServices": [{"status": "Healthy", "name": "ntp or chrony", "desc": ""}]}, "reverseLookup": true, "alternatives": [], "umask": "18", "firewallName": "ufw", "stackFoldersAndFiles": [], "existingUsers": [], "firewallRunning": false}, "timestamp": 1539857130932, "hostname": "w1-am.c.deploiementhadoop.internal", "responseId": -1, "publicHostname": "w1-am.c.deploiementhadoop.internal"}\')
INFO 2018-10-18 10:05:31,057 NetUtil.py:67 - Connecting to &lt;A href="https://master-am.c.deploiementhadoop.internal:8440/connection_info" target="_blank"&gt;https://master-am.c.deploiementhadoop.internal:8440/connection_info&lt;/A&gt;
INFO 2018-10-18 10:05:31,107 security.py:93 - SSL Connect being called.. connecting to the server
', None)

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:05:33

Registering with the server... &lt;/PRE&gt;&lt;P&gt;Registration with the server failed.&lt;/P&gt;</description>
      <pubDate>Thu, 18 Oct 2018 19:25:53 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-install-HDP-I-got-a-problem-on-the-step-quot/m-p/213966#M84437</guid>
      <dc:creator>amina_meziani21</dc:creator>
      <dc:date>2018-10-18T19:25:53Z</dc:date>
    </item>
    <item>
      <title>Re: I try to install HDP. I got a problem on the step "Confirm Hosts". I do not understand where the problem comes from. has anyone ever had the same error or an idea of ​​how to solve it? can you help me please ?  the log is in description</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-install-HDP-I-got-a-problem-on-the-step-quot/m-p/213967#M84438</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/97566/aminameziani21.html" nodeid="97566"&gt;@Amin Meziani&lt;/A&gt; ,&lt;/P&gt;&lt;P&gt;IT seems you are installing in Ubuntu and  facing the issue mentioned in here : &lt;A href="https://community.hortonworks.com/content/supportkb/48912/how-to-install-the-hortonworks-gpg-key-on-ubuntu.html" target="_blank"&gt;https://community.hortonworks.com/content/supportkb/48912/how-to-install-the-hortonworks-gpg-key-on-ubuntu.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Can you please See if the solution works and revert.&lt;/P&gt;&lt;P&gt;Please accept this answer if this solution worked for you.&lt;/P&gt;</description>
      <pubDate>Fri, 19 Oct 2018 12:37:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-install-HDP-I-got-a-problem-on-the-step-quot/m-p/213967#M84438</guid>
      <dc:creator>akhilsnaik</dc:creator>
      <dc:date>2018-10-19T12:37:46Z</dc:date>
    </item>
    <item>
      <title>Re: I try to install HDP. I got a problem on the step "Confirm Hosts". I do not understand where the problem comes from. has anyone ever had the same error or an idea of ​​how to solve it? can you help me please ?  the log is in description</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-install-HDP-I-got-a-problem-on-the-step-quot/m-p/213968#M84439</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/18735/asnaik.html" nodeid="18735"&gt;@Akhil S Naik&lt;/A&gt; , thank you for your. it works.&lt;/P&gt;</description>
      <pubDate>Thu, 25 Oct 2018 20:50:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-install-HDP-I-got-a-problem-on-the-step-quot/m-p/213968#M84439</guid>
      <dc:creator>amina_meziani21</dc:creator>
      <dc:date>2018-10-25T20:50:36Z</dc:date>
    </item>
  </channel>
</rss>

