Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

I try to install HDP. I got a problem on the step "Confirm Hosts". I do not understand where the problem comes from. has anyone ever had the same error or an idea of ​​how to solve it? can you help me please ? the log is in description

avatar
==========================
Creating target directory...
==========================

Command start time 2018-10-18 10:04:53

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:53

==========================
Copying ambari sudo script...
==========================

Command start time 2018-10-18 10:04:53

scp /var/lib/ambari-server/ambari-sudo.sh
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:54

==========================
Copying common functions script...
==========================

Command start time 2018-10-18 10:04:54

scp /usr/lib/python2.6/site-packages/ambari_commons
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:54

==========================
Copying create-python-wrap script...
==========================

Command start time 2018-10-18 10:04:54

scp /var/lib/ambari-server/create-python-wrap.sh
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:54

==========================
Copying OS type check script...
==========================

Command start time 2018-10-18 10:04:54

scp /usr/lib/python2.6/site-packages/ambari_server/os_check_type.py
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:55

==========================
Running create-python-wrap script...
==========================

Command start time 2018-10-18 10:04:55

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:55

==========================
Running OS type check...
==========================

Command start time 2018-10-18 10:04:55
Cluster primary/cluster OS family is ubuntu16 and local/current OS family is ubuntu16

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:55

==========================
Checking 'sudo' package on remote host...
==========================

Command start time 2018-10-18 10:04:55

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Copying repo file to 'tmp' folder...
==========================

Command start time 2018-10-18 10:04:56

scp /etc/apt/sources.list.d/ambari.list
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Moving file to repo dir...
==========================

Command start time 2018-10-18 10:04:56

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Changing permissions for ambari.repo...
==========================

Command start time 2018-10-18 10:04:56

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:56

==========================
Update apt cache of repository...
==========================

Command start time 2018-10-18 10:04:56

0% [Working]
            
Get:1 http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3 Ambari InRelease [7,394 B]

0% [1 InRelease 0 B/7,394 B 0%] [Connecting to archive.canonical.com (91.189.92
                                                                               
0% [Connecting to archive.canonical.com (91.189.92.191)]
0% [1 InRelease gpgv 7,394 B] [Connecting to archive.canonical.com (91.189.92.1
                                                                               
Ign:1 http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3 Ambari InRelease

                                                                               
0% [Waiting for headers]
                        
Hit:2 http://archive.canonical.com/ubuntu xenial InRelease

                        
0% [Working]
0% [2 InRelease gpgv 11.5 kB]
                             
20% [Working]
             
Fetched 7,394 B in 0s (27.0 kB/s)

Reading package lists... 0%

Reading package lists... 0%

Reading package lists... 0%

Reading package lists... 0%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... 2%

Reading package lists... Done

W: GPG error: http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3 Ambari InRelease: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY B9733A7A07513CAD
W: The repository 'http://public-repo-1.hortonworks.com/ambari/ubuntu16/2.x/updates/2.5.0.3 Ambari InRelease' is not signed.
N: Data from such a repository can't be authenticated and is therefore potentially dangerous to use.
N: See apt-secure(8) manpage for repository creation and user configuration details.

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:57

==========================
Copying setup script file...
==========================

Command start time 2018-10-18 10:04:57

scp /usr/lib/python2.6/site-packages/ambari_server/setupAgent.py
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:04:57

==========================
Running setup agent script...
==========================

Command start time 2018-10-18 10:04:57
('INFO 2018-10-18 10:05:30,349 ExitHelper.py:56 - Performing cleanup before exiting...
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,826 DataCleaner.py:39 - Data cleanup thread started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:120 - Data cleanup started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:122 - Data cleanup finished
INFO 2018-10-18 10:05:30,849 PingPortListener.py:50 - Ping port listener started on port: 8670
INFO 2018-10-18 10:05:30,850 main.py:436 - Connecting to Ambari server at https://master-am.c.deploiementhadoop.internal:8440 (192.168.0.3)
INFO 2018-10-18 10:05:30,850 NetUtil.py:67 - Connecting to https://master-am.c.deploiementhadoop.internal:8440/ca
INFO 2018-10-18 10:05:30,900 main.py:446 - Connected to Ambari server master-am.c.deploiementhadoop.internal
INFO 2018-10-18 10:05:30,901 threadpool.py:58 - Started thread pool with 3 core threads and 20 maximum threads
WARNING 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:280 - [AlertScheduler] /var/lib/ambari-agent/cache/alerts/definitions.json not found or invalid. No alerts will be scheduled until registration occurs.
INFO 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:175 - [AlertScheduler] Starting <ambari_agent.apscheduler.scheduler.Scheduler object at 0x7f9c56b89bd0>; currently running: False
INFO 2018-10-18 10:05:30,908 hostname.py:98 - Read public hostname \'w1-am.c.deploiementhadoop.internal\' using socket.getfqdn()
INFO 2018-10-18 10:05:30,916 Hardware.py:174 - Some mount points were ignored: /run, /dev/shm, /run/lock, /sys/fs/cgroup, /run/user/1001, /run/user/0
INFO 2018-10-18 10:05:30,927 Facter.py:202 - Directory: \'/etc/resource_overrides\' does not exist - it won\'t be used for gathering system resources.
INFO 2018-10-18 10:05:31,057 Controller.py:170 - Registering with w1-am.c.deploiementhadoop.internal (192.168.0.4) (agent=\'{"hardwareProfile": {"kernel": "Linux", "domain": "c.deploiementhadoop.internal", "physicalprocessorcount": 1, "kernelrelease": "4.15.0-1021-gcp", "uptime_days": "0", "memorytotal": 3781816, "swapfree": "0.00 GB", "memorysize": 3781816, "osfamily": "ubuntu", "swapsize": "0.00 GB", "processorcount": 1, "netmask": "255.255.255.255", "timezone": "UTC", "hardwareisa": "x86_64", "memoryfree": 2957172, "operatingsystem": "ubuntu", "kernelmajversion": "4.15", "kernelversion": "4.15.0", "macaddress": "42:01:C0:A8:00:04", "operatingsystemrelease": "16.04", "ipaddress": "192.168.0.4", "hostname": "w1-am", "uptime_hours": "3", "fqdn": "w1-am.c.deploiementhadoop.internal", "id": "root", "architecture": "x86_64", "selinux": false, "mounts": [{"available": "1879756", "used": "0", "percent": "0%", "device": "udev", "mountpoint": "/dev", "type": "devtmpfs", "size": "1879756"}, {"available": "149950580", "used": "2442596", "percent": "2%", "device": "/dev/sda1", "mountpoint": "/", "type": "ext4", "size": "152409560"}], "hardwaremodel": "x86_64", "uptime_seconds": "10847", "interfaces": "ens4,lo"}, "currentPingPort": 8670, "prefix": "/var/lib/ambari-agent/data", "agentVersion": "2.5.0.3", "agentEnv": {"transparentHugePage": "madvise", "hostHealth": {"agentTimeStampAtReporting": 1539857131055, "activeJavaProcs": [], "liveServices": [{"status": "Healthy", "name": "ntp or chrony", "desc": ""}]}, "reverseLookup": true, "alternatives": [], "umask": "18", "firewallName": "ufw", "stackFoldersAndFiles": [], "existingUsers": [], "firewallRunning": false}, "timestamp": 1539857130932, "hostname": "w1-am.c.deploiementhadoop.internal", "responseId": -1, "publicHostname": "w1-am.c.deploiementhadoop.internal"}\')
INFO 2018-10-18 10:05:31,057 NetUtil.py:67 - Connecting to https://master-am.c.deploiementhadoop.internal:8440/connection_info
INFO 2018-10-18 10:05:31,107 security.py:93 - SSL Connect being called.. connecting to the server
', None)
('INFO 2018-10-18 10:05:30,349 ExitHelper.py:56 - Performing cleanup before exiting...
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,825 main.py:145 - loglevel=logging.INFO
INFO 2018-10-18 10:05:30,826 DataCleaner.py:39 - Data cleanup thread started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:120 - Data cleanup started
INFO 2018-10-18 10:05:30,829 DataCleaner.py:122 - Data cleanup finished
INFO 2018-10-18 10:05:30,849 PingPortListener.py:50 - Ping port listener started on port: 8670
INFO 2018-10-18 10:05:30,850 main.py:436 - Connecting to Ambari server at https://master-am.c.deploiementhadoop.internal:8440 (192.168.0.3)
INFO 2018-10-18 10:05:30,850 NetUtil.py:67 - Connecting to https://master-am.c.deploiementhadoop.internal:8440/ca
INFO 2018-10-18 10:05:30,900 main.py:446 - Connected to Ambari server master-am.c.deploiementhadoop.internal
INFO 2018-10-18 10:05:30,901 threadpool.py:58 - Started thread pool with 3 core threads and 20 maximum threads
WARNING 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:280 - [AlertScheduler] /var/lib/ambari-agent/cache/alerts/definitions.json not found or invalid. No alerts will be scheduled until registration occurs.
INFO 2018-10-18 10:05:30,902 AlertSchedulerHandler.py:175 - [AlertScheduler] Starting <ambari_agent.apscheduler.scheduler.Scheduler object at 0x7f9c56b89bd0>; currently running: False
INFO 2018-10-18 10:05:30,908 hostname.py:98 - Read public hostname \'w1-am.c.deploiementhadoop.internal\' using socket.getfqdn()
INFO 2018-10-18 10:05:30,916 Hardware.py:174 - Some mount points were ignored: /run, /dev/shm, /run/lock, /sys/fs/cgroup, /run/user/1001, /run/user/0
INFO 2018-10-18 10:05:30,927 Facter.py:202 - Directory: \'/etc/resource_overrides\' does not exist - it won\'t be used for gathering system resources.
INFO 2018-10-18 10:05:31,057 Controller.py:170 - Registering with w1-am.c.deploiementhadoop.internal (192.168.0.4) (agent=\'{"hardwareProfile": {"kernel": "Linux", "domain": "c.deploiementhadoop.internal", "physicalprocessorcount": 1, "kernelrelease": "4.15.0-1021-gcp", "uptime_days": "0", "memorytotal": 3781816, "swapfree": "0.00 GB", "memorysize": 3781816, "osfamily": "ubuntu", "swapsize": "0.00 GB", "processorcount": 1, "netmask": "255.255.255.255", "timezone": "UTC", "hardwareisa": "x86_64", "memoryfree": 2957172, "operatingsystem": "ubuntu", "kernelmajversion": "4.15", "kernelversion": "4.15.0", "macaddress": "42:01:C0:A8:00:04", "operatingsystemrelease": "16.04", "ipaddress": "192.168.0.4", "hostname": "w1-am", "uptime_hours": "3", "fqdn": "w1-am.c.deploiementhadoop.internal", "id": "root", "architecture": "x86_64", "selinux": false, "mounts": [{"available": "1879756", "used": "0", "percent": "0%", "device": "udev", "mountpoint": "/dev", "type": "devtmpfs", "size": "1879756"}, {"available": "149950580", "used": "2442596", "percent": "2%", "device": "/dev/sda1", "mountpoint": "/", "type": "ext4", "size": "152409560"}], "hardwaremodel": "x86_64", "uptime_seconds": "10847", "interfaces": "ens4,lo"}, "currentPingPort": 8670, "prefix": "/var/lib/ambari-agent/data", "agentVersion": "2.5.0.3", "agentEnv": {"transparentHugePage": "madvise", "hostHealth": {"agentTimeStampAtReporting": 1539857131055, "activeJavaProcs": [], "liveServices": [{"status": "Healthy", "name": "ntp or chrony", "desc": ""}]}, "reverseLookup": true, "alternatives": [], "umask": "18", "firewallName": "ufw", "stackFoldersAndFiles": [], "existingUsers": [], "firewallRunning": false}, "timestamp": 1539857130932, "hostname": "w1-am.c.deploiementhadoop.internal", "responseId": -1, "publicHostname": "w1-am.c.deploiementhadoop.internal"}\')
INFO 2018-10-18 10:05:31,057 NetUtil.py:67 - Connecting to https://master-am.c.deploiementhadoop.internal:8440/connection_info
INFO 2018-10-18 10:05:31,107 security.py:93 - SSL Connect being called.. connecting to the server
', None)

Connection to w1-am.c.deploiementhadoop.internal closed.
SSH command execution finished
host=w1-am.c.deploiementhadoop.internal, exitcode=0
Command end time 2018-10-18 10:05:33

Registering with the server... 

Registration with the server failed.

1 ACCEPTED SOLUTION

avatar

Hi @Amin Meziani ,

IT seems you are installing in Ubuntu and facing the issue mentioned in here : https://community.hortonworks.com/content/supportkb/48912/how-to-install-the-hortonworks-gpg-key-on-...

Can you please See if the solution works and revert.

Please accept this answer if this solution worked for you.

View solution in original post

2 REPLIES 2

avatar

Hi @Amin Meziani ,

IT seems you are installing in Ubuntu and facing the issue mentioned in here : https://community.hortonworks.com/content/supportkb/48912/how-to-install-the-hortonworks-gpg-key-on-...

Can you please See if the solution works and revert.

Please accept this answer if this solution worked for you.

avatar

Hi @Akhil S Naik , thank you for your. it works.