Member since
09-20-2018
26
Posts
1
Kudos Received
0
Solutions
02-04-2019
06:49 PM
I am installing a 6 node cluster with 1 as ambari-server and rest other as agents. Ambari-version: 2.7.0.0 HDP version:3.0 I am getting error at the installation stage where it is not able to find hdp-select file for versions. Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stack-hooks/before-INSTALL/scripts/hook.py", line 37, in <module>
BeforeInstallHook().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute
self.save_component_version_to_structured_out(self.command_name)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 222, in save_component_version_to_structured_out
stack_select_package_name = stack_select.get_package_name()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
supported_packages = get_supported_packages()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select
... View more
Labels:
02-04-2019
02:15 PM
Thanks @Geoffrey Shelton Okot ! It worked for me!
... View more
02-03-2019
01:19 AM
I am trying to install HDP 3.0 version on a 6 node cluster where 5 nodes act as ambari-agent and 1 node as server. Here is the log file: stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER_KMS/package/scripts/kms_server.py", line 137, in <module>
KmsServer().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 353, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER_KMS/package/scripts/kms_server.py", line 51, in install
kms.setup_kms_db()
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER_KMS/package/scripts/kms.py", line 68, in setup_kms_db
copy_jdbc_connector(kms_home)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/RANGER_KMS/package/scripts/kms.py", line 359, in copy_jdbc_connector
Please run 'ambari-server setup --jdbc-db={db_name} --jdbc-driver={path_to_jdbc} on server host.'".format(params.db_flavor, params.jdk_location)
KeyError: 'db_name'
stdout:
2019-02-02 04:43:22,791 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2019-02-02 04:43:22,810 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-02-02 04:43:22,813 - Group['kms'] {}
2019-02-02 04:43:22,815 - Group['livy'] {}
2019-02-02 04:43:22,815 - Group['spark'] {}
2019-02-02 04:43:22,819 - Group['ranger'] {}
2019-02-02 04:43:22,820 - Group['hdfs'] {}
2019-02-02 04:43:22,820 - Group['hadoop'] {}
2019-02-02 04:43:22,820 - Group['users'] {}
2019-02-02 04:43:22,821 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-02-02 04:43:22,824 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-02-02 04:43:22,829 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-02-02 04:43:22,831 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-02-02 04:43:22,833 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-02-02 04:43:22,835 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2019-02-02 04:43:22,840 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-02-02 04:43:22,842 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}
2019-02-02 04:43:22,844 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2019-02-02 04:43:22,849 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2019-02-02 04:43:22,851 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-02-02 04:43:22,853 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-02-02 04:43:22,858 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2019-02-02 04:43:22,860 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-02-02 04:43:22,862 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-02-02 04:43:22,867 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-02-02 04:43:22,868 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-02-02 04:43:22,871 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2019-02-02 04:43:22,884 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2019-02-02 04:43:22,885 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2019-02-02 04:43:22,887 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-02-02 04:43:22,889 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-02-02 04:43:22,891 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2019-02-02 04:43:22,905 - call returned (0, '1017')
2019-02-02 04:43:22,906 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1017'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2019-02-02 04:43:22,914 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1017'] due to not_if
2019-02-02 04:43:22,915 - Group['hdfs'] {}
2019-02-02 04:43:22,916 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2019-02-02 04:43:22,917 - FS Type: HDFS
2019-02-02 04:43:22,918 - Directory['/etc/hadoop'] {'mode': 0755}
2019-02-02 04:43:22,964 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2019-02-02 04:43:22,966 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2019-02-02 04:43:23,007 - Repository['HDP-3.0-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-02-02 04:43:23,032 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-02-02 04:43:23,034 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2019-02-02 04:43:23,039 - Repository['HDP-3.0-GPL-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0', 'action': ['create'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-02-02 04:43:23,049 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-02-02 04:43:23,050 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2019-02-02 04:43:23,050 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-02-02 04:43:23,060 - File['/etc/yum.repos.d/ambari-hdp-1.repo'] {'content': '[HDP-3.0-repo-1]\nname=HDP-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.0-GPL-repo-1]\nname=HDP-3.0-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.0.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-02-02 04:43:23,060 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2019-02-02 04:43:23,061 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-02-02 04:43:23,519 - Skipping installation of existing package unzip
2019-02-02 04:43:23,519 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-02-02 04:43:23,570 - Skipping installation of existing package curl
2019-02-02 04:43:23,570 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-02-02 04:43:23,626 - Skipping installation of existing package hdp-select
2019-02-02 04:43:23,639 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2019-02-02 04:43:24,364 - Package['ranger_3_0_1_0_187-kms'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-02-02 04:43:24,819 - Skipping installation of existing package ranger_3_0_1_0_187-kms
2019-02-02 04:43:24,822 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2019-02-02 04:43:24,892 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-02-02 04:43:24,910 - Execute[('cp', '-f', u'/usr/hdp/current/ranger-kms/install.properties', u'/usr/hdp/current/ranger-kms/install-backup.properties')] {'not_if': 'ls /usr/hdp/current/ranger-kms/install-backup.properties', 'sudo': True, 'only_if': 'ls /usr/hdp/current/ranger-kms/install.properties'}
2019-02-02 04:43:24,923 - Skipping Execute[('cp', '-f', u'/usr/hdp/current/ranger-kms/install.properties', u'/usr/hdp/current/ranger-kms/install-backup.properties')] due to not_if
2019-02-02 04:43:24,924 - Password validated
2019-02-02 04:43:24,930 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
Command failed after 1 tries
... View more
Labels:
- Labels:
-
Apache Ranger
12-04-2018
06:44 AM
Thanks @Akhil S Naik, the issue was resolved using the above method.
... View more
11-27-2018
02:56 PM
Hi @Akhil S Naik I got the following result after running the command: WARNING 2018-11-27 13:42:59,507 NetUtil.py:124 - Server at https://devdp30.eng.s sn:8440 is not reachable, sleeping for 10 seconds...
INFO 2018-11-27 13:43:09,507 NetUtil.py:70 - Connecting to https://devdp30.eng.ssn:8440/ca
ERROR 2018-11-27 13:43:09,509 NetUtil.py:96 - EOF occurred in violation of protocol (_ssl.c:618)
ERROR 2018-11-27 13:43:09,509 NetUtil.py:97 - SSLError: Failed to connect. Please check openssl library versions.
Refer to: https://bugzilla.redhat.com/show_bug.cgi?id=1022468 for more details.
WARNING 2018-11-27 13:43:09,509 NetUtil.py:124 - Server at https://devdp30.eng.ssn:8440 is not reachable, sleeping for 10 seconds...
INFO 2018-11-27 13:43:19,510 main.py:439 - Connecting to Ambari server at https://devdp30.eng.ssn:8440 (172.23.222.59)
INFO 2018-11-27 13:43:19,510 NetUtil.py:70 - Connecting to https://devdp30.eng.ssn:8440/ca
ERROR 2018-11-27 13:43:19,511 NetUtil.py:96 - EOF occurred in violation of protocol (_ssl.c:618)
ERROR 2018-11-27 13:43:19,511 NetUtil.py:97 - SSLError: Failed to connect. Please check openssl library versions.
Refer to: https://bugzilla.redhat.com/show_bug.cgi?id=1022468 for more details.
WARNING 2018-11-27 13:43:19,511 NetUtil.py:124 - Server at https://devdp30.eng.ssn:8440 is not reachable, sleeping for 10 seconds...
INFO 2018-11-27 13:43:29,512 NetUtil.py:70 - Connecting to https://devdp30.eng.ssn:8440/ca
ERROR 2018-11-27 13:43:29,513 NetUtil.py:96 - EOF occurred in violation of protocol (_ssl.c:618)
ERROR 2018-11-27 13:43:29,513 NetUtil.py:97 - SSLError: Failed to connect. Please check openssl library versions.
Refer to: https://bugzilla.redhat.com/show_bug.cgi?id=1022468 for more details.
WARNING 2018-11-27 13:43:29,514 NetUtil.py:124 - Server at https://devdp30.eng.ssn:8440 is not reachable, sleeping for 10 seconds...
... View more
11-27-2018
11:46 AM
The output of the command is attached: 8433 as i changed the url port to 8433,it is still not reachable
... View more
11-27-2018
08:11 AM
Hi @Sampath Kumar, I have followed all the steps stated above already for the installation but still while registering I am getting error. devdp30.eng.ssn - is the hostname I have given and changed in all the required places.
... View more
11-27-2018
06:24 AM
Hi @Akhil S Naik I have manually nstalled the agent and made the changes in .ini file of ambari-agent and restarted everything but still I am getting the error that Registration with the server failed. I have changes the server name of .ini file as the output of hostname -f command.
... View more
11-27-2018
05:54 AM
OS: RHL 7 I am still not very clear regarding the FQDN of the system needed while registering the host. The screenshot is attached.
... View more
Labels:
- Labels:
-
Apache Ambari
09-24-2018
11:52 AM
@amarnath reddy pappu Could you help me to figure out the error in the images below and after 3 days I am not able to see any host while running command http://ambari-server-host:8080/api/v1/hosts.
... View more
- « Previous
-
- 1
- 2
- Next »