Created on 01-26-2018 12:50 AM - edited 08-17-2019 09:56 PM
Building dev playground for HDF. Installed Ambari 2.6.0, PostgreSQL 9.6, did all the pre-req setup steps like creating the Ambari DB, Registry DB, SAM DB, etc. Configured PostgreSQL for remote access all the stuff listed here: https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.2/bk_installing-hdf-on-hdp/bk_installing-hdf-...
I've also registered with Ambari server the JDBC postgresql driver as noted here: https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.18/bk_ambari-reference/content/_using_hive_wi...
Installed Zookeeper and Ambari Metrics, all green. Added Kafka, also green. Note: This is a single node system, just a playground. Went through the install steps for Schema Registry, switched DB type to postgres, set the password to registry as I set in postgresql, reset the FDQN and the configuration lines as shown below. Click Next, it runs for a few seconds and then bombs.
I have also gone through the steps in the Best Answer for this issue to ensure I have the right Registry version packages: https://community.hortonworks.com/questions/160769/there-is-an-error-in-installing-hdf-3020-in-hdp-2...
Does anyone have any idea of what I'm missing or mis-configured?
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/registry_server.py", line 129, in <module>
RegistryServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/registry_server.py", line 57, in install
import params
File "/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/params.py", line 173, in <module>
connector_curl_source = format("{jdk_location}/{jdbc_driver_jar}")
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/format.py", line 95, in format
return ConfigurationFormatter().format(format_string, args, **result)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/format.py", line 59, in format
result_protected = self.vformat(format_string, args, all_params)
File "/usr/lib64/python2.7/string.py", line 549, in vformat
result = self._vformat(format_string, args, kwargs, used_args, 2)
File "/usr/lib64/python2.7/string.py", line 571, in _vformat
obj, arg_used = self.get_field(field_name, args, kwargs)
File "/usr/lib64/python2.7/string.py", line 632, in get_field
obj = self.get_value(first, args, kwargs)
File "/usr/lib64/python2.7/string.py", line 591, in get_value
return kwargs[key]
File "/usr/lib/python2.6/site-packages/resource_management/core/utils.py", line 63, in __getitem__
return self._convert_value(self._dict[name])
KeyError: 'jdbc_driver_jar'
stdout:
2018-01-25 16:18:45,596 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
User Group mapping (user_group) is missing in the hostLevelParams
2018-01-25 16:18:45,600 - Group['hadoop'] {}
2018-01-25 16:18:45,601 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-25 16:18:45,602 - call['/var/lib/ambari-agent/tmp/changeUid.sh registry'] {}
2018-01-25 16:18:45,609 - call returned (0, '1004')
2018-01-25 16:18:45,609 - User['registry'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1004}
2018-01-25 16:18:45,611 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-25 16:18:45,611 - call['/var/lib/ambari-agent/tmp/changeUid.sh zookeeper'] {}
2018-01-25 16:18:45,619 - call returned (0, '1001')
2018-01-25 16:18:45,620 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1001}
2018-01-25 16:18:45,620 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-25 16:18:45,621 - call['/var/lib/ambari-agent/tmp/changeUid.sh ams'] {}
2018-01-25 16:18:45,628 - call returned (0, '1002')
2018-01-25 16:18:45,629 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1002}
2018-01-25 16:18:45,630 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-01-25 16:18:45,630 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-25 16:18:45,631 - call['/var/lib/ambari-agent/tmp/changeUid.sh kafka'] {}
2018-01-25 16:18:45,638 - call returned (0, '1005')
2018-01-25 16:18:45,638 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1005}
2018-01-25 16:18:45,639 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-25 16:18:45,640 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-01-25 16:18:45,646 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-01-25 16:18:45,659 - Repository['HDF-3.0-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.2.0', 'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdf-1', 'mirror_list': None}
2018-01-25 16:18:45,665 - File['/etc/yum.repos.d/ambari-hdf-1.repo'] {'content': '[HDF-3.0-repo-1]\nname=HDF-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.2.0\n\npath=/\nenabled=1\ngpgcheck=...'}
2018-01-25 16:18:45,666 - Writing File['/etc/yum.repos.d/ambari-hdf-1.repo'] because contents don't match
2018-01-25 16:18:45,667 - Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdf-1', 'mirror_list': None}
2018-01-25 16:18:45,669 - File['/etc/yum.repos.d/ambari-hdf-1.repo'] {'content': '[HDF-3.0-repo-1]\nname=HDF-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.2.0\n\npath=/\nenabled=1\ngpgcheck=...'}
2018-01-25 16:18:45,669 - Writing File['/etc/yum.repos.d/ambari-hdf-1.repo'] because contents don't match
2018-01-25 16:18:45,670 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-01-25 16:18:45,741 - Skipping installation of existing package unzip
2018-01-25 16:18:45,741 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-01-25 16:18:45,752 - Skipping installation of existing package curl
Created 01-26-2018 12:56 AM
Interestingly enough I was able to get the schema registry installed by while the failure message was up, going into the params.py and setting the parameters to point to the JDBC driver on my box
/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/params.py
I set the lines:
connector_curl_source = format("/usr/share/java/postgresql-jdbc.jar")
downloaded_custom_connector = format("/usr/share/java/postgresql-jdbc.jar")
Then after saving my changes I clicked retry on the install and it went through just fine. Any ideas why the variable substitution that I think should have been going on didn't?
Created 01-26-2018 01:03 AM
Confirm if the Postgres JDBC driver is present here "/usr/share/java/postgresql-jdbc.jar" location. If not then please download the JDBC driver from: https://jdbc.postgresql.org/
The perform these steps:
# ls /usr/share/java/postgresql-jdbc.jar # chmod 644 /usr/share/java/postgresql-jdbc.jar # ambari-server setup --jdbc-db=postgres --jdbc-driver=/usr/share/java/postgresql-jdbc.jar # ambari-server restart
.
Created 01-26-2018 01:07 AM
Ambari Sets the Postgres-jdbc JAR location to default:
Created 01-26-2018 06:01 PM
Yeah that's what I did, except for the chmod command. I went and checked the permissions on the file and they are set to that, so I'm not sure why it didn't work, until I did the hardcode.
Created 04-09-2018 07:55 PM
Hey Alex,
I had the same problem 🙂
It seems /var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/params.py sets jdbc JAR for oracle and mysql but not for postgresql.
This is missing:
if 'postgresql' == registry_storage_type: jdbc_driver_jar = default("/hostLevelParams/custom_postgres_jdbc_name", None)
Andre
Created 06-29-2018 06:54 PM
It is the same when you setup Postgres for the Streaming Analytics Manager.
The path is /var/lib/ambari-agent/cache/common-services/STREAMLINE/0.5.0/package/scripts/params.py