Member since
11-17-2017
6
Posts
0
Kudos Received
0
Solutions
04-13-2018
10:20 AM
I guess 3.1.0.0 is also incompatible with 2.6.3. It's noted that HDF 3.1.1 added to HDP 2.6.4 doesn't support Schema Registry and SAM. https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.1/bk_installing-hdf-and-hdp/content/ch_add-hdf-to-hdp.html
... View more
04-03-2018
11:13 PM
I have a Ambari server installed HDP node that is only running zookeeper, HDFS, Yarn, and MapReduce2. I added the HDF Management Pack for 3.1.0.0 and wanted to install Kafka and Schema Registry. I realize kafka is already accessible with HDP, but I wanted to add the schema registry. When I went to add the kafka and schema registry services, I do not see schema registry as an available service. Ni-Fi, Ni-Fi registry and other HDF services are available, but not SAM or Schema Registry. Does anyone know why this would be?
... View more
Labels:
01-26-2018
06:01 PM
Yeah that's what I did, except for the chmod command. I went and checked the permissions on the file and they are set to that, so I'm not sure why it didn't work, until I did the hardcode.
... View more
01-26-2018
12:56 AM
Interestingly enough I was able to get the schema registry installed by while the failure message was up, going into the params.py and setting the parameters to point to the JDBC driver on my box /var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/params.py I set the lines: connector_curl_source = format("/usr/share/java/postgresql-jdbc.jar") downloaded_custom_connector = format("/usr/share/java/postgresql-jdbc.jar") Then after saving my changes I clicked retry on the install and it went through just fine. Any ideas why the variable substitution that I think should have been going on didn't?
... View more
01-26-2018
12:50 AM
Building dev playground for HDF. Installed Ambari 2.6.0, PostgreSQL 9.6, did all the pre-req setup steps like creating the Ambari DB, Registry DB, SAM DB, etc. Configured PostgreSQL for remote access all the stuff listed here: https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.2/bk_installing-hdf-on-hdp/bk_installing-hdf-on-hdp.pdf I've also registered with Ambari server the JDBC postgresql driver as noted here: https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.18/bk_ambari-reference/content/_using_hive_with_postgresql.html Installed Zookeeper and Ambari Metrics, all green. Added Kafka, also green. Note: This is a single node system, just a playground. Went through the install steps for Schema Registry, switched DB type to postgres, set the password to registry as I set in postgresql, reset the FDQN and the configuration lines as shown below. Click Next, it runs for a few seconds and then bombs. I have also gone through the steps in the Best Answer for this issue to ensure I have the right Registry version packages: https://community.hortonworks.com/questions/160769/there-is-an-error-in-installing-hdf-3020-in-hdp-26.html Does anyone have any idea of what I'm missing or mis-configured? stderr: Traceback (most recent call last): File
"/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/registry_server.py",
line 129, in <module>
RegistryServer().execute() File
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 367, in execute method(env) File
"/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/registry_server.py",
line 57, in install import params File
"/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/params.py",
line 173, in <module>
connector_curl_source =
format("{jdk_location}/{jdbc_driver_jar}") File
"/usr/lib/python2.6/site-packages/resource_management/libraries/functions/format.py",
line 95, in format return
ConfigurationFormatter().format(format_string, args, **result) File
"/usr/lib/python2.6/site-packages/resource_management/libraries/functions/format.py",
line 59, in format result_protected =
self.vformat(format_string, args, all_params) File
"/usr/lib64/python2.7/string.py", line 549, in vformat result =
self._vformat(format_string, args, kwargs, used_args, 2) File
"/usr/lib64/python2.7/string.py", line 571, in _vformat obj, arg_used =
self.get_field(field_name, args, kwargs) File
"/usr/lib64/python2.7/string.py", line 632, in get_field obj =
self.get_value(first, args, kwargs) File
"/usr/lib64/python2.7/string.py", line 591, in get_value return kwargs[key] File
"/usr/lib/python2.6/site-packages/resource_management/core/utils.py",
line 63, in __getitem__ return
self._convert_value(self._dict[name]) KeyError: 'jdbc_driver_jar' stdout: 2018-01-25 16:18:45,596 - Stack Feature Version Info:
Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0 User Group mapping (user_group) is missing in the
hostLevelParams 2018-01-25 16:18:45,600 - Group['hadoop'] {} 2018-01-25 16:18:45,601 -
File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-01-25 16:18:45,602 -
call['/var/lib/ambari-agent/tmp/changeUid.sh registry'] {} 2018-01-25 16:18:45,609 - call returned (0, '1004') 2018-01-25 16:18:45,609 - User['registry'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1004} 2018-01-25 16:18:45,611 -
File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-01-25 16:18:45,611 -
call['/var/lib/ambari-agent/tmp/changeUid.sh zookeeper'] {} 2018-01-25 16:18:45,619 - call returned (0, '1001') 2018-01-25 16:18:45,620 - User['zookeeper'] {'gid':
'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1001} 2018-01-25 16:18:45,620 -
File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-01-25 16:18:45,621 -
call['/var/lib/ambari-agent/tmp/changeUid.sh ams'] {} 2018-01-25 16:18:45,628 - call returned (0, '1002') 2018-01-25 16:18:45,629 - User['ams'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1002} 2018-01-25 16:18:45,630 - User['ambari-qa'] {'gid':
'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None} 2018-01-25 16:18:45,630 -
File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-01-25 16:18:45,631 -
call['/var/lib/ambari-agent/tmp/changeUid.sh kafka'] {} 2018-01-25 16:18:45,638 - call returned (0, '1005') 2018-01-25 16:18:45,638 - User['kafka'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1005} 2018-01-25 16:18:45,639 -
File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-01-25 16:18:45,640 -
Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-01-25 16:18:45,646 - Skipping
Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
0'] due to not_if 2018-01-25 16:18:45,659 - Repository['HDF-3.0-repo-1']
{'append_to_file': False, 'base_url':
'http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.2.0',
'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template':
'[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list
%}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdf-1',
'mirror_list': None} 2018-01-25 16:18:45,665 -
File['/etc/yum.repos.d/ambari-hdf-1.repo'] {'content':
'[HDF-3.0-repo-1]\nname=HDF-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.2.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-01-25 16:18:45,666 - Writing
File['/etc/yum.repos.d/ambari-hdf-1.repo'] because contents don't match 2018-01-25 16:18:45,667 -
Repository['HDP-UTILS-1.1.0.21-repo-1'] {'append_to_file': True, 'base_url':
'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7',
'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template':
'[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list
%}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdf-1',
'mirror_list': None} 2018-01-25 16:18:45,669 -
File['/etc/yum.repos.d/ambari-hdf-1.repo'] {'content':
'[HDF-3.0-repo-1]\nname=HDF-3.0-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.0.2.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-1]\nname=HDP-UTILS-1.1.0.21-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'} 2018-01-25 16:18:45,669 - Writing
File['/etc/yum.repos.d/ambari-hdf-1.repo'] because contents don't match 2018-01-25 16:18:45,670 - Package['unzip']
{'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-01-25 16:18:45,741 - Skipping installation of existing
package unzip 2018-01-25 16:18:45,741 - Package['curl']
{'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-01-25 16:18:45,752 - Skipping installation of existing
package curl
... View more
Labels: