Created 01-16-2018 07:06 AM
What is the correct way of configuring a custom repo (say using jfrog) which is mirroring the public repo so that a blueprint install can find packages ?
for instance my /etc/yum.repos.d/ambari-hdp-1.repo is
[HDP-2.6-repo-1] name=HDP-2.6-repo-1 baseurl=http://myserver:8081/artifactory/hortonworks-hdp/ path=/ enabled=1 gpgcheck=0 [HDP-UTILS-1.1.0.21-repo-2] name=HDP-UTILS-1.1.0.21-repo-2 baseurl=http://myserver:8081/artifactory/hortonworks-hdp-utils/ path=/ enabled=1 gpgcheck=0
So if i run sudo yum info hadoop-hdfs-datanode it lists the right package.
As part of the 2.6 change in blueprints it is required to register the stack version following the methodology described here by registering a vdf file
Once done, I can see my repository_version with id 1 on
/api/v1/stacks/HDP/versions/2.6/repository_versions/1/operating_systems/redhat6/repositories/HDP-2.6 { "href" : "https://fooserver:8443/api/v1/stacks/HDP/versions/2.6/repository_versions/1/operating_systems/redhat6/repositories/HDP-2.6", "Repositories" : { "applicable_services" : [ ], "base_url" : "http://myserver:8081/artifactory/hortonworks-hdp/", "components" : null, "default_base_url" : "", "distribution" : null, "latest_base_url" : "", "mirrors_list" : "", "os_type" : "redhat6", "repo_id" : "HDP-2.6", "repo_name" : "HDP", "repository_version_id" : 1, "stack_name" : "HDP", "stack_version" : "2.6", "unique" : false } }
However after registering the cluster template with "repository_version_id" : 1 the cluster initiation request fails to find the repos on any of the nodes. Where exactly do I specify the version so that the blueprint install can pick it up ??
Have also attempted to set the repo version in /var/lib/ambari-server/resources/stacks/HDP/2.6/repos/repoinfo.xml but to no avail.
2018-01-16 06:35:50,800 - Unable to load available packages Traceback (most recent call last): File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 771, in load_available_packages self.available_packages_in_repos = pkg_provider.get_available_packages_in_repos(repos) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 85, in get_available_packages_in_repos available_packages.extend(self._get_available_packages(repo)) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 146, in _get_available_packages return self._lookup_packages(cmd, 'Available Packages') File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 191, in _lookup_packages if items[i + 2].find('@') == 0: IndexError: list index out of range 2018-01-16 06:35:51,308 - The 'hadoop-hdfs-datanode' component did not advertise a version. This may indicate a problem with the component packaging. Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 155, in <module> DataNode().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 48, in install import params File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/params.py", line 25, in <module> from params_linux import * File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py", line 391, in <module> lzo_packages = get_lzo_packages(stack_version_unformatted) File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/get_lzo_packages.py", line 45, in get_lzo_packages lzo_packages += [script_instance.format_package_name("hadooplzo_${stack_version}"), File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 538, in format_package_name raise Fail("Cannot match package for regexp name {0}. Available packages: {1}".format(name, self.available_packages_in_repos)) resource_management.core.exceptions.Fail: Cannot match package for regexp name hadooplzo_${stack_version}. Available packages: [] 2018-01-16 06:36:04,670 - Unable to load available packages Traceback (most recent call last): File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 771, in load_available_packages self.available_packages_in_repos = pkg_provider.get_available_packages_in_repos(repos) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 85, in get_available_packages_in_repos available_packages.extend(self._get_available_packages(repo)) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 146, in _get_available_packages return self._lookup_packages(cmd, 'Available Packages') File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 191, in _lookup_packages if items[i + 2].find('@') == 0: IndexError: list index out of range 2018-01-16 06:36:05,184 - The 'hadoop-hdfs-datanode' component did not advertise a version. This may indicate a problem with the component packaging.
Created 01-31-2018 07:27 AM
The responses above helped me with the problems i had, however the right answer is that when using blueprints in version 2.6 onwards when the vdf file is registered, we have to specify the repositories in that file. That input is then used to create another ambari-hdp-repo-1.repo which will then be subsequently used.
Created 01-31-2018 07:27 AM
The responses above helped me with the problems i had, however the right answer is that when using blueprints in version 2.6 onwards when the vdf file is registered, we have to specify the repositories in that file. That input is then used to create another ambari-hdp-repo-1.repo which will then be subsequently used.