Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Solr install fails on hdp 2.5 sandbox

avatar
Super Collaborator

I tried installing Solr via ambari on the hdp 2.5 sandbox. I followed the steps and i am getting errors while starting solr. Please see the logs attached below.

STDERR:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/solr.py", line 101, in <module>
    Solr().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/solr.py", line 17, in install
    self.install_packages(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 567, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 83, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 91, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install lucidworks-hdpsearch' returned 1. Traceback (most recent call last):
  File "/usr/bin/yum", line 29, in <module>
    yummain.user_main(sys.argv[1:], exit_code=True)
  File "/usr/share/yum-cli/yummain.py", line 298, in user_main
    errcode = main(args)
  File "/usr/share/yum-cli/yummain.py", line 146, in main
    result, resultmsgs = base.doCommands()
  File "/usr/share/yum-cli/cli.py", line 440, in doCommands
    return self.yum_cli_commands[self.basecmd].doCommand(self, self.basecmd, self.extcmds)
  File "/usr/share/yum-cli/yumcommands.py", line 211, in doCommand
    return base.installPkgs(extcmds)
  File "/usr/share/yum-cli/cli.py", line 702, in installPkgs
    self.install(pattern=arg)
  File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 3551, in install
    mypkgs = self.pkgSack.returnPackages(patterns=pats,
  File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 907, in <lambda>
    pkgSack = property(fget=lambda self: self._getSacks(),
  File "/usr/lib/python2.6/site-packages/yum/__init__.py", line 687, in _getSacks
    self.repos.populateSack(which=repos)
  File "/usr/lib/python2.6/site-packages/yum/repos.py", line 324, in populateSack
    sack.populate(repo, mdtype, callback, cacheonly)
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 165, in populate
    if self._check_db_version(repo, mydbtype):
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 223, in _check_db_version
    return repo._check_db_version(mdtype)
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1261, in _check_db_version
    repoXML = self.repoXML
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1460, in <lambda>
    repoXML = property(fget=lambda self: self._getRepoXML(),
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1452, in _getRepoXML
    self._loadRepoXML(text=self)
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1442, in _loadRepoXML
    return self._groupLoadRepoXML(text, self._mdpolicy2mdtypes())
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1418, in _groupLoadRepoXML
    self._commonRetrieveDataMD(mdtypes)
  File "/usr/lib/python2.6/site-packages/yum/yumRepo.py", line 1349, in _commonRetrieveDataMD
    os.rename(local, local + '.old.tmp')
OSError: [Errno 22] Invalid argument

STDOUT:

2016-11-06 06:20:58,169 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-11-06 06:20:58,169 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-11-06 06:20:58,170 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-11-06 06:20:58,197 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-11-06 06:20:58,198 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-11-06 06:20:58,225 - checked_call returned (0, '')
2016-11-06 06:20:58,225 - Ensuring that hadoop has the correct symlink structure
2016-11-06 06:20:58,225 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-11-06 06:20:58,226 - Group['hadoop'] {}
2016-11-06 06:20:58,228 - Group['users'] {}
2016-11-06 06:20:58,228 - Group['zeppelin'] {}
2016-11-06 06:20:58,228 - Group['solr'] {}
2016-11-06 06:20:58,228 - Group['knox'] {}
2016-11-06 06:20:58,228 - Group['ranger'] {}
2016-11-06 06:20:58,228 - Group['spark'] {}
2016-11-06 06:20:58,229 - Group['livy'] {}
2016-11-06 06:20:58,229 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,230 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-11-06 06:20:58,230 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,231 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-11-06 06:20:58,231 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,232 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,232 - User['solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,233 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,234 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,235 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger']}
2016-11-06 06:20:58,236 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,236 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,237 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,237 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,238 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,238 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,240 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-11-06 06:20:58,240 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2016-11-06 06:20:58,241 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,242 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,242 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,243 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,243 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,244 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2016-11-06 06:20:58,244 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-11-06 06:20:58,246 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-11-06 06:20:58,258 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-11-06 06:20:58,258 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2016-11-06 06:20:58,259 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-11-06 06:20:58,260 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2016-11-06 06:20:58,272 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2016-11-06 06:20:58,272 - Group['hdfs'] {}
2016-11-06 06:20:58,272 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2016-11-06 06:20:58,273 - FS Type: 
2016-11-06 06:20:58,273 - Directory['/etc/hadoop'] {'mode': 0755}
2016-11-06 06:20:58,286 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-11-06 06:20:58,287 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2016-11-06 06:20:58,304 - Initializing 3 repositories
2016-11-06 06:20:58,305 - Repository['HDP-2.5'] {'base_url': 'http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos6/2.x/BUILDS/2.5.0.0-1245/', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-11-06 06:20:58,313 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.5]\nname=HDP-2.5\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos6/2.x/BUILDS/2.5.0.0-1245/\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-11-06 06:20:58,314 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-11-06 06:20:58,316 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-11-06 06:20:58,317 - Repository['HDP-SOLR-2.5-100'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-SOLR-2.5-100/repos/centos6', 'action': ['create'], 'components': ['HDP-SOLR', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-SOLR', 'mirror_list': None}
2016-11-06 06:20:58,320 - File['/etc/yum.repos.d/HDP-SOLR.repo'] {'content': '[HDP-SOLR-2.5-100]\nname=HDP-SOLR-2.5-100\nbaseurl=http://public-repo-1.hortonworks.com/HDP-SOLR-2.5-100/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-11-06 06:20:58,320 - Writing File['/etc/yum.repos.d/HDP-SOLR.repo'] because contents don't match
2016-11-06 06:20:58,320 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-11-06 06:20:58,400 - Skipping installation of existing package unzip
2016-11-06 06:20:58,400 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-11-06 06:20:58,409 - Skipping installation of existing package curl
2016-11-06 06:20:58,409 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-11-06 06:20:58,421 - Skipping installation of existing package hdp-select
2016-11-06 06:20:58,589 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-11-06 06:20:58,590 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-11-06 06:20:58,590 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-11-06 06:20:58,620 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-11-06 06:20:58,620 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-11-06 06:20:58,647 - checked_call returned (0, '')
2016-11-06 06:20:58,648 - Ensuring that hadoop has the correct symlink structure
2016-11-06 06:20:58,648 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-11-06 06:20:58,650 - Version 2.5.0.0-1245 was provided as effective cluster version.  Using package version 2_5_0_0_1245
2016-11-06 06:20:58,652 - Package['lucidworks-hdpsearch'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-11-06 06:20:58,727 - Installing package lucidworks-hdpsearch ('/usr/bin/yum -d 0 -e 0 -y install lucidworks-hdpsearch')

Command failed after 1 tries
1 ACCEPTED SOLUTION

avatar
Super Collaborator

The solr installation worked when i change the repoinfo.xml file to include the correct repobuild. For some reason for 2.5/repos/centos6 the tar.gz is named with -centos6. In 2.3 the -centos6 is not there ... please see the screenshots

2.5

9316-screen-shot-2016-11-10-at-55729-pm.png

2.3

9317-screen-shot-2016-11-10-at-55747-pm.png

may be that is the issue.

View solution in original post

6 REPLIES 6

avatar
Super Collaborator

The solr installation worked when i change the repoinfo.xml file to include the correct repobuild. For some reason for 2.5/repos/centos6 the tar.gz is named with -centos6. In 2.3 the -centos6 is not there ... please see the screenshots

2.5

9316-screen-shot-2016-11-10-at-55729-pm.png

2.3

9317-screen-shot-2016-11-10-at-55747-pm.png

may be that is the issue.

avatar
Contributor

@Karthik Narayanan

i am facing the same issue, can you be more specifice where is the

repoinfo.xml file located?

thank you.

avatar
Super Collaborator

@Muhammad idrees var/lib/ambari-server/resources/stacks/HDP/2.5/repos/repoinfo.xml , are you trying this with 2.5?

avatar
Contributor

thank you Karthik,

i am running HDP 2.6.1 with vmware. the location you provide doesnt exists.

42639-hdp.pngthank you very much.

Regards.

avatar
Super Collaborator

avatar
Contributor
repo.txt

@Karthik Narayanan

thank you dear,

i tried 2.6.1 but still issues, now i downloaded the 2.5 sandbox of hdp. can you please guide what to change in the

var/lib/ambari-server/resources/stacks/HDP/2.5/repos/repoinfo.xml file? i mean the exact change which i need to do?

i have attached that file in this reply

Thank you very much.