Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDPSearch Install for Ambari 2.4.1 Fails - JAVA_HOME Error

HDPSearch Install for Ambari 2.4.1 Fails - JAVA_HOME Error

Expert Contributor

Followed the instructions here to install HDPSearch http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.0/bk_solr-search-installation/content/ch_hdp-s... using Ambari 2.4.1. During deployment of Solr on all nodes, I'm getting the error. JAVA_HOME is set correctly system wide (/etc/envrionment).

/etc/environment:

PATH=$PATH:/usr/jdk64/jdk1.8.0_77/bin
export PATH


JAVA_HOME=/usr/jdk64/jdk1.8.0_77
export JAVA_HOME
stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/solr.py", line 101, in <module>
    Solr().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/solr.py", line 17, in install
    self.install_packages(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 567, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 83, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 91, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install lucidworks-hdpsearch' returned 1. Executing pre-install script


   Distribution found: RedHat


Checking for available disk space ...


    Available space: 14679348 KB
    Minimum space: 2717053 KB
    Minimum required disk space available


Verifying installation directories...
    Creating installation directory
    Installation directory created: /opt/lucidworks-hdpsearch


Validating user ...
    Group solr already exists
    User solr already exists


Checking java ...
====
ERROR: cannot find the java command. Install Oracle Java, adjust your PATH, or set JAVA_HOME.
====
error: %pre(lucidworks-hdpsearch-0:2.5-100.noarch) scriptlet failed, exit status 1
Error in PREIN scriptlet in rpm package lucidworks-hdpsearch-2.5-100.noarch
 stdout:
2016-10-15 23:24:30,545 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-10-15 23:24:30,547 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-10-15 23:24:30,549 - call[('ambari-python-wrap', u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-10-15 23:24:30,573 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-10-15 23:24:30,573 - checked_call[('ambari-python-wrap', u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-10-15 23:24:30,598 - checked_call returned (0, '')
2016-10-15 23:24:30,598 - Ensuring that hadoop has the correct symlink structure
2016-10-15 23:24:30,599 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-15 23:24:30,600 - Group['solr'] {}
2016-10-15 23:24:30,601 - Adding group Group['solr']
2016-10-15 23:24:30,624 - Group['hadoop'] {}
2016-10-15 23:24:30,624 - Group['users'] {}
2016-10-15 23:24:30,624 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,625 - User['logsearch'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,626 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,626 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,627 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,627 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-10-15 23:24:30,628 - User['solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,628 - Adding user User['solr']
2016-10-15 23:24:30,654 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-10-15 23:24:30,655 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,655 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,656 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,656 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-10-15 23:24:30,657 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-10-15 23:24:30,659 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-10-15 23:24:30,664 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-10-15 23:24:30,664 - Group['hdfs'] {}
2016-10-15 23:24:30,665 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2016-10-15 23:24:30,665 - FS Type: 
2016-10-15 23:24:30,665 - Directory['/etc/hadoop'] {'mode': 0755}
2016-10-15 23:24:30,678 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-10-15 23:24:30,678 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2016-10-15 23:24:30,694 - Initializing 3 repositories
2016-10-15 23:24:30,695 - Repository['HDP-2.5'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.5.0.0/', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-10-15 23:24:30,702 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.5]\nname=HDP-2.5\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.5.0.0/\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-10-15 23:24:30,703 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-10-15 23:24:30,705 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-10-15 23:24:30,706 - Repository['HDP-SOLR-2.5-100'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-SOLR-2.5-100/repos/centos7/', 'action': ['create'], 'components': [u'HDP-SOLR', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-SOLR', 'mirror_list': None}
2016-10-15 23:24:30,708 - File['/etc/yum.repos.d/HDP-SOLR.repo'] {'content': InlineTemplate(...)}
2016-10-15 23:24:30,709 - Writing File['/etc/yum.repos.d/HDP-SOLR.repo'] because it doesn't exist
2016-10-15 23:24:30,709 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-15 23:24:30,845 - Skipping installation of existing package unzip
2016-10-15 23:24:30,845 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-15 23:24:30,893 - Skipping installation of existing package curl
2016-10-15 23:24:30,893 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-15 23:24:30,941 - Skipping installation of existing package hdp-select
2016-10-15 23:24:31,149 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-10-15 23:24:31,151 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-10-15 23:24:31,153 - call[('ambari-python-wrap', u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-10-15 23:24:31,177 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-10-15 23:24:31,177 - checked_call[('ambari-python-wrap', u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-10-15 23:24:31,202 - checked_call returned (0, '')
2016-10-15 23:24:31,202 - Ensuring that hadoop has the correct symlink structure
2016-10-15 23:24:31,203 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-10-15 23:24:31,204 - Version 2.5.0.0-1245 was provided as effective cluster version.  Using package version 2_5_0_0_1245
2016-10-15 23:24:31,205 - Package['lucidworks-hdpsearch'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-10-15 23:24:31,343 - Installing package lucidworks-hdpsearch ('/usr/bin/yum -d 0 -e 0 -y install lucidworks-hdpsearch')


Command failed after 1 tries


3 REPLIES 3

Re: HDPSearch Install for Ambari 2.4.1 Fails - JAVA_HOME Error

Expert Contributor

Rommel, have you tried defining the JAVA_HOME on the command line right before issuing the yum install command:

export JAVA_HOME=/usr/jdk64/jdk1.8.0_77
yum install lucidworks-hdpsearch

Re: HDPSearch Install for Ambari 2.4.1 Fails - JAVA_HOME Error

New Contributor

I ran into the same thing no matter what I tried.

Look at the bottom of the page of the link that you supplied:

"In the case of the Java preinstall check failing, the easiest remediation is to login to each machine that Solr will be installed on, temporarily set the JAVA_HOME environmental variable, then then use yum/zypper/apt-get to install the package. For example on CentOS:

export JAVA_HOME=/usr/jdk64/jdk1.8.0_77
yum install lucidworks-hdpsearch

Once all of the prerequisite checks have been satisfied and the package is installed, you can simple click “Retry” in the Ambari Web UI to move forward and complete the installation."

Looks like this is expected at this point. I went to each node and ran the yum install and then hit retry on Ambari and it finished.

Re: HDPSearch Install for Ambari 2.4.1 Fails - JAVA_HOME Error

Expert Contributor

I had the same problem when installing Solr. Even though I have Java on all nodes, still coud not install it on Namenode. Realised that Java versions (vendors?) were different on Namenode than on other nodes (/usr/lib/jvm/java-8-openjdk-amd64 - on all nodes, /usr/jdk64/jdk1.8.0_77 - on Namenode). So I installed the same version (vendor?) on Namenode and it worked. The link below describes my experience.

Adding Solr via Ambari

Don't have an account?
Coming from Hortonworks? Activate your account here