Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Could not determine HDP version for component hadoop ...

Could not determine HDP version for component hadoop ...

New Contributor

Hi,i am installing HDP by ambari on the server, but something wrong on the last step 'install,start, and test',the output as follow,btw,the OS is ubuntu 14.04 LTS.

the server has installed the HDP serveral years ago,and i suppose the HDP didn't cleanup completely that lead to the wrong,i have check the folder and delete them refer to

https://cwiki.apache.org/confluence/display/AMBARI/Host+Cleanup+for+Ambari+and+Stack

but it didn't work

and the solution

https://community.hortonworks.com/questions/33519/could-not-determine-hdp-version-for-component-zook...

provide on the community didn't work on me

can anyone please help me resolve this problem! thank you very much!!!

stderr: 
2017-08-21 20:08:11,954 - Could not determine HDP version for component hadoop-yarn-timelineserver by calling '/usr/bin/hdp-select status hadoop-yarn-timelineserver > /tmp/tmpUqPhGv'. Return Code: 1, Output: .
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 38, in <module>
    AfterInstallHook().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 33, in hook
    setup_config()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", line 56, in setup_config
    only_if=format("ls {hadoop_conf_dir}"))
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 67, in action_create
    encoding = self.resource.encoding
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist
 stdout:
2017-08-21 20:08:11,448 - Group['spark'] {}
2017-08-21 20:08:11,448 - Group['hadoop'] {}
2017-08-21 20:08:11,448 - Group['users'] {}
2017-08-21 20:08:11,449 - User['hive'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2017-08-21 20:08:11,449 - User['zookeeper'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2017-08-21 20:08:11,450 - User['spark'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2017-08-21 20:08:11,450 - User['ambari-qa'] {'gid': 'hadoop', 'groups': [u'users']}
2017-08-21 20:08:11,451 - User['tez'] {'gid': 'hadoop', 'groups': [u'users']}
2017-08-21 20:08:11,452 - User['hdfs'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2017-08-21 20:08:11,452 - User['yarn'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2017-08-21 20:08:11,453 - User['hcat'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2017-08-21 20:08:11,453 - User['mapred'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2017-08-21 20:08:11,454 - User['hbase'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2017-08-21 20:08:11,455 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-08-21 20:08:11,455 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-08-21 20:08:11,460 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-08-21 20:08:11,461 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2017-08-21 20:08:11,461 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-08-21 20:08:11,462 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-08-21 20:08:11,466 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-08-21 20:08:11,467 - Group['hdfs'] {'ignore_failures': False}
2017-08-21 20:08:11,467 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', u'hdfs']}
2017-08-21 20:08:11,468 - Directory['/etc/hadoop'] {'mode': 0755}
2017-08-21 20:08:11,468 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2017-08-21 20:08:11,477 - Repository['HDP-2.3'] {'base_url': 'http://202.116.42.142/zzt/HDP/ubuntu14/2.x/updates/2.3.2.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-08-21 20:08:11,482 - File['/tmp/tmpn7F75O'] {'content': 'deb http://202.116.42.142/zzt/HDP/ubuntu14/2.x/updates/2.3.2.0 HDP main'}
2017-08-21 20:08:11,483 - Writing File['/tmp/tmpn7F75O'] because contents don't match
2017-08-21 20:08:11,483 - File['/tmp/tmpa0PRJq'] {'content': StaticFile('/etc/apt/sources.list.d/HDP.list')}
2017-08-21 20:08:11,483 - Writing File['/tmp/tmpa0PRJq'] because contents don't match
2017-08-21 20:08:11,484 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://202.116.42.142/zzt/HDP-UTILS-1.1.0.20/repos/ubuntu14', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-08-21 20:08:11,485 - File['/tmp/tmppt1hrD'] {'content': 'deb http://202.116.42.142/zzt/HDP-UTILS-1.1.0.20/repos/ubuntu14 HDP-UTILS main'}
2017-08-21 20:08:11,485 - Writing File['/tmp/tmppt1hrD'] because contents don't match
2017-08-21 20:08:11,485 - File['/tmp/tmp1P7J2n'] {'content': StaticFile('/etc/apt/sources.list.d/HDP-UTILS.list')}
2017-08-21 20:08:11,486 - Writing File['/tmp/tmp1P7J2n'] because contents don't match
2017-08-21 20:08:11,486 - Package['unzip'] {}
2017-08-21 20:08:11,532 - Skipping installation of existing package unzip
2017-08-21 20:08:11,533 - Package['curl'] {}
2017-08-21 20:08:11,579 - Skipping installation of existing package curl
2017-08-21 20:08:11,580 - Package['hdp-select'] {}
2017-08-21 20:08:11,630 - Skipping installation of existing package hdp-select
2017-08-21 20:08:11,812 - Package['hadoop-2-3-.*-yarn'] {}
2017-08-21 20:08:11,874 - Skipping installation of existing package hadoop-2-3-.*-yarn
2017-08-21 20:08:11,874 - Package['hadoop-2-3-.*-mapreduce'] {}
2017-08-21 20:08:11,919 - Skipping installation of existing package hadoop-2-3-.*-mapreduce
2017-08-21 20:08:11,954 - Could not determine HDP version for component hadoop-yarn-timelineserver by calling '/usr/bin/hdp-select status hadoop-yarn-timelineserver > /tmp/tmpUqPhGv'. Return Code: 1, Output: .
2017-08-21 20:08:12,125 - Execute['ambari-sudo.sh  -H -E touch /var/lib/ambari-agent/data/hdp-select-set-all.performed ; ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.3 | tail -1`'] {'not_if': 'test -f /var/lib/ambari-agent/data/hdp-select-set-all.performed', 'only_if': 'ls -d /usr/hdp/2.3*'}
2017-08-21 20:08:12,128 - Skipping Execute['ambari-sudo.sh  -H -E touch /var/lib/ambari-agent/data/hdp-select-set-all.performed ; ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.3 | tail -1`'] due to not_if
2017-08-21 20:08:12,128 - XmlConfig['core-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'owner': 'hdfs', 'only_if': 'ls /usr/hdp/current/hadoop-client/conf', 'configurations': ...}
2017-08-21 20:08:12,142 - Generating config: /usr/hdp/current/hadoop-client/conf/core-site.xml
2017-08-21 20:08:12,142 - File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2 REPLIES 2
Highlighted

Re: Could not determine HDP version for component hadoop ...

Expert Contributor

@zheng zt

HI,

This usually happens when you try to uninstall/ install again.

Please check this link and delete everything mentioned here.

https://community.hortonworks.com/articles/97489/completely-uninstall-hdp-and-ambari.html

Also even after this, in the screen where it performs host checks, do not ignore the warnings (if encountered), go to the details, perform the actions mentioned there. Do this till there are no warnings.

It should succeed then.

Thanks

Highlighted

Re: Could not determine HDP version for component hadoop ...

New Contributor

@tsharma

Hi,thanks for your repy.

I am trying the method you refer.but something different between the OS because my OS is ubuntu14 and the link you refer is the centos.

Don't have an account?
Coming from Hortonworks? Activate your account here