Created 12-10-2015 11:56 AM
Hi,
the HDP install (with ambari) fails in step "App Timeline Server Install". The error message is:
resource_management.core.exceptions.Fail: Execution of 'ambari-sudo.sh -H -E touch /var/lib/ambari-agent/data/hdp-select-set-all.performed ; ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.3 | tail -1`' returned 1. Traceback (most recent call last): File "/usr/bin/hdp-select", line 378, in <module> printVersions() File "/usr/bin/hdp-select", line 235, in printVersions result[tuple(map(int, versionRegex.split(f)))] = f ValueError: invalid literal for int() with base 10: 'usr' ERROR: set command takes 2 parameters, instead of 1 usage: hdp-select [-h] [<command>] [<package>] [<version>] Set the selected version of HDP. positional arguments:
Does it matter if the /usr/hdp directory already exists (is a symlink to a partition with enough space and is empty) ?
Thanks Peter.
Created 12-17-2015 09:14 AM
I have added the value "usr" to the list on line 234 in script /usr/bin/hdp-select and it seems to work. I did not run any cleanup. The HDP version is 2.3.2.0-2950
Created 12-17-2015 09:14 AM
I have added the value "usr" to the list on line 234 in script /usr/bin/hdp-select and it seems to work. I did not run any cleanup. The HDP version is 2.3.2.0-2950
Created 03-17-2016 04:31 PM
Peter can you add your code snippet here that you changed? I'm seeing the same issue with trying to install HDP 2.4.0.0-169
Created 04-28-2016 05:29 AM
Below is code snippet
def printVersions(): result = {} for f in os.listdir(root): if f not in [".", "..", "current", "share", "lost+found","usr"]: result[tuple(map(int, versionRegex.split(f)))] = f keys = result.keys() keys.sort() for k in keys: print result[k]
Created 04-08-2016 12:09 AM
I found a similar issue, where /usr/hdp contained a file derby.log
The issue is printVersions should filter based on a directory pattern of "(\d+\.{0,1}.*)-(\d+)"
234 if f notin [".", "..", "current", "share", "lost+found"]:
235 result[tuple(map(int, versionRegex.split(f)))] = f
Created 04-08-2016 12:12 AM
@Peter Bartal Am not a python person so bare with me
I think if there was a regex compile
----
filter = regex.compile((\d+\.{0,1}.*)-(\d+))
if filter.match(f)
result[tuple(map(int, versionRegex.split(f)))] = f
----
That should solve the problem
Created 04-23-2016 04:08 PM
I found a similar issue, where /usr/hdp contained a file derby.log
The issue is printVersions should filter based on a directory pattern of "(\d+\.{0,1}.*)-(\d+)"
234 if f notin [".", "..", "current", "share", "lost+found"]:
235 result[tuple(map(int, versionRegex.split(f)))] = f
Created 05-03-2016 08:50 PM
def printVersions(): result = {} for f in os.listdir(root): if f not in [".", "..", "current", "share", "lost+found","docker"]: result[tuple(map(int, versionRegex.split(f)))] = f keys = result.keys() ....
This fixed my issue. Happened to me when i was restarting Hbase to deploy a service on Ambari and hbase client wouldn't install.It said "docker" in printVersions function, instead of "usr". Thanks!
Created 05-12-2016 03:49 PM
This fixed my issue. Happened to me when i was restarting Hbase to deploy a service on Ambari and hbase client wouldn't install.It said "hadoop" in printVersions function, instead of "usr" or "docker". I have observed that we need to give that user that prompts in the error msg when we run this command from /var/lib/ambari-agent : ./ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.3.2.0-2950 | tail -1` Thanks! go to /usr/bin/ vi hdp-select Change Print the installed packages code to appropriate error user: # Print the installed packages def printVersions(): result = {} for f in os.listdir(root): if f not in [".", "..", "current", "share", "lost+found","hadoop"]: result[tuple(map(int, versionRegex.split(f)))] = f keys = result.keys() keys.sort() for k in keys: print result[k]
Created 06-14-2016 04:41 AM
Faced this issue while installing HDP 2.4.2 on Centos 7.x.
Fix for this issue depends on details in error message.
ValueError: invalid literal forint()withbase10:'hadoop' <-- base on this vale need to modify function - printVersions in file /usr/bin/hdp-select on all nodes.
Easy part is this file is same on all nodes ,you can do changes on one node and clush it to all nodes.
clush -ab md5sum /usr/bin/hdp-select vim /usr/bin/hdp-select #modify below function.. Clush -ab -c /usr/bin/hdp-select
def printVersions(): ...... ...... if f not in [".", "..", "current", "share", "lost+found","hadoop"]: ......
Created 06-22-2016 10:20 AM
I had this same issue when trying to upgrade from HDP 2.3 to HDP 2.4. This is a bug (or serious weakness) in script /usr/bin/hdp-select which does not tolerate any other files or subdirectories under your root folder (in my case /usr/hdp ). This folder should only contain two sub-folders, one named current and one with a version-number of current HDP.
I had places a backup folder in there and got the same install error
ValueError: invalid literal forint()withbase10:'backup'
when trying to install Grafana according to Ambari upgrade documentation.
My fix (instead of modifying the script as above) was to move my backupfolder from that folder to a different location (or to remove it all together if not needed).
A agree with the fix proposed by @patrick o'leary
Created 08-05-2016 01:33 PM
This worked for me
Created 08-05-2016 01:34 PM
Just removed the extra directories expect the current and other one it worked
Created 01-10-2017 09:38 AM
Thanks. I had created few folder under /usr/hdp and faced same issue.
It's a good practice to not to create any files, folders under /usr/hdp as the script doesn't like it.
Easy to move/create the folders (Thank modifying the script) somewhere else if required.
And that solves my issue!
Created 08-30-2016 09:39 PM
This is a known issue with /usr/bin/hdp-select script. The script does not honor any dir except from "CURRENT" and other HDP stack versions. There is an exception in few HDP version where it can tolerate "Share" and "lost+found" folders. You can remove anything except these folder to another location and try again.