Member since
04-03-2024
1
Post
0
Kudos Received
0
Solutions
04-03-2024
01:09 AM
@rizalt I had a same bug when used ODP stack to deploy Ambari. After debug, I found code in ambari-agent bug when runtime to try start Hive Server2. Problem: This below codes output {out} parameter to handle hdfs path, but this code returned not valid hdfs URI 163 metatool_cmd = format("hive --config {conf_dir} --service metatool")
164 cmd = as_user(format("{metatool_cmd} -listFSRoot", env={'PATH': execution_path}), params.hive_user) \
165 + format(" 2>/dev/null | grep hdfs:// | cut -f1,2,3 -d '/' | grep -v '{fs_root}' | head -1")
166 code, out = shell.call(cmd) 2024-04-03 07:40:48,317 - call['ambari-sudo.sh su hive -l -s /bin/bash -c 'hive --config /usr/odp/current/hive-server2/conf/ --service metatool -listFSRoot' 2>/dev/null | grep hdfs:// | cut -f1,2,3 -d '/' | grep -v 'hdfs://vm-ambari.internal.cloudapp.net:8020' | head -1'] {}
2024-04-03 07:40:58,721 - call returned (0, '07:40:53.268 [main] DEBUG org.apache.hadoop.fs.FileSystem - hdfs:// = class org.apache.hadoop.hdfs.DistributedFileSystem from ') To fix: Step 1: edit row 170 in file "/var/lib/ambari-agent/cache/stacks/ODP/1.0/services/HIVE/package/scripts/hive_service.py" as below: I hard code old_path to valid URI, this help by pass updateLocation config # cmd = format("{metatool_cmd} -updateLocation {fs_root} {out}")
cmd = format("{metatool_cmd} -updateLocation {fs_root} hdfs://oldpath") You can see as below image: Step 2: Restart ambari agent sudo ambari-agent restart Step 3: Try to restart Hive server2 in Ambari Service started successful.
... View more