I am installing HDF 3.0 using Ambari Server. I followed the steps mentioned
I installed the Ambari 3.0 Mpack and I could see HDF 3.0 in drop down menu and selected it. Passed through selection steps when it comes to actually install the stuff I get installation failed with [ Pretty much same issue with all other packages]
Not sure why it is refering HDF 2.0 when I have selected HDF3.0. There are no older installation of HDF2.0 in the VM
Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDF/2.0/hooks/before-ANY/scripts/hook.py", line 35, in <module> BeforeAnyHook().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDF/2.0/hooks/before-ANY/scripts/hook.py", line 26, in hook import params File "/var/lib/ambari-agent/cache/stacks/HDF/2.0/hooks/before-ANY/scripts/params.py", line 101, in <module> hadoop_home = stack_select.get_hadoop_dir("home", force_latest_on_upgrade=True) TypeError: get_hadoop_dir() got an unexpected keyword argument 'force_latest_on_upgrade' Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDF/2.0/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-143.json', '/var/lib/ambari-agent/cache/stacks/HDF/2.0/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-143.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']
HDF 3.0 requires Ambari 2.5.1. I was getting this when I loaded the MPACK in an Ambari 2.4 environment.,
This happened to me when I was trying to use Ambari 2.4. Nifi 3.0 requires Ambari 2.5.1. I suggest looking at the support matrices: https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.0/bk_support-matrices/content/ch_matrices-hdf...