Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

HDF MPack not appearing when installed on HDP 3.0 cluster



I am trying to add the HDF 3.1.2 MPack to my HDP 3.0.0 cluster so that I can use NiFi (along with some HDP service). No matter which way I try and add it I cannot get it to appear in the stack and versions or Services sections of Ambari. I have tried adding the MPack via SSH to the on the master node (installed successfully), also on a fresh cluster I have tried registering the MPack in Cloudbreak and specifying that it be installed during the cluster deployment. Each time it states the MPack has been installed successfully but it is nowhere to be found in Ambari. I cannot find any section to add the base HDF URL as one document says to do nor can I find it to try and register a new version. Only the HDP 3.0 version is shown.

Does anyone have a suggestion as to where I'm going wrong?




Super Mentor

@Paul Norris

As per website : (accessible with hortonworks support subscription) and this link :

Ambari-2.7.0 is not compatable with HDF .

You might need to install Ambari- inorder to support HDF- or lesser Ambari-2.6.x.x versions.

HDF-3.2 is releasing in some more days which will be compatible with Ambari-2.7. as of now Ambari-2.7 supports only HDP-3.0 installation .

Reference HCC Thread:


@Jay Kumar SenSharma thanks for that information, that is disappointing. So from the looks of this if I am running HDP 3.0 which I think requires Ambari 2.7 (correct me if that is wrong) then there is currently no way for me to add an HDF MPack to HDP 3.0 until HDF 3.2 is released? Do you think this is correct?

When they say "in some more days" do you know is that this week, next week, next month, how many days is "some more"?



Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.