Created 03-21-2017 05:25 AM
In Tutorials there are 2 configurations of install and deploy HDF and APache Hadoop, if we are installing the HDF using Ambari, it will install same components. So what is exact difference between these 2.
Created on 03-21-2017 05:31 AM - edited 08-18-2019 04:12 AM
Apache Hadoop is just a major component of HDP (Hortonworks Data Platform). HDP includes a lots of other components in it. HDP is a suit of components that are tested and certified to work together.
Please see the image explaining HDP in the following link: https://hortonworks.com/products/data-center/hdp/
.
Where as the HDF (Hortonworks Data Flow) includes the following: HDF includes the following services: NiFi, Storm, Kafka, Zookeeper, Ambari, Ranger
.
Created 12-21-2017 03:41 AM
So Can we say that HDP contains the tool that processes the data which is in rest and HDF contains the components which process real time flow /streaming data.
HDP -used for Data on rest
HDF - Used for Data in flow
Created 12-21-2017 03:42 AM
So Can we say that HDP contains the tool that processes the data which is in rest and HDF contains the components which process real time flow /streaming data.
HDP -used for Data on rest
HDF - Used for Data in flow