Created 11-18-2016 07:17 PM
Hello,
This is perhaps a trivial question.
Please could you let me know what is the difference between DataFlow and DataPlatform?
When would you use one over the other?
Thanks,
Marcy
Created 11-18-2016 11:22 PM
HDF is best thought of as working with data in motion and HDP as Hadoop, the popular Big Data Platform which in contrast can be seen as data at rest. Both are independent platforms but can are often integrated. When integrated, they are deployed as separate clusters or platforms. Both are open source and Hortonworks provides paid support for each separately.
HDF
HDF has NiFi, Storm and Kafka (as well as Ambari admin console). These components are used to get data from diverse sources (ranging from social media sites, log files, IoT devices, databases, etc) and send the data to an equally diverse range of target systems. In between, they can transform moving content, make decisions based on moving content, and run analytics on moving content. The actual movement of data is difficult to engineer and these components move data and handle the many challenges in doing so all under the covers with no low-level development needed.
See: https://hortonworks.com/products/data-center/hdf/
HDP
HDP is more commonly known as the Hadoop or Big Data Platform. It has HDFS, YARN, Map-reduce and Tez processing engines, Hive database, HBase No Sql database, and many other tools to work with Big Data (data in large volumes, wide variety of formats, and fast real-time velocity of arriving on the platform ... the 3 Vs). It stores this data cheaply and flexibly, and uses horizontal scaling of servers to parallel process these 3 Vs of data in a short amount of time (compared to traditional databases which face limits in working with the 3 Vs). What type of processing depends on the out-of-the-box or 3rd party tools used and the use case / business case involved.
See: https://hortonworks.com/products/data-center/hdp/
HDF + HDP
HDF and HDP are often integrated because HDF is an effective way to get diverse sources of data into HDP to be stored and processed all in one place, to be used by data scientists for example.
If this is what you were looking for, let me know by accepting the answer; else, please respond to this answer with further questions and I will follow-up.
Created 11-18-2016 11:22 PM
HDF is best thought of as working with data in motion and HDP as Hadoop, the popular Big Data Platform which in contrast can be seen as data at rest. Both are independent platforms but can are often integrated. When integrated, they are deployed as separate clusters or platforms. Both are open source and Hortonworks provides paid support for each separately.
HDF
HDF has NiFi, Storm and Kafka (as well as Ambari admin console). These components are used to get data from diverse sources (ranging from social media sites, log files, IoT devices, databases, etc) and send the data to an equally diverse range of target systems. In between, they can transform moving content, make decisions based on moving content, and run analytics on moving content. The actual movement of data is difficult to engineer and these components move data and handle the many challenges in doing so all under the covers with no low-level development needed.
See: https://hortonworks.com/products/data-center/hdf/
HDP
HDP is more commonly known as the Hadoop or Big Data Platform. It has HDFS, YARN, Map-reduce and Tez processing engines, Hive database, HBase No Sql database, and many other tools to work with Big Data (data in large volumes, wide variety of formats, and fast real-time velocity of arriving on the platform ... the 3 Vs). It stores this data cheaply and flexibly, and uses horizontal scaling of servers to parallel process these 3 Vs of data in a short amount of time (compared to traditional databases which face limits in working with the 3 Vs). What type of processing depends on the out-of-the-box or 3rd party tools used and the use case / business case involved.
See: https://hortonworks.com/products/data-center/hdp/
HDF + HDP
HDF and HDP are often integrated because HDF is an effective way to get diverse sources of data into HDP to be stored and processed all in one place, to be used by data scientists for example.
If this is what you were looking for, let me know by accepting the answer; else, please respond to this answer with further questions and I will follow-up.