- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
HDP Component working in deep
- Labels:
-
Apache Hadoop
Created ‎07-18-2018 06:46 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Community Team,
I'm trying to dig deeper in HDP component how HDP component works?
How one compent call to other other component?
How can i achieve it?
Could you please help me to find the hadoop in deep?
Thanks,
Vinay K
Created ‎07-20-2018 04:51 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
What component are you asking about? What are you trying to achieve?
They typically call each other over combinations of separate protocols.
- HDFS and YARN interact via RPC/IPC.
- Ambari Server and Agents are over HTTP & REST. Ambari also needs JDBC connections to the backing database.
- Hive, Hbase, and Spark can use Thrift Server. The Hive metastore uses JDBC.
- Kafka has its own TCP protocol.
I would suggest starting on a specific component for the use case(s) you want. Hadoop itself is only comprised of HDFS & YARN + MapReduce
Created ‎07-20-2018 04:51 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
What component are you asking about? What are you trying to achieve?
They typically call each other over combinations of separate protocols.
- HDFS and YARN interact via RPC/IPC.
- Ambari Server and Agents are over HTTP & REST. Ambari also needs JDBC connections to the backing database.
- Hive, Hbase, and Spark can use Thrift Server. The Hive metastore uses JDBC.
- Kafka has its own TCP protocol.
I would suggest starting on a specific component for the use case(s) you want. Hadoop itself is only comprised of HDFS & YARN + MapReduce
