Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDP Component working in deep

Solved Go to solution

HDP Component working in deep

Expert Contributor

Hi Community Team,

I'm trying to dig deeper in HDP component how HDP component works?

How one compent call to other other component?

How can i achieve it?

Could you please help me to find the hadoop in deep?

Thanks,

Vinay K

1 ACCEPTED SOLUTION

Accepted Solutions

Re: HDP Component working in deep

Super Collaborator

What component are you asking about? What are you trying to achieve?

They typically call each other over combinations of separate protocols.

- HDFS and YARN interact via RPC/IPC.

- Ambari Server and Agents are over HTTP & REST. Ambari also needs JDBC connections to the backing database.

- Hive, Hbase, and Spark can use Thrift Server. The Hive metastore uses JDBC.

- Kafka has its own TCP protocol.

I would suggest starting on a specific component for the use case(s) you want. Hadoop itself is only comprised of HDFS & YARN + MapReduce

1 REPLY 1

Re: HDP Component working in deep

Super Collaborator

What component are you asking about? What are you trying to achieve?

They typically call each other over combinations of separate protocols.

- HDFS and YARN interact via RPC/IPC.

- Ambari Server and Agents are over HTTP & REST. Ambari also needs JDBC connections to the backing database.

- Hive, Hbase, and Spark can use Thrift Server. The Hive metastore uses JDBC.

- Kafka has its own TCP protocol.

I would suggest starting on a specific component for the use case(s) you want. Hadoop itself is only comprised of HDFS & YARN + MapReduce

Don't have an account?
Coming from Hortonworks? Activate your account here