<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: HDP Component working in deep in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HDP-Component-working-in-deep/m-p/186692#M80899</link>
    <description>&lt;P&gt;What component are you asking about? What are you trying to achieve?&lt;/P&gt;&lt;P&gt;They typically call each other over combinations of separate protocols. &lt;/P&gt;&lt;P&gt;- HDFS and YARN interact via RPC/IPC. &lt;/P&gt;&lt;P&gt;- Ambari Server and Agents are over HTTP &amp;amp; REST. Ambari also needs JDBC connections to the backing database. &lt;/P&gt;&lt;P&gt;- Hive, Hbase, and Spark can use Thrift Server. The Hive metastore uses JDBC. &lt;/P&gt;&lt;P&gt;- Kafka has its own TCP protocol. &lt;/P&gt;&lt;P&gt; I would suggest starting on a specific component for the use case(s) you want. Hadoop itself is only comprised of HDFS &amp;amp; YARN + MapReduce&lt;/P&gt;</description>
    <pubDate>Fri, 20 Jul 2018 23:51:23 GMT</pubDate>
    <dc:creator>JordanMoore</dc:creator>
    <dc:date>2018-07-20T23:51:23Z</dc:date>
    <item>
      <title>HDP Component working in deep</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HDP-Component-working-in-deep/m-p/186691#M80898</link>
      <description>&lt;P&gt;Hi Community Team,&lt;/P&gt;&lt;P&gt;I'm trying to dig deeper in HDP component how HDP component works?&lt;/P&gt;&lt;P&gt;How one compent call to other other component?&lt;/P&gt;&lt;P&gt;How can i achieve it?&lt;/P&gt;&lt;P&gt;Could you please help me to find the hadoop in deep?&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Vinay K&lt;/P&gt;</description>
      <pubDate>Thu, 19 Jul 2018 01:46:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HDP-Component-working-in-deep/m-p/186691#M80898</guid>
      <dc:creator>vinayk</dc:creator>
      <dc:date>2018-07-19T01:46:00Z</dc:date>
    </item>
    <item>
      <title>Re: HDP Component working in deep</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/HDP-Component-working-in-deep/m-p/186692#M80899</link>
      <description>&lt;P&gt;What component are you asking about? What are you trying to achieve?&lt;/P&gt;&lt;P&gt;They typically call each other over combinations of separate protocols. &lt;/P&gt;&lt;P&gt;- HDFS and YARN interact via RPC/IPC. &lt;/P&gt;&lt;P&gt;- Ambari Server and Agents are over HTTP &amp;amp; REST. Ambari also needs JDBC connections to the backing database. &lt;/P&gt;&lt;P&gt;- Hive, Hbase, and Spark can use Thrift Server. The Hive metastore uses JDBC. &lt;/P&gt;&lt;P&gt;- Kafka has its own TCP protocol. &lt;/P&gt;&lt;P&gt; I would suggest starting on a specific component for the use case(s) you want. Hadoop itself is only comprised of HDFS &amp;amp; YARN + MapReduce&lt;/P&gt;</description>
      <pubDate>Fri, 20 Jul 2018 23:51:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/HDP-Component-working-in-deep/m-p/186692#M80899</guid>
      <dc:creator>JordanMoore</dc:creator>
      <dc:date>2018-07-20T23:51:23Z</dc:date>
    </item>
  </channel>
</rss>

