Member since
04-12-2019
105
Posts
3
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3607 | 05-28-2019 07:41 AM | |
2185 | 05-28-2019 06:49 AM | |
1790 | 12-20-2018 10:54 AM | |
1276 | 06-27-2018 09:05 AM | |
6907 | 06-27-2018 09:02 AM |
03-23-2022
03:05 AM
@iamfromsky as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.
... View more
02-02-2022
08:15 AM
Spark and Hive use separate catalogs to access SparkSQL or Hive tables in HDP 3.0 and later. The Spark catalog contains a table created by Spark. The Hive catalog contains a table created by Hive. By default, standard Spark APIs access tables in the Spark catalog. To access tables in the hive catalog, we have to edit the metastore.catalog.default property in hive-site.xml (Set that property value to 'hive' instead of 'spark'). Config File Path: $SPARK_HOME/conf/hive-site.xml Before change the config <property>
<name>metastore.catalog.default</name>
<value>spark</value>
</property> After change the config <property>
<name>metastore.catalog.default</name>
<value>hive</value>
</property>
... View more
04-07-2021
03:02 AM
Huge thanks. It works for me.
... View more
10-30-2019
03:30 AM
Hi, To understand what the Yarn application is doing, check the application logs of the particular yarn application and if the job hasnot completed , Also check for the Resource manager logs if it was stuck with any errors. Thanks Arun
... View more
10-06-2019
06:56 AM
Hi, I'm facing same issue. Have there any luck?
... View more
08-08-2019
03:36 PM
Hey Harsh Thanks for responding. As multiple client are requesting data to hbase, at some point, sometimes user don’t get data, EOF exception or connection interruptions occur. We are not able to track the record of requested data and size of input and output data sending to end user. Regards Vinay K
... View more
12-20-2018
10:54 AM
Hi I have resolved the issue. i have done all steps on that node where i was facing issue First i clean cache from /tmp to some temporary directory. Then move all keytabs from /etc/security/keytabs/ to other temporary directory and finally i restart ambari-agent where i was facing issue. Then tried regenerate keytabs which i got successfully. Now i have resumed the upgradation.
... View more
12-12-2018
09:41 AM
Thanks @Geoffrey Shelton Okot Also i have found the problem. Whenever we install new HDP or upgrade HDP, we specify the repository path of HDP and HDP-UTILS in UI, accordingly ambari create repo on all agents with name of HDP-2.6-repo-51, HDP-UTILS-1.1.0.22-repo-51. But i had also created HDP and HDP-UTILS repository manually on all nodes and all HDP packages had installed with manually repository path. When i was starting services, hbase client and other client find HDP-2.6-repo-51 repository for install the client which i was not getting. Now i have disable the manually repository and reinstall the client package manually. It's working fine.
... View more
12-12-2018
10:45 AM
@VinayPlease login and accept the answer if you find this helpful. Thanks
... View more
07-20-2018
04:51 PM
1 Kudo
What component are you asking about? What are you trying to achieve? They typically call each other over combinations of separate protocols. - HDFS and YARN interact via RPC/IPC. - Ambari Server and Agents are over HTTP & REST. Ambari also needs JDBC connections to the backing database. - Hive, Hbase, and Spark can use Thrift Server. The Hive metastore uses JDBC. - Kafka has its own TCP protocol. I would suggest starting on a specific component for the use case(s) you want. Hadoop itself is only comprised of HDFS & YARN + MapReduce
... View more