Member since
05-13-2017
42
Posts
0
Kudos Received
0
Solutions
08-11-2017
11:09 AM
@Chiranjeevi Nimmala, What is group permission on HDFS location for Hive where this hive table is stored ? 'hdpmasters' group should have permission of HDFS file system where hive table is stored. See if below property is set. webhcat.proxyuser.root.groups * webhcat.proxyuser.root.hosts * Regards, Fahim
... View more
08-11-2017
07:14 AM
@Narasimha K , 401 error seems to be related to authentication. Is ResourceManager Web UI requires authentication ? Can you try to check from log who is trying to access ResourceManger web UI ? Regards, Fahim
... View more
08-10-2017
06:16 PM
@HEMANTH KUMAR RATAKONDA Can you please let us know how you investigated that table is corrupted and need repair ? It will help. Regards, Fahim
... View more
08-10-2017
01:33 PM
@ANSARI FAHEEM AHMED , Can you try to see memory assigned to Spark History Server ? See if you can increase it. Second thought is please check no of applications is being shown in History server and see what is limit assigned to it. Hope this may help. Regards, Fahim
... View more
08-10-2017
07:31 AM
Hi Hemanth , What I assuming is , you have already created hive table and try to read it from Spark. What I am suggesting is from Spark also you can create a hive table using Spark SQL. Try to create a small hive table using Spark and try to read also. This will prove that your Spark functionality is working correctly with Hive. And issue is with specific table you posted in comment. Regards, Fahim
... View more
08-09-2017
12:20 PM
Hi , Can you try to save sample table to Hive from Spark ? Then try to re-read table and see if you are able to read it. Regards, Fahim
... View more
08-08-2017
11:55 AM
@mqureshi , Thanks for answer. It makes sense. Regards, Fahim
... View more
08-04-2017
01:41 PM
Is it possible to assign Hadoop queues to Hadoop tools via Capacity scheduler ? Example : Hive queue will only cater request from Hive clients. Spark queue will only cater request from Spark clients. Question in another way : Is it possible to map Hadoop components to Hadoop queue ? Regards, Fahim
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Spark
08-04-2017
01:20 PM
Hi Hdave , You should pickup one tool at a time. for example take Hive. Below are high level steps. 1. Upload your csv files from your local system to HDFS file system. Hint : Use PUT command 2. Launch hive - Hint - Beeline 3. Create hive table as per CSV columns. 4. Load CSV file into table. 5. Query table from HIVE CLI or Beeline. Once this is done , please pick up another tool and try same. Regards, Fahim
... View more
05-13-2017
07:52 PM
Hi Avijeet Dash , Apache Zeppelin 0.7.0 is packed and integrated with new version of HDP 2.6. It has new security feature for Zeppelin.
... View more