Created on 06-29-2017 10:34 PM - edited 09-16-2022 04:52 AM
I have downloaded Cloudera quickstart 5.10 for VirtualBox.
But it's not loading hive data into spark
import org.apache.spark.sql.hive.HiveContext
import sqlContext.implicits._
val hiveObj = new HiveContext(sc)
hiveObj.refreshTable("db.table") // if you have uograded your hive do this, to refresh the tables.
val sample = sqlContext.sql("select * from table").collect()
sample.foreach(println)
Still i'm getting the error as table not found(It's not accessing metadata)
What should i do, Any one pls help me
(In cloudera quickstart we are unable to copy hive-site.xml in to spark/conf)
Created on 07-04-2017 11:36 PM - edited 07-04-2017 11:37 PM
now thats the reason it says table not found . mate
Will dig more and comeback to u . we almost narrow down to it
Could you see if you have hive-site.xml & hdfs-site.xml in your spark conf folder
/etc/spark/conf/
if not just do cp command and push those xml file to /etc/spark/conf/ restart spark
fire it again let us see .
Created on 07-12-2017 04:37 AM - edited 07-12-2017 04:38 AM
Thank you Guna.
i link the confuguration file hive to spark as
ln -s /etc/hive/conf/hive-site.xml /etc/spark/conf/hive-site.xml
it started workig after restart only.
Created 07-12-2017 05:45 AM
Hurray ! 🙂 @hadoopSparkZen
Created 03-28-2018 01:17 PM
@csgunaand @hadoopSparkZen you guys have saved my day. Thanks to both of you 🙂
Created 02-23-2019 02:35 AM
Hi, I am also facing the same issue. Not able to load hive table into spark.
I tried to copy the xml files in spark conf folder. But its permission denied and I tries to change the permission for the folder also .That is also not working.
Using cloudera vm 5.12