Created 04-04-2019 01:25 PM
i use ambari2.7+hdp3.0,data souce from kafka, code:
val conf = new SparkConf().setMaster(masterUrl).setAppName("UserEnterAndLeave") conf.set("spark.task.maxFailures", "3") val sc = new SparkContext(conf) val ssc = new StreamingContext(sc, Seconds(10))
import spark.implicits._ import spark.sql
sql("use wx")
spark streaming start ,ERROR not found database "wx"
hive database "wx" is created.
help me,thanks...
Created 04-04-2019 03:56 PM
@geniusbaibai geniusbaibai Starting HDP3.0 spark cannot directly access hive.
you'd need to use HWC (Hive Warehouse connector) https://community.hortonworks.com/articles/223626/integrating-apache-hive-with-apache-spark-hive-war...
Created 04-08-2019 03:25 AM
but ,HWC not accept spark streaming, not too found database "wx" .i already use HWC to test 。Is there any other way
Created 04-08-2019 03:28 AM
HWC Query only support. insert table also not successed. but .not Compatible with spark streaming .only support spark sql.
Created 04-05-2019 03:32 PM
I think you need to enable hive support when you create a spark session with the value of hive metastore Uris from hive-site.xml where your hive metastore is saved. Then you should be able to see existing tables of hive from spark .
Created 04-08-2019 06:06 AM
.enableHiveSupport()
open this is not ,error not found database "wx" .because,i am use spark streaming ,not spark sql. you can test.It doesn't work