i use ambari2.7+hdp3.0,data souce from kafka, code:
val conf = new SparkConf().setMaster(masterUrl).setAppName("UserEnterAndLeave") conf.set("spark.task.maxFailures", "3") val sc = new SparkContext(conf) val ssc = new StreamingContext(sc, Seconds(10))
import spark.implicits._ import spark.sql
spark streaming start ,ERROR not found database "wx"
hive database "wx" is created.
@geniusbaibai geniusbaibai Starting HDP3.0 spark cannot directly access hive.
you'd need to use HWC (Hive Warehouse connector) https://community.hortonworks.com/articles/223626/integrating-apache-hive-with-apache-spark-hive-war...
but ,HWC not accept spark streaming, not too found database "wx" .i already use HWC to test 。Is there any other way
HWC Query only support. insert table also not successed. but .not Compatible with spark streaming .only support spark sql.
I think you need to enable hive support when you create a spark session with the value of hive metastore Uris from hive-site.xml where your hive metastore is saved. Then you should be able to see existing tables of hive from spark .
open this is not ,error not found database "wx" .because,i am use spark streaming ,not spark sql. you can test.It doesn't work