Member since
11-15-2018
4
Posts
0
Kudos Received
0
Solutions
11-21-2018
09:48 AM
Thank You. This works for me. 🙂
... View more
11-20-2018
08:38 PM
Thank you for your reply. May i know what is referesh command ? And can i see table in hive only after i close spark application?
... View more
11-16-2018
08:48 AM
Below is my code import sqlContext.implicits._ import org.apache.spark.sql val eBayText = sc.textFile("/user/cloudera/spark/servicesDemo.csv") val hospitalDataText = sc.textFile("/user/cloudera/spark/servicesDemo.csv") val header = hospitalDataText.first() val hospitalData = hospitalDataText.filter(a=>a!=header) case class Services(uhid:String,locationid:String,doctorid:String) val hData = hospitalData.map(_.split(",")).map(p=>Services(p(0),p(1),p(2))) val hosService = hData.toDF() hosService.write.format("parquet").mode(org.apache.spark.sql.SaveMode.Append).save("/user/hive/warehouse/hosdata") This code created 'hosdata' folder at specified path, which contains data in 'parquet' format. But when i went to hive and check table got created or not the, i did not able to see any table name as 'hosdata'. So i run below commands. hosService.write.mode("overwrite").saveAsTable("hosData") sqlContext.sql("show tables").show shows me below result +--------------------+-----------+ | tableName|isTemporary| +--------------------+-----------+ | hosdata| false| +--------------------+-----------+ But again when i check in hive, i can not see table 'hosdata' Could anyone let me know what step i am missing?
... View more
Labels:
- Labels:
-
Apache Spark