Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

How do I create an ORC Hive table from Spark?

avatar
Expert Contributor

I'm currently using Spark 1.4 and I'm loading some data into a DataFrame using jdbc:

val jdbcDF = sqlContext.load("jdbc", options)

How can I save the jdbcDF DataFrame to a Hive table using the ORC file format?

1 ACCEPTED SOLUTION
12 REPLIES 12

avatar
Expert Contributor

@Brandon Wilson I tried your suggestion it creates the hive table but I get this error:

org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table.

and it does not load data into my table. do you have any idea how to solve this?

avatar

Divya,

What is the user account when DF is used to create the external Hive table?

What is the user account when you try to see the table in Hive (& did you use HiveCli or Hive/Beeline or some ODBC tool?)

avatar
Expert Contributor

@vshukla I am logging in as hdfs user on HDP 2.3.2 sandbox

and using the same account to see tables in hive.Yes , I am using hive CLI and even browsed HDFS files through Ambari .Couldnt see any tables created.