- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
HiveContext is not reading schema of an Orcfile
- Labels:
-
Apache Hive
-
Apache Spark
Created ‎08-02-2016 07:26 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
When I run the following:
<code>val df1 = sqlContext.read.format("orc").load(myPath) df1.columns.map(m => println(m))
The columns are printed as '_col0', '_col1', '_col2' etc. As opposed to their real names such as 'empno', 'name', 'deptno'.
When I 'describe mytable' in Hive it prints the column name correctly, but when I run 'orcfiledump' it shows _col0, _col1, _col2 as well. Do I have to specify 'schema on read' or something? If yes, how do I do that in Spark/Scala?
<code>hive --orcfiledump /apps/hive/warehouse/mydb.db/mytable1 ..... fieldNames:"_col0" fieldNames:"_col1" fieldNames:"_col2"
As suggested elsewhere I've added '--files' BEFORE '--jars' as follows:
spark-submit \ --master yarn \ --deploy-mode cluster \ --class xxx.xxxx.MyDriver \ --files hive-site.xml \ --jars datanucleus-api-jdo-3.2.6.jar,datanucleus-core-3.2.10.jar,datanucleus-rdbms-3.2.9.jar \ --name MyDriver \ --num-executors 1 \ --driver-memory 1g \ --executor-memory 1g \ --executor-cores 1 \ ./my-utils-1.0-SNAPSHOT.jar
Note: I created the table as follows:
<code>create table mydb.mytable1 (empno int, name VARCHAR(20), deptno int) stored as orc;
Note: This is not a duplicate of this issue (Hadoop ORC file - How it works - How to fetch metadata) because the answer tells me to use 'Hive' & I am already using HiveContext as follows:
<code>val sqlContext =new org.apache.spark.sql.hive.HiveContext(sc)
By the way, I am using my own hive-site.xml, which contains following:
<code><configuration> <property> <name>hive.metastore.uris</name> <value>thrift://sandbox.hortonworks.com:9083</value> </property> </configuration>
Created ‎08-03-2016 09:35 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I figured out what the problem was. It was the way I was creating the test data. I was under the impression that if I run the following commands:
create table mydb.mytable1 (empno int, name VARCHAR(20), deptno int) stored as orc;
INSERT INTO mydb.mytable1(empno, name, deptno) VALUES (1, 'EMP1',100);
INSERT INTO mydb.mytable1(empno, name, deptno) VALUES (2, 'EMP2',50);
INSERT INTO mydb.mytable1(empno, name, deptno) VALUES (3, 'EMP3',200);
Data would be created in the ORC format at: /apps/hive/warehouse/mydb.db/mytable1
Turns out that's not the case. Even though I indicated 'stored as orc' the INSERT statements didn't save the column information. Not sure if that's expected behavior. In any case, it all works now. Apologies for the confusion but hopefully this will help someone in future -:)
Created ‎08-02-2016 09:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @Jay Ch
Great question. You must register your df1 as a temporary table like:
val table = sqlContext.read.format("orc").load("/apps/hive/warehouse/yourtable") table.registerTempTable("yourtable")
and then run:
val tester = sqlContext.sql("select * from yourtable"); tester.columns
You'll get the actual column names
Created ‎08-02-2016 11:48 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Not sure I understand the answer. I need to run "select * from yourtable" to get column names populated? Perhaps "select * from yourtable limit 1". I can try this, but shouldn't the column names be populated from Metastore as soon as I do a 'load'?
Created ‎08-03-2016 02:21 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Tried it but it doesn't work. Columns are still '_col0', '_col1', '_col2'
Created ‎08-03-2016 09:35 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I figured out what the problem was. It was the way I was creating the test data. I was under the impression that if I run the following commands:
create table mydb.mytable1 (empno int, name VARCHAR(20), deptno int) stored as orc;
INSERT INTO mydb.mytable1(empno, name, deptno) VALUES (1, 'EMP1',100);
INSERT INTO mydb.mytable1(empno, name, deptno) VALUES (2, 'EMP2',50);
INSERT INTO mydb.mytable1(empno, name, deptno) VALUES (3, 'EMP3',200);
Data would be created in the ORC format at: /apps/hive/warehouse/mydb.db/mytable1
Turns out that's not the case. Even though I indicated 'stored as orc' the INSERT statements didn't save the column information. Not sure if that's expected behavior. In any case, it all works now. Apologies for the confusion but hopefully this will help someone in future -:)
Created ‎01-23-2017 06:42 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am with the same issue
,,I have the same isssue.
Created ‎03-01-2017 02:41 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
You should use val df = hiveContext.read.table("table_name") instead. In this case, columns are displayed properly.
