Support Questions
Find answers, ask questions, and share your expertise

error creating Truck milage table org.apache.hive.service.cli.HiveSQLException:

New Contributor

org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10004]: Line 3:32 Invalid table alias or column reference 'jun13_miles': (possible column names are: column1, column2, column3

2 REPLIES 2

New Contributor

I have removed most of the rows and trimmed the create statement

CREATE TABLE truck_mil STORED AS ORC AS
SELECT truckid, driverid, rdate, miles, gas, miles / gas mpg FROM trucks
LATERAL VIEW stack(54,
'jun13',jun13_miles,jun13_gas,
'may13',may13_miles,may13_gas,
'jan09',jan09_miles,jan09_gas )
dummyalias AS rdate, miles, gas;

here is original sql statement from web

CREATE TABLE truck_mil STORED AS ORC AS
SELECT truckid, driverid, rdate, miles, gas, miles / gas mpg FROM trucks
LATERAL VIEW stack(54,
'jun13',jun13_miles,jun13_gas,
'may13',may13_miles,may13_gas,
'jan09',jan09_miles,jan09_gas )
dummyalias AS rdate, miles, gas;

Super Guru
@Siva

in stack keep all values with single quotes enclosed if the values are not integers.

I enclosed all the values with single quotes and executed select query as shown below.

hive# select dummyalias.* from (select 0) t LATERAL VIEW stack(3,
'jun13','jun13_miles','jun13_gas',
'may13','may13_miles','may13_gas',
'jan09','jan09_miles','jan09_gas')
dummyalias AS rdate, miles, gas;
+-------------------+-------------------+-----------------+--+
| dummyalias.rdate  | dummyalias.miles  | dummyalias.gas  |
+-------------------+-------------------+-----------------+--+
| jun13             | jun13_miles       | jun13_gas       |
| may13             | may13_miles       | may13_gas       |
| jan09             | jan09_miles       | jan09_gas       |
+-------------------+-------------------+-----------------+--+
3 rows selected (0.106 seconds)

If our stack values are all integers then we don't need to enclose them in single quotes

Hive# select tf.* from (select 0) t lateral view stack(2,10,10,10,20) tf as col0,col1;
+-------+-------+--+
| col0  | col1  |
+-------+-------+--+
| 10    | 10    |
| 10    | 20    |
+-------+-------+--+
2 rows selected (0.092 seconds)

as you can see above example all values are integers so we haven't enclosed them in single quotes still we got results back with 2 columns.

Here is what i tried..

Hive# CREATE TABLE default.truck_mil STORED AS ORC AS
SELECT cast(null as string)truckid, cast(null as string)driverid, rdate, miles, gas, cast(null as string)mpg  from (select 0) t LATERAL VIEW stack(3,
'jun13','jun13_miles','jun13_gas',
'may13','may13_miles','may13_gas',
'jan09','jan09_miles','jan09_gas')
dummyalias  as rdate,miles,gas;

Stack Description:-

Row-setName(Signature)Description
T1,...,Tn/rstack(int r,T1 V1,...,Tn/r Vn)Breaks up n values V1,...,Vn into r rows. Each row will have n/r columns. r must be constant

As you see in the above stack i'm having 3 as rows(stack(3,...)) value and having 9 values('jun13'...'jan09_gas') which is breaking into 9v/3r= 3 columns.

once i created the table and tried to select data from the table

Hive# select * from default.truck_mil;
+----------+-----------+--------+--------------+------------+-------+--+
| truckid  | driverid  | rdate  |    miles     |    gas     |  mpg  |
+----------+-----------+--------+--------------+------------+-------+--+
| NULL     | NULL      | jun13  | jun13_miles  | jun13_gas  | NULL  |
| NULL     | NULL      | may13  | may13_miles  | may13_gas  | NULL  |
| NULL     | NULL      | jan09  | jan09_miles  | jan09_gas  | NULL  |
+----------+-----------+--------+--------------+------------+-------+--+
3 rows selected (0.205 seconds)

Enclose your stack values with single quotes and then try to create table, Take my above create table statement as reference query.

References:-

https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-stack(values)

If the Answer helped to resolve your issue, Click on Accept button below to accept the answer, That would be great help to Community users to find solution quickly for these kind of errors.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.