Support Questions

Find answers, ask questions, and share your expertise

HIVE : Error while creating table

avatar

hive> CREATE EXTERNAL TABLE fix_map

> (tag MAP<INT, STRING>)

> ROW FORMAT DELIMITED

> COLLECTION ITEMS TERMINATED BY '1'

> FIELDS TERMINATED BY '2'

> MAP KEYS TERMINATED BY '='

> LOCATION '/user/pooja/fix/';

NoViableAltException(105@[1704:103: ( tableRowFormatMapKeysIdentifier )?]) at org.antlr.runtime.DFA.noViableAlt(DFA.java:158) at org.antlr.runtime.DFA.predict(DFA.java:116) at org.apache.hadoop.hive.ql.parse.HiveParser.rowFormatDelimited(HiveParser.java:30427) at org.apache.hadoop.hive.ql.parse.HiveParser.tableRowFormat(HiveParser.java:30662) at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:4683) at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2144) at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1398) at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1036) at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:199) at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:404) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:322) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:975) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1040) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) FAILED: ParseException line 5:0 cannot recognize input near 'FIELDS' 'TERMINATED' 'BY' in serde properties specification hive>

Hi Gurus,

I am facing above error while creating external table.Could you please help resolving the issue.

Regards,

Pooja

6 REPLIES 6

avatar
@Pooja Sahu

Try the below create comment:

CREATE EXTERNAL TABLE fix_map > (tag MAP<INT, STRING>) > ROW FORMAT DELIMITED > COLLECTION ITEMS TERMINATED BY '1' >ROW FORMAT DELIMITED>FIELDS TERMINATED BY '2' > MAP KEYS TERMINATED BY '=' > LOCATION '/user/pooja/fix/';

avatar

Hi Sindhu,

I am able to create table with below statement. but when I select tag it shows just NULL.

CREATE EXTERNAL TABLE fix_map1

(tag MAP<INT, STRING>)

ROW FORMAT DELIMITED

FIELDS TERMINATED BY '\002'

COLLECTION ITEMS TERMINATED BY '\001'

MAP KEYS TERMINATED BY '='

LOCATION '/user/pooja/fix/';

hive> select tag[10] from fix_map1; Total jobs = 1 Launching Job 1 out of 1 Number of reduce tasks is set to 0 since there's no reduce operator Starting Job = job_1480605451535_0011, Tracking URL = http://localhost:8088/proxy/application_1480605451535_0011/ Kill Command = /usr/lib/hadoop-2.2.0/bin/hadoop job -kill job_1480605451535_0011 Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0 2016-12-02 18:45:05,423 Stage-1 map = 0%, reduce = 0% 2016-12-02 18:45:15,600 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 2.11 sec MapReduce Total cumulative CPU time: 2 seconds 110 msec Ended Job = job_1480605451535_0011 MapReduce Jobs Launched: Job 0: Map: 1 Cumulative CPU: 2.11 sec HDFS Read: 6998 HDFS Write: 75 SUCCESS Total MapReduce CPU Time Spent: 2 seconds 110 msec OK NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL Time taken: 22.676 seconds, Fetched: 25 row(s)

Regards,

Pooja

avatar

HI Sindhu.

Thanks for response.

I tried the command but getting below error now.

hive> CREATE EXTERNAL TABLE fix_map > (tag MAP<INT, STRING>) > ROW FORMAT DELIMITED > COLLECTION ITEMS TERMINATED BY '1' > ROW FORMAT DELIMITED > FIELDS TERMINATED BY '2' > MAP KEYS TERMINATED BY '=' > LOCATION '/user/pooja/fix/'; NoViableAltException(217@[1704:103: ( tableRowFormatMapKeysIdentifier )?]) at org.antlr.runtime.DFA.noViableAlt(DFA.java:158) at org.antlr.runtime.DFA.predict(DFA.java:116) at org.apache.hadoop.hive.ql.parse.HiveParser.rowFormatDelimited(HiveParser.java:30427) at org.apache.hadoop.hive.ql.parse.HiveParser.tableRowFormat(HiveParser.java:30662) at org.apache.hadoop.hive.ql.parse.HiveParser.createTableStatement(HiveParser.java:4683) at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:2144) at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1398) at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1036) at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:199) at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:404) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:322) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:975) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1040) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:792) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) FAILED: ParseException line 5:0 cannot recognize input near 'ROW' 'FORMAT' 'DELIMITED' in serde properties specification

avatar
Rising Star

@Pooja Sahu Can you share a couple rows of the data set ?

avatar

@Jean-Philippe Player

Please find attachment.

test.txt

Regards,

Pooja

avatar
Rising Star

@Pooja Sahu In the source file, the original '\001' character code has been replaced with the string representation "^A". One way to process the file is to convert it back to \001:

CREATE EXTERNAL TABLE fix_raw (line string)
ROW FORMAT DELIMITED
LOCATION '/user/pooja/fix/';
CREATE TABLE fix_map (tag MAP<STRING, STRING>) 
STORED AS ORC;
INSERT INTO TABLE fix_map 
SELECT  str_to_map( replace(line, '^A', '\001'), '\001', '=') tag from fix_raw;
-- query
SELECT tag[49] FROM fix_map;