Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

hive DDL for complex CSV type

Highlighted

hive DDL for complex CSV type

Hi folks,

Could you please help me in finding hive DDL for following type of records. This is CSV file which contains JSON

Header ==> field1, field2, field3, field4
Data ====> value1, value2, {'field31':'value31','field32':'value32'}, value4

Thanks !

1 REPLY 1

Re: hive DDL for complex CSV type

Expert Contributor

Create table using appropriate delimiters as shown in below create statement

# create table complextype(field1 string, field2 string, field3 struct<field31:string,field32:string>, field4 string) row format delimited fields terminated by ',' collection items terminated by '$' stored as textfile;

Create CSV files using appropriate delimiters.

In below sample csv data, fields are terminated by comma(,) & complex data within the field is terminated by dollar($)

# hadoop fs -cat /tmp/complex.csv
value1,value2,value31$value32,value4

Load & query data

# load data inpath '/tmp/complex.csv' into table complextype;
# select * from complextype;
+---------------------+---------------------+--------------------------------------------+---------------------+--+
| complextype.field1  | complextype.field2  |             complextype.field3             | complextype.field4  |
+---------------------+---------------------+--------------------------------------------+---------------------+--+
| value1              | value2              | {"field31":"value31","field32":"value32"}  | value4              |
+---------------------+---------------------+--------------------------------------------+---------------------+--+

If my answer helps to solve your problem, accept the answer. It might help others in the community.

Don't have an account?
Coming from Hortonworks? Activate your account here