Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

sqoop export after adding new columns to mysql table

sqoop export after adding new columns to mysql table

Explorer

I have table in MySQL with 2 columns until yesterday. The columns are `id` and `name`.

 

1,Raj
2,Jack

 

I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called `salary`. The table looks like below.

 

1,Raj
2,Jack
3,Jill,2000
4,Nick,3000

 

Now I have done Incremental import on this table as a file.

 

Part-m-00000 file contains

 

1,Raj
2,Jack

 

Part-m-00001 file contains

 

3,Jill,2000
4,Nick,3000

 

Now I created a new table in MySQL with same schema as Original MySQL table with columns `id` `name` and `salary`.

 

When I do sqoop export only last 2 rows are getting inserted to the new table in MySQL .

 

How can I reflect all the rows to be inserted to the table.

 

3 REPLIES 3

Re: sqoop export after adding new columns to mysql table

Champion

@sanjeev20

 

You may have to alter the table in hive (may be using avro-tool, etc) before you go for an incremental import because it looks the hive table is still referring to old metadata. If everything is good with hive then try for export

 

Your result

1,Raj
2,Jack
3,Jill,2000
4,Nick,3000

 

Expected result (a comma after the 2nd column)

1,Raj,
2,Jack,
3,Jill,2000
4,Nick,3000

Re: sqoop export after adding new columns to mysql table

Explorer

@saranvisa

 

As Mentioned the import of the MySQl table is done as a file Not as a hive table.

 

Highlighted

Re: sqoop export after adding new columns to mysql table

Champion

I think you got a  Can't parse input data error when you tried to export to the new table.

 

The only way you could get away with this error is to .

 

delete the folder of the table data which you have previously imported  plus the incremental import .

 

peform a Import freshly to the hdfs.

Now you will have null inside the rows in the hdfs.

then when you perform export to the new table it will get presisted without any error.

Also you can replace the null with any string character using.

 

for string

--null-string '\\N'

for non-string 

--null-non-string '\\N'