Reply
Highlighted
New Contributor
Posts: 1
Registered: ‎12-03-2017

DF saved to HDFS file, gives error when using sqoop export to save information in mySQL

[ Edited ]

Hello,

I'm working with spark 1.6.

I read a table from MySQL with Sqoop Import and process it with SqlContext as a DataFrame.

I save the result  with parquet format in HDFS, and I check it on Hue:Saved.PNG

 

As you can see, it creates a _metadata file. But to use Sqoop Export and put the information back to mySQL, I need a .metadata file with the schema. (and changing the name doesn't work).

How can I do it?

Thank you.

 

Announcements