Support Questions

Find answers, ask questions, and share your expertise

Where did scoop2 put my imported data?

avatar
Explorer

Hi folks,

 

With Abe's help (thank you!), I've now got sqoop talking to SQL Server via jdbc, and the sqoop job to import a table SEEMS to be working - no errors in the log, successful job completion showing in the Job Browser.  However, I can't find the data!  The file browser doesn't show me any new files, and the new directory I asked sqoop to deposit the data in isn't there.  So... where is my imported data?  Any assistance would be greatly appreciated!

 

Thanks,

 

ws

1 ACCEPTED SOLUTION

avatar
Explorer

The solution appears to be using $hadoop fs -chown / -chmod to adjust the default permissions in HDFS to something you can write to.

View solution in original post

6 REPLIES 6

avatar
Explorer

Bump 🙂

 

Anyone?  Or is this a really really stupid question?  Don't noobs get one free one?

avatar
Expert Contributor

Could you provide Sqoop2 logs with verbose enabled?

avatar
Explorer

Thanks for responding Abe.  New info from the log:

 

Cannot access: /user/warren.lastname/Fetch2. Note: You are a Hue admin but not a HDFS superuser (which is "hdfs").
[Errno 2] File /user/warren.lastname/Fetch2 not found

 

Fetch2 is the directory name I asked sqoop to dump the data into (it did not exist).

 

So it looks like I need some hdfs permissions, but I can't seem to find where to set them in the Hue interface.  Could you provide some guidance on this?

 

Many thanks!

 

ws

 

avatar
Expert Contributor
You can set permissions through Hue via the file browser. There should be a Chmod/Chown button above the file listing.

Are you logged in as warren.lastname user?

Would you be able to run the job with the "verbose" option from the command line? Or, possibly, provide the Sqoop logs and task logs to see what happened?

avatar
Explorer

> You can set permissions through Hue via the file browser. There should be a Chmod/Chown button above the file listing.

 

The directory is never created, so there are no permissions to mess with.


> Are you logged in as warren.lastname user?

 

Yes.


> Would you be able to run the job with the "verbose" option from the command line? Or, possibly, provide the Sqoop logs and task logs to > see what happened?

 

I think my previous post identifies the error - lack of HDFS permissions.  This is strange, since this is a stock Cloudera install from a couple of weeks ago and I haven't messed with any of the settings.  If this really is the problem, how do I set HDFS permissions?  I'm an Admin user as warren.lastname, but apparently that isn't quite enough?

 

Thanks,

 

ws

avatar
Explorer

The solution appears to be using $hadoop fs -chown / -chmod to adjust the default permissions in HDFS to something you can write to.