Created on 06-20-2014 07:07 AM - edited 09-16-2022 02:00 AM
Hi folks,
With Abe's help (thank you!), I've now got sqoop talking to SQL Server via jdbc, and the sqoop job to import a table SEEMS to be working - no errors in the log, successful job completion showing in the Job Browser. However, I can't find the data! The file browser doesn't show me any new files, and the new directory I asked sqoop to deposit the data in isn't there. So... where is my imported data? Any assistance would be greatly appreciated!
Thanks,
ws
Created 07-08-2014 06:03 AM
The solution appears to be using $hadoop fs -chown / -chmod to adjust the default permissions in HDFS to something you can write to.
Created 06-23-2014 04:37 AM
Bump 🙂
Anyone? Or is this a really really stupid question? Don't noobs get one free one?
Created on 06-23-2014 11:18 AM - edited 06-23-2014 11:20 AM
Could you provide Sqoop2 logs with verbose enabled?
Created 06-24-2014 10:41 AM
Thanks for responding Abe. New info from the log:
Cannot access: /user/warren.lastname/Fetch2. Note: You are a Hue admin but not a HDFS superuser (which is "hdfs").
[Errno 2] File /user/warren.lastname/Fetch2 not found
Fetch2 is the directory name I asked sqoop to dump the data into (it did not exist).
So it looks like I need some hdfs permissions, but I can't seem to find where to set them in the Hue interface. Could you provide some guidance on this?
Many thanks!
ws
Created 06-24-2014 01:35 PM
Created 06-25-2014 06:03 AM
> You can set permissions through Hue via the file browser. There should be a Chmod/Chown button above the file listing.
The directory is never created, so there are no permissions to mess with.
> Are you logged in as warren.lastname user?
Yes.
> Would you be able to run the job with the "verbose" option from the command line? Or, possibly, provide the Sqoop logs and task logs to > see what happened?
I think my previous post identifies the error - lack of HDFS permissions. This is strange, since this is a stock Cloudera install from a couple of weeks ago and I haven't messed with any of the settings. If this really is the problem, how do I set HDFS permissions? I'm an Admin user as warren.lastname, but apparently that isn't quite enough?
Thanks,
ws
Created 07-08-2014 06:03 AM
The solution appears to be using $hadoop fs -chown / -chmod to adjust the default permissions in HDFS to something you can write to.