Member since
06-18-2014
26
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3087 | 07-08-2014 06:03 AM |
07-09-2014
12:42 PM
Yup you're right - sorry, I was looking at the table creation pages, not the sqoop import pages. The SQOOP-777 issue seems to reference column and record limiters. Is my single-quote problem one of these? I would think not, since a single quote is being inserted at the beginning AND the end of the column value. It's behaving like it's enclosing the column value in single quotes, rather than a field or record delimiter.
... View more
07-09-2014
11:43 AM
Thanks Abe. I assume I'd have to do this from the sqoop2 command line? I don't see any mapper or loader options in Hue. Another minor snag: The text files sqoop2 created contain the correct data, but strings are single-quoted, and when I import the file(s) the single quote becomes part of the data. How would I get around that?
... View more
07-09-2014
11:28 AM
Yup, that did it. Thanks Abe! It created a bunch of files, named part-m-00001, part-m-00002, etc. What I really want is a single file, so I can easily create a table from it. Of course I can hack the thing into one file with the hadoop fs -cat command, but is there an easier way to import the data sqoop2 retrieved?
... View more
07-09-2014
06:25 AM
Sure, here's the MS SQL create script: CREATE TABLE [dbo].[Assets]( [AssetId] [uniqueidentifier] NOT NULL, [Value] [nvarchar](max) NULL, [LastModifiedDate] [datetime] NOT NULL, [LastSyncronizedDate] [datetime] NULL, [SyncId] [nvarchar](128) NULL, CONSTRAINT [PK_dbo.Assets] PRIMARY KEY CLUSTERED ( [AssetId] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
... View more
07-08-2014
10:43 AM
com.microsoft.sqlserver.jdbc.SQLServerException: Conversion failed when converting from a character string to uniqueidentifier Hi Folks, I'm attempting to use sqoop2 from Hue to import a single column from one database table in MS SQL Server. That column is defined as 'nvarchar(max)', and contains a json document - which starts and ends with curly braces {...}. The sqoop2 job fails with: com.microsoft.sqlserver.jdbc.SQLServerException: Conversion failed when converting from a character string to uniqueidentifier I suspect it's trying to convert the string value to a UID simply because it starts and ends with curly braces, buy hey, I'm just guessing. I see in the sqoop (version 1) docs that you can override the default conversions using: Table�3.�Parameters for overriding mapping Argument Description --map-column-java <mapping> Override mapping from SQL to Java type for configured columns. --map-column-hive <mapping> Override mapping from SQL to Hive type for configured columns. How can I use these overrides in sqoop2 via Hue? Or another way? Or am I off on the wrong path here? Thanks, ws
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
-
Cloudera Hue
07-08-2014
06:03 AM
The solution appears to be using $hadoop fs -chown / -chmod to adjust the default permissions in HDFS to something you can write to.
... View more
06-25-2014
06:03 AM
> You can set permissions through Hue via the file browser. There should be a Chmod/Chown button above the file listing. The directory is never created, so there are no permissions to mess with. > Are you logged in as warren.lastname user? Yes. > Would you be able to run the job with the "verbose" option from the command line? Or, possibly, provide the Sqoop logs and task logs to > see what happened? I think my previous post identifies the error - lack of HDFS permissions. This is strange, since this is a stock Cloudera install from a couple of weeks ago and I haven't messed with any of the settings. If this really is the problem, how do I set HDFS permissions? I'm an Admin user as warren.lastname, but apparently that isn't quite enough? Thanks, ws
... View more
06-24-2014
10:41 AM
Thanks for responding Abe. New info from the log: Cannot access: /user/warren.lastname/Fetch2. Note: You are a Hue admin but not a HDFS superuser (which is "hdfs"). [Errno 2] File /user/warren.lastname/Fetch2 not found Fetch2 is the directory name I asked sqoop to dump the data into (it did not exist). So it looks like I need some hdfs permissions, but I can't seem to find where to set them in the Hue interface. Could you provide some guidance on this? Many thanks! ws
... View more
06-23-2014
04:37 AM
Bump 🙂 Anyone? Or is this a really really stupid question? Don't noobs get one free one?
... View more
06-20-2014
07:07 AM
Hi folks, With Abe's help (thank you!), I've now got sqoop talking to SQL Server via jdbc, and the sqoop job to import a table SEEMS to be working - no errors in the log, successful job completion showing in the Job Browser. However, I can't find the data! The file browser doesn't show me any new files, and the new directory I asked sqoop to deposit the data in isn't there. So... where is my imported data? Any assistance would be greatly appreciated! Thanks, ws
... View more
Labels:
- Labels:
-
Apache Sqoop