Member since
11-04-2019
26
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1804 | 11-25-2019 12:30 AM |
11-27-2019
04:29 AM
@EricL I tried -map-column-java Settings=String and got the data type as string in hive. however the data doesnt look corrupted but the hexadecimal representation of the value is not retained cmd used : sqoop import --driver 'com.microsoft.sqlserver.jdbc.SQLServerDriver' --connect <> --connection-manager 'org.apache.sqoop.manager.SQLServerManager' --username <> -password <> --table 'my_table' --as-parquetfile --delete-target-dir --target-dir /user/test/ -map-column-java Settings=String --m 1 Result of above query: column data type in source : image column value in source : 0xFFffDeDJBF.......dDf. --> ( A hexadecimal value ) column data type in hive: string column value in hive: FFffDeDJBF.......dDf (0x is not retained)
... View more
11-27-2019
04:20 AM
@EricL I tried the above. the map-column-hive is not making any impact. the column data type is taken as string in hive after import and the data is still corrupted. Using hive-import and hive-table with it helped in getting the data type as binary in hive. But the data is still corrupted cmd used : sqoop import --driver 'com.microsoft.sqlserver.jdbc.SQLServerDriver' --connect <> --connection-manager 'org.apache.sqoop.manager.SQLServerManager' --username <> -password <> --hive-import --hive-table testc --as-parquetfile --delete-target-dir --target-dir '/user/test/' --query "select * FROM my_table where \$CONDITIONS; " --m 1 --map-column-hive Settings=binary
... View more
11-26-2019
06:03 AM
@VidyaSargur Any sqoop expert who can throw some light on this issue ?
... View more
11-25-2019
04:08 AM
@pmohan Does this help with columns having data type - image ?? converting the image data type to binary again is leading to data corruption. Please see the issue highlighted below : https://community.cloudera.com/t5/Support-Questions/SQOOP-import-of-quot-image-quot-data-type-into-hive/td-p/283584
... View more
11-25-2019
12:30 AM
1 Kudo
using the query option helped. solution : sqoop import --connect <> --username <> -password <> --as-parquetfile --delete-target-dir --target-dir /hdfs/dir/ --query "select * from [DB_SCHEMA_NAME].[table_name] where \$CONDITIONS;" --m 1
... View more
11-21-2019
12:06 AM
I am trying to import a table from MSSQL which has few columns with data type "Image"
The value in the columns are of binary format(LOBS) .
Ex
database type : MSSQL
column name : Settings
Data_type : image
value of the column:
( 0x1F8B0800000400EDBD07601C499625262FD7E0CC188CDE692EC1D69472329AB2A81CA6501320CCE74A10880604010ED9D....)
When I import using sqoop the COLUMNS is automatically taken as string in hive but the data is corrupted.
COMMAND USED :
sqoop import --driver 'com.microsoft.sqlserver.jdbc.SQLServerDriver' --connect 'jdbc:sqlserver://IP:PORT;database=DB;' --connection-manager 'org.apache.sqoop.manager.SQLServerManager' --username <> -password <> --as-parquetfile --delete-target-dir --target-dir '/user/test/' --query "select GUID,Name,cast(Settings AS binary) AS Settings FROM my_table_name where \$CONDITIONS; " --m 1
... View more
Labels:
- Labels:
-
Apache Sqoop
11-10-2019
11:13 PM
@Khanna command used : sqoop import --driver "com.microsoft.sqlserver.jdbc.SQLServerDriver" --connect "jdbc:sqlserver://server:port;database=db_name;" --connection-manager "org.apache.sqoop.manager.SQLServerManager" --username <> -password <> --table 'table_name' --as-parquetfile --delete-target-dir --target-dir /user/test/axe/ --temporary-rootdir /user/test2/ --m 4 --split-by user_id before mapreduce starts: hadoop fs -ls -R /user/test/ /user/test/axe /user/test/axe/.metadata /user/test/axe/.metadata/descriptor.properties /user/test/axe/.metadata/schema.avsc /user/test/axe/.metadata/schemas /user/test/axe/.metadata/schemas/1.avsc when the mapreduce job starts: hadoop fs -ls -R /user/test/ /user/test/.temp /user/test/.temp/job_1571067970221_0156 /user/test/.temp/job_1571067970221_0156/mr /user/test/.temp/job_1571067970221_0156/mr/job_1571067970221_0156 /user/test/.temp/job_1571067970221_0156/mr/job_1571067970221_0156/.metadata /user/test/.temp/job_1571067970221_0156/mr/job_1571067970221_0156/.metadata/descriptor.properties /user/test/.temp/job_1571067970221_0156/mr/job_1571067970221_0156/.metadata/schema.avsc /user/test/.temp/job_1571067970221_0156/mr/job_1571067970221_0156/.metadata/schemas /user/test/.temp/job_1571067970221_0156/mr/job_1571067970221_0156/.metadata/schemas/1.avsc /user/test/axe /user/test/axe/.metadata /user/test/axe/.metadata/descriptor.properties /user/test/axe/.metadata/schema.avsc /user/test/axe/.metadata/schemas /user/test/axe/.metadata/schemas/1.avsc once the import is complete: hadoop fs -ls -R /user/test/ /user/test/axe /user/test/axe/.metadata /user/test/axe/.metadata/descriptor.properties /user/test/axe/.metadata/schema.avsc /user/test/axe/.metadata/schemas /user/test/axe/.metadata/schemas/1.avsc /user/test/axe/.signals /user/test/axe/.signals/unbounded /user/test/axe/679dadfc-3657-410b-a805-6c98b8d1720b.parquet /user/test/axe/923bbf06-35d6-4156-8df3-a6e53ebf00f7.parquet /user/test/axe/aef24eac-15dd-4ebc-a61a-68265d53a320.parquet /user/test/axe/f903a079-87b3-48a0-bea6-aa02d92c5aac.parquet
... View more
11-10-2019
10:53 PM
Sqoop import fails while importing tables that has space and special characters in the table name.
command used:
sqoop import --connect <> --username <> --password <> --table "employee details information" --as-parquetfile --delete-target-dir --target-dir /hdfs/dir/ --m 1
... View more
Labels:
- Labels:
-
Apache Sqoop
11-10-2019
10:51 PM
@Khanna I tried it. but the .temp is still getting created at the target directory that i specify.
... View more
11-05-2019
10:23 PM
@Shelton The target-dir is specified so that the generated files are placed in that directory. The problem I am facing here is with the temporary directory (.temp) that gets created during run time (i.e when the mapreduce job is initiated) at the target -dir. The solution I am looking for is to change the temporary directory(.temp) location.
... View more
- « Previous
-
- 1
- 2
- Next »