Member since
12-19-2017
8
Posts
0
Kudos Received
0
Solutions
01-26-2018
07:29 PM
Hi @Predrag Minovic I am using Hortonworks 2.5.3. In my sqoop job, the incremental.last.value in sqoop metastore is updating even if the sqoop job fails. 18/01/26 14:26:50 ERROR tool.ImportTool: Merge MapReduce job failed! 18/01/26 14:26:50 INFO tool.ImportTool: Saving incremental import state to the metastore 18/01/26 14:26:51 INFO tool.ImportTool: Updated data for job: job1 When I run the same job again, it is taking the last value as updated value.
... View more
12-21-2017
05:55 PM
Hi @Geoffrey Shelton Okot I am using HDP- 2.5.3 and Sqoop version: 1.4.6. I am trying to import tables in Oracle DB to Sqoop. One of my table has BLOB data in it. While I am incrementally importing it am also using Merge-key with primary key. But I am getting the Null pointer Exception. Error: java.lang.NullPointerException
at org.apache.sqoop.lib.BlobRef.writeInternal(BlobRef.java:98)
at org.apache.sqoop.lib.LobRef.write(LobRef.java:307)
at org.apache.sqoop.lib.LobSerializer.writeBlob(LobSerializer.java:38)
at com.cloudera.sqoop.lib.LobSerializer.writeBlob(LobSerializer.java:41)
at QueryResult.write(QueryResult.java:343)
at org.apache.sqoop.mapreduce.MergeRecord.write(MergeRecord.java:124) I am not sure how to proceed further. Please help me to resolve the issue. Thanks, Jyothi
... View more
Labels:
12-21-2017
04:17 PM
Hi @Bala Vignesh N V I have another table in Oracle DB which has a single column primary key, and it also has blob data in it. When I am trying to do Sqoop import using Merge-key for this table also, it is giving me the same error - Null Pointer Exception. I think the issue is with blob data and merge-key. In the Merge-key I am passing the primary key attribute, but still it is throwing me the exception. No clue of why it is doing so. Please help ! Thanks, Jyothi
... View more
12-20-2017
07:58 PM
Hi @Krishna Srinivas I am facing the same issue, where in I have to specify the composite primary key in merge-key. But it is giving me the error. Can you please explain me on how can I achieve the above mentioned answer by taking an example. That will help me understand more clearly. Thanks in Advance !!
... View more
12-20-2017
07:54 PM
Hi @Bala Vignesh N V Actually, my table in Oracle DB has a composite primary key, but in the merge-key I am specifying only one column. So, it is throwing me the Null pointer exception. Now, the problem is merge-key will not accept the composite primary key. So I am kind of struck there. Can you please help me if you have any idea on how to deal with this. Thanks Jyothi
... View more
12-19-2017
09:24 PM
@Prabhat Ratnala @Geoffrey Shelton Okot @Sandeep Nemuri Hi, Even I am facing the same issue. But I am trying to import the data into HDFS. Please let me know if you got any solution for this. Thanks!
... View more
12-19-2017
07:54 PM
Hi Team, I have a table in Oracle DB with one column holding BLOB data. Using Sqoop import the blob content is stored as binary value in HDFS. I want to see the original content of the blob data. How can I do that? Please help. Thanks in Advance !!
... View more
Labels:
12-19-2017
06:08 PM
@ksuresh @SupportKB I am trying to import the table with BLOB data from Oracle to HDFS using sqoop. I am getting the following error Error: java.lang.NullPointerException
at org.apache.sqoop.lib.BlobRef.writeInternal(BlobRef.java:98)
at org.apache.sqoop.lib.LobRef.write(LobRef.java:307)
at org.apache.sqoop.lib.LobSerializer.writeBlob(LobSerializer.java:38)
at com.cloudera.sqoop.lib.LobSerializer.writeBlob(LobSerializer.java:41)
at QueryResult.write(QueryResult.java:191)
at org.apache.sqoop.mapreduce.MergeRecord.write(MergeRecord.java:124)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:98)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:82) Thanks in Advance !!
... View more
Labels: