Member since
11-21-2017
70
Posts
5
Kudos Received
0
Solutions
08-13-2018
02:55 PM
I can able run this with out sqoop job and with out ACCOUNT_CURRENCY_CODE in ('GHS', 'UGX') this condition also.surly its not permission issues.
... View more
08-13-2018
10:19 AM
Hi @ASIF Khan, Have u got solution for this?
... View more
08-04-2018
09:27 PM
Hi @Vinicius Higa Murakami, There is no updates, I am getting full row duplicates ,I think its sqoop tool problem.And to make use of merge I dont have PK in table. Sqoop import with last modifies is not giving consistence result.Some times its importing hole day records instead of importing records from last import . Anyway Thanks a lot.
... View more
08-04-2018
09:17 PM
Hi, My Sqoop job is as follows sqoop job --create KHL_SOE_FINANCIAL_TRAN_JOB -- import --options-file '/home/hdfs/sqoopimport/DBConnections/connectionDetails.txt' --password-file 'hdfs://ssehdp101.metmom.mmih.biz:8020/passwd/psw.txt' --query "select * from ad.soe_financial_tran where ACCOUNT_CURRENCY_CODE in ('GHS', 'UGX') and \$CONDITIONS" -m 1 --incremental lastmodified --check-column POSTED_AT --last-value '2018-08-02 21:11:31.759' --hcatalog-home /usr/hdp/current/hive-webhcat --hcatalog-database SNDPD --hcatalog-table SND_SOE_FINANCIAL_TRAN --hcatalog-storage-stanza 'stored as orcfile' If I run this job Im getting following error. ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
java.lang.NullPointerException
at org.apache.hadoop.fs.FileSystem.fixRelativePart(FileSystem.java:2254)
at org.apache.hadoop.hdfs.DistributedFileSystem.fixRelativePart(DistributedFileSystem.java:2512)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1437)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1447)
at org.apache.sqoop.tool.ImportTool.initIncrementalConstraints(ImportTool.java:320)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:498)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:243)
at org.apache.sqoop.tool.JobTool.run(JobTool.java:298)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243) Connection details are correct, If I run with out sqoop job I can able to import. Thanks in Advance..
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HCatalog
-
Apache Sqoop
07-30-2018
06:43 PM
Hi @Vinicius Higa Murakami, Thanks for response.. --last-modifird value will take it from Sqoop job only,if I give manually there wont be any issue,In my source DB new records will add at 2018-07-29 01:20:08 and my sqoop import has run at 2018-07-29 15:10:08.234980.And again my source import will be at 2018-07-30 01:30:08 and my sqoop import will run at 2018-07-30 14:10:08.234980, this it will import 2018-07-29 source import records and 2018-07-30 import records also, and its not every time some times its importing 2018-07-30 import records only. My import statement is as follows sqoop job --create PACKAGE_EVENT_AUTOBOOST_SETUP_AMOUNT_JOB -- import --options-file '/home/hdfs/sqoopimport/DBConnections/connectionDetails.txt' --password-file 'hdfs://ssehdp101.metmom.mmih.biz:8020/passwd/psw.txt' --table REPORT.PACKAGE_EVENT_AUTOBOOST --incremental lastmodified --check-column LOAD_AT -m 1 --hcatalog-home /usr/hdp/current/hive-webhcat --hcatalog-database SNDPD --hcatalog-table report_PACKAGE --hcatalog-storage-stanza 'stored as orcfile'. sqoop job --exec PACKGE_EVENT_AUTOBOOST_SETUP_AMOUNT_JOB
... View more
07-23-2018
08:14 PM
HI, Im importing table from DB2 to hcatalog with lastmodified option in ORC formate, Sometimes I am getting duplicate records, some tables are iporting properly, but soe tbales are getting dupliates,What might be the problem? Thank U
... View more
Labels:
- Labels:
-
Apache HCatalog
-
Apache Hive
-
Apache Sqoop
07-03-2018
02:34 PM
How to runn sqoop job? is my sqoop job name id Inc_dat, how to run this using oozie?
... View more
06-26-2018
11:45 AM
Hi @Anjali Shevadkar also facing same problem, if u find any solution plz share
... View more
06-04-2018
09:56 AM
Have u done this? Im also trying for tht.If u done plz help me in
... View more
05-09-2018
03:08 PM
How to Import to Hive? If I am going importing directly to Hive Im getting following error.I can able to import to HDFS. Error: Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] Sqoop version is 1.4.6, is sqoop import will support for hive through Oozie?
... View more