Member since
11-21-2017
70
Posts
5
Kudos Received
0
Solutions
03-29-2019
01:51 PM
Hi, When I run follwing one I am getting EOF Exception. hadoop distcp hdfs://ssehdp101.biz:8020/user/radasar/test.txt hdfs://10.2.27.50:8020/test. Is there any issue in migrating data from HDP2.6 to 3.1? Invalid arguments: End of File Exception between local host is: "ssehdp101.metmom.mmih.biz/10.1.18.26"; destination host is: "10.248.27.50":8020; : java.io.EOFException; For more details see: http://wiki.apache.org/hadoop/EOFException usage: distcp OPTIONS [source_path...] <target_path>
... View more
09-19-2018
12:07 PM
HI @bhagan, my metastore is default one I didnt configure any thing as it has come with Sqoop installation in cluster, and Main is not Sandbox. what will be the jdbc string for same.
... View more
08-13-2018
02:55 PM
I can able run this with out sqoop job and with out ACCOUNT_CURRENCY_CODE in ('GHS', 'UGX') this condition also.surly its not permission issues.
... View more
08-13-2018
10:19 AM
Hi @ASIF Khan, Have u got solution for this?
... View more
08-07-2018
02:08 PM
Hi @simran kaur, I to want to run my sqoop jobs in Oozie, Can you help me to get it done.waht will be the --meta-connect srting?
... View more
08-04-2018
09:27 PM
Hi @Vinicius Higa Murakami, There is no updates, I am getting full row duplicates ,I think its sqoop tool problem.And to make use of merge I dont have PK in table. Sqoop import with last modifies is not giving consistence result.Some times its importing hole day records instead of importing records from last import . Anyway Thanks a lot.
... View more
08-04-2018
09:17 PM
Hi, My Sqoop job is as follows sqoop job --create KHL_SOE_FINANCIAL_TRAN_JOB -- import --options-file '/home/hdfs/sqoopimport/DBConnections/connectionDetails.txt' --password-file 'hdfs://ssehdp101.metmom.mmih.biz:8020/passwd/psw.txt' --query "select * from ad.soe_financial_tran where ACCOUNT_CURRENCY_CODE in ('GHS', 'UGX') and \$CONDITIONS" -m 1 --incremental lastmodified --check-column POSTED_AT --last-value '2018-08-02 21:11:31.759' --hcatalog-home /usr/hdp/current/hive-webhcat --hcatalog-database SNDPD --hcatalog-table SND_SOE_FINANCIAL_TRAN --hcatalog-storage-stanza 'stored as orcfile' If I run this job Im getting following error. ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
java.lang.NullPointerException
at org.apache.hadoop.fs.FileSystem.fixRelativePart(FileSystem.java:2254)
at org.apache.hadoop.hdfs.DistributedFileSystem.fixRelativePart(DistributedFileSystem.java:2512)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1437)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1447)
at org.apache.sqoop.tool.ImportTool.initIncrementalConstraints(ImportTool.java:320)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:498)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:243)
at org.apache.sqoop.tool.JobTool.run(JobTool.java:298)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243) Connection details are correct, If I run with out sqoop job I can able to import. Thanks in Advance..
... View more
Labels:
07-30-2018
06:43 PM
Hi @Vinicius Higa Murakami, Thanks for response.. --last-modifird value will take it from Sqoop job only,if I give manually there wont be any issue,In my source DB new records will add at 2018-07-29 01:20:08 and my sqoop import has run at 2018-07-29 15:10:08.234980.And again my source import will be at 2018-07-30 01:30:08 and my sqoop import will run at 2018-07-30 14:10:08.234980, this it will import 2018-07-29 source import records and 2018-07-30 import records also, and its not every time some times its importing 2018-07-30 import records only. My import statement is as follows sqoop job --create PACKAGE_EVENT_AUTOBOOST_SETUP_AMOUNT_JOB -- import --options-file '/home/hdfs/sqoopimport/DBConnections/connectionDetails.txt' --password-file 'hdfs://ssehdp101.metmom.mmih.biz:8020/passwd/psw.txt' --table REPORT.PACKAGE_EVENT_AUTOBOOST --incremental lastmodified --check-column LOAD_AT -m 1 --hcatalog-home /usr/hdp/current/hive-webhcat --hcatalog-database SNDPD --hcatalog-table report_PACKAGE --hcatalog-storage-stanza 'stored as orcfile'. sqoop job --exec PACKGE_EVENT_AUTOBOOST_SETUP_AMOUNT_JOB
... View more
07-23-2018
08:14 PM
HI, Im importing table from DB2 to hcatalog with lastmodified option in ORC formate, Sometimes I am getting duplicate records, some tables are iporting properly, but soe tbales are getting dupliates,What might be the problem? Thank U
... View more
Labels:
07-03-2018
02:34 PM
How to runn sqoop job? is my sqoop job name id Inc_dat, how to run this using oozie?
... View more
06-26-2018
11:45 AM
Hi @Anjali Shevadkar also facing same problem, if u find any solution plz share
... View more
06-04-2018
09:56 AM
Have u done this? Im also trying for tht.If u done plz help me in
... View more
05-09-2018
03:08 PM
How to Import to Hive? If I am going importing directly to Hive Im getting following error.I can able to import to HDFS. Error: Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] Sqoop version is 1.4.6, is sqoop import will support for hive through Oozie?
... View more
04-20-2018
10:52 AM
Hi @schhabra, Thanks for response .. Good day...,Details are as attached. Job.Properties nameNode=hdfs://ssehdp101.biz:8020 jobTracker=ssehdp102.biz:8050 queueName=Process examplesRoot=ravi oozie.use.system.libpath=true oozie.libpath=${nameNode}/user/oozie/share/lib/lib_20170922104734/ oozie.wf.rerun.failnodes=true oozie.wf.application.path=${nameNode}/user/${user.name}/ravi/workflow.xml oozie.action.sharelib.for.sqoop=sqoop,hive resourcemanager-logs.txtwfl-logs.txt
... View more
04-17-2018
07:30 AM
Hi Kuldeep, Good day...,Details are as attached.workflow.xmlwfl-logs.txtresourcemanager-logs.txt Job.Properties nameNode=hdfs://ssehdp101.biz:8020 jobTracker=ssehdp102.biz:8050 queueName=Process examplesRoot=ravi oozie.use.system.libpath=true oozie.libpath=${nameNode}/user/oozie/share/lib/lib_20170922104734/ oozie.wf.rerun.failnodes=true oozie.wf.application.path=${nameNode}/user/${user.name}/ravi/workflow.xml oozie.action.sharelib.for.sqoop=sqoop,hive
... View more
04-16-2018
12:59 PM
While I am running Sqoop action in Oozie im getting following error Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
What may be the reason, when I check in Resource Manager it showing Success.
... View more
Labels:
04-16-2018
11:28 AM
HI Simran, Im also facing same problem, can I have solution for this?
... View more
04-16-2018
11:03 AM
Hi Salvator, Im facing same problem, do u find any solution for this?
... View more
02-28-2018
02:40 PM
Im having HDP cluster and now I want to install HDF also, can I do it on HDP or I need to have separate cluster?
... View more
02-27-2018
11:44 AM
HI @Harald Berghoff Im usinf crontab only for scheduling the jobs, I tried in ur way also, but its prompting for password, how to give password in different script and error handling?If you dont mind can I have well capable script for handling errors and security.
... View more
02-27-2018
07:48 AM
Hi @Bala Vignesh N V Thanks for solution, I need to implement,in case of shell how to do that? manual
interaction I am getting files, bu I want to automate this, generally my
manula process is like follows step1: sftp ayosftpuser@IPaddredss password step2: cd /sourcedir step3:in above directory every day one directory will create, in this directory some files are droping. get -Pr 2018-02-26 bye step4: hadoop fs -put -f 2018-02-26 /destination I need to automate this
... View more
02-27-2018
07:05 AM
Hi @Harald Berghoff, Thanks for solution, In my cluster NiFi is not there. So I have to do using shell only other wise I can go for Flume, in case of shell how to do that? manual interaction I am getting files, bu I want to automate this, generally my manula process is like follows step1: sftp ayosftpuser@IPaddredss password step2: cd /sourcedir step3:in above directory every day one directory will create, in this directory some files are droping. get -Pr 2018-02-26 bye step4: hadoop fs -put -f 2018-02-26 /destination I need to automate this
... View more
02-20-2018
11:29 AM
1 Kudo
I want to get ftp file into hdfs,in ftp files are created in date directory for every day, I need to autonmate this job. what will be the best way for doing this?
... View more
Labels:
02-15-2018
01:52 PM
I am doing incremental import from DB2 to HDFS(based on last modified column),I traied to import directly to Hive,Its throwing error tht hive will not support incremental import, so I tried to HDFS, I have used --merge-key , but I am getting duplicates and I didn't find any difference with and with out --merge-key option., Is there any option for incremental importing with out duplicates
... View more
Labels:
01-24-2018
08:23 AM
We will import updated row and already we imported that row in earlier import.so now we will have those 2 rows,how can we avoid this ?
... View more
01-23-2018
12:28 PM
@ssharma Thank you.. I will try with this
... View more
01-23-2018
11:33 AM
I am having Ambari 2.5.2 cluster and I didnt find any Oozie flow in that,Is it possible to create OOzie view in Ambari?If can what are the steps?
... View more
Labels:
01-22-2018
10:21 AM
My HBase getting down with following error Connection failed: [Errno 111] Connection refused to ssehdp101.metmom.mmih.biz:16000 and My sqoop jobs also not running. while I run sqoop job --list command I am getting following error java.sql.SQLException: error in script file line: 904 org.hsqldb.HsqlException: Violation of unique constraint SQOOP_SESSIONS_UNQ: duplicate value(s) for column(s) JOB_NAME,PROPNAME,PROPCLASS what is the proble..
... View more
Labels: