Member since
11-21-2017
70
Posts
5
Kudos Received
0
Solutions
07-03-2018
02:34 PM
How to runn sqoop job? is my sqoop job name id Inc_dat, how to run this using oozie?
... View more
05-09-2018
03:08 PM
How to Import to Hive? If I am going importing directly to Hive Im getting following error.I can able to import to HDFS. Error: Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] Sqoop version is 1.4.6, is sqoop import will support for hive through Oozie?
... View more
04-16-2018
11:28 AM
HI Simran, Im also facing same problem, can I have solution for this?
... View more
04-16-2018
11:03 AM
Hi Salvator, Im facing same problem, do u find any solution for this?
... View more
02-28-2018
02:40 PM
Im having HDP cluster and now I want to install HDF also, can I do it on HDP or I need to have separate cluster?
... View more
Labels:
02-27-2018
11:44 AM
HI @Harald Berghoff Im usinf crontab only for scheduling the jobs, I tried in ur way also, but its prompting for password, how to give password in different script and error handling?If you dont mind can I have well capable script for handling errors and security.
... View more
02-27-2018
07:48 AM
Hi @Bala Vignesh N V Thanks for solution, I need to implement,in case of shell how to do that? manual
interaction I am getting files, bu I want to automate this, generally my
manula process is like follows step1: sftp ayosftpuser@IPaddredss password step2: cd /sourcedir step3:in above directory every day one directory will create, in this directory some files are droping. get -Pr 2018-02-26 bye step4: hadoop fs -put -f 2018-02-26 /destination I need to automate this
... View more
02-27-2018
07:05 AM
Hi @Harald Berghoff, Thanks for solution, In my cluster NiFi is not there. So I have to do using shell only other wise I can go for Flume, in case of shell how to do that? manual interaction I am getting files, bu I want to automate this, generally my manula process is like follows step1: sftp ayosftpuser@IPaddredss password step2: cd /sourcedir step3:in above directory every day one directory will create, in this directory some files are droping. get -Pr 2018-02-26 bye step4: hadoop fs -put -f 2018-02-26 /destination I need to automate this
... View more