Member since
03-28-2016
99
Posts
9
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2396 | 05-08-2018 06:39 AM | |
1301 | 04-27-2018 10:12 AM | |
3213 | 09-11-2017 01:07 PM | |
22488 | 03-14-2017 10:00 AM | |
6222 | 02-10-2017 08:40 AM |
03-14-2017
04:58 AM
This article will help you to implement kerberos and add kerberos principal: https://sqoop.apache.org/docs/1.99.7/security/AuthenticationAndAuthorization.html
... View more
03-10-2017
05:43 AM
I don't think that Sqoop supports importing from Hive or exporting to Hive. It is intended as a bridge between Hive and RDBMS. However, you should be able to do, From within hive, run the following command: insert overwrite local directory '/home/user/staging' row format delimited fields terminated by','select*from table; This command will save the results of the select on the table to a file on your local system, then export csv to Oracle.
... View more
03-07-2017
10:26 AM
This link may help you in understanding, http://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-plan-file-systems.html
... View more
02-28-2017
11:06 AM
@satya gaurav
Directory path is where, your hive data resides, if you give empty folder then table will be empty, so you cannot change if you want the same data to be loaded to the table. The columns with will null is due to the number columns you defined in schema may more than the delimited data.
... View more
02-15-2017
06:13 AM
Check the directories entry assigned in HDFS config, Ambari will consider only these directory space,for reference check the below image,
... View more
02-15-2017
05:51 AM
Adding to @Kuldeep Kulkarni comments, And make sure that appropriate driver jar on oozie shared lib hdfs location.
... View more
02-14-2017
05:17 AM
Once the "Ambari" setup done, you can achieve all other thing in automated way.
... View more
02-14-2017
05:03 AM
@Angelo Alexander If you are using the "--target-dir" option with sqoop, then it is just a location on HDFS, where your output sits, you can create a external hive table on the target-dir location. Or you can use hive-create-table option with sqoop to directly create a table on hive.
... View more
02-14-2017
04:56 AM
@Angelo Alexander If you are trying the "sqoop" command, you have to try on linux CLI, where sqoop client should have installed. And proper JDBC driver should be copied to sqoop lib location.
... View more
02-13-2017
02:52 PM
@omkar pathallapalli Just add the SQL Server jdbc jar(sqljdbc42.jar) to the sqoop lib folder and try to sqoop using the syntax provided above bu Sunile.
... View more