Member since
02-18-2016
72
Posts
19
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
977 | 07-10-2017 04:10 PM | |
2053 | 07-10-2017 04:01 PM | |
5047 | 04-25-2017 05:01 PM | |
5355 | 03-02-2017 06:35 PM | |
6855 | 12-20-2016 02:13 PM |
12-19-2016
08:45 PM
Can you check with the zookeeper log, to see what messages it reported.
... View more
12-19-2016
08:14 PM
Looks to me that you are running hadoop on windows, and you are missing the winutil.exe. Can you try this:
Download winutils.exe from http://public-repo-1.hortonworks.com/hdp-win-alpha/winutils.exe. Set your HADOOP_HOME environment variable on the OS level to the full path to the bin folder with winutils.
... View more
12-19-2016
06:44 PM
It looks to me that the content in the zookeeper is not synchronized, as you can not get the updated information from zookeeper on that node. it might need to fix the zookeeper first.
... View more
12-19-2016
03:33 PM
1 Kudo
@Asma Dhaouadi you mentioned that you received errors during the job running, if there is error, that might be prevent to creating correct files. Could you please share what kind of errors it produced?
... View more
11-08-2016
03:58 PM
It really depends on what your relational database is. Some relational database does not support incremental imports. Otherwise, the link above provided by bhagan should sufficient.
... View more
08-30-2016
03:10 PM
1 Kudo
@SBandaru can you create a hive table ahead of time, then run sqoop import-libjars /usr/hdp/current/sqoop-server/lib/--connect jdbc:teradata://example/Database=ABCDE --connection-manager org.apache.sqoop.teradata.TeradataConnManager --username user --password xxxxx --table cricket_location_store --split-by storeid --target-dir /apps/hive/warehouse/user.db/cricket_location_store
... View more
08-25-2016
08:54 PM
It didn't work if you use "sqoop import --create-hive-table ......" It worked if you use "sqoop create-hive-table .......". In this way, you only create the table based on Teradata schema, but not ingest the data at same time. At the end of last time I use TDCH, I had to create table first in hive, and then TDCH the data over.
... View more
08-25-2016
07:22 PM
2 Kudos
As I last used in TDCH 1.4.2, create-hive-table as parameter in "sqoop import" doesn't work. Though you can use "sqoop create-hive-table" to create hive based on the schema from Teradata. Keep in mind, the sqoop for Teradata use TDCH as internal driver, so some of parameters are supported in sqoop, are not supported in TDCH.
... View more
08-19-2016
09:03 PM
3 Kudos
I assume the TD you referring is Teradata. According to the TDCH tutorial, "When the source table is not partitioned, the Teradata split.by.partition source plugin creates a
temporary partitioned staging table and executes an INSERT-SELECT to move data from the
source table into the stage table.". So, if you want to create multiple mappers, means you need to split the table into partitioned staging table, and thus "to enable the creation of staging table,
the split.by.partition plugin requires that the associated Teradata user has ‘create table’ and
‘create view’ privileges as well as free perm space equivalent to the size of the source table
available."
... View more
08-18-2016
01:31 PM
@da li Sorry, I didn't look at your original post carefully. It seems that you are using Sqoop 2 on HDP 2.4. HDP 2.4 is only currently support Sqoop 1.4.6. Install Sqoop 2 on HDP 2.4 obviously cause some incompatible issue there. Did you install it yourself?
... View more
- « Previous
- Next »