Member since
04-11-2016
535
Posts
148
Kudos Received
77
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7468 | 09-17-2018 06:33 AM | |
1821 | 08-29-2018 07:48 AM | |
2730 | 08-28-2018 12:38 PM | |
2115 | 08-03-2018 05:42 AM | |
1973 | 07-27-2018 04:00 PM |
08-28-2018
12:38 PM
@Samant Thakur This is a limitation from Sqoop end that operation on any non-transactional is not supported. Refer to below link to enable transaction logging on Informix and then, try Sqoop. https://www.ibm.com/support/knowledgecenter/SSGU8G_12.1.0/com.ibm.sqlt.doc/ids_sqt_279.htm
... View more
08-28-2018
10:51 AM
1 Kudo
@Eugene Mogilevsky Can you check the HS2 logs and Zookeeper logs for issues? Seems like the Zookeeper is unable to connect to HS2.
... View more
08-27-2018
05:07 PM
@Jai
C
It seems like Sqoop exported '0' records and the mapper failed. Check the application log for errors and share the complete error stack.
... View more
08-27-2018
11:26 AM
@Jai
C
As mentioned in the Sqoop Jira, export into Bucketed Hive table is not supported. To export into Hive table, recreate the table without 'clustered by'.
... View more
08-27-2018
10:26 AM
@Jai
C
The error is not related to the Windows authentication, but is because of the Hive/Hcatalog table being a bucketed table which is not supported from Sqoop export. You can verify the same by running command from Hive Cli / Beeline: show create table <database>.<table>; The table definition would be like, keyword being 'clustered by': hive> show create table default.test_bucket;
OK
CREATE TABLE `default.test_bucket`(
`col1` int,
`col2` string)
CLUSTERED BY (
col1)
INTO 3 BUCKETS
ROW FORMAT SERDE
'org.apache.hadoop.hive.ql.io.orc.OrcSerde'
STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
LOCATION
'hdfs://xxx.com:8020/apps/hive/warehouse/test_bucket'
TBLPROPERTIES (
'numFiles'='13',
'numRows'='0',
'rawDataSize'='0',
'totalSize'='8883',
'transactional'='true',
'transient_lastDdlTime'='1505206092')
Time taken: 0.786 seconds, Fetched: 21 row(s)
... View more
08-27-2018
10:09 AM
@Jai
C
Below is the sample export command which works. In your case, could you share the error seen? sqoop export --connect "jdbc:jtds:sqlserver://IE11WIN7:1433;useNTLMv2=true;domain=IE11WIN7;databaseName=default_db" --table "test_table_view" --hcatalog-database default --hcatalog-table t1 --columns col2,col3 --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username IEUser --password 'Passw0rd!' --update-mode allowinsert --verbose
... View more
08-27-2018
09:49 AM
@Jai
C
Based on the following Jira, unfortunately, it seems like the ability to import directly into hive bucketed tables is not supported yet. https://issues.apache.org/jira/browse/SQOOP-1889 So, you would have to import the data to an intermediate table and then insert into the bucketed table. Please accept answer if this helped.
... View more
08-16-2018
07:36 AM
@Saravanan
Muthiah
When using hive-jdbc-standalone*.jar, part from hadoop-common*.jar, below are the other dependent jars required: ibthrift-0.9.0.jar
httpclient-4.2.5.jar
httpcore-4.2.5.jar
commons-logging-1.1.3.jar
hive-common.jar
slf4j-api-1.7.5.jar
hive-metastore.jar
hive-service.jar
hadoop-common.jar
hive-jdbc.jar
guava-11.0.2.jar Please add the jars in the classpath under Client and try again.
... View more
08-16-2018
07:31 AM
1 Kudo
@Nagarajan
Jayaraman
Hive supports time in form of timestamp, hence, the error as "time" is not recognized. If you are to hold the 0930 alone as openhour, then apt datatype would be string. In case, you could hold data like "2018-10-15 09:30:00" as openhour, then you could use timestamp as datatype and select query would be like "select cast("2018-10-15 09:30:00" as timestamp) from test_table. FYI, more details of Timestamp datatype.
... View more
08-16-2018
05:56 AM
@Sudharsan
Ganeshkumar
You need to run it under beeline / hive cli as below: [root@xxx ~]# su - hive
[hive@xxx ~]$ hive
Logging initialized using configuration in file:/etc/hive/2.5.3.0-37/0/hive-log4j.properties
hive> show create table flight_orc;
OK
CREATE TABLE `flight_orc`(
`flightnum` string,
`arrtime` int,
`deptime` int,
`crsarrtime` int,
`crsdeptime` int,
`arrdelay` int,
`depdelay` int,
`airtime` int)
ROW FORMAT SERDE
'org.apache.hadoop.hive.ql.io.orc.OrcSerde'
STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
LOCATION
'hdfs://xxx.com:8020/apps/hive/warehouse/flight_orc'
TBLPROPERTIES (
'numFiles'='3',
'numRows'='1000000',
'rawDataSize'='110912400',
'totalSize'='397173299',
'transient_lastDdlTime'='1516645288')
Time taken: 4.901 seconds, Fetched: 23 row(s)
hive>
... View more