Member since
01-25-2016
345
Posts
86
Kudos Received
25
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5011 | 10-20-2017 06:39 PM | |
3539 | 03-30-2017 06:03 AM | |
2594 | 02-16-2017 04:55 PM | |
16113 | 02-01-2017 04:38 PM | |
1146 | 01-24-2017 08:36 PM |
07-01-2016
03:25 PM
@ed day A fully qualified domain name (FQDN) is the complete domain name for a specific computer, or host, on the Internet. The FQDN consists of two parts: the hostname and the domain name. Ex: hostname.<companyName>.com Give any similar Name and I don't think you will see any issue. Ex: If you hostname is hdp234 then give FQDN like hdp234.abc.com
... View more
06-30-2016
05:14 PM
@Venkata Chinna Here is the HDP2.4 Supported OS link HDP2.4 OS
... View more
06-29-2016
04:05 PM
@Antony Shajin Lucas What are you trying to make changes in conf.empty? Hue installation Link: Hue Link as per HDP document make configuration changes in /etc/hue/conf/hue.ini as re-start hue service Command to re-start: service hue restart or service hue stop/start Ideally you will see like below the hue.ini and folder permissions, and directory location. here is the location for logs: /var/log/hue/error.log /var/log/hue/runcpserver.log /var/log/hue/error.log i hope this will help you.
... View more
06-27-2016
04:51 AM
@Ethan Hsieh I think 2.3.1 is latest version gsonlink
... View more
06-22-2016
06:04 AM
Thanks Predrag. it's working for "append" and INT column as "check-column" another quick question: Do we need to pass "--last -value" every time based on previous "Upper bound value:####"?
... View more
06-22-2016
05:07 AM
below script is running fine, then no issues with timestamp running scripts in Teradata, it seems issue with incremental or Upper value milliseconds format. sqoop import \ --connect jdbc:teradata://abc/DATABASE=hadoop,CHARSET=UTF8 \ --driver "com.teradata.jdbc.TeraDriver" \ --username hadoop \ --password 00000 \ --query "select * from text where dw_load_ts >='2016-01-01 00:00:00' AND \$CONDITIONS" \ --target-dir /diva/text \ --m 1
... View more
06-22-2016
04:37 AM
Changed incremental to lastmodified "--incremental lastmodified" in sqoop script but still getting an error. 16/06/22 04:35:16 ERROR manager.SqlManager: SQL exception accessing current timestamp: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.00.00.20] [Error 3706] [SQLState 42000] Syntax error: expected something between '(' and ')'.
java.sql.SQLException: [Teradata Database] [TeraJDBC 15.00.00.20] [Error 3706] [SQLState 42000] Syntax error: expected something between '(' and ')'. ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Could not get current time from database
... View more
06-22-2016
04:29 AM
I'm thinking that the value of '0000-00-00 00:00:00' date is stored in our db but can't be handled by the driver.
... View more
06-22-2016
04:22 AM
Hi, I'm trying to run sqoop incremental imports from Teradata to HDFS but running into issues with time stamp. my sqoop command: sqoop import \ --connect jdbc:teradata://PH/DATABASE=hadoop,CHARSET=UTF8 \ --driver "com.teradata.jdbc.TeraDriver" \ --username hadoop \ --password Hadoop \ --table item \ --target-dir /diva/item_text \ --m 1 \ --fields-terminated-by '|' \ --incremental append -check-column dw_modify_ts --last-value '2013-01-01 02:00:00' Error Message: Error: java.io.IOException: SQLException in nextKeyValue Caused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.00.00.20] [Error 6760] [SQLState HY000] Invalid timestamp. Log Message: ----------------------------------- 16/06/22 04:13:43 INFO tool.ImportTool: Incremental import based on column dw_modify_ts 16/06/22 04:13:43 INFO tool.ImportTool: Lower bound value: '2013-01-01 02:00:00' 16/06/22 04:13:43 INFO tool.ImportTool: Upper bound value: '2013-07-10 22:56:01.0' 16/06/22 04:13:43 INFO mapreduce.ImportJobBase: Beginning import of e2_item_category_text Here as per above log sqoop giving milliseconds as well in Upper bound Value "'2013-07-10 22:56:01.0'? I'm suspecting that this milliseconds format having an issue while importing data from Teradata. Can you suggest me if you have any idea?
... View more
Labels:
- Labels:
-
Apache Sqoop
06-21-2016
04:35 PM
It seems permission issue and user is not able to write the tmp files in /tmp/hive/ Grant write permissions for user and try again.
... View more