Created on
11-22-2016
09:08 AM
- last edited on
08-27-2019
07:28 AM
by
cjervis
We are importing a table from mysql to hdfs using incremental import and on top of the imported file creating a HIVE external table,
Initially the complete source table is importing when last modified value for a timestamp column set to 0 but while I am providing the last-value to the latest one. Ite throwing error as below.
Error: java.lang.RuntimeException: Can't parse input data: 'FTP' at acct.__loadFromFields(acct.java:2804) at acct.parse(acct.java:2452) at org.apache.sqoop.mapreduce.MergeTextMapper.map(MergeTextMapper.java:53) at org.apache.sqoop.mapreduce.MergeTextMapper.map(MergeTextMapper.java:34) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by: java.lang.NumberFormatException: For input string: "FTP" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.lang.Long.parseLong(Long.java:589) at java.lang.Long.valueOf(Long.java:803) at acct.__loadFromFields(acct.java:2666)
Created 11-22-2016 09:10 AM
Both mysql table and hive external table both have same number of columns
Created 01-17-2017 06:44 PM
The issue is with the data and sqoop is unable to parse the data. Please try below arguments with sqoop command and let me know:
--input-lines-terminated-by'\n'--input-null-string"\\\\N"--input-null-non-string"\\\\N"
Created 08-06-2017 06:41 PM
Same issue I too facing. Added arguments as mentioned but still the same issue.
Created 08-06-2017 06:42 PM
I too facing same issue. Is your issue resolved? If yes let me know the solution.
Created 12-18-2017 07:35 PM
I think while importing your data the value of fields terminated by is other than comma (,) then you need to mention the variable while incremental load.
Still if your unable to solve let me know.
,I think while importing your data the value of fields terminated by is other than comma (,) then you need to mention the variable while incremental load.
Still if your unable to solve let me know.
Created 08-21-2019 01:15 AM
how add variable, in my case I am using ^ this Delimiter.
Could please give me an example how should i use variable for this instance
Created 08-25-2019 09:00 PM
Hi,
It looks like you are trying to import the data and it is failing with the delimiter.
Let's break the testing in 2 jobs. Let's try to import the data first in HDFS and once this works then we will move to next step.
Can you please try to run the sqoop command with incremental and import just to HDFS.
Option 1:- If this works fine then we will go with next step.
Option 2:- If this doesn't work fine then can you please share the sqoop command and the console output so we can debug further.
Regards
Nitish
Created 08-26-2019 05:55 AM
Created 08-26-2019 07:48 AM
HI,
Means only hive import is causing the issue right?
Can you share the output of below command.
## show create table <tablename> from hive.
NOTE:- I am expecting the earlier sqoop import(first time) you ran, you are importing with the same delimiter.
Regards
Nitish