- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Export Hive Parquet table data to Teradata: Index outof Boundary error
- Labels:
-
Apache Hive
-
Apache Sqoop
Created ‎03-18-2020 01:04 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm trying to export data from a hdfs location to teradata. I Have created a table with same schema in teradata
Export Command:
sqoop export --connect jdbc:teradata://teradataserver/Database=dbname --username xxxx --password xxxx --table teradataTbl --export-dir /hdfs/parquet/files/path/
Exception:
2020-03-18 14:32:00,754 ERROR [IPC Server handler 3 on 41836] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1584475869533_13501_m_000002_0 - exited : com.teradata.connector.common.exception.ConnectorException: index outof boundary
at com.teradata.connector.teradata.converter.TeradataConverter.convert(TeradataConverter.java:179)
at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:111)
at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:70)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:670)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at com.teradata.connector.common.ConnectorMMapper.map(ConnectorMMapper.java:134)
at com.teradata.connector.common.ConnectorMMapper.run(ConnectorMMapper.java:122)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
Created ‎03-18-2020 04:04 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Can you please share below info:
- the DDL of the table in Teradata
- some sample data from the file in HDFS
It looks to me some issues in the data, I suggest you also narrow down the issue by slowly reducing the number of columns from the file and see which one is actually causing the issue and which rows.
Cheers
Eric
Created ‎03-20-2020 09:37 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks EricL. At least I know it will work
Created ‎03-21-2020 09:35 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Still struggling with this... See exception stack below
2020-03-21 12:27:31,694 ERROR [IPC Server handler 10 on 45536] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1584785234978_6403_m_000000_0 - exited : com.teradata.connector.common.exception.ConnectorException: index outof boundary at com.teradata.connector.teradata.converter.TeradataConverter.convert(TeradataConverter.java:179) at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:111) at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:70) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:670) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at com.teradata.connector.common.ConnectorMMapper.map(ConnectorMMapper.java:134) at com.teradata.connector.common.ConnectorMMapper.run(ConnectorMMapper.java:122) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
Created ‎03-22-2020 03:12 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Did you mean that you got the same error while trying to export the sample data you provided earlier?
Have you tried to update your driver in case it might be old?
Cheers
Eric
