<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Sqoop : Teradata to HDFS using AVRO file format not working in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150770#M40596</link>
    <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/5078/pvillard.html" nodeid="5078"&gt;@Pierre Villard&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I am getting below error now&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Error: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter&lt;/P&gt;&lt;P&gt;I have avro-mapred-1.7.5-hadoop2.jar and avro-1.7.5.jar in my $SQOOP_HOME/lib."&lt;/P&gt;&lt;P&gt;Please help.&lt;/P&gt;</description>
    <pubDate>Wed, 14 Sep 2016 17:31:38 GMT</pubDate>
    <dc:creator>arkaprova</dc:creator>
    <dc:date>2016-09-14T17:31:38Z</dc:date>
    <item>
      <title>Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150768#M40594</link>
      <description>&lt;P&gt;I am getting below error when I am trying to import from Teradata to HDFS &lt;/P&gt;&lt;P&gt;Sqoop command :&lt;/P&gt;&lt;PRE&gt;sqoop import --connection-manager org.apache.sqoop.teradata.TeradataConnManager --connect jdbc:teradata://**.***.***.**/DATABASE=***** --username ****** --password ***** --table employee --target-dir /home/****/tera_to_hdfs125 --as-avrodatafile -m 1&lt;/PRE&gt;&lt;P&gt;16/09/14 11:56:22 ERROR teradata.TeradataSqoopImportHelper: Exception running Teradata import job
com.teradata.connector.common.exception.ConnectorException: no Avro schema is found for type mapping
  at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:142)
  at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:58)
  at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370)
  at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504)
  at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
  at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
  at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
  at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
  at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
16/09/14 11:56:22 INFO teradata.TeradataSqoopImportHelper: Teradata import job completed with exit code 1
16/09/14 11:56:22 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Exception running Teradata import job
  at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:373)
  at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504)
  at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
  at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
  at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
  at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
  at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
  at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
Caused by: com.teradata.connector.common.exception.ConnectorException: &lt;STRONG&gt;no Avro schema is found for type mapping&lt;/STRONG&gt;
  at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:142)
  at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:58)
  at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370)
  ... 9 more&lt;/P&gt;&lt;P&gt;Please help.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Arkaprova&lt;/P&gt;</description>
      <pubDate>Wed, 14 Sep 2016 13:50:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150768#M40594</guid>
      <dc:creator>arkaprova</dc:creator>
      <dc:date>2016-09-14T13:50:43Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150769#M40595</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Based on this documentation :&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_HortonworksConnectorForTeradata/content/ch_HortonworksConnectorForTeradata.html#ch_HortonworksConnectorForTeradata-Appendix-Options-Sqoop" target="_blank"&gt;https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2/bk_HortonworksConnectorForTeradata/content/ch_HortonworksConnectorForTeradata.html#ch_HortonworksConnectorForTeradata-Appendix-Options-Sqoop&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I think you need to:&lt;/P&gt;&lt;P&gt;- "&lt;STRONG&gt;Note:&lt;/STRONG&gt; If you will run Avro jobs, download
                        avro-mapred-1.7.4-hadoop2.jar and place it under $SQOOP_HOME/lib."&lt;/P&gt;&lt;P&gt;- Give as argument the Avro schema of the data you want to import through the option 'avroschemafile'. This is a connector-specific argument so you would need to do something like:&lt;/P&gt;&lt;PRE&gt;sqoop import--connection-manager org.apache.sqoop.teradata.TeradataConnManager--connect jdbc:teradata://**.***.***.**/DATABASE=***** --username ****** --password ***** --table employee --target-dir /home/****/tera_to_hdfs125 --as-avrodatafile -m 1 -- --avroschemafile &amp;lt;schema&amp;gt;&lt;/PRE&gt;&lt;P&gt;Hope this helps.&lt;/P&gt;</description>
      <pubDate>Wed, 14 Sep 2016 15:18:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150769#M40595</guid>
      <dc:creator>pvillard</dc:creator>
      <dc:date>2016-09-14T15:18:31Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150770#M40596</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/5078/pvillard.html" nodeid="5078"&gt;@Pierre Villard&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I am getting below error now&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Error: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter&lt;/P&gt;&lt;P&gt;I have avro-mapred-1.7.5-hadoop2.jar and avro-1.7.5.jar in my $SQOOP_HOME/lib."&lt;/P&gt;&lt;P&gt;Please help.&lt;/P&gt;</description>
      <pubDate>Wed, 14 Sep 2016 17:31:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150770#M40596</guid>
      <dc:creator>arkaprova</dc:creator>
      <dc:date>2016-09-14T17:31:38Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150771#M40597</link>
      <description>&lt;P&gt;Do you have a full stack trace that you could share? What is your schema (maybe some types are not yet supported with Teradata connector depending of the version)?&lt;/P&gt;</description>
      <pubDate>Wed, 14 Sep 2016 17:56:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150771#M40597</guid>
      <dc:creator>pvillard</dc:creator>
      <dc:date>2016-09-14T17:56:12Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150772#M40598</link>
      <description>&lt;P&gt;Below is the full stack trace.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;16/09/14 15:49:10 INFO mapreduce.Job: Running job: job_1473774257007_0002
16/09/14 15:49:19 INFO mapreduce.Job: Job job_1473774257007_0002 running in uber mode : false
16/09/14 15:49:19 INFO mapreduce.Job:  map 0% reduce 0%
16/09/14 15:49:22 INFO mapreduce.Job: Task Id : attempt_1473774257007_0002_m_000000_0, Status : FAILED
Error: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143&lt;/P&gt;&lt;P&gt;16/09/14 15:49:25 INFO mapreduce.Job: Task Id : attempt_1473774257007_0002_m_000000_1, Status : FAILED
Error: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
16/09/14 15:49:29 INFO mapreduce.Job: Task Id : attempt_1473774257007_0002_m_000000_2, Status : FAILED
Error: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
16/09/14 15:49:35 INFO mapreduce.Job:  map 100% reduce 0%
16/09/14 15:49:36 INFO mapreduce.Job: Job job_1473774257007_0002 failed with state FAILED due to: Task failed task_1473774257007_0002_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0&lt;/P&gt;&lt;P&gt;16/09/14 15:49:36 INFO mapreduce.Job: Counters: 12
  Job Counters
  Failed map tasks=4
  Launched map tasks=4
  Other local map tasks=3
  Data-local map tasks=1
  Total time spent by all maps in occupied slots (ms)=8818
  Total time spent by all reduces in occupied slots (ms)=0
  Total time spent by all map tasks (ms)=8818
  Total vcore-seconds taken by all map tasks=8818
  Total megabyte-seconds taken by all map tasks=18059264
  Map-Reduce Framework
  CPU time spent (ms)=0
  Physical memory (bytes) snapshot=0
  Virtual memory (bytes) snapshot=0
16/09/14 15:49:36 INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor starts at:  1473848376584
16/09/14 15:49:37 INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor ends at:  1473848376584
16/09/14 15:49:37 INFO processor.TeradataInputProcessor: the total elapsed time of input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 0s
16/09/14 15:49:37 INFO teradata.TeradataSqoopImportHelper: Teradata import job completed with exit code 1
16/09/14 15:49:37 ERROR tool.ImportTool: Error during import: Import Job failed&lt;/P&gt;&lt;P&gt;Schema :&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;{
"type" : "record",
"namespace" : "avronamespace",
"name" : "Employee",
"fields" : [
{ "name" : "Id" , "type" : "string" },
{ "name" : "Name" , "type" : "string" }
]
}&lt;/P&gt;&lt;P&gt;Also my concern is , why avro schema file is required here. I am trying to import data from Teradata to HDFS using avro file format. Please help.&lt;/P&gt;</description>
      <pubDate>Wed, 14 Sep 2016 18:35:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150772#M40598</guid>
      <dc:creator>arkaprova</dc:creator>
      <dc:date>2016-09-14T18:35:18Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150773#M40599</link>
      <description>&lt;BLOCKQUOTE&gt;"When Avro data is stored in a file, its schema is stored with
	it, so that files may be processed later by any program."
&lt;/BLOCKQUOTE&gt;&lt;P&gt;I believe the schema is required so it is stored with the data you imported into HDFS.&lt;/P&gt;&lt;P&gt;Could you run the following command to have more details about the error?&lt;/P&gt;&lt;PRE&gt;yarn logs -applicationId application_1473774257007_0002&lt;/PRE&gt;</description>
      <pubDate>Wed, 14 Sep 2016 18:42:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150773#M40599</guid>
      <dc:creator>pvillard</dc:creator>
      <dc:date>2016-09-14T18:42:42Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150774#M40600</link>
      <description>&lt;P&gt;Below is from yarn log&lt;/P&gt;&lt;P&gt;2016-09-14 15:49:29,345 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoSuchMethodError: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
        at org.apache.avro.mapreduce.AvroKeyRecordWriter.&amp;lt;init&amp;gt;(AvroKeyRecordWriter.java:53)
        at org.apache.avro.mapreduce.AvroKeyOutputFormat$RecordWriterFactory.create(AvroKeyOutputFormat.java:78)
        at org.apache.avro.mapreduce.AvroKeyOutputFormat.getRecordWriter(AvroKeyOutputFormat.java:104)
        at com.teradata.connector.hdfs.HdfsAvroOutputFormat.getRecordWriter(HdfsAvroOutputFormat.java:49)
        at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.&amp;lt;init&amp;gt;(ConnectorOutputFormat.java:89)
        at com.teradata.connector.common.ConnectorOutputFormat.getRecordWriter(ConnectorOutputFormat.java:38)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.&amp;lt;init&amp;gt;(MapTask.java:647)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
2016-09-14 15:49:29,351 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics system...
2016-09-14 15:49:29,351 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system stopped.
2016-09-14 15:49:29,352 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system shutdown complete.
End of LogType:syslog&lt;/P&gt;</description>
      <pubDate>Wed, 14 Sep 2016 22:17:07 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150774#M40600</guid>
      <dc:creator>arkaprova</dc:creator>
      <dc:date>2016-09-14T22:17:07Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150775#M40601</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/12830/bigdataarkaprova.html" nodeid="12830"&gt;@Arkaprova Saha&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I'm not sure about the &lt;STRONG&gt;--connection-manager&lt;/STRONG&gt; option, but I have successfully performed a sqoop import from Teradata to AVRO using Teradata's JDBC driver as follows:&lt;/P&gt;&lt;PRE&gt;sqoop import --driver com.teradata.jdbc.TeraDriver \
--connect 'jdbc:teradata://****/DATABASE=****' \
--username **** --password **** \
--table MyTable \
--target-dir /****/****/**** \
--as-avrodatafile \
--num-mappers 1&lt;/PRE&gt;&lt;P&gt;Just ensure that the JBDC driver, &lt;STRONG&gt;terajdbc4.jar&lt;/STRONG&gt;, is in your &lt;STRONG&gt;$SQOOP_LIB&lt;/STRONG&gt; folder. For me, on HDP 2.4 that is &lt;STRONG&gt;/usr/hdp/current/sqoop-client/lib&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 15 Sep 2016 06:04:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150775#M40601</guid>
      <dc:creator>StevenONeill</dc:creator>
      <dc:date>2016-09-15T06:04:38Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150776#M40602</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/10843/c8sol.html" nodeid="10843"&gt;@Steven O'Neill&lt;/A&gt; Thanks a lot &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt; . This is working for me .&lt;/P&gt;</description>
      <pubDate>Thu, 15 Sep 2016 13:56:48 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150776#M40602</guid>
      <dc:creator>arkaprova</dc:creator>
      <dc:date>2016-09-15T13:56:48Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop : Teradata to HDFS using AVRO file format not working</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150777#M40603</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/5078/pvillard.html" nodeid="5078"&gt;@Pierre Villard&lt;/A&gt;&lt;P&gt;This is working with -Dmapreduce.job.user.classpath.first=true option. Thanks a lot.&lt;/P&gt;</description>
      <pubDate>Wed, 21 Sep 2016 18:08:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Teradata-to-HDFS-using-AVRO-file-format-not-working/m-p/150777#M40603</guid>
      <dc:creator>arkaprova</dc:creator>
      <dc:date>2016-09-21T18:08:56Z</dc:date>
    </item>
  </channel>
</rss>

