<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Sqoop import of Avro file from  Teradata to hdfs. in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171173#M57838</link>
    <description>&lt;P&gt;Does it mean with HDP 2.5+ we support date type in Avro 1.8+? Because that would be awesome Venkat.&lt;/P&gt;</description>
    <pubDate>Sun, 26 Mar 2017 09:41:54 GMT</pubDate>
    <dc:creator>aervits</dc:creator>
    <dc:date>2017-03-26T09:41:54Z</dc:date>
    <item>
      <title>Sqoop import of Avro file from  Teradata to hdfs.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171169#M57834</link>
      <description>&lt;P&gt;I am trying to a copy an Avro schema file from Teradata to Hdfs using sqoop, but the import job is failing with the below error:&lt;/P&gt;&lt;PRE&gt;sqoop import --libjars "SQOOP_HOME/lib/avro-mapred-1.7.5-hadoop2.jar,SQOOP_HOME/lib/avro-mapred-1.7.4-hadoop2.jar,SQOOP_HOME/lib/paranamer-2.3.jar" --connect jdbc:teradata://xx.xx.xx.xxx/Database=xxxx --connection-manager org.apache.sqoop.teradata.TeradataConnManager --username xxx --password xxx --table xx --target-dir xx --as-avrodatafile -m 1 -- --usexview --accesslock --avroschemafile xx.avsc
INFO impl.YarnClientImpl: Submitted application application_1455051872611_0127
INFO mapreduce.Job: The url to track the job: &lt;A href="http://teradata-sqoop-ks-re-sec-4.novalocal:8088/proxy/application_1455051872611_0127/" target="_blank"&gt;http://teradata-sqoop-ks-re-sec-4.novalocal:8088/proxy/application_1455051872611_0127/&lt;/A&gt;
INFO mapreduce.Job: Running job: job_1455051872611_0127
INFO mapreduce.Job: Job job_1455051872611_0127 running in uber mode : false
INFO mapreduce.Job:  map 0% reduce 0%
INFO mapreduce.Job:  map 100% reduce 0%
INFO mapreduce.Job: Task Id : attempt_1455051872611_0127_m_000000_0, Status : FAILED
Error: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
 INFO mapreduce.Job:  map 0% reduce 0%
 INFO mapreduce.Job: Task Id : attempt_1455051872611_0127_m_000000_1, Status : FAILED
Error: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
INFO mapreduce.Job: Task Id : attempt_1455051872611_0127_m_000000_2, Status : FAILED
Error: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
INFO mapreduce.Job:  map 100% reduce 0%
INFO mapreduce.Job: Job job_1455051872611_0127 failed with state FAILED due to: Task failed task_1455051872611_0127_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
.

.

.
INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor starts at:  1455147607714
INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor ends at:  1455147607714
INFO processor.TeradataInputProcessor: the total elapsed time of input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 0s
INFO teradata.TeradataSqoopImportHelper: Teradata import job completed with exit code 1
ERROR tool.ImportTool: Error during import: Import Job failed&lt;/PRE&gt;
&lt;PRE&gt;FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoSuchMethodError: org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
	at org.apache.avro.mapreduce.AvroKeyRecordWriter.&amp;lt;init&amp;gt;(AvroKeyRecordWriter.java:53)
	at org.apache.avro.mapreduce.AvroKeyOutputFormat$RecordWriterFactory.create(AvroKeyOutputFormat.java:78)
	at org.apache.avro.mapreduce.AvroKeyOutputFormat.getRecordWriter(AvroKeyOutputFormat.java:104)
	at com.teradata.connector.hdfs.HdfsAvroOutputFormat.getRecordWriter(HdfsAvroOutputFormat.java:49)
	at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.&amp;lt;init&amp;gt;(ConnectorOutputFormat.java:89)
	at com.teradata.connector.common.ConnectorOutputFormat.getRecordWriter(ConnectorOutputFormat.java:38)
	at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.&amp;lt;init&amp;gt;(MapTask.java:647)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
&lt;/PRE&gt;</description>
      <pubDate>Thu, 23 Mar 2017 05:23:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171169#M57834</guid>
      <dc:creator>ksuresh</dc:creator>
      <dc:date>2017-03-23T05:23:14Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import of Avro file from  Teradata to hdfs.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171170#M57835</link>
      <description>&lt;P&gt;Found this on the Hortonworks Teradata Connector support &lt;A href="https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.6/bk_HortonworksConnectorForTeradata/content/ch_HortonworksConnectorForTeradata.html"&gt; doc&lt;/A&gt; :&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;If you will run Avro jobs, download avro-mapred-1.7.4-hadoop2.jar and place it under $SQOOP_HOME/lib.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;I had two versions of avro jars in my $SQOOP_HOME/lib, upon removing all others except the avro-mapred-1.7.4-hadoop2.jar, import succeeded.&lt;/P&gt;</description>
      <pubDate>Thu, 23 Mar 2017 07:53:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171170#M57835</guid>
      <dc:creator>ksuresh</dc:creator>
      <dc:date>2017-03-23T07:53:31Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import of Avro file from  Teradata to hdfs.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171171#M57836</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/245/ksuresh.html" nodeid="245"&gt;@ksuresh&lt;/A&gt;, thanks for posting your resolution. I work on the Sqoop documentation and will look into whether we need to add a comment about removing other jars from /lib.&lt;/P&gt;</description>
      <pubDate>Fri, 24 Mar 2017 23:22:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171171#M57836</guid>
      <dc:creator>bandalora</dc:creator>
      <dc:date>2017-03-24T23:22:27Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import of Avro file from  Teradata to hdfs.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171172#M57837</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/245/ksuresh.html" nodeid="245"&gt;@ksuresh&lt;/A&gt;&lt;P&gt;Thanks for catching the issue in the doc.   You can either remove the conflicting avro files (sqoop in HDP 2.5 and 2.6 ships with avro 1.8.0 jar files) or you can add the following to the sqoop command line  and run.&lt;A rel="user" href="https://community.cloudera.com/users/245/ksuresh.html" nodeid="245"&gt;&lt;/A&gt;&lt;A rel="user" href="https://community.cloudera.com/users/245/ksuresh.html" nodeid="245"&gt;&lt;/A&gt;&lt;/P&gt;&lt;P style="display: inline !important;"&gt;sqoop import -Dmapreduce.job.user.classpath.first=true &amp;lt;rest of the arguments&amp;gt;&lt;/P&gt;&lt;P&gt;Beverley&lt;/P&gt;&lt;P&gt;Yes, it would be good to mention this in the docs.&lt;/P&gt;</description>
      <pubDate>Sat, 25 Mar 2017 02:44:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171172#M57837</guid>
      <dc:creator>vranganathan</dc:creator>
      <dc:date>2017-03-25T02:44:44Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import of Avro file from  Teradata to hdfs.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171173#M57838</link>
      <description>&lt;P&gt;Does it mean with HDP 2.5+ we support date type in Avro 1.8+? Because that would be awesome Venkat.&lt;/P&gt;</description>
      <pubDate>Sun, 26 Mar 2017 09:41:54 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-import-of-Avro-file-from-Teradata-to-hdfs/m-p/171173#M57838</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2017-03-26T09:41:54Z</dc:date>
    </item>
  </channel>
</rss>

