<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Sqoop import to avro failing - which jars to be used ? in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124084#M86828</link>
    <description>&lt;P&gt;Thank you.&lt;/P&gt;</description>
    <pubDate>Sat, 17 Mar 2018 21:46:58 GMT</pubDate>
    <dc:creator>jun251111</dc:creator>
    <dc:date>2018-03-17T21:46:58Z</dc:date>
    <item>
      <title>Sqoop import to avro failing - which jars to be used ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124079#M86823</link>
      <description>&lt;P&gt;HDP-2.5.0.0 using Ambari  2.4.0.1&lt;/P&gt;&lt;P&gt;A Sqoop import to avro fails with the following error :&lt;/P&gt;&lt;PRE&gt;16/10/11 08:26:32 INFO mapreduce.Job: Job job_1476162030393_0002 running in uber mode : false
16/10/11 08:26:32 INFO mapreduce.Job:  map 0% reduce 0%
16/10/11 08:26:40 INFO mapreduce.Job:  map 25% reduce 0%
16/10/11 08:26:40 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000001_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:40 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000000_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:40 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000003_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/10/11 08:26:41 INFO mapreduce.Job:  map 0% reduce 0%
16/10/11 08:26:42 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000002_0, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:46 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000001_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:47 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000000_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:47 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000003_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:48 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000002_1, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:51 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000001_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
16/10/11 08:26:51 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000002_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/10/11 08:26:51 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000003_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/10/11 08:26:52 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000000_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
16/10/11 08:26:57 INFO mapreduce.Job:  map 100% reduce 0%
16/10/11 08:26:57 INFO mapreduce.Job: Job job_1476162030393_0002 failed with state FAILED due to: Task failed task_1476162030393_0002_m_000002
Job failed as tasks failed. failedMaps:1 failedReduces:0&lt;/PRE&gt;&lt;P&gt;The YARN application log ends with :&lt;/P&gt;&lt;PRE&gt;FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoSuchMethodError: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
        at org.apache.sqoop.mapreduce.AvroOutputFormat.getRecordWriter(AvroOutputFormat.java:97)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.&amp;lt;init&amp;gt;(MapTask.java:647)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)&lt;/PRE&gt;&lt;P&gt;The original installation had the following libraries under /usr/hdp/2.5.0.0-1245/sqoop/lib:&lt;/P&gt;&lt;P&gt;avro-mapred-1.8.0-hadoop2.jar, parquet-avro-1.4.1.jar, avro-1.8.0.jar&lt;/P&gt;&lt;P&gt;I tried first replacing(&lt;STRONG&gt;ONLY one jar at a time under the lib&lt;/STRONG&gt;) &lt;STRONG&gt;avro-mapred-1.8.0-hadoop2.jar&lt;/STRONG&gt; &lt;STRONG&gt;with &lt;/STRONG&gt;&lt;STRONG&gt;avro-mapred-1.8.1-hadoop2.jar&lt;/STRONG&gt; &lt;STRONG&gt;and &lt;/STRONG&gt;&lt;STRONG&gt;avro-mapred-1.7.7-hadoop2.jar&lt;/STRONG&gt;. When that didn't help, I tried using the jars from the &lt;STRONG&gt;HDP 2.4&lt;/STRONG&gt; distribution viz. &lt;STRONG&gt;avro-1.7.5.jar&lt;/STRONG&gt; and &lt;STRONG&gt;avro-mapred-1.7.5-hadoop2.jar&lt;/STRONG&gt;, yet the error persisted.&lt;/P&gt;&lt;P&gt;How shall I fix the error ? &lt;/P&gt;</description>
      <pubDate>Tue, 11 Oct 2016 16:36:10 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124079#M86823</guid>
      <dc:creator>kaliyugantagoni</dc:creator>
      <dc:date>2016-10-11T16:36:10Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import to avro failing - which jars to be used ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124080#M86824</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/5134/kaliyugantagonist.html" nodeid="5134"&gt;@Kaliyug Antagonist&lt;/A&gt;&lt;P&gt;This is actually a known issue, and there is a Jira for a documentation bug to get this fixed in a later HDP release. Sqoop uses 1.8.0 of avro and there are other Hadoop components using 1.7.5 or 1.7.4 avro. &lt;/P&gt;&lt;P&gt;Please add the following property after 'import': -Dmapreduce.job.user.classpath.first=true&lt;/P&gt;&lt;P&gt;Example: &lt;/P&gt;&lt;P&gt;sqoop import -Dmapreduce.job.user.classpath.first=true -Dhadoop.security.credential.provider.path=jceks://x.jceks --connect jdbc:db2://xxx:60000/x2 --username xx -password-alias xx --as-avrodatafile --target-dir xx/data/test --fields-terminated-by '\001' --table xx -m 1&lt;/P&gt;</description>
      <pubDate>Tue, 11 Oct 2016 16:51:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124080#M86824</guid>
      <dc:creator>ssubhas</dc:creator>
      <dc:date>2016-10-11T16:51:20Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import to avro failing - which jars to be used ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124081#M86825</link>
      <description>&lt;P&gt;It worked :)
Can you provide the JIRA bug link ?&lt;/P&gt;</description>
      <pubDate>Tue, 11 Oct 2016 17:35:49 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124081#M86825</guid>
      <dc:creator>kaliyugantagoni</dc:creator>
      <dc:date>2016-10-11T17:35:49Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import to avro failing - which jars to be used ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124082#M86826</link>
      <description>&lt;P&gt;Thank you Sindhu,&lt;/P&gt;&lt;P&gt;I was facing the same problem. Now it works.&lt;/P&gt;</description>
      <pubDate>Mon, 06 Feb 2017 07:50:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124082#M86826</guid>
      <dc:creator>ejunior76</dc:creator>
      <dc:date>2017-02-06T07:50:17Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import to avro failing - which jars to be used ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124083#M86827</link>
      <description>&lt;P&gt;Thanks! Even i faced similar issue.&lt;/P&gt;</description>
      <pubDate>Sun, 26 Nov 2017 09:56:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124083#M86827</guid>
      <dc:creator>sarvesh_sood</dc:creator>
      <dc:date>2017-11-26T09:56:00Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import to avro failing - which jars to be used ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124084#M86828</link>
      <description>&lt;P&gt;Thank you.&lt;/P&gt;</description>
      <pubDate>Sat, 17 Mar 2018 21:46:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124084#M86828</guid>
      <dc:creator>jun251111</dc:creator>
      <dc:date>2018-03-17T21:46:58Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import to avro failing - which jars to be used ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124085#M86829</link>
      <description>&lt;P&gt;Thanks It worked ! &lt;/P&gt;&lt;P&gt; could you please explain this addition how is this making it work ? " -Dmapreduce.job.user.classpath.first=true -Dhadoop.security.credential.provider.path=jceks://x.jceksor "&lt;/P&gt;</description>
      <pubDate>Mon, 02 Apr 2018 03:08:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124085#M86829</guid>
      <dc:creator>muhammad_danish</dc:creator>
      <dc:date>2018-04-02T03:08:51Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import to avro failing - which jars to be used ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124086#M86830</link>
      <description>&lt;P&gt;For HDP 3.1, setting the following property will resolve the issue:&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;-Dmapreduce.job.classloader=true&lt;/P&gt;</description>
      <pubDate>Thu, 09 May 2019 04:58:54 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/124086#M86830</guid>
      <dc:creator>manmeetkaur_ran</dc:creator>
      <dc:date>2019-05-09T04:58:54Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import to avro failing - which jars to be used ?</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/310003#M223990</link>
      <description>&lt;P&gt;Thank you So much Subha, It worked like magic.&lt;/P&gt;</description>
      <pubDate>Tue, 19 Jan 2021 11:38:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-to-avro-failing-which-jars-to-be-used/m-p/310003#M223990</guid>
      <dc:creator>Praveenya</dc:creator>
      <dc:date>2021-01-19T11:38:30Z</dc:date>
    </item>
  </channel>
</rss>

