Created 10-11-2016 09:36 AM
HDP-2.5.0.0 using Ambari 2.4.0.1
A Sqoop import to avro fails with the following error :
16/10/11 08:26:32 INFO mapreduce.Job: Job job_1476162030393_0002 running in uber mode : false 16/10/11 08:26:32 INFO mapreduce.Job: map 0% reduce 0% 16/10/11 08:26:40 INFO mapreduce.Job: map 25% reduce 0% 16/10/11 08:26:40 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000001_0, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 16/10/11 08:26:40 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000000_0, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 16/10/11 08:26:40 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000003_0, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V 16/10/11 08:26:41 INFO mapreduce.Job: map 0% reduce 0% 16/10/11 08:26:42 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000002_0, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 16/10/11 08:26:46 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000001_1, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 16/10/11 08:26:47 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000000_1, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 16/10/11 08:26:47 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000003_1, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 16/10/11 08:26:48 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000002_1, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 16/10/11 08:26:51 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000001_2, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 16/10/11 08:26:51 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000002_2, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V 16/10/11 08:26:51 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000003_2, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V 16/10/11 08:26:52 INFO mapreduce.Job: Task Id : attempt_1476162030393_0002_m_000000_2, Status : FAILED Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V 16/10/11 08:26:57 INFO mapreduce.Job: map 100% reduce 0% 16/10/11 08:26:57 INFO mapreduce.Job: Job job_1476162030393_0002 failed with state FAILED due to: Task failed task_1476162030393_0002_m_000002 Job failed as tasks failed. failedMaps:1 failedReduces:0
The YARN application log ends with :
FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoSuchMethodError: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V at org.apache.sqoop.mapreduce.AvroOutputFormat.getRecordWriter(AvroOutputFormat.java:97) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:647) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:767) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
The original installation had the following libraries under /usr/hdp/2.5.0.0-1245/sqoop/lib:
avro-mapred-1.8.0-hadoop2.jar, parquet-avro-1.4.1.jar, avro-1.8.0.jar
I tried first replacing(ONLY one jar at a time under the lib) avro-mapred-1.8.0-hadoop2.jar with avro-mapred-1.8.1-hadoop2.jar and avro-mapred-1.7.7-hadoop2.jar. When that didn't help, I tried using the jars from the HDP 2.4 distribution viz. avro-1.7.5.jar and avro-mapred-1.7.5-hadoop2.jar, yet the error persisted.
How shall I fix the error ?
Created 10-11-2016 09:51 AM
This is actually a known issue, and there is a Jira for a documentation bug to get this fixed in a later HDP release. Sqoop uses 1.8.0 of avro and there are other Hadoop components using 1.7.5 or 1.7.4 avro.
Please add the following property after 'import': -Dmapreduce.job.user.classpath.first=true
Example:
sqoop import -Dmapreduce.job.user.classpath.first=true -Dhadoop.security.credential.provider.path=jceks://x.jceks --connect jdbc:db2://xxx:60000/x2 --username xx -password-alias xx --as-avrodatafile --target-dir xx/data/test --fields-terminated-by '\001' --table xx -m 1
Created 10-11-2016 09:51 AM
This is actually a known issue, and there is a Jira for a documentation bug to get this fixed in a later HDP release. Sqoop uses 1.8.0 of avro and there are other Hadoop components using 1.7.5 or 1.7.4 avro.
Please add the following property after 'import': -Dmapreduce.job.user.classpath.first=true
Example:
sqoop import -Dmapreduce.job.user.classpath.first=true -Dhadoop.security.credential.provider.path=jceks://x.jceks --connect jdbc:db2://xxx:60000/x2 --username xx -password-alias xx --as-avrodatafile --target-dir xx/data/test --fields-terminated-by '\001' --table xx -m 1
Created 10-11-2016 10:35 AM
It worked :) Can you provide the JIRA bug link ?
Created 02-05-2017 11:50 PM
Thank you Sindhu,
I was facing the same problem. Now it works.
Created 04-01-2018 08:08 PM
Thanks It worked !
could you please explain this addition how is this making it work ? " -Dmapreduce.job.user.classpath.first=true -Dhadoop.security.credential.provider.path=jceks://x.jceksor "
Created 01-19-2021 03:38 AM
Thank you So much Subha, It worked like magic.
Created 11-26-2017 01:56 AM
Thanks! Even i faced similar issue.
Created 03-17-2018 02:46 PM
Thank you.
Created 05-08-2019 09:58 PM
For HDP 3.1, setting the following property will resolve the issue:
-Dmapreduce.job.classloader=true