- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Envelope error while writing into hive table
- Labels:
-
Apache Hive
-
Apache Kudu
-
Apache Spark
Created ‎04-08-2020 08:05 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Team,
While using the envelope tool, I'm getting below error while using the tool to write into hive table:
20/04/08 02:54:32 ERROR run.Runner: Pipeline exception occurred: org.apache.spark.sql.AnalysisException: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;;
20/04/08 02:54:32 ERROR yarn.ApplicationMaster: User class threw exception: java.util.concurrent.ExecutionException: org.apache.spark.sql.AnalysisException: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;;
java.util.concurrent.ExecutionException: org.apache.spark.sql.AnalysisException: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;;
Can you please provide any suggestion to overcome this.
Regards,
Arvind
Created ‎04-09-2020 08:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
That it was working on CDH 5.16.1 but not on CDH 6.3.3 tells me that there's some kind of classpath conflict that didn't exist before. I'd suggest modifying Envelope's top-level pom.xml file to point to the Cloudera Maven repository and to use the Spark version "2.4.0-cdh6.3.3".
Created ‎04-08-2020 08:33 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
A few questions to see if we can narrow it down:
- Which version of Spark are you using? Is this on CDH?
- Are you modifying Envelope before compiling?
- Are you using any of your own Envelope plugins?
Jeremy
Created ‎04-08-2020 09:45 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Jeremy,
Thanks for your quick response. Kindly find the required details below:
- Which version of Spark are you using? Is this on CDH? --> We are using Spark 2.4 and yes we are on CDH
- Are you modifying Envelope before compiling? --> No, I didn't modify Envelope before compiling. It was working fine before upgrading to CDH 6.3.3 with spark 2.4. Earlier we were on CDH 5.16.1 with spark 2.4. In both case we have used spark 2.4
- Are you using any of your own Envelope plugins? --> Nope
Regards,
Arvind
Created on ‎04-09-2020 12:54 AM - edited ‎04-09-2020 12:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Jeremy,
I also tried to rebuild the jar by updating the spark version to 2.4.0 in pom.xml, but getting below error ( also tried with spark version of 2.2.0, 2.3.0), Do you think I need to change some other parameter in pom.xml to cater below issue?
20/04/09 02:46:35 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.IllegalAccessError: class org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator java.lang.IllegalAccessError: class org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:763) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:467) at java.net.URLClassLoader.access$100(URLClassLoader.java:73) at java.net.URLClassLoader$1.run(URLClassLoader.java:368) at java.net.URLClassLoader$1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370) at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404) at java.util.ServiceLoader$1.next(ServiceLoader.java:480) at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3151) at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3196) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3235) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:123) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3286) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3254) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:478) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226) at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$$anonfun$2.apply(YarnSparkHadoopUtil.scala:197) at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$$anonfun$2.apply(YarnSparkHadoopUtil.scala:197) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.hadoopFSsToAccess(YarnSparkHadoopUtil.scala:197) at org.apache.spark.deploy.yarn.security.YARNHadoopDelegationTokenManager.fileSystemsToAccess(YARNHadoopDelegationTokenManager.scala:80) at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$9.apply(HadoopDelegationTokenManager.scala:291) at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$9.apply(HadoopDelegationTokenManager.scala:291) at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider.obtainDelegationTokens(HadoopFSDelegationTokenProvider.scala:47) at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:166) at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:164) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.Iterator$class.foreach(Iterator.scala:891) at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) at scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) at org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainDelegationTokens(HadoopDelegationTokenManager.scala:164) at org.apache.spark.deploy.yarn.security.YARNHadoopDelegationTokenManager.obtainDelegationTokens(YARNHadoopDelegationTokenManager.scala:59) at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anon$4.run(HadoopDelegationTokenManager.scala:259) at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anon$4.run(HadoopDelegationTokenManager.scala:257) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainTokensAndScheduleRenewal(HadoopDelegationTokenManager.scala:257) at org.apache.spark.deploy.security.HadoopDelegationTokenManager.org$apache$spark$deploy$security$HadoopDelegationTokenManager$$updateTokensTask(HadoopDelegationTokenManager.scala:231) at org.apache.spark.deploy.security.HadoopDelegationTokenManager.start(HadoopDelegationTokenManager.scala:125) at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$$anonfun$start$1.apply(CoarseGrainedSchedulerBackend.scala:404) at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$$anonfun$start$1.apply(CoarseGrainedSchedulerBackend.scala:401) at scala.Option.foreach(Option.scala:257) at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.start(CoarseGrainedSchedulerBackend.scala:401) at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend.start(YarnClusterSchedulerBackend.scala:37) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:186) at org.apache.spark.SparkContext.<init>(SparkContext.scala:511) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2549) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:944) at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:935) at com.cloudera.labs.envelope.spark.Contexts.startSparkSession(Contexts.java:158) at com.cloudera.labs.envelope.spark.Contexts.getSparkSession(Contexts.java:87) at com.cloudera.labs.envelope.spark.Contexts.initialize(Contexts.java:130) at com.cloudera.labs.envelope.run.Runner.run(Runner.java:110) at com.cloudera.labs.envelope.EnvelopeMain.main(EnvelopeMain.java:58) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:673)
Created ‎04-09-2020 08:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
That it was working on CDH 5.16.1 but not on CDH 6.3.3 tells me that there's some kind of classpath conflict that didn't exist before. I'd suggest modifying Envelope's top-level pom.xml file to point to the Cloudera Maven repository and to use the Spark version "2.4.0-cdh6.3.3".
Created on ‎04-10-2020 12:06 AM - edited ‎04-10-2020 04:53 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Jeremy,
Thanks for your response. As suggested I build the jar and issue got resolved.
Regards,
Arvind
