<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Envelope error while writing into hive table in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293523#M216739</link>
    <description>&lt;P&gt;Hi Jeremy,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for your quick response. Kindly find the required details below:&lt;/P&gt;&lt;P&gt;- Which version of Spark are you using? Is this on CDH? --&amp;gt; &lt;STRONG&gt;We are using Spark 2.4 and yes we are on CDH&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;- Are you modifying Envelope before compiling? --&amp;gt; &lt;STRONG&gt;No, I didn't modify Envelope before compiling. It was working fine before upgrading to CDH 6.3.3 with spark 2.4. Earlier we were on CDH 5.16.1 with spark 2.4. In both case we have used spark 2.4&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;- Are you using any of your own Envelope plugins? --&amp;gt; &lt;STRONG&gt;Nope&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Arvind&lt;/P&gt;</description>
    <pubDate>Wed, 08 Apr 2020 16:45:14 GMT</pubDate>
    <dc:creator>akv31</dc:creator>
    <dc:date>2020-04-08T16:45:14Z</dc:date>
    <item>
      <title>Envelope error while writing into hive table</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293517#M216733</link>
      <description>&lt;P&gt;Hi Team,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;While using the envelope tool, I'm getting below error while using the tool to write into hive table:&lt;/P&gt;&lt;P&gt;20/04/08 02:54:32 ERROR run.Runner: Pipeline exception occurred: org.apache.spark.sql.AnalysisException: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;;&lt;BR /&gt;20/04/08 02:54:32 ERROR yarn.ApplicationMaster: User class threw exception: java.util.concurrent.ExecutionException: org.apache.spark.sql.AnalysisException: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;;&lt;BR /&gt;java.util.concurrent.ExecutionException: org.apache.spark.sql.AnalysisException: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Can you please provide any suggestion to overcome this.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Arvind&lt;/P&gt;</description>
      <pubDate>Wed, 08 Apr 2020 15:05:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293517#M216733</guid>
      <dc:creator>akv31</dc:creator>
      <dc:date>2020-04-08T15:05:40Z</dc:date>
    </item>
    <item>
      <title>Re: Envelope error while writing into hive table</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293518#M216734</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;A few questions to see if we can narrow it down:&lt;/P&gt;&lt;P&gt;- Which version of Spark are you using? Is this on CDH?&lt;/P&gt;&lt;P&gt;- Are you modifying Envelope before compiling?&lt;/P&gt;&lt;P&gt;- Are you using any of your own Envelope plugins?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Jeremy&lt;/P&gt;</description>
      <pubDate>Wed, 08 Apr 2020 15:33:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293518#M216734</guid>
      <dc:creator>Jeremy Beard</dc:creator>
      <dc:date>2020-04-08T15:33:43Z</dc:date>
    </item>
    <item>
      <title>Re: Envelope error while writing into hive table</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293523#M216739</link>
      <description>&lt;P&gt;Hi Jeremy,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for your quick response. Kindly find the required details below:&lt;/P&gt;&lt;P&gt;- Which version of Spark are you using? Is this on CDH? --&amp;gt; &lt;STRONG&gt;We are using Spark 2.4 and yes we are on CDH&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;- Are you modifying Envelope before compiling? --&amp;gt; &lt;STRONG&gt;No, I didn't modify Envelope before compiling. It was working fine before upgrading to CDH 6.3.3 with spark 2.4. Earlier we were on CDH 5.16.1 with spark 2.4. In both case we have used spark 2.4&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;- Are you using any of your own Envelope plugins? --&amp;gt; &lt;STRONG&gt;Nope&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Arvind&lt;/P&gt;</description>
      <pubDate>Wed, 08 Apr 2020 16:45:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293523#M216739</guid>
      <dc:creator>akv31</dc:creator>
      <dc:date>2020-04-08T16:45:14Z</dc:date>
    </item>
    <item>
      <title>Re: Envelope error while writing into hive table</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293577#M216762</link>
      <description>&lt;P&gt;Hi Jeremy,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I also tried to rebuild the jar by updating the spark version to 2.4.0 in pom.xml, but getting below error ( also tried with spark version of 2.2.0, 2.3.0), Do you think I need to change some other parameter in pom.xml to cater below issue?&lt;/P&gt;&lt;PRE&gt;20/04/09 02:46:35 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.IllegalAccessError: class org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
java.lang.IllegalAccessError: class org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:348)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3151)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3196)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3235)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:123)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3286)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3254)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:478)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
	at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$$anonfun$2.apply(YarnSparkHadoopUtil.scala:197)
	at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$$anonfun$2.apply(YarnSparkHadoopUtil.scala:197)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.hadoopFSsToAccess(YarnSparkHadoopUtil.scala:197)
	at org.apache.spark.deploy.yarn.security.YARNHadoopDelegationTokenManager.fileSystemsToAccess(YARNHadoopDelegationTokenManager.scala:80)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$9.apply(HadoopDelegationTokenManager.scala:291)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$9.apply(HadoopDelegationTokenManager.scala:291)
	at org.apache.spark.deploy.security.HadoopFSDelegationTokenProvider.obtainDelegationTokens(HadoopFSDelegationTokenProvider.scala:47)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:166)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$6.apply(HadoopDelegationTokenManager.scala:164)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
	at scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
	at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainDelegationTokens(HadoopDelegationTokenManager.scala:164)
	at org.apache.spark.deploy.yarn.security.YARNHadoopDelegationTokenManager.obtainDelegationTokens(YARNHadoopDelegationTokenManager.scala:59)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anon$4.run(HadoopDelegationTokenManager.scala:259)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anon$4.run(HadoopDelegationTokenManager.scala:257)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainTokensAndScheduleRenewal(HadoopDelegationTokenManager.scala:257)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager.org$apache$spark$deploy$security$HadoopDelegationTokenManager$$updateTokensTask(HadoopDelegationTokenManager.scala:231)
	at org.apache.spark.deploy.security.HadoopDelegationTokenManager.start(HadoopDelegationTokenManager.scala:125)
	at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$$anonfun$start$1.apply(CoarseGrainedSchedulerBackend.scala:404)
	at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$$anonfun$start$1.apply(CoarseGrainedSchedulerBackend.scala:401)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.start(CoarseGrainedSchedulerBackend.scala:401)
	at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend.start(YarnClusterSchedulerBackend.scala:37)
	at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:186)
	at org.apache.spark.SparkContext.&amp;lt;init&amp;gt;(SparkContext.scala:511)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2549)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:944)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:935)
	at com.cloudera.labs.envelope.spark.Contexts.startSparkSession(Contexts.java:158)
	at com.cloudera.labs.envelope.spark.Contexts.getSparkSession(Contexts.java:87)
	at com.cloudera.labs.envelope.spark.Contexts.initialize(Contexts.java:130)
	at com.cloudera.labs.envelope.run.Runner.run(Runner.java:110)
	at com.cloudera.labs.envelope.EnvelopeMain.main(EnvelopeMain.java:58)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:673) &lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 09 Apr 2020 07:56:40 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293577#M216762</guid>
      <dc:creator>akv31</dc:creator>
      <dc:date>2020-04-09T07:56:40Z</dc:date>
    </item>
    <item>
      <title>Re: Envelope error while writing into hive table</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293617#M216782</link>
      <description>&lt;P&gt;That it was working on CDH 5.16.1 but not on CDH 6.3.3 tells me that there's some kind of classpath conflict that didn't exist before. I'd suggest modifying Envelope's top-level pom.xml file to point to the &lt;A href="https://docs.cloudera.com/documentation/enterprise/6/release-notes/topics/rg_cdh_6_maven_repo.html" target="_self"&gt;Cloudera Maven repository&lt;/A&gt; and to use the Spark version "2.4.0-cdh6.3.3".&lt;/P&gt;</description>
      <pubDate>Thu, 09 Apr 2020 15:11:45 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293617#M216782</guid>
      <dc:creator>Jeremy Beard</dc:creator>
      <dc:date>2020-04-09T15:11:45Z</dc:date>
    </item>
    <item>
      <title>Re: Envelope error while writing into hive table</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293660#M216806</link>
      <description>&lt;P&gt;Hi Jeremy,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for your response. As suggested I build the jar and issue got resolved.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;BR /&gt;Arvind&lt;/P&gt;</description>
      <pubDate>Fri, 10 Apr 2020 11:53:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Envelope-error-while-writing-into-hive-table/m-p/293660#M216806</guid>
      <dc:creator>akv31</dc:creator>
      <dc:date>2020-04-10T11:53:22Z</dc:date>
    </item>
  </channel>
</rss>

