<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Spark on Yarn - Unexpected end-of-input: was expecting closing quote for a string value in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57653#M36815</link>
    <description>&lt;P&gt;Hi all, after recently upgrading to CDH 5.11 I get tons of the following "Unexpected end-of-input" log entries related to "SPARK" (running on YARN) and classified as "ERRORS".&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm experiencing malfunctionings (failed Oozie Jobs) and I believe they are related to these errors, so I'd really like to solve the causing issue and see if the situation gets any better.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In the logs, "source" is:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;FsHistoryProvider&lt;/PRE&gt;&lt;P&gt;And "message" is:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;Exception encountered when attempting to load application log hdfs://xxxxx.xxxxx.zz:8020/user/spark/applicationHistory/application_1494352758818_0117_1
com.fasterxml.jackson.core.JsonParseException: Unexpected end-of-input: was expecting closing quote for a string value
 at [Source: java.io.StringReader@1fec7fc4; line: 1, column: 3655]
	at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1369)
	at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:599)
	at com.fasterxml.jackson.core.base.ParserMinimalBase._reportInvalidEOF(ParserMinimalBase.java:532)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._finishString2(ReaderBasedJsonParser.java:1517)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._finishString(ReaderBasedJsonParser.java:1505)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.getText(ReaderBasedJsonParser.java:205)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:20)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:42)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:35)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:28)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:42)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:35)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:42)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:35)
	at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:2888)
	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2034)
	at org.json4s.jackson.JsonMethods$class.parse(JsonMethods.scala:19)
	at org.json4s.jackson.JsonMethods$.parse(JsonMethods.scala:44)
	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:58)
	at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$replay(FsHistoryProvider.scala:583)
	at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$16.apply(FsHistoryProvider.scala:410)
	at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$16.apply(FsHistoryProvider.scala:407)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
	at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
	at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$mergeApplicationListing(FsHistoryProvider.scala:407)
	at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$checkForLogs$3$$anon$4.run(FsHistoryProvider.scala:309)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)&lt;/PRE&gt;&lt;P&gt;Any suggestions/ideas? Thanks!&lt;/P&gt;</description>
    <pubDate>Fri, 16 Sep 2022 11:57:18 GMT</pubDate>
    <dc:creator>FrozenWave</dc:creator>
    <dc:date>2022-09-16T11:57:18Z</dc:date>
    <item>
      <title>Spark on Yarn - Unexpected end-of-input: was expecting closing quote for a string value</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57653#M36815</link>
      <description>&lt;P&gt;Hi all, after recently upgrading to CDH 5.11 I get tons of the following "Unexpected end-of-input" log entries related to "SPARK" (running on YARN) and classified as "ERRORS".&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm experiencing malfunctionings (failed Oozie Jobs) and I believe they are related to these errors, so I'd really like to solve the causing issue and see if the situation gets any better.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In the logs, "source" is:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;FsHistoryProvider&lt;/PRE&gt;&lt;P&gt;And "message" is:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;Exception encountered when attempting to load application log hdfs://xxxxx.xxxxx.zz:8020/user/spark/applicationHistory/application_1494352758818_0117_1
com.fasterxml.jackson.core.JsonParseException: Unexpected end-of-input: was expecting closing quote for a string value
 at [Source: java.io.StringReader@1fec7fc4; line: 1, column: 3655]
	at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1369)
	at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:599)
	at com.fasterxml.jackson.core.base.ParserMinimalBase._reportInvalidEOF(ParserMinimalBase.java:532)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._finishString2(ReaderBasedJsonParser.java:1517)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._finishString(ReaderBasedJsonParser.java:1505)
	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.getText(ReaderBasedJsonParser.java:205)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:20)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:42)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:35)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:28)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:42)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:35)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:42)
	at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:35)
	at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:2888)
	at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2034)
	at org.json4s.jackson.JsonMethods$class.parse(JsonMethods.scala:19)
	at org.json4s.jackson.JsonMethods$.parse(JsonMethods.scala:44)
	at org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:58)
	at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$replay(FsHistoryProvider.scala:583)
	at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$16.apply(FsHistoryProvider.scala:410)
	at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$16.apply(FsHistoryProvider.scala:407)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
	at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
	at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$mergeApplicationListing(FsHistoryProvider.scala:407)
	at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$checkForLogs$3$$anon$4.run(FsHistoryProvider.scala:309)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)&lt;/PRE&gt;&lt;P&gt;Any suggestions/ideas? Thanks!&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 11:57:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57653#M36815</guid>
      <dc:creator>FrozenWave</dc:creator>
      <dc:date>2022-09-16T11:57:18Z</dc:date>
    </item>
    <item>
      <title>Re: Spark on Yarn - Unexpected end-of-input: was expecting closing quote for a string value</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57655#M36816</link>
      <description>&lt;P&gt;Additional info. If I run spark CLI (where my spark procedures are working, btw, differently from when they are launched in Oozie), as soon as I try to define a Dataframe I receive the following warning that I've never seen before the upgrade:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;In [17]: utenti_DF = sqlContext.table("xxxx.yyyy")
17/07/19 15:48:58 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.1.0
17/07/19 15:48:58 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException&lt;/PRE&gt;&lt;P&gt;Anyway, as I repeat, from CLI things work. I just thought this could be relevant&lt;/P&gt;</description>
      <pubDate>Wed, 19 Jul 2017 14:05:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57655#M36816</guid>
      <dc:creator>FrozenWave</dc:creator>
      <dc:date>2017-07-19T14:05:06Z</dc:date>
    </item>
    <item>
      <title>Re: Spark on Yarn - Unexpected end-of-input: was expecting closing quote for a string value</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57659#M36817</link>
      <description>I am not sure about the first portion but the second bit seems to indicate that you did not upgrade your Hive schema during the upgrade process. You will need to stop Hive. In CM, go to the Hive process, the Action menu should have an Upgrade Hive Metastore Database Schema option. Run that to see if it clears up the warning.</description>
      <pubDate>Wed, 19 Jul 2017 15:10:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57659#M36817</guid>
      <dc:creator>mbigelow</dc:creator>
      <dc:date>2017-07-19T15:10:51Z</dc:date>
    </item>
    <item>
      <title>Re: Spark on Yarn - Unexpected end-of-input: was expecting closing quote for a string value</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57664#M36818</link>
      <description>&lt;P&gt;Hi mbigelow, I've tried what ypu suggested (stop hive + running the action from the dropdown menu).&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Process was successful, but the warning in spark CLI is still there...&lt;/P&gt;</description>
      <pubDate>Wed, 19 Jul 2017 15:58:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57664#M36818</guid>
      <dc:creator>FrozenWave</dc:creator>
      <dc:date>2017-07-19T15:58:02Z</dc:date>
    </item>
    <item>
      <title>Re: Spark on Yarn - Unexpected end-of-input: was expecting closing quote for a string value</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57837#M36819</link>
      <description>Based on some SO post, the exception is most likely related to some invalid JSON somewhere. This si the Spark History server though and I cannot think of any json files it would be using on a regular basis.&lt;BR /&gt;&lt;BR /&gt;On mine I see a redaction-rules.json. Are you using redaction?&lt;BR /&gt;&lt;BR /&gt;Oh wow, I think it was staring at us in the face. It is trying to read a specific application log which has invalid JSON characters. Read that file and put its output into a JSON validator to see what is invalid. I would save it somewhere so it can be review again if needed. Then remove it and try to run the job again. If it fails again, then something is causing it to create the invalid JSON in the application log.&lt;BR /&gt;&lt;BR /&gt;</description>
      <pubDate>Fri, 21 Jul 2017 21:32:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57837#M36819</guid>
      <dc:creator>mbigelow</dc:creator>
      <dc:date>2017-07-21T21:32:22Z</dc:date>
    </item>
    <item>
      <title>Re: Spark on Yarn - Unexpected end-of-input: was expecting closing quote for a string value</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57858#M36820</link>
      <description>&lt;P&gt;Thanks mbigelow, following your suggestions I solved the massive error logging issue.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I've processed in a Json validator the specific log file&amp;nbsp;referenced in the Java stack trace:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;/user/spark/applicationHistory/application_1494352758818_0117_1&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But the format was correct, according to the validator. So I just moved it away in a temporary directory. As soon as I did it, the error messages stopped clogging the system logs. So it was probably corrupted in a very subtle way... But it was definitely corrupted&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;That Json file has been&amp;nbsp;indeed generated by the Spark Action that is giving me problems, but it was an OLD&amp;nbsp;file. New instances of that Spark Action are generating new Json logs, but they are not giving any troubles to the History Server (stopped having tons of exceptions logged as I just said)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Unfortunately, the Spark job itself&amp;nbsp;is still failing and it's needing further investigation on my side, so apparently this is not related to that specific error message.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;But I've solved an annoying problem, and at the same time I have cleared out the possibility of the Spark Action issue being related to that java exception&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
      <pubDate>Sat, 22 Jul 2017 15:45:04 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-on-Yarn-Unexpected-end-of-input-was-expecting-closing/m-p/57858#M36820</guid>
      <dc:creator>FrozenWave</dc:creator>
      <dc:date>2017-07-22T15:45:04Z</dc:date>
    </item>
  </channel>
</rss>

