<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Install two versions of Spark on same cluster. in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125310#M43258</link>
    <description>&lt;P&gt;I have installed Spark 2.1 manually on HDP 2.3.4  while there is another version Spark 1.5 already installed via HDP. When i try to run jobs in yarn cluster mode spark 2.1 is resolving to HDP 2.3.4 spark libraries and resulting in bad substitution errors. Any ideas how you were able to resolve this when using two spark versions ? &lt;/P&gt;</description>
    <pubDate>Thu, 15 Jun 2017 23:52:52 GMT</pubDate>
    <dc:creator>prnuamat</dc:creator>
    <dc:date>2017-06-15T23:52:52Z</dc:date>
    <item>
      <title>Install two versions of Spark on same cluster.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125306#M43254</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Is it possible to install two version of Spark on the same cluster? I know that in HDP 2.5 it supports Spark 1.6 and Spark 2.0 but I want them on either HDP 2.3.6 to install Spark 1.5 and Spark 1.6. &lt;/P&gt;&lt;P&gt;The reason for above requirement is we are integrating the external tool with Hadoop and Spark, where one tool support only Spark 1.5 and other tool require Spark 1.6. &lt;/P&gt;&lt;P&gt;Any help is highly appricated. &lt;/P&gt;</description>
      <pubDate>Wed, 12 Oct 2016 04:12:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125306#M43254</guid>
      <dc:creator>bandarusridhar1</dc:creator>
      <dc:date>2016-10-12T04:12:38Z</dc:date>
    </item>
    <item>
      <title>Re: Install two versions of Spark on same cluster.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125307#M43255</link>
      <description>&lt;P&gt; &lt;A rel="user" href="https://community.cloudera.com/users/5746/bandarusridhar1.html" nodeid="5746"&gt;@SBandaru&lt;/A&gt; from a technical sense you could do it but this would not be supported.  If spark 1.6.x is required I recommend upgrading to HDP 2.4.2.  Regarding your tool which support older version, I would reach out to vendor and ask them to come up to latest release.  Spark has moved on to 2.0.  Being on 1.5 well behind the ball in terms of spark.&lt;/P&gt;</description>
      <pubDate>Wed, 12 Oct 2016 04:15:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125307#M43255</guid>
      <dc:creator>sunile_manjee</dc:creator>
      <dc:date>2016-10-12T04:15:00Z</dc:date>
    </item>
    <item>
      <title>Re: Install two versions of Spark on same cluster.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125308#M43256</link>
      <description>&lt;P&gt;&lt;A href="https://community.hortonworks.com/users/5746/bandarusridhar1.html"&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/users/5746/bandarusridhar1.html"&gt;@SBandaru&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Yes. Actually, it is technically possible and even done. HDP 2.5 includes two versions of Spark: 1.6.2 production level and 2.0 technical preview. They co-exist having different timeline server. You can add Spark 2.0 using Ambari UI and "Add Service". In this case, the reason is to provide a preview of Spark 2.0, however, it is a business decision whether it makes sense.&lt;/P&gt;&lt;P&gt;If any of the responses was helpful, don't forget to vote/accept best answer.&lt;/P&gt;</description>
      <pubDate>Sat, 15 Oct 2016 10:32:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125308#M43256</guid>
      <dc:creator>cstanca</dc:creator>
      <dc:date>2016-10-15T10:32:56Z</dc:date>
    </item>
    <item>
      <title>Re: Install two versions of Spark on same cluster.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125309#M43257</link>
      <description>&lt;P&gt;I have cluster with two spark versions, one was installed with cluster, one manually for zeppelin and livy server, no issues.&lt;/P&gt;</description>
      <pubDate>Sat, 15 Oct 2016 20:02:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125309#M43257</guid>
      <dc:creator>xfox</dc:creator>
      <dc:date>2016-10-15T20:02:38Z</dc:date>
    </item>
    <item>
      <title>Re: Install two versions of Spark on same cluster.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125310#M43258</link>
      <description>&lt;P&gt;I have installed Spark 2.1 manually on HDP 2.3.4  while there is another version Spark 1.5 already installed via HDP. When i try to run jobs in yarn cluster mode spark 2.1 is resolving to HDP 2.3.4 spark libraries and resulting in bad substitution errors. Any ideas how you were able to resolve this when using two spark versions ? &lt;/P&gt;</description>
      <pubDate>Thu, 15 Jun 2017 23:52:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125310#M43258</guid>
      <dc:creator>prnuamat</dc:creator>
      <dc:date>2017-06-15T23:52:52Z</dc:date>
    </item>
    <item>
      <title>Re: Install two versions of Spark on same cluster.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125311#M43259</link>
      <description>&lt;P&gt;I have installed Spark 2.1 manually on HDP 2.3.4 while there is another version Spark 1.5 already installed via HDP. When i try to run jobs in yarn cluster mode spark 2.1 is resolving to HDP 2.3.4 spark libraries and resulting in bad substitution errors. Any ideas how you were able to resolve this when using two spark versions ?&lt;/P&gt;</description>
      <pubDate>Fri, 16 Jun 2017 00:30:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Install-two-versions-of-Spark-on-same-cluster/m-p/125311#M43259</guid>
      <dc:creator>prnuamat</dc:creator>
      <dc:date>2017-06-16T00:30:31Z</dc:date>
    </item>
  </channel>
</rss>

