<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Issue installing Spark in windows 10. in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-installing-Spark-in-windows-10/m-p/65932#M76711</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I followed below steps when installing Spark:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;1. downlaoded JDK 10&lt;/P&gt;&lt;P&gt;2. Set environment varibale "JAVA_HOME" under user variable. Update the PATH under user variable to %JAVA_HOME%\bin&lt;/P&gt;&lt;P&gt;3. downlaod spark (spark-2.3.0-bin-hadoop2.7). Copy all files into a folder called "spark".&amp;nbsp;&lt;/P&gt;&lt;P&gt;4. downlaod winutils.exe from internet (GITHUB).&amp;nbsp;&lt;/P&gt;&lt;P&gt;5. setup environemtn varibale "HADOOP_HOME" and "SPARK_HOME"&lt;/P&gt;&lt;P&gt;6. Add path %SPARK_HOME%\bin&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When i run spark-shell i am getting below error message:&lt;/P&gt;&lt;PRE&gt;C:\spark&amp;gt;java -version
java version "10" 2018-03-20
Java(TM) SE Runtime Environment 18.3 (build 10+46)
Java HotSpot(TM) 64-Bit Server VM 18.3 (build 10+46, mixed mode)

C:\spark&amp;gt;spark-shell
Exception in thread "main" java.lang.ExceptionInInitializerError
        at org.apache.hadoop.util.StringUtils.&amp;lt;clinit&amp;gt;(StringUtils.java:80)
        at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
        at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
        at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
        at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
        at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464)
        at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2464)
        at org.apache.spark.SecurityManager.&amp;lt;init&amp;gt;(SecurityManager.scala:222)
        at org.apache.spark.deploy.SparkSubmit$.secMgr$lzycompute$1(SparkSubmit.scala:393)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:393)
        at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401)
        at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401)
        at scala.Option.map(Option.scala:146)
        at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:400)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
        at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3107)
        at java.base/java.lang.String.substring(String.java:1873)
        at org.apache.hadoop.util.Shell.&amp;lt;clinit&amp;gt;(Shell.java:52)
        ... 21 more

C:\spark&amp;gt;&lt;/PRE&gt;&lt;P&gt;can anyone please help with above error.&lt;/P&gt;</description>
    <pubDate>Fri, 16 Sep 2022 13:02:57 GMT</pubDate>
    <dc:creator>gskn</dc:creator>
    <dc:date>2022-09-16T13:02:57Z</dc:date>
    <item>
      <title>Issue installing Spark in windows 10.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-installing-Spark-in-windows-10/m-p/65932#M76711</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I followed below steps when installing Spark:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;1. downlaoded JDK 10&lt;/P&gt;&lt;P&gt;2. Set environment varibale "JAVA_HOME" under user variable. Update the PATH under user variable to %JAVA_HOME%\bin&lt;/P&gt;&lt;P&gt;3. downlaod spark (spark-2.3.0-bin-hadoop2.7). Copy all files into a folder called "spark".&amp;nbsp;&lt;/P&gt;&lt;P&gt;4. downlaod winutils.exe from internet (GITHUB).&amp;nbsp;&lt;/P&gt;&lt;P&gt;5. setup environemtn varibale "HADOOP_HOME" and "SPARK_HOME"&lt;/P&gt;&lt;P&gt;6. Add path %SPARK_HOME%\bin&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When i run spark-shell i am getting below error message:&lt;/P&gt;&lt;PRE&gt;C:\spark&amp;gt;java -version
java version "10" 2018-03-20
Java(TM) SE Runtime Environment 18.3 (build 10+46)
Java HotSpot(TM) 64-Bit Server VM 18.3 (build 10+46, mixed mode)

C:\spark&amp;gt;spark-shell
Exception in thread "main" java.lang.ExceptionInInitializerError
        at org.apache.hadoop.util.StringUtils.&amp;lt;clinit&amp;gt;(StringUtils.java:80)
        at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
        at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
        at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
        at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
        at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464)
        at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2464)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2464)
        at org.apache.spark.SecurityManager.&amp;lt;init&amp;gt;(SecurityManager.scala:222)
        at org.apache.spark.deploy.SparkSubmit$.secMgr$lzycompute$1(SparkSubmit.scala:393)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:393)
        at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401)
        at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:401)
        at scala.Option.map(Option.scala:146)
        at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:400)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
        at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3107)
        at java.base/java.lang.String.substring(String.java:1873)
        at org.apache.hadoop.util.Shell.&amp;lt;clinit&amp;gt;(Shell.java:52)
        ... 21 more

C:\spark&amp;gt;&lt;/PRE&gt;&lt;P&gt;can anyone please help with above error.&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 13:02:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-installing-Spark-in-windows-10/m-p/65932#M76711</guid>
      <dc:creator>gskn</dc:creator>
      <dc:date>2022-09-16T13:02:57Z</dc:date>
    </item>
    <item>
      <title>Re: Issue installing Spark in windows 10.</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-installing-Spark-in-windows-10/m-p/65938#M76712</link>
      <description>&lt;P&gt;Update: The issue is resolved by using JDK 8.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 31 Mar 2018 22:58:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Issue-installing-Spark-in-windows-10/m-p/65938#M76712</guid>
      <dc:creator>gskn</dc:creator>
      <dc:date>2018-03-31T22:58:58Z</dc:date>
    </item>
  </channel>
</rss>

