<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: cdsw spark context issue in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/cdsw-spark-context-issue/m-p/79750#M83463</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This is a known issue for the CDSW 1.3 release, please read the documentation about this:&lt;/P&gt;&lt;P&gt;&lt;A href="https://www.cloudera.com/documentation/data-science-workbench/1-3-x/topics/cdsw_known_issues.html#cds__py4j" target="_blank"&gt;https://www.cloudera.com/documentation/data-science-workbench/1-3-x/topics/cdsw_known_issues.html#cds__py4j&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I also see that you are trying to create a SparkContext object which still should work but you might be better off using the new Spark 2.x interfaces. You can see a few examples here:&lt;/P&gt;&lt;P&gt;&lt;A href="https://www.cloudera.com/documentation/data-science-workbench/1-3-x/topics/cdsw_pyspark.html" target="_blank"&gt;https://www.cloudera.com/documentation/data-science-workbench/1-3-x/topics/cdsw_pyspark.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Peter&lt;/P&gt;</description>
    <pubDate>Thu, 13 Sep 2018 09:27:01 GMT</pubDate>
    <dc:creator>peter_ableda</dc:creator>
    <dc:date>2018-09-13T09:27:01Z</dc:date>
    <item>
      <title>cdsw spark context issue</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/cdsw-spark-context-issue/m-p/79749#M83462</link>
      <description>&lt;P&gt;Hi, I am trying to start a spark session via CDSW and met an error showed as below: TypeError: __init__() got an unexpected keyword argument 'auth_token' codes I used: from pyspark import SparkContext from pyspark import SparkConf from pyspark.sql import HiveContext from pyspark.sql import SQLContext conf = SparkConf().set("spark.executor.memory", "12g") \ .set("spark.yarn.executor.memoryOverhead", "3g") \ .set("spark.dynamicAllocation.initialExecutors", "2") \ .set("spark.driver.memory", "16g") \ .set("spark.kryoserializer.buffer.max", "1g") \ .set("spark.driver.cores", "32") \ .set("spark.executor.cores", "8") \ .set("spark.yarn.queue", "us9") \ .set("spark.dynamicAllocation.maxExecutors", "32") sparkContext = SparkContext.getOrCreate(conf=conf) Does anyone meet this error before or know about how to solve it? Thanks in advance.&lt;/P&gt;</description>
      <pubDate>Tue, 21 Apr 2026 13:27:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/cdsw-spark-context-issue/m-p/79749#M83462</guid>
      <dc:creator>cici</dc:creator>
      <dc:date>2026-04-21T13:27:24Z</dc:date>
    </item>
    <item>
      <title>Re: cdsw spark context issue</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/cdsw-spark-context-issue/m-p/79750#M83463</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This is a known issue for the CDSW 1.3 release, please read the documentation about this:&lt;/P&gt;&lt;P&gt;&lt;A href="https://www.cloudera.com/documentation/data-science-workbench/1-3-x/topics/cdsw_known_issues.html#cds__py4j" target="_blank"&gt;https://www.cloudera.com/documentation/data-science-workbench/1-3-x/topics/cdsw_known_issues.html#cds__py4j&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I also see that you are trying to create a SparkContext object which still should work but you might be better off using the new Spark 2.x interfaces. You can see a few examples here:&lt;/P&gt;&lt;P&gt;&lt;A href="https://www.cloudera.com/documentation/data-science-workbench/1-3-x/topics/cdsw_pyspark.html" target="_blank"&gt;https://www.cloudera.com/documentation/data-science-workbench/1-3-x/topics/cdsw_pyspark.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Peter&lt;/P&gt;</description>
      <pubDate>Thu, 13 Sep 2018 09:27:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/cdsw-spark-context-issue/m-p/79750#M83463</guid>
      <dc:creator>peter_ableda</dc:creator>
      <dc:date>2018-09-13T09:27:01Z</dc:date>
    </item>
    <item>
      <title>Re: cdsw spark context issue</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/cdsw-spark-context-issue/m-p/79816#M83464</link>
      <description>&lt;P&gt;Thank you so much! My problem has been solved.&lt;/P&gt;</description>
      <pubDate>Fri, 14 Sep 2018 09:33:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/cdsw-spark-context-issue/m-p/79816#M83464</guid>
      <dc:creator>cici</dc:creator>
      <dc:date>2018-09-14T09:33:42Z</dc:date>
    </item>
    <item>
      <title>Re: cdsw spark context issue</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/cdsw-spark-context-issue/m-p/82505#M83465</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/11332"&gt;@peter_ableda&lt;/a&gt;&amp;nbsp;Sorry to ask you. Actually I have installed cdsw 1.4 on my cdsw machine and when I am trying to start the sparksession/running any hdfs commands then I am getting the error as unknowhostException with the&amp;nbsp; clouderamaster hostname. I am very new to cloudera so not sure which set up i am missing as i followed the set up related to pyspark(by importing the template while creating the project and starting the pyhton 2 env to run the pyspark job). It would be great help if you can guide me something which I am missing from my set up. Thanks in Advance!!!&lt;/P&gt;</description>
      <pubDate>Sat, 17 Nov 2018 17:53:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/cdsw-spark-context-issue/m-p/82505#M83465</guid>
      <dc:creator>t5</dc:creator>
      <dc:date>2018-11-17T17:53:26Z</dc:date>
    </item>
  </channel>
</rss>

