Member since 
    
	
		
		
		09-06-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                1
            
            
                Post
            
        
                0
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		07-12-2019
	
		
		11:43 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							> Key: SPARK-23476  > URL: https://issues.apache.org/jira/browse/SPARK-23476  > Project: Spark  > Issue Type: Bug  > Components: Spark Shell  > Affects Versions: 2.3.0  > Reporter: Gabor Somogyi  > Priority: Minor  >  > If spark is run with "spark.authenticate=true", then it will fail to start in local mode.  > {noformat}  > 17/02/03 12:09:39 ERROR spark.SparkContext: Error initializing SparkContext.  > java.lang.IllegalArgumentException: Error: a secret key must be specified via the spark.authenticate.secret  config  > at org.apache.spark.SecurityManager.generateSecretKey(SecurityManager.scala:401)  > at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:221)  > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:258)  > at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:199)  > at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:290)  > ...  > {noformat}  > It can be confusing when authentication is turned on by default in a cluster, and one  tries to start spark in local mode for a simple test.  > *Workaround*: If {{spark.authenticate=true}} is specified as a cluster wide config, then  the following has to be added  > {{--conf "spark.authenticate=false" --conf "spark.shuffle.service.enabled=false" --conf  "spark.dynamicAllocation.enabled=false" --conf "spark.network.crypto.enabled=false" --conf  "spark.authenticate.enableSaslEncryption=false"}}  > in the spark-submit command.
						
					
					... View more