Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

spark_shuffle fails nodemanager

SOLVED Go to solution

spark_shuffle fails nodemanager

New Contributor

 

 Hi,

 

i have cds 2.3.2 on cdh 5.15.

  

i have configured shuffle as under,-

 

<property><name>yarn.nodemanager.aux-services</name><value>spark_shuffle</value></property><property><name>yarn.nodemanager.aux-services.mapreduce_shuffle.class</name><value>org.apache.spark.network.yarn.YarnShuffleService</value></property><property><name>yarn.resourcemanager.hostname</name><value>whf00aql.in.oracle.com</value></property><property><name>spark.yarn.shuffle.stopOnFailure</name><value>false</value></property>

 

Also distributed the shuffle jar as exected for external shuffle config.

 

on restart of yarn i see teh following,-

 

 

Role failed to start due to error com.cloudera.cmf.service.config.ConfigGenException:
Conflicting yarn extensions provided by more than one service. Yarn extension key [spark_shuffle], 
First value [YarnAuxServiceExtension{className='org.apache.spark.network.yarn.YarnShuffleService', auxServiceId='spark_shuffle', configs={spark.authenticate=true, spark.shuffle.service.port=7337}, service=Spark 2}], Second value [YarnAuxServiceExtension{className='org.apache.spark.network.yarn.YarnShuffleService', auxServiceId='spark_shuffle', configs={spark.authenticate=false, spark.shuffle.service.port=7337}, service=Spark}].

 

please help

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: spark_shuffle fails nodemanager

New Contributor
Issue was with spark authentication set up between spark and spar2 in cdh. default value of spark2 did not match to spark. once changed it resolved the issue spark.authenticate=true
1 REPLY 1
Highlighted

Re: spark_shuffle fails nodemanager

New Contributor
Issue was with spark authentication set up between spark and spar2 in cdh. default value of spark2 did not match to spark. once changed it resolved the issue spark.authenticate=true