Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here. Want to know more about what has changed? Check out the Community News blog.

Spark configuration is not correct when install a custom serice which delarcare dependecy on spark

Highlighted

Spark configuration is not correct when install a custom serice which delarcare dependecy on spark

New Contributor

When I install one custom service which declare depends on spark_on_yarn service in service.sdl, as the doc https://github.com/cloudera/cm_ext/wiki/Service-Descriptor-Language-Reference

state that " Declaring a dependency on a service, means that the client configs for all dependencies will also get deployed to the process directory.".  But I checked the process dir, the spark conf file "spark-defaults.conf"'s content is not same with other node's conf under /etc/spark/conf/spark-defaults.conf.  Like the spark.eventLog.dir=/user/spark/applicationHistory, other nodes' value is "spark.eventLog.dir=hdfs://namenode:8020/user/spark/applicationHistory. So that, If I run spark job on this node, it will throw "/user/spark/applicationHistory" is not exist. So the expected value should be "spark.eventLog.dir=hdfs://namenode:8020/user/spark/applicationHistory.". So I am not sure it is CM issue or other configuration issue.