Reply
New Contributor
Posts: 3
Registered: ‎05-19-2016

Spark configuration is not correct when install a custom serice which delarcare dependecy on spark

When I install one custom service which declare depends on spark_on_yarn service in service.sdl, as the doc https://github.com/cloudera/cm_ext/wiki/Service-Descriptor-Language-Reference

state that " Declaring a dependency on a service, means that the client configs for all dependencies will also get deployed to the process directory.".  But I checked the process dir, the spark conf file "spark-defaults.conf"'s content is not same with other node's conf under /etc/spark/conf/spark-defaults.conf.  Like the spark.eventLog.dir=/user/spark/applicationHistory, other nodes' value is "spark.eventLog.dir=hdfs://namenode:8020/user/spark/applicationHistory. So that, If I run spark job on this node, it will throw "/user/spark/applicationHistory" is not exist. So the expected value should be "spark.eventLog.dir=hdfs://namenode:8020/user/spark/applicationHistory.". So I am not sure it is CM issue or other configuration issue.

Announcements