Spark configuration is not correct when install a ...
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here. Want to know more about what has changed? Check out the Community News blog.
state that " Declaring a dependency on a service, means that the client configs for all dependencies will also get deployed to the process directory.". But I checked the process dir, the spark conf file "spark-defaults.conf"'s content is not same with other node's conf under /etc/spark/conf/spark-defaults.conf. Like the spark.eventLog.dir=/user/spark/applicationHistory, other nodes' value is "spark.eventLog.dir=hdfs://namenode:8020/user/spark/applicationHistory. So that, If I run spark job on this node, it will throw "/user/spark/applicationHistory" is not exist. So the expected value should be "spark.eventLog.dir=hdfs://namenode:8020/user/spark/applicationHistory.". So I am not sure it is CM issue or other configuration issue.