Member since
05-05-2016
18
Posts
16
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5147 | 07-02-2016 03:31 PM | |
25264 | 07-02-2016 01:07 PM |
04-17-2020
09:49 PM
Have you been able to solve this issue. Getting exact same issue. Wanted to know how you resolve it Thanks
... View more
12-14-2016
02:48 PM
1 Kudo
The spark logging code is Spark's Logger class, which does lazy eval of expressions like logInfo(s"status $value") Sadly, that's private to the spark code, so outside it you can't use it. See [SPARK-13928](https://issues.apache.org/jira/browse/SPARK-13928) for the discussion, and know that I don't really agree with the decision. When I was moving some code from org.apache.spark to a different package, I ended up having to copy & paste the entire spark logging class into my own code. Not ideal, but it works: CloudLogging.scala Bear in mind that underneath, Spark uses SLF4J and whatever back it, such as log4j; you can use SLF4J direct for its lazy eval of log.info("status {}", value). However, the spark lazy string evaluation is easier to use, and I believe is even lazy about evaluating functions inside the strings (.e.g. s"users = ${users.count()}"), so can be more efficient. The CloudLogging class I've linked to shows how Spark binds to SLF4J; feel free to grab and use it,
... View more