Created 07-26-2017 11:26 AM
I am newbie for pyspark , i could not able to get pyspark exception handling in transformations .
for example i am calling a function on each line of map transformation , i would like to handle few exceptions in that functions and log them .
exampleof my code : .map(lambda eachone : ultility.function1(eachone , somesrgs)) , here in function1 i would like to handle exceptions .
please provide an example for better understanding . thank you .
Created 07-28-2017 11:03 AM
I couldn't find any special exception handling behavior implemented for pyspark.
But you can do a native python try/except in utility.function1, returning None if there is an exception and then filter the resulting rdd on that. (or handle as you want to)