Support Questions

Find answers, ask questions, and share your expertise

pyspark exception error handling ?

avatar
Rising Star

I am newbie for pyspark , i could not able to get pyspark exception handling in transformations .

for example i am calling a function on each line of map transformation , i would like to handle few exceptions in that functions and log them .

exampleof my code : .map(lambda eachone : ultility.function1(eachone , somesrgs)) , here in function1 i would like to handle exceptions .

please provide an example for better understanding . thank you .

1 REPLY 1

avatar
Super Collaborator

@srini

I couldn't find any special exception handling behavior implemented for pyspark.

But you can do a native python try/except in utility.function1, returning None if there is an exception and then filter the resulting rdd on that. (or handle as you want to)