- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
pyspark exception error handling ?
- Labels:
-
Apache Spark
Created ‎07-26-2017 11:26 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am newbie for pyspark , i could not able to get pyspark exception handling in transformations .
for example i am calling a function on each line of map transformation , i would like to handle few exceptions in that functions and log them .
exampleof my code : .map(lambda eachone : ultility.function1(eachone , somesrgs)) , here in function1 i would like to handle exceptions .
please provide an example for better understanding . thank you .
Created ‎07-28-2017 11:03 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I couldn't find any special exception handling behavior implemented for pyspark.
But you can do a native python try/except in utility.function1, returning None if there is an exception and then filter the resulting rdd on that. (or handle as you want to)
