Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

how to ignore errors while running commands on spark-sql -f

how to ignore errors while running commands on spark-sql -f

New Contributor

I have multiple queries in a file (say 10, every query ending with ;) which I am running from spark-sql -f "list of Qurey;". When a query in between fails (say query #5), the queries after 5 do not execute, and the spark-sql job is completed. How can I do error handling to make sure that queries from 6 to 10 run even though query 5 fails?