i have also encountered this weird situation interpreter died when using livy pyspark on spark3. I have lost for 2 days debugging this errors and i've found out that we need to:
1. In the "Spark 3" service on Cloudera manager portal, set spark pyspark python and python driver executable files configuration in the section "Spark 3 Client Advanced Configuration Snippet (Safety Valve) for spark3-conf/spark-defaults.conf" as below:
2. Restart "livy for Spark 3" service on cloudera manager.
3. Restart the Zeppelin Livy Interpreter.
=> After being restarted, the zeppelin's livy interpreter on Spark 3 can execute %pyspark interpreter.
=> Note that, the python version to work with livy spark 3 on zeppelin must be less than or equal 3.7. otherwise it will generate error "required field "type_ignores" missing from Module" when executing pyspark script on Zeppelin.