I've successfully setup Spark 3.3.0 on CDH 6.2 (we used YARN). Here are some important step
1. Back up the current spark come from Cloudera package (v2.4.0 I think) at /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/spark
2. Download the spark version from Spark homepage, for ex "spark-3.3.0-bin-hadoop3.tgz". Extract, delete old spark folder and replace with new spark folder (rename it to "spark") at /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/spark
3. Copy all the config files from old spark conf folder to the new spark conf folder

4. Copy the Yarn-related config file into spark conf folder too

4.1. Copy file spark-3.3.0-yarn-shuffle.jar from spark/yarn to spark/jars folder
5. Make some modifications to spark-default.conf file, mostly disable log and point to the right jar folder

6. Modify some yarn config like below (yarn-site.xml)


7. Restart the cluster and run spark-shell command. Run some queries for testing. You could modify the yarn-site.xml file in the spark conf folder directly to make sure.