Support Questions

Find answers, ask questions, and share your expertise

Are there any effects of Spark2 by CVE-2022-33891?

New Contributor

I'm checking whether there are any effects to spark2 by CVE-2022-33891 or not.

Is there anyone who can explanin it to me?

1 ACCEPTED SOLUTION

Expert Contributor

Hi @JiHoone 

 

Spark security vulnerability CVE-2022-33891 affects Spark 2 and Spark 3 versions but not 3.1.3, 3.0.4, 3.3.0, 3.2.2 versions". The CVE is only affected if you have enabled ACLs on the SHS UI. By default, ACLs are disabled. If ACLs are enabled, then specified users and groups have access, and group membership is checked using ShellBasedGroupsMappingProvider (which is the class with the vulnerability).

 

Cluster is affected by the CVE only when the GroupMappingServiceProvider is called - which means when spark.history.ui.acls.enable or spark.acls.enable is enabled.

View solution in original post

7 REPLIES 7

Expert Contributor

Hi @JiHoone 

 

Spark security vulnerability CVE-2022-33891 affects Spark 2 and Spark 3 versions but not 3.1.3, 3.0.4, 3.3.0, 3.2.2 versions". The CVE is only affected if you have enabled ACLs on the SHS UI. By default, ACLs are disabled. If ACLs are enabled, then specified users and groups have access, and group membership is checked using ShellBasedGroupsMappingProvider (which is the class with the vulnerability).

 

Cluster is affected by the CVE only when the GroupMappingServiceProvider is called - which means when spark.history.ui.acls.enable or spark.acls.enable is enabled.

Hello @rki_ , how could we saw or configure it to disable acls ?
Thanks for your answer.

Expert Contributor

Hi, Inside Spark, you can check for spark.history.ui.acls.enable and spark.acls.enable. These should be false by default.

 

https://spark.apache.org/docs/2.4.3/security.html#authentication-and-authorization

Hi @rki_ , unfortunately, on my kerberos cluster (HDP 2.6.5), I can't find it in Spark from Ambari.
Do I need to activate them specifically  into custom Spark configs even it's disabled (false) by default ?

Expert Contributor

Hi, Those parameter won't be exposed by Ambari and would be false by default. The parameters would go into Custom spark-defaults. As they are disabled by default, I would suggest not to enable them.

Hello @rki as I can't find those parameters into Ambari, is it possible to enforce disabling it spark.enable.acls = false into Ambari (Custom Spark-defaults) ?

Or maybe it's not be possible to expose by Ambari at all !

Thanks in advance.

New Contributor

Hello @rki_  We are using CDH 6.3.4 and we have spark on yarn:
Below is the version details:

Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.0-cdh6.3.4
/_/

Using Scala version 2.11.12, OpenJDK 64-Bit Server VM, 1.8.0_342
Branch HEAD
Compiled by user jenkins on 2022-01-10T17:29:31Z
Revision HEAD
Url
Type --help for more information.

 

Are we affected? if yes then please can you tell us how to remediate it.

 

Thanks in Advance,

Sagar