Created 08-07-2022 11:10 PM
I'm checking whether there are any effects to spark2 by CVE-2022-33891 or not.
Is there anyone who can explanin it to me?
Created 08-07-2022 11:38 PM
Hi @JiHoone
Spark security vulnerability CVE-2022-33891 affects Spark 2 and Spark 3 versions but not 3.1.3, 3.0.4, 3.3.0, 3.2.2 versions". The CVE is only affected if you have enabled ACLs on the SHS UI. By default, ACLs are disabled. If ACLs are enabled, then specified users and groups have access, and group membership is checked using ShellBasedGroupsMappingProvider (which is the class with the vulnerability).
Cluster is affected by the CVE only when the GroupMappingServiceProvider is called - which means when spark.history.ui.acls.enable or spark.acls.enable is enabled.
Created 08-07-2022 11:38 PM
Hi @JiHoone
Spark security vulnerability CVE-2022-33891 affects Spark 2 and Spark 3 versions but not 3.1.3, 3.0.4, 3.3.0, 3.2.2 versions". The CVE is only affected if you have enabled ACLs on the SHS UI. By default, ACLs are disabled. If ACLs are enabled, then specified users and groups have access, and group membership is checked using ShellBasedGroupsMappingProvider (which is the class with the vulnerability).
Cluster is affected by the CVE only when the GroupMappingServiceProvider is called - which means when spark.history.ui.acls.enable or spark.acls.enable is enabled.
Created 09-27-2022 07:52 AM
Hello @rki_ , how could we saw or configure it to disable acls ?
Thanks for your answer.
Created 09-28-2022 01:57 AM
Hi, Inside Spark, you can check for spark.history.ui.acls.enable and spark.acls.enable. These should be false by default.
https://spark.apache.org/docs/2.4.3/security.html#authentication-and-authorization
Created 09-28-2022 07:03 AM
Hi @rki_ , unfortunately, on my kerberos cluster (HDP 2.6.5), I can't find it in Spark from Ambari.
Do I need to activate them specifically into custom Spark configs even it's disabled (false) by default ?
Created 10-05-2022 05:04 AM
Hi, Those parameter won't be exposed by Ambari and would be false by default. The parameters would go into Custom spark-defaults. As they are disabled by default, I would suggest not to enable them.
Created 10-11-2022 07:45 AM
Hello @rki as I can't find those parameters into Ambari, is it possible to enforce disabling it spark.enable.acls = false into Ambari (Custom Spark-defaults) ?
Or maybe it's not be possible to expose by Ambari at all !
Thanks in advance.
Created 03-29-2023 03:18 AM
Hello Jero, Can you please let me know where you saw the parameters and how did you handle it ?
Created 11-30-2022 06:18 AM
Hello @rki_ We are using CDH 6.3.4 and we have spark on yarn:
Below is the version details:
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.0-cdh6.3.4
/_/
Using Scala version 2.11.12, OpenJDK 64-Bit Server VM, 1.8.0_342
Branch HEAD
Compiled by user jenkins on 2022-01-10T17:29:31Z
Revision HEAD
Url
Type --help for more information.
Are we affected? if yes then please can you tell us how to remediate it.
Thanks in Advance,
Sagar