We're using CDH 5.12.1 currently, which ships with Spark1.6. We have deployed Spark2.3 on the cluster, which is the distribution that we're actively using, and is working fine.
However, this does mean that we've got Spark1.6 binaries on our servers. Our security scans have picked these up as a vulnerability and we'd like to go ahead and remove them.
I'm wondering if anyone has attempted something like this before? If so, do they have any advice regarding it? I was simply going to have a look at what Spark1.6 files there are, then write a script that looped through our cluster and removed those files.
If someone has a more "official" way of doing things, that would be preferable. I'm more than aware that my proposal wouldn't exactly be supported.
As a follow up, have the Spark1.6 binaries been removed from more recent CDH versions?
To be more acturate, technically, CVE-2018-1334 is fixed in CDH 5.14.4.
But there's a new issue been found with similar privilege escalation vulnerability, which is CVE-2018-11760. We fixed CVE-2018-11760 in CDH 5.15.1. So with CDH 5.15.1, you won't be affected by these two similar privilege escalation vulnerabilities.
Hi Sara, I run vulnerability scans and our scanner picking up Spark 1.6 banner from the following path- for
CVE-2018-8024 vulnerability, you did mention this vulnerability doesn't affect SPark 1.6 but didn't give detail reasons. This is where Qualys picksup the banner-
/opt/cloudera/parcels/CDH-5.15.1-1.cdh5.15.1.p0.4/lib/spark/conf/spark-env.sh: line 75: /usr/appl/cloudera/java/jdk1.8.0_162: is a directory
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.0
Also we have 2 versions of SPARK running- do you really need version 1.6 to run v 2.3.0
Can you please help, advise.