Support Questions

Find answers, ask questions, and share your expertise

Spark executor default ssl truststore

avatar
Expert Contributor

Hi,

I'm trying to run a spark job for which all executors have to call a secured (HTTPS) web service on a dedicated server. During SSL handshake, this server returns a certificate that has been signed by a private (company specific) CA.

The certificate of this CA has been added to a custom truststore (cacert) that I would like to point to in spark configuration in order for the executors to validate server's certificates without any extra configuration.

I know that I can pass following option to my spark-submit command line :

"--conf "spark.executor.extraJavaOptions=-Djavax.net.ssl.trustStore=<MyCaCert> -Djavax.net.ssl.trustStorePassword=<MyPassword>"

...but I would like to avoid asking this to all our users (because they are not supposed to know where this trustore is located and its password).

I tried to use the "ssl.client.truststore.location" property as described in https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_security/content/ch_wire-webhdfs-mr-yarn... but it didn't change anything.

Obviously spark does not use this configuration ?

Do you guys know how is configured the default truststore used by spark executors ?

Any help will be highly appreciated 🙂

Thanks

3 REPLIES 3

avatar
Guru

avatar
Expert Contributor

If I understand properly, this configuration is used by spark to secure data exhanges between the nodes, but my use case is slightly different : my executor runs custom java code that performs a call to an HTTPS server and in that context, the SSL handshake relies on the default truststore of the JVM instead of the one I configured with my own CA certificate...Maybe that's not possible and the only way to achieve this is to use the properties I mentionned previously...

Thanks for your help

avatar
New Contributor

Hello, I have the same problem. Any updates?

Thanks !