Member since
01-01-2016
19
Posts
3
Kudos Received
0
Solutions
04-02-2019
04:32 PM
@Geoffery Shelton Okot, apology for the late reply. We are new to Spark and it took us some time to run few test cases before putting any comment in this forum. We have tried copying the wallet file to HDFS path and it did not work. Below is the exception received: Caused by: java.io.FileNotFoundException: hdfs:/user/example/.sparkStaging/application_1553474902547_6762/cwallet.sso (No such file or directory)
at java.io.FileInputStream.open0(Native Method) And here is the code sample we are using to create the DB connection using JDBC spark.read.format("jdbc").option("url",JDBCURL).option("user", "DB_USER").option("oracle.net.wallet_location","(SOURCE=(METHOD=file)(METHOD_DATA=(DIRECTORY=hdfs://user/example/.sparkStaging/application_1553474902547_6762/)))")
... The value inside "DIRECTORY=hdfs://user/example/.sparkStaging/application_1553474902547_6762/" block is expected to be a local path and it can not recognize the "hdfs://" protocol and thorwing the error even if the file is there. Alternative approaches: As an alternative approach we did the following 1) Run Spark in local mode : For this we set --master local[*] and below is how we specified the wallet directory location option("oracle.net.wallet_location","(SOURCE=(METHOD=file)(METHOD_DATA=(DIRECTORY=/local/path/to/wallet_dir/)))") "/local/path/to/wallet_dir/" indicates the directory with the wallet file and everything works fine. 2) Run Spark in yarn mode: This time we set --master yarn and use the same wallet directory path as above. But we got the following exception Caused by: oracle.net.ns.NetException: Unable to initialize ssl context.
at oracle.net.nt.CustomSSLSocketFactory.createSSLContext(CustomSSLSocketFactory.java:344)
at oracle.net.nt.CustomSSLSocketFactory.getSSLContext(CustomSSLSocketFactory.java:305)
... 30 more
Caused by: oracle.net.ns.NetException: Unable to initialize the key store.
at oracle.net.nt.CustomSSLSocketFactory.getKeyManagerArray(CustomSSLSocketFactory.java:617)
at oracle.net.nt.CustomSSLSocketFactory.createSSLContext(CustomSSLSocketFactory.java:322)
... 35 more
Caused by: java.io.FileNotFoundException: /local/path/to/cwallet.sso (No such file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138) It looks like in the yarn mode whenever Spark is trying to establish the connection from the executor node it is failing as the wallet directory is not available across those nodes. We thought of copying the wallet directory to all the worker nodes and it works fine. But due to official/corporate policy we were told to find a different solution without copying the file to all nodes. In order to figure out a solution without copying the wallet file we did the following. In local mode If we specify the wallet file under --files params. and try to use the path returned by the following commad scala> SparkFiles.getRootDirectory()
res0: String = /tmp/spark-a396c1f4-ddad-4da7-a2f4-6f8c279b3a7b/userFiles-744cd2cb-23d4-410c-8085-dee3207749ce The file is available under the /tmp path and it is able to create the connection. But in yarn mode the same is not true and it shows no files under that path. So is there anything we are missing here? Is it at all possible to get the files in all worker nodes without copying them ? Why can't we see the files under "SparkFiles.getRootDirectory()" path in yarn mode ? Does it only reflects driver's location ? Kindly advice
... View more
12-21-2017
01:13 AM
But if I havn't complete installation, how will it appear when click coudera logo?
... View more
01-13-2016
04:25 AM
2 Kudos
For me it was solved after login using hdfs user. You can typically then run the spark-shell command without errors
... View more