Hi,
I have a HDP sandbox 3.0 with : Scala v2.11 and Spark v2.3.1
And mongoDB server with the version 4.2.1.
I'm looking to do a simple connection between spark and mongoDB collection by doing a simple read from a collection (testMongo.py) :
from pyspark.sql import SparkSession
from pyspark.sql import Row
from pyspark.sql import functions
if __name__ == "__main__":
# Create a SparkSession
spark = SparkSession.builder.appName("MongoDBIntegration").getOrCreate()
# Read it back from MongoDB into a new Dataframe
readUsers = spark.read.format("com.mongodb.spark.sql.DefaultSource").option("uri","mongodb://sandbox-mongo.infodetics.com:27017/DBTest.col1").load()
readUsers.createOrReplaceTempView("col1")
sqlDF = spark.sql("SELECT * FROM col1")
sqlDF.show()
# Stop the session
spark.stop()
There is my spark-submit command that I used :
spark-submit --packages org.mongodb.spark:mongo-spark-connector_2.11:2.3.2,org.mongodb:mongo-java-driver:3.11.1,org.mongodb:mongodb-driver-core:3.11.1,org.mongodb:bson:3.11.1 /hdp/spark/scripts/PySpark/testMongo.py
After few seconds the job fails and show me this error :

Any idea ?