Member since
11-22-2016
14
Posts
23
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1496 | 03-03-2018 12:26 AM | |
1392 | 03-02-2018 11:34 PM |
03-12-2018
02:25 AM
1 Kudo
@Jane Becker Happy it worked out. Enjoy the rest of the week-end!
... View more
03-02-2018
05:39 PM
5 Kudos
@Jane Becker True. The connector is not currently bundled or supported. I installed it manually and my preliminary tests were successful when using it with Spark, but I did not anything complicated or at scale. I checked recently with Engineering and there is a good chance that it will be supported in the second part of 2018. As this connector gets more attention and importance from the users community, its priority will increase and there will be a better chance that it will be supported sooner. As you may know, this connector does not seem supported even by Google.
... View more
03-08-2017
05:39 PM
@jwoodward Thank you.
... View more
05-16-2017
06:07 AM
Hi @Jane Becker, Apart from above answer, on the spark note, I believe you can use JDBC to extract the data into DataFrame, Spark does support jdbc driver to load or save data, and documentation can be found here PS : I have not tested on mongoDB and hope that works as the mongoDB JDBC driver be in generic JDBC driver standerd.
... View more
12-29-2016
07:07 PM
To summarize, G1GC provides predictable GC times which is critical for real-time applications (like Kafka, Storm, Solr etc.).
The reasoning is to avoid stop the world garbage collection which will result in back pressure in heavy ingest environments.
... View more