Member since
02-01-2019
650
Posts
143
Kudos Received
117
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2898 | 04-01-2019 09:53 AM | |
1514 | 04-01-2019 09:34 AM | |
7179 | 01-28-2019 03:50 PM | |
1615 | 11-08-2018 09:26 AM | |
3913 | 11-08-2018 08:55 AM |
06-09-2018
06:31 PM
@Robert Cornell, you'd need to shade the protobuf dependency which is being used in the spark application to avoid conflicts between application and spark dependencies. On a side note you can try if setting below properties helps with the current state. spark.driver.userClassPathFirst true
spark.executor.userClassPathFirst true
... View more
06-04-2018
08:51 AM
@irfan aziz Glad that it helped! Please click on "Accept" to mark this thread as closed.
... View more
06-04-2018
08:31 AM
1 Kudo
@irfan aziz Importtsv is not a hbase shell command, Please exit hbase shell and execute the same command.
... View more
05-30-2018
06:22 PM
@Developer Developer As @Felix Albani suggested above i'd go with spawning multiple threads to process the dataframes in parallel. This article has a good example : https://hadoopist.wordpress.com/2017/02/03/how-to-use-threads-in-spark-job-to-achieve-parallel-read-and-writes/
... View more
05-30-2018
11:44 AM
@Victor If this helped, Please consider "Accepting" the answer and close this thread.
... View more
05-30-2018
10:52 AM
@Victor This article should help you : https://community.hortonworks.com/content/supportkb/146508/how-to-use-alternate-python-version-for-spark-in-z.html
... View more
05-30-2018
10:33 AM
1 Kudo
@Victor Spark 2.2.0 supports Python 2.7+/3.4+. But you'd need to install these python versions separately and point spark accordingly.
... View more
05-29-2018
02:50 PM
@Rajesh Reddy If the above answer helped you, Please consider clicking "Accept" button and close this thread.
... View more
05-29-2018
02:43 PM
@RAUI If the above answer helped you, Please consider clicking "Accept" button and close this thread.
... View more
05-28-2018
09:04 AM
@Michael Bronson, It is highly unlikely that you will get the same version number when using epoch timestamp, The number changes every millisecond. So each time you execute "version"+str(int(time.time() * 1000000)) you will get a new unique number.
... View more