Member since
08-11-2014
481
Posts
92
Kudos Received
72
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3043 | 01-26-2018 04:02 AM | |
6424 | 12-22-2017 09:18 AM | |
3099 | 12-05-2017 06:13 AM | |
3351 | 10-16-2017 07:55 AM | |
9584 | 10-04-2017 08:08 PM |
03-29-2015
09:12 AM
Yes, that's right.
... View more
03-29-2015
02:02 AM
Yes, it becomes a new "row" in Y. The candidate filter is something else. This line of code is like a callback notifying the implementation that a new item exists. New users also cause a new row in X, but there is no equivalent 'candidate filter' for users because the same types of operations (recommend, etc.) are not supported for users.
... View more
03-26-2015
07:37 AM
Hm, how do you compile your app? Usually you create a Maven or SBT project to declare its dependencies, which should include a "provided" dependency on the same version of Spark as is on your cluster. How do you submit your application? spark-submit? you are submitting a JAR to run your app, right?
... View more
03-25-2015
03:09 PM
I mean, do you build your app with a dependency on Spark, and if so what version, and, have you marked it as 'provided' so as not to be included the JAR you submit?
... View more
03-25-2015
02:55 PM
This generally means you're mixing two versions of Spark somehow. Are you sure your app isn't also trying to bundle Spark? are you using the CDH Spark, and not your own compiled version?
... View more
03-25-2015
09:48 AM
1 Kudo
To add a little color, yes you can do that, although the CLASSPATH intentionally does not include Hive, since as I understand, Spark doesn't work with the later versions of Hive that CDH 5.3 and beyond use. It still may work enough to do what you need, so, have at it. But you may hit some incompatibilities.
... View more
03-18-2015
03:51 AM
1 Kudo
Looks good although I would recommend closing the statement and connection too. Also, you're executing an update for every datum. JDBC as an addBatch / executeBatch interface too I think? might be faster.
... View more
03-16-2015
04:51 AM
Yes, perfectly possible. It's not specific to Spark Streaming or even Spark; you'd just use foreachPartition to create and execute a SQL statement via JDBC over a batch of records. The code is just normal JDBC code.
... View more
03-15-2015
08:43 AM
2 Kudos
Although that thread sounds similar, I don't think it's the same thing. Failing to bind is not a failure to connect to a remote host. It means the local host didn't allow the process to listen on a port. The two most likely explanations are: - an old process is still listening on that port, or at least, another still-running process is - you appear to be binding to a non-routable address (192.168.x.x) This might be OK but worth double-checking
... View more
03-15-2015
03:29 AM
As you can see, the problem is that the receiver can't bind to its assigned address. Is there any networking-related restriction in place that would prevent this? is this the port you intended?
... View more