Member since
04-07-2018
6
Posts
0
Kudos Received
0
Solutions
04-15-2018
10:45 PM
Thanks. I really appreciate your response. My advisor actually found out that this will work if we use the following command: $ pysark --master local[i] where i is a number. Using this command, multiple pyspark shells could run concurrently. But why the other solutions did not work, I have no clue!
... View more
04-09-2018
01:30 PM
It looks like things cannot run in parallel but more in a queue form. Maybe missed/misconfgured something in the installation process.
... View more
04-09-2018
12:22 PM
@saranvisa Just tried that. It's not working for different users either.
... View more
04-09-2018
11:44 AM
@saranvisa Sorry forgot to mention that... yes I did. The port is open.
... View more
04-09-2018
11:27 AM
Thank you for your help. I tried different ports, but it still doesn't work,unless I kill the running session and start another one. Can it be that I had wrong configuriation(s) during cloudera installation? Or changes needed to be made in any configuration files or somewhere else?
... View more
04-07-2018
01:43 PM
I am new to coudera. I have installed cloudera express on a Centos 7 VM, and created a cluster with 4 nodes(another 4 VMs). I ssh to the master node and run: pyspark This works but only for one session. If I open another console and run pyspark I will get the following error: WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. And it gets stuck there and does nothing until I close the other session running pyspark! Any idea why this is happening and how I can fix this so multiple sessions/user can run pyspark? Am I missing some configurations somewhere? Thanks in advance for your help.
... View more
Labels:
- Labels:
-
Apache Spark