Created 09-02-2018 09:33 AM
Hello,
Due to limitations of SPARK ML 1.6 , i had to upgrade spark to Spark 2 , every configuration is fine.
I have 4 host cluster, if i launch pyspark from master its gets stucked at launch or otherwise it will show warn that couldn't find ui,port trying to connect 4041 etc.
Strange thing here is all those ports ar unoccupied , can somebody help ?
Created 09-03-2018 12:47 AM
Created 09-03-2018 03:10 AM
yes, it may be due to port, pls try the below
export SPARK_MAJOR_VERSION=2
pyspark --master yarn --conf spark.ui.port=12888
pyspark --master yarn --conf spark.ui.port=4041
pyspark --master yarn --conf spark.ui.port=4042
etc
Created 09-03-2018 03:31 AM
i tried your suggestion already, but did it again and now it gets stuck here. upon using ctrl plus c it skips to executor
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLeve l(newLevel).
strange thing is it works fine on other nodes, should i use them then ?
Created 09-04-2018 01:43 AM
if the command is working on the other nodes then run the netstat command again on both the nodes (for the port starting 4040) to see the difference.
it is clear that it is not a spark issue as it is working form other nodes. so you have to identify the port open/availability status