I have a HDP cluster of 5 node set up. how can i access pyspark from client server
Make sure you have the spark-client installed on that machine. Then just type "pyspark" and hit enter you should be able to use the pyspark in local mode. if you want to run it in yarn mode, use "pyspark --master yarn-client".
Hope this helps.
Just to add, make sure all the configuration files for spark, hdfs, yarn and mapreduce are also in sync. This is needed if the client server is not managed by ambari.