@Jitendra Yadav , I fixed the jar's thing and know it work but i got a permission error
org.apache.hadoop.security.AccessControlException: Permission denied: user=A62, access=WRITE, inode="/user/A62/.sparkStaging/application_1464688052729_0003":hdfs:hdfs:drwxr-xr-x
I found it but it's huge yarn-yarn-nodemanager-sandboxhortonworkscomlogtxt.zip should i look for a specific thing in it? for example i can see this info in it
STARTUP_MSG: Starting NodeManager STARTUP_MSG: host = sandbox.hortonworks.com/10.0.2.15 STARTUP_MSG: args =  STARTUP_MSG: version = 188.8.131.52.3.2.0-2950 as you can see the host ip is 10.0.2.15 which not correct
Unable to send metrics to collector by address:http://sandbox.hortonworks.com:6188/ws/v1/timeline/metrics
Hello @Jitendra Yadav,
I have the similar case as Emad. I followed your recommendations but still have the issues with remote vm.
Could you please share exact steps how to set up eclipse and maven dependencies to make Spark job working from Eclipse?
I will appreciate your help a lot,
Please use ambari to get this url -
you can login t ambari host using - http://127.0.0.1:8080/ [username/password - admin/admin]
Click on respective service [say - hdfs]
click on "Quick Links" located in top center of the services page. Pls find attached screenshot for the same -
2. you can get from cli also. Login to sandbox and execute below commands -
# cat /etc/hadoop/conf/core-site.xml |grep 8020
#cat /etc/hadoop/conf/yarn-site.xml |grep 8030
it seems that you are running your spark main class in eclipse as java class which similar to 'java -cp', i will suggest you to build a fat jar and run application using spark-submit.
you can also follow along this link to build spark application using maven and eclipse