Created 08-03-2016 06:46 AM
Writing to learn about some of the pointers for THE HDPCD:SPARK EXAM which aren't listed in HDPCD FAQ's Below I have mentioned some queries which I need assistance your on:- 1.) As your website say's, As of now hortonworks doesn't provides practice test for Spark. May I know where I can refer the contents of the exam's number of questions being asked ? The level of the exam ? 2.) My laptop would not be having the required resolution. In that case what if I connect it with any external LCD/LED screen ? 3.) In case of power-outage or internet connectivity Issues. How can I reconnect to the exam ?
@allaboutbdata
Created 08-03-2016 01:35 PM
1) The current Spark exam consists of 7 hands-on tasks and you need to get 4 correct to pass. A task either right or wrong - there is no partial credit. If you need experience with Spark on HDP, I recommend candidates work through the Spark tutorials on the Hortonworks Sandbox.
2) Any resolution should work fine for the Spark exam. It does not have a specific resolution requirement.
3) If you get disconnected at any time, log back in to examslocal.com and attempt to re-launch the exam. The exam proctor will take over from there and assist in getting you back up and running.
Created 08-03-2016 07:47 PM
Thanks for the response @Rich Raposa
I checked one of your response here
You suggested for Hortonworks Spark tutorials. However, when I navigated to Training-Link I found that spark tutorials are yet to be launched by Hortonworks "**Coming Soon".
Can you please help for any other reference link from where I can get an idea about the level and type of questions for the exam ?
Created 08-04-2016 05:54 PM
So quick question about the HDPCD:Spark exam, out of 7 tasks we need to get 4 right instead of 5 as opposed to the HDPCD exam?
Created 08-04-2016 05:56 PM
Well anything can change at any time, but yes - you currently need to get 4 correct out of 7 to pass the Spark exam.
Created 02-08-2017 10:23 AM
Hi,
could you describe the environment for the test? Do we have to make scala programs to be submitted or do we have to work on the spark-shell itself? If so, how would we submit the code from spark-shell?
Created 08-03-2016 11:22 PM
That is not the link to the Sandbox tutorials. Try this one:
Created 02-08-2017 01:48 PM
Hello Abhishek, To answer your question regarding the environment of the test, here are the versions:
You will submit the code in the spark (or python) shell. You can either write a script to a file and then execute the file or type directly on the shell. I hope that this helps.
Created 03-10-2017 12:07 AM
@wgonzalez, Since Spark 2.0 version is released, is it possible to take the exam in this version. If not, any idea when would the exam support Spark 2.0 version
Created 03-20-2017 09:13 PM
I understand that we can use python/spark shell to write spark code. But one of the topic in syllabus says "Run a Spark job on YARN". How does it possible with shell? I know how to run a jar in local/yarn mode but this is confusing... any clarification will be appreciated