I have registered for CCA 175 and planning to take the exam in a week. I am bit confused on spark questions, By reading the forum/FAQ I understand that I will be provided a partially written programm in scala/python which I need to fill and execute. My main doubt is how do I need to execute them, should I just copy paste them at the spark-shell or do I need to build the project and use sbt(spark submit) to execute them. Please let me know.
Thanks for the confirmation. I was of the assumption that we would create jar files and execute them using spark-submit. So wont knowledge of sbt and spark-submit be tested as part of certification.
SBT and spark-submit will be installed on the cluster. You are free to use them as tools.
In the CCA-175 exam, in order to speed up development you should expect that most coding questions will have a template provided with some of the code so that you do not have to write everything from scratch. If Cloudera provides a template, there will be an executable script written that can run that template.
I'm writing to clarify a doubt on CCA175 exam which is scheduled for this weekend. I've prepared for scala as my primary programming language. However, as part of the exam, if I'm provided with a python template, then can i execute my commands on spark-shell and produce the output? Or do i need to create a new file with scala code and execute as part of the shell script provided?
Any inputs are greatly welcome.
The current version of the exam is not using templates. We have removed a question to make the exam shorter, but currently you must write everything from scratch.