Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

In HDPCD - Spark exam do we need to write entire application or can we use spark-shell for the same?

Solved Go to solution
Highlighted

In HDPCD - Spark exam do we need to write entire application or can we use spark-shell for the same?

New Contributor

For every problem given, should i need to write an entire application in Scala/Python or is it sufficient to use spark-shell for writing statements. If the answer of this question is YES, then, how do I build that application? If maven is available then, do i need to write "pom.xml" from scratch or will it be readily available? If "SBT" is available them, do i need to write "build.sbt" file from scratch?

1 ACCEPTED SOLUTION

Accepted Solutions

Re: In HDPCD - Spark exam do we need to write entire application or can we use spark-shell for the same?

There is no need to build. The entire exam can be done via Scala/Python shells.

Thanks

1 REPLY 1

Re: In HDPCD - Spark exam do we need to write entire application or can we use spark-shell for the same?

There is no need to build. The entire exam can be done via Scala/Python shells.

Thanks