Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

In HDPCD - Spark exam do we need to write entire application or can we use spark-shell for the same?

avatar
New Member

For every problem given, should i need to write an entire application in Scala/Python or is it sufficient to use spark-shell for writing statements. If the answer of this question is YES, then, how do I build that application? If maven is available then, do i need to write "pom.xml" from scratch or will it be readily available? If "SBT" is available them, do i need to write "build.sbt" file from scratch?

1 ACCEPTED SOLUTION

avatar

There is no need to build. The entire exam can be done via Scala/Python shells.

Thanks

View solution in original post

1 REPLY 1

avatar

There is no need to build. The entire exam can be done via Scala/Python shells.

Thanks