Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

In HDPCD - Spark exam do we need to write entire application or can we use spark-shell for the same?

Explorer

For every problem given, should i need to write an entire application in Scala/Python or is it sufficient to use spark-shell for writing statements. If the answer of this question is YES, then, how do I build that application? If maven is available then, do i need to write "pom.xml" from scratch or will it be readily available? If "SBT" is available them, do i need to write "build.sbt" file from scratch?

1 ACCEPTED SOLUTION

There is no need to build. The entire exam can be done via Scala/Python shells.

Thanks

View solution in original post

1 REPLY 1

There is no need to build. The entire exam can be done via Scala/Python shells.

Thanks

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.