06-04-2017 03:11 PM
I'm trying to study for the exam, but I am trying to find out what are the main tools/subject to study for this exam.
Should we focus on Sqoop (Import&Export), Flume, Kafka, Hive, Spark and Oozie?
Am I missing anything? Or studying more than I should once my background for now is only Sqoop and Hive?
06-05-2017 05:23 AM
The CCP Data Engineer page explains the skills required and the test environment which you will be using to complete tasks. It should have the information you are looking for.
06-05-2017 06:16 AM
Ok, first of all thanks for you reply @cjervis.
I've checked that page, but it's not specific, it have all the Cloudera tools which is quite impossible to use them all right?
It is important to us, to have guidelines like the page of the Spark certification exam, what does not happens with Data Engineer page.
06-05-2017 06:21 AM
Ah, I see what you mean now. The certification team has bee updating the pages to add a sample question and additional information but don't seem to have changed this one yet. I'll reach out and see what I can find out.
06-12-2017 05:14 AM
Sorry about the delay. I heard back from the certification team the other day and didn't get around to posting a reply.
Basically, they advised theat the CCP: Data Engineering certification is to show your "mastery" of the subjects covered. As a result they do not provide further details on the exam or sample questions.
06-12-2017 05:31 AM
I think the below wording from the CCP Data Engineer page should cover that question.
You are given five to eight customer problems each with a unique, large data set, a CDH cluster, and four hours. For each problem, you must implement a technical solution with a high degree of precision that meets all the requirements. You may use any tool or combination of tools on the cluster (see list below) -- you get to pick the tool(s) that are right for the job. You must possess enough industry knowledge to analyze the problem and arrive at an optimal approach given the time allowed. You need to know what you should do and then do it on a live cluster under rigorous conditions, including a time limit and while being watched by a proctor.