Member since
01-13-2016
23
Posts
18
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3268 | 01-13-2016 08:56 PM |
02-05-2016
04:52 PM
1 Kudo
@rich Rich and team, Is there a better way to contact you in regards to discovered errors in the Practice Exam? Not sure this is the correct forum for raising such issues as it seems more intended for Q&A. However, the issue noted here also affects the answer for Task 03 item #1. The correct answer should be 30,000. However, the answer shown in the solutions folder is 30,267 which is including records from the 2 weather related files which shouldn't be used until Task 07.
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
02-05-2016
03:06 PM
1 Kudo
@rich Thanks Rich. This clarifies a lot and removes some of the stress from the exam preparation. 🙂
... View more
02-05-2016
03:00 PM
1 Kudo
@rich Thanks Rich. Based on your response, I believe then that for a task to be marked as correct, you would only be considering a combination of: output location framework used and that there could be multiple ways of writing the code or script. For example, for the task questioned above, someone could write a single put command to copy all .csv files in the folder, OR they could execute separate put commands for every file, OR they could use copyFromLocal. Would that be correct?
... View more
02-05-2016
02:40 PM
1 Kudo
#2 of TASK 01 states that you should put the 3 files from the /home/horton/datasets/flightdelays directory into HDFS. However, the flightdelays directory contains 5 files. I feel that this is a bit ambiguous but assume that it is referring to the 3 with a similar name. Furthermore, when checking the solutions.txt file I can see that the correct command is set to copy all .csv files in the folder to HDFS. How will Tasks such as this one be graded (by command/s used, presence of the correct files in HDFS, etc.)? Would I not receive a PASS for this Task if I moved all 5 files instead of just the 3 with a similar naming convention?
... View more
Labels:
- Labels:
-
HDFS
-
Hortonworks Data Platform (HDP)
02-05-2016
01:41 PM
1 Kudo
@Neeraj Sabharwal Thanks! I hadn't ran across this one yet.
... View more
02-05-2016
01:27 PM
3 Kudos
Are there any good walkthrough tutorials for Flume? I've seen the two listed here. However, after skimming through the second one "Analyzing Social Media and Customer Sentiment," I fail to see any use or reference of Flume within it. I would specifically like something that walks through performance of the two Flume objectives documented in the HDP Certified Developer Exam Objectives sheet: Given a Flume configuration file, start a Flume agent Given a configured sink and source, configure a Flume memory channel with a specified capacity https
The 1st tutorial from the link above starts a Flume agent via Ambari, but I assume the Exam will require this to be done via the Terminal.
... View more
Labels:
- Labels:
-
Apache Flume
01-13-2016
08:56 PM
2 Kudos
I believe it is a matter of case-sensitivity. Please try PigStorage instead of pigstorage.
... View more
01-13-2016
03:33 PM
Thanks Rich. Your response led me to find a discrepancy between what is written on the first link you provided and the pdf file which is available for download on the right side of that same page under Resources as ' HDP Certified Developer (HDPCD) Exam' which is what Ive been referencing during my preparation.
... View more
01-13-2016
03:17 PM
1 Kudo
I am preparing for the HDP Developer Certification and have been unable to find relevant tutorials related to each of the exam objectives. I'm using the labs that are part of the Hortonworks University as well as the official tutorials. I'm aware that the exam objectives list has links to documentation but I'm specifically looking for more hands-on step-by-step tutorials for things such as 'Use WebHDFS to create and write a file in HDFS'.
... View more
Labels:
- « Previous
-
- 1
- 2
- Next »