Member since
09-25-2015
101
Posts
51
Kudos Received
25
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
427 | 08-25-2017 01:06 PM | |
874 | 05-16-2017 12:53 PM | |
474 | 05-15-2017 05:07 PM | |
581 | 05-11-2017 04:19 PM | |
405 | 05-05-2017 12:29 AM |
02-07-2017
01:39 PM
2 Kudos
Hello, Dinesh, all good questions.
You are correct that to "pass" the HDPCA exam you need to be at least familiar with the tasks in the Exam Objectives (http://hortonworks.com/training/certification/exam-objectives/#hdpca).
The best way to prepare for the Exam is to run a Practice Exam on AWS (http://2xbbhjxc6wk3v21p62t8n4d4.wpengine.netdna-cdn.com/wp-content/uploads/2015/04/HDPCA-PracticeExamGuide.pdf). This practice environment has the same version as the exam will have and the tasks in practice are similar to the ones on the actual exam. In the actual exam you will have a minimally configured cluster, so you will not need to install from scratch on a base OS, but you will be required to configure different services as per the Objectives.
If you still want to create a cluster from scratch, which is always a great thing to learn, you would probably want to install an HDP version that is close to the exam. The exam is based on the Hortonworks Data Platform 2.3 installed and managed with Ambari 2.1. Hope that this helps.
... View more
02-06-2017
05:29 PM
1 Kudo
Hello, Robert, Normally is a couple days, up to 5, to receive grades from proctoring company. Let me check on this particular one and see if there was an issue in the grade transmission. Also, just to double check, sometimes these emails are caught in your spam folder. I will email you in a few minutes.
... View more
02-03-2017
08:59 PM
2 Kudos
No worries, sorry if I misunderstood. The level is the same.Of course the questions and datasets are different but the tasks are similar in nature. If you comfortable finish all tasks in the practice, I have a high level of confidence you can pass the actual exam. try finishing under 2 hrs to best simulate the actual exam. Hope that this helps.
... View more
02-03-2017
04:54 PM
1 Kudo
The actual exam is hosted by a proctoring company so it adds a wrapper for chat (with the proctor). The screen may look a bit different because of the wrapper, but the underlying cloud VM is the same in format and HDP version. More info can be found in http://hortonworks.com/training/certification/hdpcd-certification/ Hope that this helps
... View more
02-03-2017
04:44 PM
1 Kudo
You have to convert the DF to a table, then save the table like: myDF.registerTempTable("myTempTable")
val myDFTable = sqlContext.sql("SELECT col1, col2, col3 FROM myTempTable WHERE col2 > 1000")
myDFTable.map(x => x(0) + "," + x(1) + "," + x(2)).saveAsTextFile("output.csv")
... View more
02-03-2017
12:58 PM
Yes, thats what I figured. I will fix. Thanks for the feedback.
... View more
02-03-2017
12:34 PM
OK. I understand. Thank you for bringing that to our attention. I think that the documentation probably moved. I will review and update. Thanks again
... View more
02-02-2017
12:31 PM
Thanks for the feedback. I will check the links. Meanwhile I attached the actual datasheet pdfs for your reference. Cheers
... View more
02-01-2017
02:55 PM
1 Kudo
Hello, Abhijit, The exam does not specify which method, that is up to you, as long as the output is in the correct format and # records. Hope that this helps
... View more
02-01-2017
02:51 PM
1 Kudo
Hello, Abhijit, Yes the exam instructions ask you to save the scripts in a folder under the current user directory in the exam Desktop. The exam will tell you the exact location of where to save it. Normally you would write your script using vi or gedit (available in exam environment) and then save it in the correct location. Hope that this helps
... View more
02-01-2017
12:10 PM
1 Kudo
Not sure why you are getting that link. What was the page that you were accessing the exam datasheets from? I do not think that they are under docs.hortonworks.com Can you try accessing from here: http://hortonworks.com/training/certification/ --> http://hortonworks.com/training/certification/hdpcd-certification/ In any event, I am enclosing the sheets: datasheet-hdpcd-java-22.pdf certdatasheet-hdpcd-24.pdfdatasheet-hdpca-23.pdf
... View more
01-30-2017
07:32 PM
2 Kudos
There is no need to build. The entire exam can be done via Scala/Python shells. Thanks
... View more
01-27-2017
06:33 PM
Sachin, another alternative to using the latest Ambari is to use an AWS Practice exam for HDPCA (click here for info: http://hortonworks.com/training/certification/hdpca-certification/). This AWS environment is based on the same Ambari version as the actual exam. For exam questions please write to certification@hortonworks.com
... View more
01-25-2017
11:06 PM
1 Kudo
Hello, Mohamed. What AWS instance type did you use for the practice? The actual exam uses a slightly larger instance size than the recommended size on the practice. However, latency can also be caused by other factors like bandwidth. We recommend using a fast network in your area while taking the exam. If you have any more questions regarding your particular case, please write to certification@hortonworks.com.
... View more
01-23-2017
02:25 PM
1 Kudo
Neeraj, please send practical exam queries to certification@hortonworks.com Thanks
... View more
01-23-2017
02:22 PM
You can also EXPLICITLY CAST after the load as: a = load 'a.txt' using PigStorage('\t');
b = foreach a generate (int)$6 as art:int, (chararray)$17 as dest:chararray;
... View more
01-23-2017
02:08 PM
1 Kudo
There is a typo in the solution. The correct line in the solution should be b = filter a by (chararray) $4 != 'NA'; Thanks
... View more
12-13-2016
03:37 PM
2 Kudos
At the moment the exam image does not accept keyboard changes, but you may be able to change the keyboard on your machine and the VM will receive the appropriate character set:
... View more
11-30-2016
06:28 PM
Thanks. Yea I meant the contents (text files) not the directory. This makes sense. Thanks I will try that
... View more
11-30-2016
05:27 PM
1 Kudo
I replied to Yogesh on a separate thread, but basically this particular issue was due to a cloud-hosting environment issue.
... View more
11-30-2016
05:04 PM
In Ambari 2.2.1 how can I delete old Ambari Server Ops History? I think that the history of tasks is stored in /var/lib/ambari-agent/data. Can I simply delete the folder contents?
... View more
Labels:
- Labels:
-
Apache Ambari
06-25-2016
12:42 AM
1 Kudo
Password in #3 above is stored in /etc/ambari-server/conf/password.dat
... View more
04-28-2016
01:17 PM
Thanks Guys. Both answers helped. I had the wrong DNS name in Database URL config value on Hive service. Hive is NOT being used in this cluster, so I did not think to check that value. Thanks to both. WG
... View more
04-27-2016
11:22 PM
I have an issue where the screen for YARN Configs tab in ambari is not refreshing. (Get eternal spinning wheel but never refreshes). All other Config tabs work. All other tabs for YARN (Summary, Heatmaps) work. Upon closer inspection with the Developer Tools in chrome, I see an "Uncaught Error: Invalid Path" in the JS Console (see screenshot). Ideas? hcc.png
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache YARN
12-15-2015
11:05 PM
HCatalog does not seem to allow Constants as columns. Via Sqoop import: sqoop import \
--query "select col1 as col1, col2 as col2, '' as col3, col4 as col4 from input_table" \
--connect jdbc:db2://server.com:60000/DB1 \
--username ******* \
--password ******* \
--hcatalog-database my_db \
--hcatalog-table output_table \
--create-hcatalog-table \
--hcatalog-storage-stanza 'stored as orc tblproperties ("orc.compress"="SNAPPY")' \
--verbose \
--fetch-size 20000 \
Log: 15/12/14 03:28:57 INFO hcat.SqoopHCatUtilities: Creating HCatalog table my_db.output_table for import
15/12/14 03:28:57 INFO hcat.SqoopHCatUtilities: HCatalog Create table statement:
create table `my_db`.`output_table` (
`col1` varchar(387),
`col2` varchar(17),
`col3` varchar,
`col4` varchar(17))
stored as orc tblproperties ("orc.compress"="SNAPPY")
...
15/12/14 03:29:04 INFO hcat.SqoopHCatUtilities: FAILED: ParseException line 4:14 mismatched input ',' expecting ( near 'varchar' in primitive type specification
15/12/14 03:29:05 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@4aa0560e
15/12/14 03:29:05 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: HCat exited with status 64 Is this supported? Looks like: https://issues.apache.org/jira/browse/SQOOP-2596
... View more
Labels:
- Labels:
-
Apache HCatalog
-
Apache Sqoop
12-11-2015
01:37 PM
That error only says that the container crashed, but does not say why. You may be right about the yarn classpath. What is the output of: ls -al /usr/hdp/current/hadoop-* ??
... View more
- « Previous
-
- 1
- 2
- Next »