Member since
07-18-2016
262
Posts
12
Kudos Received
21
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6540 | 09-21-2018 03:16 AM | |
3104 | 07-25-2018 05:03 AM | |
4050 | 02-13-2018 02:00 AM | |
1890 | 01-21-2018 02:47 AM | |
37651 | 08-08-2017 10:32 AM |
09-21-2018
03:16 AM
Updating late , After further checking information as below. 1) Hadoop fs -copyFromLocal file1.dat /home/hadoop/file1.dat :- its linux server local command You can check its local server process by #ps -ef|grep file1.dat |grep -i copyFromLocal, you will find the process id ,Hence again we can its local process. 2) How to find yarn application ID for this copyformlocal command :- Its linux server local command and use the local server resource, hence you wont able to find MR/Yarn Jobs. While data copy RM assign the resources however its for datacopy only. Hence "hadoop fs " command occupy the resource from local linux server and hadoop cluster as well for copy only. Where proces is local only , it wont create MR/Yarn Jobs.
... View more
08-30-2018
01:42 AM
Entries will updated in logs, however is there any command to check application id for Hadoop Command i am looking like that. Example :- for Yarn we can check list of running jobs by using YARN command #yarn application -list
... View more
08-23-2018
09:52 PM
I have a job which copy data from Local file system and HDFS 1) Hadoop fs -copyFromLocal file1.dat /home/hadoop/file1.dat 2) How to find yarn application ID for this copyformlocal command thanks,
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache YARN
07-25-2018
05:03 AM
Finally I did the following and Certification team has refunded the amount. 1) Reached to Hortonwords Customer care on contact number. 2) Shared Cerfitification registered number and name of person. 3) Raised the complaint on Issue with certification and raised ticket on our request. 4) After 2 week certification team has refunded the amount. As they confirmed , upgrading the Certification Platform from Aug 2018. Will check review, if no complaints will try to take cerfitication again. Hope it helps. thank you.
... View more
06-13-2018
08:20 AM
Could you help no response since 15 days
... View more
06-03-2018
02:02 PM
Please can some from Hortonworks can Respond its been 3 days no information
... View more
06-02-2018
01:12 AM
Send mail to Certification team and this is the request number # Your request (14199) has been received and is being reviewed by our support staff.
... View more
05-31-2018
01:51 PM
Certification team: please respond with your inputs.
... View more
05-31-2018
02:41 AM
@William Gonzalez Please advise with your comments and for next what shall need to do.
... View more
05-30-2018
06:09 PM
Hortonworks is a great company, great product but I think it should recognize it has a big problem with its exam provider.Complaints about the exam environment and the subsequent delivery is very poor and appaling. Personally, My friend sat for exam but I encountered the same problem on first attempt network issues. It need attention hortonworks developers/engineers are getting frustrated with the PSI exam environment but hortonworks is doing nothing to resolve the problem, delays in delivering exam results are unacceptable when there is an SLA. If hortonworks wants consultants to deliver your products and compete with other vendors for its promotion I think they should rethink the exam delivery process for God's sake. 1) For simple error/output not able to check in Given Window . 2) For scroll up/down is very slow and not as user friendly . 3) Not alone person to complaint like this, many people are there and you can check in community. 4) If you not able to kindly close the Certification, at least we wont try to get certified from Hortonworks. Dear don't screw up such a nice product! Report to the responsible managers 🙂 Exam Sponsor: Hortonworks Exam: HDP Certified Developer: Spark Exam Code: HDPCD:Spark Scheduled Date: May 30, 2018 Scheduled Time: 11:00 PM Malay Peninsula Standard Time Confirmation Code: 351-669 Candidate Id: 3994184016
... View more
Labels:
- Labels:
-
Apache Hadoop