Member since
06-12-2016
22
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2138 | 01-15-2017 03:49 PM |
02-13-2017
08:10 PM
Thank you, you are right, when I create a kadmin user on each linux machine, you can successfully submit the task!
... View more
01-15-2017
03:00 PM
This worked: [hdfs@test232 ~]$ hadoop distcp -Dhadoop.security.credential.provider.path=jceks://hdfs/aws/aws.jceks /test s3a://kartik-test/ Thanks for all your help!!
... View more
01-15-2017
03:49 PM
This worked: [hdfs@test232 ~]$ hadoop distcp -Dhadoop.security.credential.provider.path=jceks://hdfs/aws/aws.jceks /test s3a://kartik-test/ Thanks for all your help!!
... View more
07-16-2019
03:24 PM
@Arpit is right that you should do an actual calculation for the namenode heap and keep that up to date as your data grows. I know this thread is about datanodes, but since namenode was brought up multiple times, I just want to point out that Cloudera recommends 1GB per million files+blocks as a good starting point. Once you get to many millions of files and blocks, you can reduce it but start there.
... View more
06-15-2016
03:03 AM
4 Kudos
@Kartik Vashishta Downloading Sandbox and performing tasks mentioned at below URL will be sufficient for HDPCA exam. http://hortonworks.com/training/class/hdp-certified-administrator-hdpca-exam/ We also have an option to give practice test on AWS. Please refer below URL for practice test guide. http://hortonworks.com/wp-content/uploads/2015/04/HDPCA-PracticeExamGuide-1.pdf Hope this information helps! All the best! 🙂
... View more