Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 616 | 06-04-2025 11:36 PM | |
| 1182 | 03-23-2025 05:23 AM | |
| 585 | 03-17-2025 10:18 AM | |
| 2192 | 03-05-2025 01:34 PM | |
| 1378 | 03-03-2025 01:09 PM |
01-02-2019
07:19 AM
hi Geoffrey , I post anew thread - https://community.hortonworks.com/questions/231177/metrics-failed-on-orgapachehadoophbasezookeeperzoo.html , but I see this post not appears in the hortonworks questions ,could you help me to understand why ?
... View more
04-17-2019
06:31 AM
@Alexander Lebedev , Are you still facing login issues with the Sandbox? This looks like a redirection issue with your localhost and would most probably be linked to your /etc/hosts configuration. Let me know if you are still stuck with this , would be happy to help.
... View more
12-18-2018
05:42 AM
Thanks @Geoffrey Shelton Okot for researching on this. I have resolved this issue by following instructions given in this link: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_command-line-installation/content/configuring-atlas-sqoop-hook.html
... View more
12-17-2018
05:53 AM
thanks bro
... View more
01-01-2019
05:13 PM
@max mouse There isn’t a one-and-only tool that can do everything equally well and address all of your requirements. Combining tools that do different things in better ways allows for a buildup in functionality and increased flexibility in handling a larger set of scenarios. Depending on your needs, both NiFi and Flume can act as Kafka producers and/or consumers. HTH
... View more
12-21-2018
02:30 PM
All, Thanks for your response. I found the root cause of the issue. Ambari was using its master's key in KDC admin credentials that is why it was giving "Missing KDC administrator credentials. Please enter admin principal and password". So I have removed that crendential file (PFA for this) and issue has been solved. For others, you may need to keep ambari master key and KDC admin creds same, because that file is required at the time of ambari-server restart (if you have configured jceks). PFA, kerberos-admin-creds-issue-solved.png
... View more
10-29-2018
02:29 PM
Hi Geoffrey - I reinstalled Ambari and HDFS and that fixed the Issue - thank you
... View more
10-19-2018
05:54 PM
Your "Database type" property is set to "Generic", try setting it to Oracle (for Oracle < 12) or Oracle 12+.
... View more
10-03-2018
09:01 PM
2 Kudos
@Lenu K Your question is rather wide for a small cluster all depends on manpower at hand, for HDF remember to back up the flow files, below are immediately what comes into my mind. Fresh Install pros and con's Better planned Here you get a clean installation maybe properly configured mistakes learned from the current cluster setup. Straightforward no upgrade surprises. Loose Customization Upgrade pros and cons' Must plan properly and document steps Expect technical surprises and challenge. Plan support if not having one already on the D-day Challenges mold you to a better hadoopist! See Mandatory Post-Upgrade Tasks Best practice Verify that the file system you selected is supported HWX Pre-create all the databases Backup your cluster before either of the above. Plan for at least NN/RM HA (NN are the brain so allocate good memory) MUST have 3 Zookeeper HDD planning is important SSD for SCSI Restrict access to the cluster from the ONLY edge node. Kerberize the Cluster Configure SSL think of SSD for Zk,Hbase and OS can also use the SSD acceleration for temp tables in hive, exposing the SSD via HDFS Plan well the Data center network(Backup lines) Size your nodes memory and storage properly. Beware if performance is a must especially with Kafka and Storm are memory intensive. Delegate authorization to Ranger. Test upgrade procedures for new versions of existing components Execute performance tests of custom-built applications Allow end-users to perform user acceptance testing Execute integration tests where custom-built applications communicate with third-party software Experiment with new software that is beta quality and may not be ready for usage at all Execute security penetration tests (typically done by an external company) Let application developers modify configuration parameters and restart services on short notice Maintain a mirror image of the production environment to be activated in case of natural disaster or unforeseen events Execute regression tests that compare the outputs of new application code with existing code running in production HTH
... View more