Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 923 | 06-04-2025 11:36 PM | |
| 1525 | 03-23-2025 05:23 AM | |
| 756 | 03-17-2025 10:18 AM | |
| 2710 | 03-05-2025 01:34 PM | |
| 1801 | 03-03-2025 01:09 PM |
10-05-2018
10:52 PM
@Rodrigo Hjort Any updates on this thread, it should be long closed. Please take time and accept a provided solution
... View more
10-04-2018
10:08 AM
@Kunal Agarwal The proper way is to locate the application ID in RM Ui and run the below command $ yarn application -kill application_id HTH
... View more
10-03-2018
09:01 PM
2 Kudos
@Lenu K Your question is rather wide for a small cluster all depends on manpower at hand, for HDF remember to back up the flow files, below are immediately what comes into my mind. Fresh Install pros and con's Better planned Here you get a clean installation maybe properly configured mistakes learned from the current cluster setup. Straightforward no upgrade surprises. Loose Customization Upgrade pros and cons' Must plan properly and document steps Expect technical surprises and challenge. Plan support if not having one already on the D-day Challenges mold you to a better hadoopist! See Mandatory Post-Upgrade Tasks Best practice Verify that the file system you selected is supported HWX Pre-create all the databases Backup your cluster before either of the above. Plan for at least NN/RM HA (NN are the brain so allocate good memory) MUST have 3 Zookeeper HDD planning is important SSD for SCSI Restrict access to the cluster from the ONLY edge node. Kerberize the Cluster Configure SSL think of SSD for Zk,Hbase and OS can also use the SSD acceleration for temp tables in hive, exposing the SSD via HDFS Plan well the Data center network(Backup lines) Size your nodes memory and storage properly. Beware if performance is a must especially with Kafka and Storm are memory intensive. Delegate authorization to Ranger. Test upgrade procedures for new versions of existing components Execute performance tests of custom-built applications Allow end-users to perform user acceptance testing Execute integration tests where custom-built applications communicate with third-party software Experiment with new software that is beta quality and may not be ready for usage at all Execute security penetration tests (typically done by an external company) Let application developers modify configuration parameters and restart services on short notice Maintain a mirror image of the production environment to be activated in case of natural disaster or unforeseen events Execute regression tests that compare the outputs of new application code with existing code running in production HTH
... View more
10-03-2018
08:01 PM
@Krystle Salazar There is no better resource than this HW KB document on port forwarding HDP 2.5 VirtualBox Sandbox If that helped please accept the answer to close the thread Happy HDP'ing HTH
... View more
10-03-2018
05:08 PM
@Krystle Salazar Set that on the hive CLI set hive.executio.engine=tez; Should do
... View more
10-03-2018
07:05 AM
@Anurag Mishra If the response answered your question can you take time an login and "Accept" the answer and close the thread so other members can use it as a solution
... View more
10-02-2018
10:06 AM
@Anurag Mishra The passwords generated randomly and encrypted using the supported encryption algorithms like fingerprints which are checked against the KDC databases for validity when you run the kinit. HTH
... View more
10-01-2018
01:39 PM
@Anurag Mishra When you are kerberizing the cluster through Ambari you MUST first provide an admin principal and password which you created after creating your KDC databases # kdb5_util create -s You are then required to create an admin principal and password # kadmin.local -q "addprinc admin/admin" The warning You will be prompted for the database Master Password. It is important that you NOT FORGET this password. Enter KDC database master key: Re-enter KDC database master key to verify: This the input requested when kerberizing through Ambari it will ask for root/admin@{REALM} /password Only after passing the correct values on the Enable Kerberos UI that you can proceed to generate successfully the keytabs. So to answer your question the Ambari picks your decrypts your admin password against the KDC private key and the allows you to generate the keytabs. Hortonworks Kerberos document See attached screenshot illustration HTH
... View more
09-28-2018
06:22 PM
1 Kudo
@Saravana V Your problem is a broken symlink if you did upgrade please see how to resolve the issue in this hortonworks official documentation HTH
... View more
09-26-2018
05:28 AM
1 Kudo
@Faisal Durrani SSM is an app running on DataPlan Services (DPS) and operates on top of the platform. DataPlane serves as a management layer across clusters on-premises or in the cloud.
Data Lifecycle Manager Data Steward Studio Streams Messaging Manager Data Analytics Studio Here is a link_to_SMM to the procedure to install SMM and the other components, remember you MUST have a HDP or HDF cluster to deploy DPS components like SMM HTH
... View more