Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 774 | 06-04-2025 11:36 PM | |
| 1354 | 03-23-2025 05:23 AM | |
| 669 | 03-17-2025 10:18 AM | |
| 2423 | 03-05-2025 01:34 PM | |
| 1583 | 03-03-2025 01:09 PM |
02-05-2016
10:09 PM
@Prakash Punj try rebooting that server ans restart the Ambari server.. Then log you upload says "Port in use: pp-hdp-m:50070 at org.apache.hadoop.http.HttpServer2.openListeners" If you want help then you need to follow some tips from this forum by elimination we cn come to your rescue !
... View more
02-05-2016
10:01 PM
Kill the process that is using the port 50070
15:12:21,968 ERROR namenode.NameNode (NameNode.java:main(1712)) - Failed to start namenode. java.net.BindException: Port in use: pp-hdp-m:50070 at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:919) at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:856) at
... View more
02-05-2016
04:13 PM
@Robin Dong Any updates?
... View more
02-05-2016
04:33 AM
1 Kudo
[root@hostx .ssh]# ls authorized_keys id_rsa id_rsa.pub known_hosts The first one contains the private key, the second one contains the public key for user root. Now run the command cat /root/.ssh/id_rsa.pub > /root/.ssh/authorized_keys it will add the contents of the root’s file id_rsa.pub to a file authorized_keys. Therefore, if a user will connect to this machine by SSH and will specify the correct private key, this machine will look up into authorized_keys and will know that this is root. remember to copy the authorized_keys to all the servers in the cluster Also check the permissions on the .ssh directory, id.rsa,authorized_keys and id_rsa.pub keys
... View more
02-03-2016
08:51 PM
@rbalam so do you have the mysql-connector-java on the Ambari server ? If not copy it please.
... View more
02-03-2016
08:26 PM
@rbalam your Ambari installation looks good exit code 0 ,but am puzzled . can you copy and paste in here your mysql setup for ambari steps?
... View more
02-03-2016
08:21 PM
@rbalam Try this steps to clean up your failed installation, just to make sure did you in th eeralier steps copy mysql-connector-java.jar to /usr/share/java/ directory to both servers ?
... View more
02-03-2016
07:48 PM
1 Kudo
@rbalam I have attached a document on how to install your Ambari server successfully you will not find any better document than this .Please enjoy and revert set-up-mysql-for-ambari.pdf I forgot to include the syntax to connect remotely to MySQL server but it should be fine The Ambari Server host needs the mysql-connector-java driver installed to be able to communicate with the database you will need to install it on the local and remote server
... View more
02-03-2016
10:49 AM
@Rainer Geissendoerfer To fix this, SSH into your HDP instance VM and edit: /etc/hadoop/conf/core-site.xml and change the following config to add “localhost”. Save and restart the relevant services or just reboot your HDP VM instances. <property> <name>hadoop.proxyuser.hive.hosts</name> <value>sandbox.hortonworks.com,127.0.0.1,localhost</value> </property> core-site.xml
... View more
02-03-2016
07:43 AM
3 Kudos
A single server production setup is not recommended at all the minimum configuration for a production environment is 3 or 5 odd numbers because of the default 3 replication factor. Dev and TEST doesn't need to as big as prod. DR is required too and we can use DR for reporting Please check the minimum requirements for production setup
... View more