Support Questions

Find answers, ask questions, and share your expertise

Is it the correct configuration for hosts with multiple IP's for a Hbase?

avatar
New Contributor

Hello.

I am concerned about the configuration in the host's settings.

The customer has Cluster and it doesn't only use Hadoop.

It is already being used for other purposes in developing the module.

Actually, that Module based cluster has limited data processing.

I installed a HDP for solving the problem. Currently, an unknown error has occurred (E.g. reject connection on zookeeper, connection lost).

A suspected portion is the configuration for hosts.

The host file is configured as a host with multiple IP's.

:: 192.168.10.1 Node1 I (Private IP)

:: 1.1.5.4 Node1 (Public IP)

Is it correct for configuration?

I have not changed the server's environment.

Thank you !

1 ACCEPTED SOLUTION

avatar

@seungho han

The nodes can have multiple IP addresses and that is alright. But the configurations used should be correctly setup and its better to use the same series of IP addresses. If the cluster is setup with Ambari and the FQDN's are correctly specified and if the FQDN's resolve to expected IP address, then you would not face a problem with configurations.

If this is a manual installation, then please ensure that all the associated configuration files [hbase / zookeeper / hdfs] all point to correct IP addresses.

View solution in original post

2 REPLIES 2

avatar

@seungho han

The nodes can have multiple IP addresses and that is alright. But the configurations used should be correctly setup and its better to use the same series of IP addresses. If the cluster is setup with Ambari and the FQDN's are correctly specified and if the FQDN's resolve to expected IP address, then you would not face a problem with configurations.

If this is a manual installation, then please ensure that all the associated configuration files [hbase / zookeeper / hdfs] all point to correct IP addresses.

avatar
New Contributor

thank you for your kindly reply!!