Member since
10-23-2019
11
Posts
2
Kudos Received
0
Solutions
01-22-2020
07:37 AM
1 Kudo
@lyubomirangelo Thank you! Going through the wizard (ambari-server setup-security) fixed my issue. I just needed to point to the new key and certificate chain file, then restart.
... View more
01-21-2020
08:39 AM
The node certificates on my cluster are expiring soon so I have installed new ones, including on the node that has ambari-server. However, after restarting ambari server, ambari agent, and even the node itself, the old certificate still shows. I've tried also clearing cache and cookies for all time on my browser, but it doesn't work and the old cert even shows up on IE. I've tried the same methodology for other nodes in the cluster and it has worked, so why isn't it working for the ambari node? (ambari-server is set up through an https port)
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
01-17-2020
03:39 PM
1 Kudo
I just recently imported a certificate chain into the keystore that NiFi points to, on 3 NiFi nodes, call them node1, node2 and node3. The truststore.jks file so far has been unedited. Testing out the SSL handshakes between nodes, I get: SSL handshake has read 4537 bytes and written 495 bytes
...
return code: 0 (ok) Executed from node 2 requesting node 1 (using the same port configured in NiFi SSL settings in Ambari) Similarly other combinations also were successful, (node1 -> node2, node3 -> node1, etc.) However, when after the certificate import and then restarting NiFi, trying the NiFi UI, it shows that the cluster has been disconnected. Furthermore, it shows that the SSL handshakes are failing: Attempt to contact NiFi Node https://node2:port/nifi did not complete due to exception: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target Attempt to contact NiFi Node https://node3:port/nifi did not complete due to exception: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target Attempt to contact NiFi Node https://node1:port/nifi did not complete due to exception: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target. What is going on here? Why isn't the SSL handshake working through NiFi?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache NiFi
01-15-2020
08:14 AM
@EricL After obtaining root and intermediate CA certificates, and using the following command: sudo keytool -importcert -alias rootca -keystore cacerts -file /tmp/rootca.crt Where cacerts is the truststore file. I get the following message: Certificate already exists in keystore under alias <...> So rootca and intermediateca certs are already in my cacerts truststore. So why is keytool not allowing me to import the new certificate into the keystore? (note: I'm trying to import the server certificate into a different file than cacerts)
... View more
01-14-2020
01:50 PM
I'm in the process of renewing the certificates for each node in my Hadoop cluster. I obtained a certificate file for each of my nodes. But when running the following command, I get the error keytool error: java.lang.Exception: Failed to establish chain from reply” Command: sudo keytool -importcert -alias node1 -file node1.cer -keystore keystore.jks From what I've gathered this happens because I didn't load the root and intermediate CA certificates into the truststore yet. Looking into the truststore.jks file itself, I can see that I already have root and intermediate CA certificates that are still not expired for a long while. So they've already been loaded. So is it possible to use these existing root and intermediate CA certificates while importing my new Hadoop node certificate into the keystore? (Also, I've tried this command alteration but still got the same error:) sudo keytool -import -alias node1 -trustcacerts -storetype jceks -file node1.cer -keystore keystore.jks
... View more
Labels:
- Labels:
-
Apache Hadoop
10-24-2019
07:17 AM
[postgres@testvm1 ~]$ psql
psql (9.2.24)
Type "help" for help.
postgres=# @Scharan Confirmed postgres is running. Again, I don't seem to have a /var/lib/psql folder at all, despite postgres being installed... It might help to say that I've designated Hive Metastore and HiveServer2 to be on a node that's not the postgres/ambari server node (from what I've heard it should not matter..) UPDATE: I did find the pg_hba.conf file, just in a different location: It shows the following: # "local" is for Unix domain socket connections only
local all all trust
# IPv4 local connections:
host all all 127.0.0.1/32 trust
# IPv6 local connections:
host all all ::1/128 trust
... View more
10-23-2019
02:04 PM
2019-10-23 20:42:17,691 - Check db_connection_check was unsuccessful. Exit code: 1. Message: ERROR: Unable to connect to the DB. Please check DB connection properties.
org.postgresql.util.PSQLException: Connection refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/custom_actions/scripts/check_host.py", line 546, in <module>
CheckHost().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/custom_actions/scripts/check_host.py", line 207, in actionexecute
raise Fail(error_message)
resource_management.core.exceptions.Fail: Check db_connection_check was unsuccessful. Exit code: 1. Message: ERROR: Unable to connect to the DB. Please check DB connection properties.
org.postgresql.util.PSQLException: Connection refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections. I followed the steps outlined when you select "Existing PostgreSQL database", namely: where I used: sudo yum install postgresql-jdbc to install the package, and then ambari-server setup --jdbc-db=postgres --jdbc-driver=/usr/share/java/postgresql-jdbc.jar And setup completed successfully. I'm making a test cluster with 3 machines on Azure, and cannot setup Hive.. Also I seem to not have a /var/lib/pgsql folder on my Ambari server like people reference in similar problems on the web, if that makes a difference.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
Apache Hive