Member since
07-30-2019
453
Posts
112
Kudos Received
80
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2398 | 04-12-2023 08:58 PM | |
| 4975 | 04-04-2023 11:48 PM | |
| 1592 | 04-02-2023 10:24 PM | |
| 3487 | 07-05-2019 08:38 AM | |
| 3402 | 05-13-2019 06:21 AM |
12-17-2018
03:35 PM
Hi @Qureshi F, It looks like your public_host_name is greater than 255 which is maxed characters allowed by ambari. Can you do a hostname -f command on the host where namenode is installed and see what the response. if it's not the hostname supposed to be please correct it and restart ambari-agent. Hope this helps !
... View more
12-14-2018
02:43 PM
Hi @Kishore Jannu, Its better to create a new thread on this one. this original question used to be for very old ambari version. When you are raising a new thread Please post the exception in code format I am code format
... View more
12-13-2018
06:41 PM
1 Kudo
Hi @Michael Bronson, we dont have any blog or place where we can keep track of Beta Hadoop releases. But yeah we have a blog here : https://hortonworks.com/blog/ where you can get to know the features of existing products , details about new releases. as IMHO I don't see any of our versions support RHEL 7.6 now. Its always better to be in supported operating system
... View more
12-13-2018
05:42 PM
Hi @Bill Ferris, Please download mysql connector jar from here if you cannot install it via yum command : https://dev.mysql.com/downloads/connector/j/ Please login and accept answer as helpfull if it worked for you
... View more
12-13-2018
05:40 PM
Hi @Michael Bronson, Yeah thats right. Hortonworks QE Team Has certified the above versions til max RHEL 7.5 . I woudn't sugest you to use Redhat 7.6 as you may run into unforeseen problems 🙂 Please login and accept the original answer if its helpful
... View more
12-13-2018
04:23 PM
1 Kudo
Hi @Michael Bronson , Hortonworks Supportability matrix and supported Operating Systems are listed in : https://supportmatrix.hortonworks.com . You can use this website to clarify your doubts related to Supportability. Please accept this answer if this resolves your query.
... View more
12-13-2018
04:18 PM
Hi @Bill Ferris, For the logs i see you are having this exception : resource_management.core.exceptions.Fail:Failed to download file from http://ambari.hadoop.uom.gr:8080/resources/mysql-connector-java.jar due to HTTP error: HTTP Error 404: Not Found Can you please try the following in ambari-server and see if this helps ? yum install mysql-connection-java -y (OR) if you are downloading the mysql-connector-java JAR from some tar.gz archive then please make sure to check the following locations and create the symlinks something like following to point to your jar. .Then you should find some symlink as following: Example: # ls -l /usr/share/java/mysql-connector-java.jarl
rwxrwxrwx 1 root root 31 Apr 19 2017 /usr/share/java/mysql-connector-java.jar -> mysql-connector-java-5.1.17.jar https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.1.0/bk_ambari-administration/content/using_hive_with_mysql.html So now ambari knows how to find this jar. The JAR can be found hereafter # ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar
# ls -l /var/lib/ambari-server/resources/mysql-connector-java.jar
-rw-r--r-- 1 root root 819803 Sep 28 19:52 /var/lib/ambari-server/resources/mysql-connector-java.jar Please accept this answer if you found this helpful
... View more
12-11-2018
04:08 PM
Hi @YASHPAL SINGH , What exact issue are you facing. Can you please comment the whole stack trace in code format i am code format also mention whats your ambari version using command : ambari-server --version are you running the ambari-server/ambar-agent as root user ?
... View more
12-11-2018
01:54 PM
Hi @haco fayik, I didn't understand whether you are able to resolve the issue from your previous answer(please accept the anwser if it did). if it doesn't can you see if your mpack of HCP is listed in cd /var/lib/ambari-server/resources/mpacks<br>ls -l and revert if it still has the HCP mpack and you have restarted ambari-server. for your second question about single node , you can refer the Memory requirements and OS Requirements here : https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.7.1/prepare-to-install/content/infrastructure_requirements.html https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.7.1/prepare-to-install/content/preparing_to_install.html
... View more
12-11-2018
01:21 PM
1 Kudo
Hi @haco fayik, Have you installed the HCP 1.7.1 mpack in ambabari following the steps : https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.7.1/ambari-install/content/installing_hcp_ambari_management_pack.html if you are intalling HCP via ambari, please carefully follow each steps mentioned in : https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.7.1/ambari-install/content/installing_hcp_on_an_ambari-managed_cluster_using_ambari.html
... View more