Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2121 | 11-01-2016 05:43 PM | |
6427 | 11-01-2016 05:36 PM | |
4110 | 07-01-2016 03:20 PM | |
7046 | 05-25-2016 11:36 AM | |
3422 | 05-24-2016 05:27 PM |
09-30-2024
07:40 AM
非安全集群被阻止rpc通信,使用webhdfs协议,hadoop distcp -D ipc.client.fallback-to-simple-auth-allowed=true webhdfs://nn1:50070/foo/bar hdfs://nn2:8020/bar/foo
... View more
09-25-2024
02:08 PM
@kiki123 As this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post. Thanks.
... View more
12-02-2023
06:53 PM
¡Muchas gracias!, me ayudó mucho esta solución. Ahora ya pude realizar mi tarea. Bendiciones
... View more
08-18-2023
02:29 AM
Hi, I agree with the solution, however, use tail -f /var/log/ambari-server/ambari-server.log to watch the logs while you try to start the service from the UI. This will give you real-time feedback.
... View more
08-10-2023
12:34 AM
Hi Team, Can any one help to resolve this issue, Resources manager wnet down due to this not able start them. Error:- Service did not start successfully; not all of the required roles started: only 23/25 roles started. Reasons : Service has only 0 ResourceManager roles running instead of minimum required 1.
... View more
12-21-2022
05:15 AM
@aravinth53, as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.
... View more
02-16-2022
06:47 PM
1 Kudo
This answer may not help the person who asked but still wanted to post it as somebody could benefit from it. There are multiple ways to solve this that explained in this artilce. https://sparkbyexamples.com/hadoop/hadoop-unable-to-load-native-hadoop-library-for-your-platform-warning/ For me, setting the below environments solved my problem. export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native # (OR) If you have hadoop library installed at /usr/lib/ export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native Thanks
... View more
11-16-2021
06:43 PM
default username/password for Ambari = maria_dev / maria_dev
... View more
11-11-2021
07:40 AM
I have downloaded the nagios with the .ova extension but the thing is not still going displaying the same output
... View more
10-19-2021
07:22 PM
Thanks for the Solution
... View more