Created 12-02-2016 07:45 PM
Trying to install ambari in my local centos7 machine.
I have followed the hortonworks document step by step.
when i run the command i.e
ambari-server start
it giving me bolow error.
Starting ambari-server Ambari Server running with administrator privileges. Organizing resource files at /var/lib/ambari-server/resources... Server PID at: /var/run/ambari-server/ambari-server.pid Server out at: /var/log/ambari-server/ambari-server.out Server log at: /var/log/ambari-server/ambari-server.log Waiting for server start......... ERROR: Exiting with exit code -1. REASON: Ambari Server java process died with exitcode 255. Check /var/log/ambari-server/ambari-server.out for more information.
I have checked into /var/log/ambari-server/ambari-server.out file, it contains
[EL Warning]: metadata: 2016-12-02 12:53:02.301--ServerSession(799570413)--The reference column name [resource_type_id] mapped on the element [field permissions] does not correspond to a valid id or basic field/column on the mapping reference. Will use referenced column name as provided.
and also i have checked the logs in /var/logs/ambari-server/ambari-server.log file
it contains
02 Dec 2016 12:53:00,195 INFO [main] ControllerModule:185 - Detected POSTGRES as the database type from the JDBC URL 02 Dec 2016 12:53:00,643 INFO [main] ControllerModule:558 - Binding and registering notification dispatcher class org.apache.ambari.server.notifications.dispatchers.AlertScriptDispatcher 02 Dec 2016 12:53:00,647 INFO [main] ControllerModule:558 - Binding and registering notification dispatcher class org.apache.ambari.server.notifications.dispatchers.EmailDispatcher 02 Dec 2016 12:53:00,684 INFO [main] ControllerModule:558 - Binding and registering notification dispatcher class org.apache.ambari.server.notifications.dispatchers.SNMPDispatcher 02 Dec 2016 12:53:01,911 INFO [main] AmbariServer:705 - Getting the controller 02 Dec 2016 12:53:02,614 INFO [main] StackManager:107 - Initializing the stack manager... 02 Dec 2016 12:53:02,614 INFO [main] StackManager:267 - Validating stack directory /var/lib/ambari-server/resources/stacks ... 02 Dec 2016 12:53:02,614 INFO [main] StackManager:243 - Validating common services directory /var/lib/ambari-server/resources/common-services ... 02 Dec 2016 12:53:02,888 ERROR [main] AmbariServer:717 - Failed to run the Ambari Server com.google.inject.ProvisionException: Guice provision errors:
1) Error injecting constructor, org.apache.ambari.server.AmbariException: Stack Definition Service at '/var/lib/ambari-server/resources/common-services/HAWQ/2.0.0/metainfo.xml' doesn't contain a metainfo.xml file at org.apache.ambari.server.stack.StackManager.<init>(StackManager.java:105) while locating org.apache.ambari.server.stack.StackManager annotated with interface com.google.inject.assistedinject.Assisted at org.apache.ambari.server.api.services.AmbariMetaInfo.init(AmbariMetaInfo.java:242) at org.apache.ambari.server.api.services.AmbariMetaInfo.class(AmbariMetaInfo.java:124) while locating org.apache.ambari.server.api.services.AmbariMetaInfo for field at org.apache.ambari.server.controller.AmbariServer.ambariMetaInfo(AmbariServer.java:138) at org.apache.ambari.server.controller.AmbariServer.class(AmbariServer.java:138) while locating org.apache.ambari.server.controller.AmbariServer
1 error at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:987) at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1013) at org.apache.ambari.server.controller.AmbariServer.main(AmbariServer.java:710) Caused by: org.apache.ambari.server.AmbariException: Stack Definition Service at '/var/lib/ambari-server/resources/common-services/HAWQ/2.0.0/metainfo.xml' doesn't contain a metainfo.xml file at org.apache.ambari.server.stack.ServiceDirectory.parseMetaInfoFile(ServiceDirectory.java:209) at org.apache.ambari.server.stack.CommonServiceDirectory.parsePath(CommonServiceDirectory.java:71) at org.apache.ambari.server.stack.ServiceDirectory.<init>(ServiceDirectory.java:106) at org.apache.ambari.server.stack.CommonServiceDirectory.<init>(CommonServiceDirectory.java:43) at org.apache.ambari.server.stack.StackManager.parseCommonServicesDirectory(StackManager.java:301) at org.apache.ambari.server.stack.StackManager.<init>(StackManager.java:115) at org.apache.ambari.server.stack.StackManager$$FastClassByGuice$$33e4ffe0.newInstance(<generated>) at com.google.inject.internal.cglib.reflect.$FastConstructor.newInstance(FastConstructor.java:40) at com.google.inject.internal.DefaultConstructionProxyFactory$1.newInstance(DefaultConstructionProxyFactory.java:60) at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:85) at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254) at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974) at com.google.inject.assistedinject.FactoryProvider2.invoke(FactoryProvider2.java:632) at com.sun.proxy.$Proxy25.create(Unknown Source) at org.apache.ambari.server.api.services.AmbariMetaInfo.init(AmbariMetaInfo.java:246) at org.apache.ambari.server.api.services.AmbariMetaInfo$$FastClassByGuice$$202844bc.invoke(<generated>) at com.google.inject.internal.cglib.reflect.$FastMethod.invoke(FastMethod.java:53) at com.google.inject.internal.SingleMethodInjector$1.invoke(SingleMethodInjector.java:56) at com.google.inject.internal.SingleMethodInjector.inject(SingleMethodInjector.java:90) at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:110) at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:94) at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254) at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) at com.google.inject.Scopes$1$1.get(Scopes.java:65) at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) at com.google.inject.internal.SingleFieldInjector.inject(SingleFieldInjector.java:53) at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:110) at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:94) at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:254) at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031) at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40) at com.google.inject.Scopes$1$1.get(Scopes.java:65) at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40) at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978) at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1024) at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974) ... 2 more
Please suggest me.
Mohan.V
Created 12-03-2016 08:51 AM
Thanks for the suggestion jss.
But it could'nt solved the issue completely.
I have moved those files into temp directory and again tried to start the server but, now it given another error as
ERROR: Exiting with exit code -1. REASON: Ambari Server java process died with exitcode 255. Check /var/log/ambari-server/ambari-server.out for more information.
when i checked into the logs, there i have found that the current version db is not comapatable with the server.
then i have tried these steps
wget -O /etc/yum.repos.d/ambari.repo http://public-repo-1.hortonworks.com/ambari/centos6/2.x/updates/2.2.1.0/ambari.repo
yum install ambari-server -y
ambari-server setup -y
wget -O /etc/yum.repos.d/ambari.repo http://public-repo-1.hortonworks.com/ambari/centos6/2.x/updates/2.2.1.1/ambari.repo
yum upgrade ambari-server -y
ambari-server upgrade
ambari-server start
when i run these commands after that ambari server did started but here is the amazing this has happened.
actually, i removed the ambari completely and trying to reinstall it.
when i completed all the above steps, and when i entered into the ambari ui, it is again pointing to the same host which i have removed previously. I was just shocked by seeing that with heartbeat lost.
then i realised that ambari agent is not at installed,
then i installed ambari agent and started it .
yum -y install ambari-agent
ambari-agent start
then,when i tried to start the services it didnt worked.
i checked in command prompt that, is these all serivecs still exist or not ?, by entering zookeeper . but that command is not found, because service is not installed in my host.
Then i started to remove the services from the host which is present in a dead mode,using these commands.
curl
-u admin:admin -H “X-Requested-By: Ambari” -X DELETE http://localhost:8080/api/v1/clusters/hostname/services/servicename
but it didnt worked, I got a error msg as
message" : "CSRF protection is turned on. X-Requested-By HTTP header is required."
then i have edited the ambari-server.properties file and added these lines into that
vi /etc/ambari-server/conf/ambari.properties api.csrfPrevention.enabled=false ambari-server restart
then again i have retried it, at this time it did worked.
But when i tried to remove hive, it didnt,because mysql is running in my machine.
when i tried this command it did worked.
curl -u admin:admin -X DELETE -H 'X-Requested-By:admin' http://localhost:8080/api/v1/clusters/mycluster/hosts/host/host_components/MYSQL_SERVER
then, when i tried to add the services starting with zookeeper,again,
it given me error like
"resource_management.core.exceptions.Fail: Applying Directory['/usr/hdp/current/zookeeper-client/conf'] failed, looped symbolic links found while resolving /usr/hdp/current/zookeeper-client/con
Then i have checked the directories, i got to know that these links were pointing back to the same directories.
So, i have tried these commands to solve this issue.
rm /usr/hdp/current/zookeeper-client/conf ln -s /etc/zookeeper/2.3.2.0-2950/0 /usr/hdp/current/zookeeper-client/conf
And it did worked.
at last i have successfully reinstalled the ambari as well as hadoop in my machine.
Thank you.
Created 12-02-2016 08:03 PM
Regarding error:
Caused by: org.apache.ambari.server.AmbariException: Stack Definition Service at '/var/lib/ambari-server/resources/common-services/HAWQ/2.0.0/metainfo.xml' doesn't contain a metainfo.xml
Do you see "HAWQ" and "PXF" directories present inside the "/var/lib/ambari-server/resources/common-services"
If yes then you should move them to some "/tmp" directory and then try again. Then restart ambari server.
I have seen this kind of error in Old ambari servers. Can you please let us know about your ambari version.
Created 12-03-2016 08:51 AM
Thanks for the suggestion jss.
But it could'nt solved the issue completely.
I have moved those files into temp directory and again tried to start the server but, now it given another error as
ERROR: Exiting with exit code -1. REASON: Ambari Server java process died with exitcode 255. Check /var/log/ambari-server/ambari-server.out for more information.
when i checked into the logs, there i have found that the current version db is not comapatable with the server.
then i have tried these steps
wget -O /etc/yum.repos.d/ambari.repo http://public-repo-1.hortonworks.com/ambari/centos6/2.x/updates/2.2.1.0/ambari.repo
yum install ambari-server -y
ambari-server setup -y
wget -O /etc/yum.repos.d/ambari.repo http://public-repo-1.hortonworks.com/ambari/centos6/2.x/updates/2.2.1.1/ambari.repo
yum upgrade ambari-server -y
ambari-server upgrade
ambari-server start
when i run these commands after that ambari server did started but here is the amazing this has happened.
actually, i removed the ambari completely and trying to reinstall it.
when i completed all the above steps, and when i entered into the ambari ui, it is again pointing to the same host which i have removed previously. I was just shocked by seeing that with heartbeat lost.
then i realised that ambari agent is not at installed,
then i installed ambari agent and started it .
yum -y install ambari-agent
ambari-agent start
then,when i tried to start the services it didnt worked.
i checked in command prompt that, is these all serivecs still exist or not ?, by entering zookeeper . but that command is not found, because service is not installed in my host.
Then i started to remove the services from the host which is present in a dead mode,using these commands.
curl
-u admin:admin -H “X-Requested-By: Ambari” -X DELETE http://localhost:8080/api/v1/clusters/hostname/services/servicename
but it didnt worked, I got a error msg as
message" : "CSRF protection is turned on. X-Requested-By HTTP header is required."
then i have edited the ambari-server.properties file and added these lines into that
vi /etc/ambari-server/conf/ambari.properties api.csrfPrevention.enabled=false ambari-server restart
then again i have retried it, at this time it did worked.
But when i tried to remove hive, it didnt,because mysql is running in my machine.
when i tried this command it did worked.
curl -u admin:admin -X DELETE -H 'X-Requested-By:admin' http://localhost:8080/api/v1/clusters/mycluster/hosts/host/host_components/MYSQL_SERVER
then, when i tried to add the services starting with zookeeper,again,
it given me error like
"resource_management.core.exceptions.Fail: Applying Directory['/usr/hdp/current/zookeeper-client/conf'] failed, looped symbolic links found while resolving /usr/hdp/current/zookeeper-client/con
Then i have checked the directories, i got to know that these links were pointing back to the same directories.
So, i have tried these commands to solve this issue.
rm /usr/hdp/current/zookeeper-client/conf ln -s /etc/zookeeper/2.3.2.0-2950/0 /usr/hdp/current/zookeeper-client/conf
And it did worked.
at last i have successfully reinstalled the ambari as well as hadoop in my machine.
Thank you.