Member since
02-08-2016
33
Posts
19
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1742 | 07-12-2016 08:35 PM | |
1186 | 06-27-2016 02:35 PM | |
2358 | 06-01-2016 08:06 PM |
11-14-2017
09:44 PM
Below are the custom properties which would go in hand with H2O Sparkling Water . Use these properties to modify H2O Cluster Nodes, Memory, Cores etc.
... View more
09-25-2017
05:31 PM
Short Description: Configure Knox to access Atlas UI Article Here is the steps to access Atlas UI through Knox. 1. Make sure Knox is configured properly and it works fine. 2. ssh to Knox gateway host and go to /var/lib/knox/data-2.6.****/services 3. mkdir –p atlas/0.8.0/ mkdir –p atlas-api/0.8.0/ 4. download the configurations from https://github.com/apache/knox/tree/v0.13.0/gateway-service-definitions/src/main/resources/services/atlas-api/0.8.0 URL to /var/lib/knox/data-2.6.***/services/ atlas-api/0.8.0/ 5. download the configurations from https://github.com/apache/knox/tree/v0.13.0/gateway-service-definitions/src/main/resources/services/atlas/0.8.0 URL to /var/lib/knox/data-2.6.***/services/ atlas/0.8.0/ 5. change the owner/Group permissions to Knox for /var/lib/knox/data-2.6.**/services/atlas*/ and subdirectory 6. Go to Knox configurations Modify "Advanced topology" with below service tag
<service> <role>ATLAS</role> <url>sandbox.hortonworks.com:21000</url> </service> <service> 7. Restart Knox service. 8. You should be able to access Atlas UI from the below URL https://sandbox.hortonworks.com:8443/gateway/default/atlas/ Please Note: At this point of time, it's a work-around, Hortonworks doesn't support this yet.
... View more
Labels:
05-30-2017
08:48 PM
1 Kudo
Issue: On Adding Oozie service to cluster, sometimes its possible user could see an error with simple message Failed to create new war file as below and on top of it Oozie server wouldn't start.
INFO: Adding extension: /usr/hdp/current/oozie-server/libext/mysql-jdbc-driver.jar
Failed: creating new Oozie WAR Solution: Oozie deployment script is trying to unzip the existing Oozie war file and then zip it back up and place it in a sub directory of /tmp/... . This unzip and zip process doubles the storage space required for the Oozie war file.
Make sure there is enough /tmp storage for the above process to take place, logs wouldn't provide this info.
... View more
Labels:
05-09-2017
09:17 PM
Resolution/Workaround:
- Clear any value assigned to the Hive Configuration Resources property in the PutHiveStreaming processor. (With no site.xml files provided, NiFi will use the site.xml files that are loaded in the classpath).
- To load the site.xml files (core-site.xml, hdfs-site.xml, and hive-site.xml) on NiFi's classpath, place them in NiFi's conf directory (for Ambari based installs that would be in /etc/nifi/conf)
- Restart NiFi.
... View more
07-12-2016
08:35 PM
Thanks all for your comments. Looks like earlier upgrade missed below step, ambari-server stop ambari-server upgradestack HDP-2.3
ambari-server start This was evident from ambari database tables, while finalizing earlier upgrade, it errored out.
... View more
07-10-2016
05:01 PM
yes, upgraded current prod cluster from Ambari 2.2.1.1 to 2.2.2 and we are trying to upgrade HDP from 2.3.4.7 to 2.4.2. Able to perform the same in pre-prod env without issues.
... View more
07-10-2016
09:04 AM
Registering the new HDP version during upgrade HDP from 2.3.4.7 to 2.4.2.0 is failing with below error message. An internal system exception occurred: Stack HDP-2.4 doesn't have upgrade packages [qtp-ambari-client-33] BaseManagementHandler:57 - Caught a system exception while attempting to create a resource: An internal system exception occurred: Stack HDP-2.4 doesn't have upgrade packages
org.apache.ambari.server.controller.spi.SystemException: An internal system exception occurred: Stack HDP-2.4 doesn't have upgrade packages
at org.apache.ambari.server.controller.internal.AbstractResourceProvider.createResources(AbstractResourceProvider.java:282)
at org.apache.ambari.server.controller.internal.RepositoryVersionResourceProvider.createResources(RepositoryVersionResourceProvider.java:153)
at org.apache.ambari.server.controller.internal.ClusterControllerImpl.createResources(ClusterControllerImpl.java:289)
at org.apache.ambari.server.api.services.persistence.PersistenceManagerImpl.create(PersistenceManagerImpl.java:76)
at org.apache.ambari.server.api.handlers.CreateHandler.persist(CreateHandler.java:36)
at org.apache.ambari.server.api.handlers.BaseManagementHandler.handleRequest(BaseManagementHandler.java:72)
at org.apache.ambari.server.api.services.BaseRequest.process(BaseRequest.java:135)
at org.apache.ambari.server.api.services.BaseService.handleRequest(BaseService.java:106)
at org.apache.ambari.server.api.services.BaseService.handleRequest(BaseService.java:75)
at org.apache.ambari.server.api.services.RepositoryVersionService.createRepositoryVersion(RepositoryVersionService.java:98)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
... View more
Labels:
- Labels:
-
Apache Ambari
06-27-2016
02:35 PM
Since storm was never used in the cluster, customer didn't want to spend time in research, service was removed.
... View more
06-06-2016
06:40 PM
please refer http://docs.hortonworks.com/HDPDocuments/Ambari-2.1.2.0/bk_ambari_views_guide/content/_reverse_proxy_views.html
... View more
06-01-2016
08:06 PM
3 Kudos
solved the issue by updating mapred.admin.user.env , since the cluster was upgraded from HDP 2.1 to HDP 2.3
... View more