Member since
06-24-2019
34
Posts
0
Kudos Received
0
Solutions
11-08-2021
11:10 PM
Hi Is there a way to configure multiple LDAP servers (LDAP HA) in the hive-site.xml ? As of now we have only configured one LDAP server using below property: hive.server2.authentication.ldap.url <property>
<name>hive.server2.authentication.ldap.url</name>
<value>LDAP_URL</value>
</property> Our requirement is to config two LDAP servers to provide HA. Thanks
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
08-09-2020
10:38 PM
1) From Ambari-server logs i can see following error/exception/warning: I have also attached the full log 2020-08-10 06:15:48,174 INFO [Stack Version Loading Thread] RepoUrlInfoCallable:94 - Loading latest URL info from http://s3.amazonaws.com/dev.hortonworks.com/HDP/hdp_urlinfo.json for stacks HDP-2.3.GlusterFS,HDP-2.3.ECS,HDP-2.6 2020-08-10 06:15:48,174 INFO [Stack Version Loading Thread] RepoUrlInfoCallable:94 - Loading latest URL info from http://public-repo-1.hortonworks.com/HDP/hdp_urlinfo.json for stacks HDP-2.5,HDP-2.3,HDP-2.1.GlusterFS,HDP-2.4,HDP-2.1,HDP-2.2,HDP-2.0 2020-08-10 06:15:48,339 INFO [Stack Version Loading Thread] RepoUrlInfoCallable:110 - Loaded URI http://public-repo-1.hortonworks.com/HDP/hdp_urlinfo.json for stacks HDP-2.5,HDP-2.3,HDP-2.1.GlusterFS,HDP-2.4,HDP-2.1,HDP-2.2,HDP-2.0 in 167ms 2020-08-10 06:15:48,943 ERROR [Stack Version Loading Thread] URLStreamProvider:245 - Received HTTP 403 response from URL: http://s3.amazonaws.com/dev.hortonworks.com/HDP/hdp_urlinfo.json 2020-08-10 06:15:48,943 INFO [Stack Version Loading Thread] RepoUrlInfoCallable:107 - Could not load the URI from http://s3.amazonaws.com/dev.hortonworks.com/HDP/hdp_urlinfo.json, stack defaults will be used 2020-08-10 06:15:48,943 INFO [Stack Version Loading Thread] RepoUrlInfoCallable:110 - Loaded URI http://s3.amazonaws.com/dev.hortonworks.com/HDP/hdp_urlinfo.json for stacks HDP-2.3.GlusterFS,HDP-2.3.ECS,HDP-2.6 in 772ms 2020-08-10 06:15:48,944 INFO [main] StackContext:180 - Loaded urlinfo in 777ms 2020-08-10 06:15:48,945 ERROR [main] StackContext:189 - Could not load repo results java.io.IOException: Server returned HTTP response code: 403 for URL: http://s3.amazonaws.com/dev.hortonworks.com/HDP/hdp_urlinfo.json at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1950) at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1945) at java.security.AccessController.doPrivileged(Native Method) at sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1944) at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1514) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498) at org.apache.ambari.server.controller.internal.URLStreamProvider.readFrom(URLStreamProvider.java:116) at org.apache.ambari.server.controller.internal.URLStreamProvider.readFrom(URLStreamProvider.java:121) at org.apache.ambari.server.state.stack.RepoUrlInfoCallable.call(RepoUrlInfoCallable.java:97) at org.apache.ambari.server.state.stack.RepoUrlInfoCallable.call(RepoUrlInfoCallable.java:50) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.IOException: Server returned HTTP response code: 403 for URL: http://s3.amazonaws.com/dev.hortonworks.com/HDP/hdp_urlinfo.json at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1900) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1498) at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480) at org.apache.ambari.server.controller.internal.URLStreamProvider.processURL(URLStreamProvider.java:218) at org.apache.ambari.server.controller.internal.URLStreamProvider.processURL(URLStreamProvider.java:142) ... 8 more 2020-08-10 06:15:50,050 INFO [main] PasswordUtils:175 - Credential provider creation failed org.apache.ambari.server.AmbariException: Master key initialization failed. at org.apache.ambari.server.security.encryption.CredentialProvider.<init>(CredentialProvider.java:63) at org.apache.ambari.server.utils.PasswordUtils.loadCredentialProvider(PasswordUtils.java:173) at org.apache.ambari.server.utils.PasswordUtils.readPasswordFromStore(PasswordUtils.java:148) at org.apache.ambari.server.configuration.ComponentSSLConfiguration.getPassword(ComponentSSLConfiguration.java:121) at org.apache.ambari.server.configuration.ComponentSSLConfiguration.init(ComponentSSLConfiguration.java:63) at org.apache.ambari.server.controller.AmbariServer.main(AmbariServer.java:1107) 2020-08-10 06:15:50,425 INFO [main] AmbariManagementControllerImpl:427 - Initializing the AmbariManagementControllerImpl 2020-08-10 06:15:50,508 INFO [main] SingleFileWatch:78 - Starting SingleFileWatcher:ambari.properties 2020-08-10 06:15:50,512 INFO [main] AmbariServer:786 - Jetty is configuring ambari-client-thread with 2 reserved acceptors/selectors and a total pool size of 25 for 8 processors. 2020-08-10 06:15:50,539 INFO [main] AmbariServer:786 - Jetty is configuring qtp-ambari-agent with 4 reserved acceptors/selectors and a total pool size of 25 for 8 processors. 2020-08-10 06:15:50,614 INFO [main] ClustersImpl:281 - Initializing cluster and host data. 2020-08-10 06:15:50,653 INFO [main] ClassPathXmlApplicationContext:583 - Refreshing org.springframework.context.support.ClassPathXmlApplicationContext@51df2a41: startup date [Mon Aug 10 06:15:50 CEST 2020]; root of context hierarchy 2020-08-10 06:15:50,908 INFO [main] CertificateManager:75 - Initialization of root certificate 2020-08-10 06:15:50,908 INFO [main] CertificateManager:77 - Certificate exists:true 2020-08-10 06:15:50,971 INFO [main] ViewRegistry:534 - Triggering loading of [ALL] views 2020-08-10 06:15:50,984 INFO [main] ViewRegistry:1814 - Reading view archive /var/lib/ambari-server/resources/views/ambari-admin-2.7.3.0.0.jar. 2020-08-10 06:15:51,092 INFO [main] ViewRegistry:1850 - View deployed: ADMIN_VIEW{2.7.3.0}. 2020-08-10 06:15:51,128 INFO [main] HeartbeatProcessor:160 - **** Starting heartbeats processing threads **** 2020-08-10 06:15:51,130 INFO [main] AmbariServer:496 - ********** Started Heartbeat handler ********** 2020-08-10 06:15:51,131 INFO [main] AmbariServer:535 - ********* Initializing Clusters ********** 2020-08-10 06:15:51,131 INFO [main] AmbariServer:541 - ********* Current Clusters State ********* 2020-08-10 06:15:51,132 INFO [main] AmbariServer:542 - 2020-08-10 06:15:51,132 INFO [main] AmbariServer:544 - ********* Reconciling Alert Definitions ********** 2020-08-10 06:15:51,132 INFO [main] AmbariServer:547 - ********* Initializing ActionManager ********** 2020-08-10 06:15:51,132 INFO [main] AmbariServer:550 - ********* Initializing Controller ********** 2020-08-10 06:15:51,132 INFO [main] AmbariServer:554 - ********* Initializing Scheduled Request Manager ********** 2020-08-10 06:15:51,397 INFO [main] ContextLoader:304 - Root WebApplicationContext: initialization started 2020-08-10 06:15:51,398 INFO [main] AnnotationConfigWebApplicationContext:583 - Refreshing Root WebApplicationContext: startup date [Mon Aug 10 06:15:51 CEST 2020]; parent: org.springframework.context.support.ClassPathXmlApplicationContext@51df2a41 2020-08-10 06:15:51,425 INFO [main] AnnotationConfigWebApplicationContext:208 - Registering annotated classes: [class org.apache.ambari.server.configuration.spring.ApiSecurityConfig] 2020-08-10 06:15:51,961 INFO [main] AutowiredAnnotationBeanPostProcessor:155 - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring 2020-08-10 06:15:52,776 INFO [main] DefaultSecurityFilterChain:43 - Creating filter chain: org.springframework.security.web.util.matcher.AnyRequestMatcher@1, [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@6292c63e, org.springframework.security.web.context.SecurityContextPersistenceFilter@232b3b4c, org.springframework.security.web.header.HeaderWriterFilter@6fdd3382, org.springframework.security.web.authentication.logout.LogoutFilter@13dc383b, org.apache.ambari.server.security.authorization.AmbariUserAuthorizationFilter@65bb4cb9, org.apache.ambari.server.security.authentication.AmbariDelegatingAuthenticationFilter@7b33deed, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@39296cef, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@64e3bc2, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@7fd32c56, org.springframework.security.web.session.SessionManagementFilter@4fea095b, org.springframework.security.web.access.ExceptionTranslationFilter@58e7f930, org.apache.ambari.server.security.authorization.AmbariAuthorizationFilter@b04a6a4, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@472dbaf5] 2020-08-10 06:15:52,846 INFO [main] ContextLoader:344 - Root WebApplicationContext: initialization completed in 1449 ms 2020-08-10 06:15:52,847 WARN [main] DeprecationWarning:43 - Using @Deprecated Class org.eclipse.jetty.servlets.GzipFilter 2020-08-10 06:15:52,847 WARN [main] DeprecationWarning:43 - Using @Deprecated Class org.eclipse.jetty.servlets.GzipFilter 2020-08-10 06:15:52,847 WARN [main] GzipFilter:45 - GzipFilter is deprecated. Use GzipHandler 2020-08-10 06:15:52,847 WARN [main] GzipFilter:45 - GzipFilter is deprecated. Use GzipHandler 2020-08-10 06:15:52,847 INFO [main] ViewThrottleFilter:123 - Ambari Views will be able to utilize 12 concurrent REST API threads 2020-08-10 06:15:52,865 INFO [main] PackagesResourceConfig:101 - Scanning for root resource and provider classes in the packages: org.apache.ambari.server.api.rest org.apache.ambari.server.api.services org.apache.ambari.eventdb.webservice org.apache.ambari.server.api 2020-08-10 06:15:55,283 INFO [main] ScanningResourceConfig:153 - Root resource classes found: class org.apache.ambari.server.api.services.views.ViewService class org.apache.ambari.server.api.services.users.ActiveWidgetLayoutService class org.apache.ambari.server.api.services.RootServiceService class org.apache.ambari.server.api.services.HostService class org.apache.ambari.server.api.services.RemoteClustersService class org.apache.ambari.server.api.services.ValidationService class org.apache.ambari.server.api.services.LogoutService class org.apache.ambari.server.api.services.VersionDefinitionService class org.apache.ambari.server.api.services.groups.GroupService class org.apache.ambari.server.api.services.FeedService class org.apache.ambari.server.api.services.views.ViewPrivilegeService class org.apache.ambari.server.api.services.users.UserAuthenticationSourceService class org.apache.ambari.server.api.services.InstanceService class org.apache.ambari.server.api.services.ClusterService class org.apache.ambari.server.api.services.AmbariPrivilegeService class org.apache.ambari.server.api.services.views.ViewUrlsService class org.apache.ambari.server.api.services.users.UserAuthorizationService class org.apache.ambari.server.api.rest.KdcServerReachabilityCheck class org.apache.ambari.server.api.services.ExtensionsService class org.apache.ambari.server.api.services.SettingService class org.apache.ambari.server.api.rest.HealthCheck class org.apache.ambari.server.api.services.AlertTargetService class org.apache.ambari.server.api.services.RecommendationService class org.apache.ambari.server.api.services.TargetClusterService class org.apache.ambari.server.api.services.KerberosDescriptorService class org.apache.ambari.server.api.services.StacksService class org.apache.ambari.server.api.services.users.UserService class org.apache.ambari.server.api.services.KeyService class org.apache.ambari.server.api.services.BlueprintService class org.apache.ambari.server.api.services.LdapSyncEventService class org.apache.ambari.server.api.services.views.ViewPermissionService class org.apache.ambari.server.api.services.views.ViewDataMigrationService class org.apache.ambari.server.api.services.RoleAuthorizationService class org.apache.ambari.server.api.services.groups.MemberService class org.apache.ambari.server.api.services.groups.GroupPrivilegeService class org.apache.ambari.server.api.rest.BootStrapResource class org.apache.ambari.server.api.services.ActionService class org.apache.ambari.server.api.services.views.ViewInstanceService class org.apache.ambari.server.api.services.users.UserPrivilegeService class org.apache.ambari.server.api.services.views.ViewVersionService class org.apache.ambari.server.api.services.ExtensionLinksService class org.apache.ambari.server.api.services.PersistKeyValueService class org.apache.ambari.server.api.services.RequestService class org.apache.ambari.server.api.services.PermissionService 2020-08-10 06:15:55,284 INFO [main] ScanningResourceConfig:153 - Provider classes found: class org.apache.ambari.server.api.GsonJsonProvider 2020-08-10 06:15:55,329 INFO [main] WebApplicationImpl:815 - Initiating Jersey application, version 'Jersey: 1.19 02/11/2015 03:25 AM' 2020-08-10 06:15:55,983 WARN [main] Errors:173 - The following warnings have been detected with resource and/or provider classes: WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.InstanceService.getInstances(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.InstanceService.getInstance(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewService.getViews(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewService.getView(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewVersionService.getVersions(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.views.ViewVersionService.getVersion(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String,java.lang.String), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RequestService.getRequests(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.RequestService.getRequest(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.BlueprintService.getBlueprints(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.BlueprintService.getBlueprint(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.users.UserService.getUsers(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.users.UserService.getUser(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.HostService.getHosts(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.HostService.getHost(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo,java.lang.String), should not consume any entity. WARNING: A HTTP GET method, public javax.ws.rs.core.Response org.apache.ambari.server.api.services.KerberosDescriptorService.getKerberosDescriptors(java.lang.String,javax.ws.rs.core.HttpHeaders,javax.ws.rs.core.UriInfo), should not consume any entity. 2) Postgress DB is fine Postgress DB and Ambari server running on same host # grep -i jdbc /etc/ambari-server/conf/ambari.properties server.jdbc.connection-pool=internal server.jdbc.database=postgres server.jdbc.database_name=ambari server.jdbc.postgres.schema=ambari server.jdbc.user.name=ambari server.jdbc.user.passwd=/etc/ambari-server/conf/password.dat /etc/ambari-server/conf # service postgresql status ● postgresql.service - PostgreSQL database server Loaded: loaded (/usr/lib/systemd/system/postgresql.service; enabled; vendor preset: disabled) Active: active (running) since Mon 2020-08-10 06:13:32 CEST; 20min ago Process: 26813 ExecStop=/usr/lib/postgresql-init stop (code=exited, status=0/SUCCESS) Process: 26783 ExecReload=/usr/lib/postgresql-init reload (code=exited, status=0/SUCCESS) Process: 26826 ExecStart=/usr/lib/postgresql-init start (code=exited, status=0/SUCCESS) Main PID: 26838 (postgres) Tasks: 15 (limit: 512) CGroup: /system.slice/postgresql.service ├─26838 /usr/lib/postgresql94/bin/postgres -D /var/lib/pgsql/data ├─26839 postgres: logger process ├─26841 postgres: checkpointer process ├─26842 postgres: writer process ├─26843 postgres: wal writer process ├─26844 postgres: autovacuum launcher process ├─26845 postgres: stats collector process ├─26966 postgres: root citrix-confdb [local] idle ├─27110 postgres: ambari ambari 127.0.0.1(44408) idle ├─27111 postgres: ambari ambari 127.0.0.1(44410) idle ├─27130 postgres: ambari ambari 127.0.0.1(44436) idle ├─27204 postgres: ctxvda citrix-confdb 127.0.0.1(44456) idle ├─27253 postgres: ambari ambari 127.0.0.1(44476) idle ├─27254 postgres: ambari ambari 127.0.0.1(44478) idle └─27255 postgres: ambari ambari 127.0.0.1(44480) idle Aug 10 06:13:31 systemd[1]: Starting PostgreSQL database server... Aug 10 06:13:31 postgresql-init[26826]: 2020-08-10 06:13:31 CEST LOG: redirecting log output to logging collector process Aug 10 06:13:31 postgresql-init[26826]: 2020-08-10 06:13:31 CEST HINT: Future log output will appear in directory "pg_log". Aug 10 06:13:32 systemd[1]: Started PostgreSQL database server. 3) Please note postgress and ambari server are running on same host Please let me know your findings Thanks
... View more
07-29-2020
07:35 AM
Hello Experts, I'm trying to install Open source AMBARI 2.7.3 in SLES 12 SP2 and SP3 environments. I was able to build Ambari 2.7.4 source successfully as per the steps mentioned in Installation Guide for Ambari 2.7.4, Reference - https://cwiki.apache.org/confluence/display/AMBARI/Installation+Guide+for+Ambari+2.7.4 After build, i was able to successfully install and start Ambari server. After logging to Ambari UI from browser i wanted to deploy the cluster using Ambari UI. Current issue im facing is, when i click next after inputting name of the cluster as shown in screenshot, AMBARI 2.7.3 UI not porgressing to next step during cluster deploy It gets spinning for long time forever and stays in Get started Page itself without any progress Please suggest fix. I already tried with create 'Reset UI' button in Ambari's experimental page and retired but no luck I also tried with multiple browsers as well but same issues. Also, there is no broken HDP stacks.All fine Please help to fix this issue. Attached the screenshot for reference Thanks & Regards, Chethan
... View more
- Tags:
- Ambari
- Installation
Labels:
- Labels:
-
Apache Ambari
06-23-2020
02:04 AM
Great ! Thank you @Bender It would be good if you can share the JIRA for reference and close this thread
... View more
06-22-2020
07:38 PM
Thanks @Bender I will wait for your response
... View more
06-22-2020
04:25 AM
Hi @Bender I still not convinced. If that is the case then why binaries of HDF 3.3.0 of SLES SP1 is still available for public but not moved to paywall? In the Release notes of HDF 3.3.0 why SLES SP3 link is not provided for public ? Considering same version of HDF but only differences here is OS version Refer here: https://docs.cloudera.com/HDPDocuments/HDF3/HDF-3.3.0/release-notes/content/hdf_repository_locations.html Table 3. SLES 12 HDF repository & additional download locations I can only see binaries for SUSE Linux Enterprise Server (SLES) v12 SP1 but not SLES V12 SP3 why ?
... View more
06-22-2020
03:41 AM
Hello @Bender I will not agree to your solution. The reason is, As per cloudera documentation it says - "Starting with the Ambari 2.7.5 release, access to Ambari repositories requires authentication. To access the binaries, you must first have the required authentication credentials (username and password)." https://docs.cloudera.com/HDPDocuments/Ambari-2.7.5.0/bk_ambari-installation/content/access_ambari_paywall.html So, its very clear that cost to pay for binaries are from Ambari 2.7.5 and later release only. What i'm asking in this post is, the link to download the "HDF 3.3.0 binaries for SLES 12 SP3" Note: Support matrix always says SLES 12 SP3 is supported with H DF 3.3.0. But if you open release notes of H DF 3.3.0, i dont see any option to download the HDF and HDF management pack for SLES 12 SP3. Release Notes contains link to download for SLES 12 SP1 only. If you go through the support matrix, it says "AMBARI 2.7.3 is the last version that is supported/compatible with HDF 3.3.0" which means there is no need to pay for downloading binaries for HDF 3.3.0 and as well as binaries for AMBARI 2.7.3. Hope my understanding is clear !! Thanks
... View more
06-21-2020
08:00 AM
Hi Experts,
Can you please share me the link to download binaries for HDF version 3.3.0 and HDF Management pack version 3.3.0 for SLES 12 SP3 ?
Support matrix always says SLES 12 SP3 is supported with H DF 3.3.0. But if you open release notes of H DF 3.3.0, i dont see any option to download the HDF and HDF management pack for SLES 12 SP3. Release Notes contains link to download for SLES 12 SP1 only.
Can you please share me the link to download binaries for HDF version 3.3.0 and HDF Management pack version 3.3.0 for SLES 12 SP3 ?
Thanks
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Cloudera DataFlow (CDF)
06-21-2020
07:31 AM
Thanks a lot @stevenmatison So with your responses, can you please conclude following: 1) My sequence and upgrade path mentioned is correct and it will work. Right ? 2) Finally, can you please help me by sharing me the link to download binaries for HDF 3.1.0 and HDF 3.3.0 for SLES 12 SP3 ? support matrix always says SLES 12 SP3 is supported with H DF 3.3.0 But if you open release notes of H DF 3.3.0, i dont see any option to download the HDF management pack for SLES 12 SP3. Release Notes contains link to download for SLES 12 SP1 only Thanks
... View more
06-21-2020
07:25 AM
@stevenmatison , Really Appreciate your response ! I forget to mention here is, Basically my environment have two separate clusters - HDP and HDF. About HDP cluster: HDP version i have is 2.6.2 . Ambari version installed on HDP is 2.6.1. We dont have kafka in HDP About HDF cluster: Ambari version installed on HDF is 2.6.1 running on SLES SP3 HDF version = 3.0.2.4 running on SLES SP3 Apache Kafka = 0.10.2 running on SLES SP3 Each HDP and HDF cluster has separate AMBARI installed. Currently we dont have support contract with hortonworks/cloudera. For this year it is discontinued. We may continue support contract again next year. Existing HDF cluster has 5 kafka brokers, zookeepers services and ambari metrics installed that's it. Ambari 2.6.1 is installed over this HDF cluster My Management is asking to have installed with open source kafka 2.5.0 latest stable version and to scratch out existing HDF cluster. But we are also exploring other option of installing new Ambari 2.7.4 and get Kafka version to at-least 2.0.0 and later. That's the intention So with this information, here is my plan: Can you please help me to get answer Step 1) Scratching out existing AMBARI 2.7.4 on HDF cluster. Step 2) Remove all existing services - kafka brokers (0.11 version) and zookeepers. Step 3) Install New A MBARI 2.7.4 Step 4) Finally, add new kafka services using 'add services' option What will be the outcome of new kafka version after Step 4 ? Thanks
... View more
- Tags:
- Apache kafka
06-21-2020
01:31 AM
Hi, What will be the Kafka version that we get when we newly install Ambari 2.7.4 and add kafka using 'add service' option ? Thanks
... View more
Labels:
- Labels:
-
Apache Kafka
06-21-2020
01:22 AM
Hi @Shelton and experts, My Current environment details are: Ambari version = 2.6.1 running on SLES SP3 HDF version = 3.0.2.4 running on SLES SP3 Apache Kafka = 0.10.2 running on SLES SP3 My Goal is to get Apache Kafka Target version to 2.0.0. As per release notes of HDF version 3.3.0 comes with apache kafka version 2.0.0 To acheive this: 1) what are my upgrade paths for both AMBARI <version ?> and HDF 3.3.0 ? 2) What is the correct sequence of upgrade that I need to follow considering HDF management pack version, Ambari version and HDF version ? As per support matrix, it says, HDF 3.2.0 and later versions only supports SLES 12 SP3. But in the past we had SLES SP1 OS running (2 years back) so at that time we had installed HDF 3.0.2 and Ambari 2.6.1 Per my research i arrived to following sequence and upgrade path. Could you please validate if this is right upgrade path or not ? If not, please suggest correct path and sequence Sequence and upgrade path: =========================== Step 1) Upgrade "HDF Management Pack" to HDF 3.1.0 from HDF 3.0.2 Step 2) Upgrade "HDF" to 3.1.0 (with SLES 12 SP3, support matrix says SP3 not supported) from HDF 3.0.2 (running already on SLES 12 SP3) Note: The reason for HDF 3.1.0 is document this upgrade path says -> https://docs.cloudera.com/HDPDocuments/HDF3/HDF-3.4.0/ambari-managed-hdf-upgrade/content/hdf-upgrade-paths.html "If you are running an earlier HDF version, upgrade to at lease HDF 3.1.0, and then proceed to the HDF 3.3.0 upgrade." Step 3) Upgrade "Ambari" to 2.7.3 (with SLES 12 SP3) from Ambari 2.6.1 (running already on SLES 12 SP3) Step 4) Upgrade "HDF Management Pack" to HDF 3.3.0 from HDF 3.1.0 Step 5) Upgrade "HDF" to 3.3.0 (with SLES 12 SP3) from HDF 3.1.0 (with SLES 12 SP3) Additionally, to get Apache Kafka version 2.0.0 and higher version do we have any other option ? like installing new AMBARI 2.7.4 and Add Kafka using "Add service" option from Ambari ? If this is the case then what version of Kafka comes allong with AMBARI 2.7.4 new installation ? Thanks
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Kafka
06-08-2020
05:48 AM
Hi Can you please share me the link to download binaries for HDF 3.2.0 for SLES 12 SP3 ? Thats all i needed help from this thread. After getting answer we can close this thread Thanks
... View more
05-30-2020
11:06 PM
Thanks for reply. Hoover this doesn't answer my question completely As i mentioned support matrix always says SLES 12 SP3 is supported with H DF 3.2.0 and 3.5.0 But if you open release notes of H DF 3.2.0 and 3.5.0, i dont see any option to download the HDF management pack for SLES 12 SP3. Release Notes contains link to download for SLES 12 SP1 only Could you please confirm on this again
... View more
05-30-2020
04:53 AM
Hello Experts, Does any version's of HDF (3.1.0 or 3.2.0 or 3.5.0) supports SLES 12 SP3 ? We would like to upgrade HDF to 3.1.0 from 3.0.2. We use Ambari 2.6.1 Environment Details: Ambari version = 2.6.1 HDF version = 3.0.2.4 OS - SLES12 SP3 # cat /etc/os-release NAME="SLES" VERSION="12-SP3" VERSION_ID="12.3" PRETTY_NAME="SUSE Linux Enterprise Server 12 SP3" ID="sles" ANSI_COLOR="0;32" CPE_NAME="cpe:/o:suse:sles:12:sp3" A year back when we upgraded our HDF to HDF 3.0.2 our O.S version was SLES 12 SP1 and after that OS was upgraded to SLES 12 SP3 Now i started upgrading HDF Management pack(below command) but i cannot install HDF 3.1.0 version as it is not visible under versions tab even after attempting to install it. ambari-server upgrade-mpack --mpack=/root/hdf-ambari-mpack-3.1.0.0-564.tar.gz --verbose Does any version's of HDF (3.1.0 or 3.2.0 or 3.5.0) supports SLES 12 SP3 ? Because if i open release notes of any HDF versions i dont see SLES v12 SP3. Instead i only see v12 SP1. For example, https://docs.cloudera.com/HDPDocuments/HDF3/HDF-3.2.0/release-notes/content/hdf_repository_locations.html Table 3. SLES 12 HDF repository & additional download locations SUSE Linux Enterprise Server (SLES) v12 SP1 Thanks
... View more
Labels:
- Labels:
-
HDFS
07-05-2019
05:32 AM
Thanks @Geoffrey Shelton Okot ! This helped
... View more
07-01-2019
01:13 PM
Hi @Jay Kumar SenSharma Thanks for your reply and helping me ! Can you please have a look of below upgrade sequence and let me know your feedback: I'm planning to implement in the following sequence: 1) Upgrade HDF Mpack using the command: ambari-server upgrade-mpack --mpack=/root/hdf-ambari-mpack-3.1.0.0-564.tar.gz --verbose Restart ambari-server 2) Upgrade HDF to 3.1.0 from 3.0 Note: I will not be upgrading Ambari as of now since currently installed ambari version is 2.6.1 and it is compatible with HDF 3.1.0 as per matrix 3) Finally, i will upgrade Ambari in future if required to 2.7.3 from 2.6.1 Also, how can i verify the installed HDF Mpack version ? Please help ! Regards, Chethan
... View more
06-29-2019
02:16 PM
Hello, Where can i get or download the Version Definition File (VDF) of HDF 3.1.0 for SLES12 O.S ? Background: My Goal is to upgrade the HDF version to 3.1.0 from 3..0.2. Basically i have two separate clusters - HDP and HDF. Each clusters is managed by Ambari (one Ambari to manage HDF and another Ambari to manage HDP) Steps to reproduce the issue: In the Versions tab in Ambari of HDF cluster, i clicked on "Register version" In the drop down > Select "Add Version" I will get a popup to upload VDF file as show below: VDF file for HDF 3.1. for SLES12 is not available in HDF 3.1 release notes https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.0/bk_release-notes/content/ch_hdf_relnotes.html Appreciate help to get me the VDF file for HDF 3.1. for SLES12 Thanks Chethan
... View more
Labels:
06-29-2019
02:16 PM
Hi @Michael Dennis "MD" Uanang I have similar issue where i can't find VDF file for HDF 3.1.0 version for SLES12 Operating system Can you please help me to find VDF file ? My Goal is to upgrade the HDF version to 3.1.0 from 3..0.2. Basically i have two separate clusters - HDP and HDF. Each clusters is managed by Ambari (one Ambari to manage HDF and another Ambari to manage HDP) Steps to reproduce the issue: In the Versions tab in Ambari of HDF cluster, i clicked on "Register version" In the drop down > Select "Add Version" I will get a popup to upload VDF file as show below: VDF file for HDF 3.1. for SLES12 is not available in HDF 3.1 release notes https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.1.0/bk_release-notes/content/ch_hdf_relnotes.html Appreciate help to get me the VDF file for HDF 3.1. for SLES12 Thanks Chethan
... View more
06-27-2019
05:11 AM
Thanks for your reply To arrive HDF 3.4.0, we also have to upgrade our Ambari as well. Basically we have two separate HDF and HDP clusters. Currently we are planning to upgrade only HDF cluster. We will not touch HDP cluster Current HDF version = 3.0.2 Current Ambari version = 2.6.1 So, my question is, what should be my upgrade sequence since I need to upgrade Ambari as well along with HDF ? Note, HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3. Can you recommend which of the below sequence will be good to follow ? Sequence#1: Step 1) Upgrade first HDF to 3.1.0 from 3.0.2 This is because HDF 3.1.0 is compatible with Ambari 2.6.1 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.6.1 The first step in upgrading the HDF 3.1 is to upgrade to Ambari 2.6.1. Step 2) Upgrade Ambari alone to 2.7.3 from 2.6.1 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.7.3 Ref - https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/ambari-managed-hdf-upgrade/content/hdf-upgrade-paths.html Note: HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3 Step 3) Upgrade “HDF” component to 3.3.0 from 3.1.0 At the end of this upgrade you will have HDF version 3.3.0 and Ambari version 2.7.3 HDF version 3.3.0 is compatible with Ambari version 2.7.3 Step 4) Upgrade “HDF” component to 3.4.0 from 3.3.0 At the end of this upgrade you will have HDF version 3.4.0 and Ambari version 2.7.3 HDF version 3.4.0 is compatible with Ambari version 2.7.3 OR Sequence#2: Step 1) Upgrade Ambari alone to 2.7.3 from 2.6.1 At the end of this upgrade you will have HDF version 3.0.2 and Ambari version 2.7.3 Note: HDF version 3.0.2 is NOT compatible with Ambari version 2.7.3 Step 2) Upgrade HDF to 3.1.0 from 3.0.2 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.7.3 Note: HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3 Step 3) Upgrade “HDF” component to 3.3.0 from 3.1.0 At the end of this upgrade you will have HDF version 3.3.0 and Ambari version 2.7.3 HDF version 3.3.0 is compatible with Ambari version 2.7.3 Step 4) Upgrade “HDF” component to 3.4.0 from 3.3.0 At the end of this upgrade you will have HDF version 3.4.0 and Ambari version 2.7.3 HDF version 3.4.0 is compatible with Ambari version 2.7.3 Thanks Chethan
... View more
06-27-2019
05:11 AM
Thanks for your reply To arrive HDF 3.4.0, we also have to upgrade our Ambari as well. Basically we have two separate HDF and HDP clusters. Currently we are planning to upgrade only HDF cluster. We will not touch HDP cluster Current HDF version = 3.0.2 Current Ambari version = 2.6.1 So, my question is, what should be my upgrade sequence since I need to upgrade Ambari as well along with HDF ? Note, HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3. Can you recommend which of the below sequence will be good to follow ? Sequence#1: Step 1) Upgrade first HDF to 3.1.0 from 3.0.2 This is because HDF 3.1.0 is compatible with Ambari 2.6.1 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.6.1 The first step in upgrading the HDF 3.1 is to upgrade to Ambari 2.6.1. Step 2) Upgrade Ambari alone to 2.7.3 from 2.6.1 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.7.3 Ref - https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/ambari-managed-hdf-upgrade/content/hdf-upgrade-paths.html Note: HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3 Step 3) Upgrade “HDF” component to 3.3.0 from 3.1.0 At the end of this upgrade you will have HDF version 3.3.0 and Ambari version 2.7.3 HDF version 3.3.0 is compatible with Ambari version 2.7.3 Step 4) Upgrade “HDF” component to 3.4.0 from 3.3.0 At the end of this upgrade you will have HDF version 3.4.0 and Ambari version 2.7.3 HDF version 3.4.0 is compatible with Ambari version 2.7.3 OR Sequence#2: Step 1) Upgrade Ambari alone to 2.7.3 from 2.6.1 At the end of this upgrade you will have HDF version 3.0.2 and Ambari version 2.7.3 Note: HDF version 3.0.2 is NOT compatible with Ambari version 2.7.3 Step 2) Upgrade HDF to 3.1.0 from 3.0.2 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.7.3 Note: HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3 Step 3) Upgrade “HDF” component to 3.3.0 from 3.1.0 At the end of this upgrade you will have HDF version 3.3.0 and Ambari version 2.7.3 HDF version 3.3.0 is compatible with Ambari version 2.7.3 Step 4) Upgrade “HDF” component to 3.4.0 from 3.3.0 At the end of this upgrade you will have HDF version 3.4.0 and Ambari version 2.7.3 HDF version 3.4.0 is compatible with Ambari version 2.7.3 Thanks Chethan
... View more
06-27-2019
05:11 AM
Thanks for your reply To arrive HDF 3.4.0, we also have to upgrade our Ambari as well. Basically we have two separate HDF and HDP clusters. Currently we are planning to upgrade only HDF cluster. We will not touch HDP cluster Current HDF version = 3.0.2 Current Ambari version = 2.6.1 So, my question is, what should be my upgrade sequence since I need to upgrade Ambari as well along with HDF ? Note, HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3. Can you recommend which of the below sequence will be good to follow ? Sequence#1: Step 1) Upgrade first HDF to 3.1.0 from 3.0.2 This is because HDF 3.1.0 is compatible with Ambari 2.6.1 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.6.1 The first step in upgrading the HDF 3.1 is to upgrade to Ambari 2.6.1. Step 2) Upgrade Ambari alone to 2.7.3 from 2.6.1 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.7.3 Ref - https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/ambari-managed-hdf-upgrade/content/hdf-upgrade-paths.html Note: HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3 Step 3) Upgrade “HDF” component to 3.3.0 from 3.1.0 At the end of this upgrade you will have HDF version 3.3.0 and Ambari version 2.7.3 HDF version 3.3.0 is compatible with Ambari version 2.7.3 Step 4) Upgrade “HDF” component to 3.4.0 from 3.3.0 At the end of this upgrade you will have HDF version 3.4.0 and Ambari version 2.7.3 HDF version 3.4.0 is compatible with Ambari version 2.7.3 OR Sequence#2: Step 1) Upgrade Ambari alone to 2.7.3 from 2.6.1 At the end of this upgrade you will have HDF version 3.0.2 and Ambari version 2.7.3 Note: HDF version 3.0.2 is NOT compatible with Ambari version 2.7.3 Step 2) Upgrade HDF to 3.1.0 from 3.0.2 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.7.3 Note: HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3 Step 3) Upgrade “HDF” component to 3.3.0 from 3.1.0 At the end of this upgrade you will have HDF version 3.3.0 and Ambari version 2.7.3 HDF version 3.3.0 is compatible with Ambari version 2.7.3 Step 4) Upgrade “HDF” component to 3.4.0 from 3.3.0 At the end of this upgrade you will have HDF version 3.4.0 and Ambari version 2.7.3 HDF version 3.4.0 is compatible with Ambari version 2.7.3 Thanks Chethan
... View more
06-27-2019
05:10 AM
Thanks for your reply To arrive HDF 3.4.0, we also have to upgrade our Ambari as well. Basically we have two separate HDF and HDP clusters. Currently we are planning to upgrade only HDF cluster. We will not touch HDP cluster Current HDF version = 3.0.2 Current Ambari version = 2.6.1 So, my question is, what should be my upgrade sequence since I need to upgrade Ambari as well along with HDF ? Note, HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3. Can you recommend which of the below sequence will be good to follow ? Sequence#1: Step 1) Upgrade first HDF to 3.1.0 from 3.0.2 This is because HDF 3.1.0 is compatible with Ambari 2.6.1 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.6.1 The first step in upgrading the HDF 3.1 is to upgrade to Ambari 2.6.1. Step 2) Upgrade Ambari alone to 2.7.3 from 2.6.1 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.7.3 Ref - https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.3.0/ambari-managed-hdf-upgrade/content/hdf-upgrade-paths.html Note: HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3 Step 3) Upgrade “HDF” component to 3.3.0 from 3.1.0 At the end of this upgrade you will have HDF version 3.3.0 and Ambari version 2.7.3 HDF version 3.3.0 is compatible with Ambari version 2.7.3 Step 4) Upgrade “HDF” component to 3.4.0 from 3.3.0 At the end of this upgrade you will have HDF version 3.4.0 and Ambari version 2.7.3 HDF version 3.4.0 is compatible with Ambari version 2.7.3 OR Sequence#2: Step 1) Upgrade Ambari alone to 2.7.3 from 2.6.1 At the end of this upgrade you will have HDF version 3.0.2 and Ambari version 2.7.3 Note: HDF version 3.0.2 is NOT compatible with Ambari version 2.7.3 Step 2) Upgrade HDF to 3.1.0 from 3.0.2 At the end of this upgrade you will have HDF version 3.1.0 and Ambari version 2.7.3 Note: HDF version 3.1.0 is NOT compatible with Ambari version 2.7.3 Step 3) Upgrade “HDF” component to 3.3.0 from 3.1.0 At the end of this upgrade you will have HDF version 3.3.0 and Ambari version 2.7.3 HDF version 3.3.0 is compatible with Ambari version 2.7.3 Step 4) Upgrade “HDF” component to 3.4.0 from 3.3.0 At the end of this upgrade you will have HDF version 3.4.0 and Ambari version 2.7.3 HDF version 3.4.0 is compatible with Ambari version 2.7.3 Thanks Chethan
... View more
06-24-2019
05:25 PM
My requirement is to upgrade HDF (not HDP) . Basically my environment have two separate clusters - HDP and HDF I want to upgrade only HDF to 3.4.0 Version from HDF 3.0.2 . Ambari version currently installed is 2.6.1 and HDF version is 3.0.2 . Can you please help me to know if I can directly upgrade to 3.4.0 from 3.0.2 ? I tried looking into this but no luck https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.4.0/ambari-managed-hdf-upgrade/content/hdf-upgrade-paths.html Can i directly perform HDF upgrade to 3.4.0 from 3.0.2.5 ? Or does it requires first to upgrade to some version first and then upgrade it to higher version hdf 3.4.0 ?
... View more
Labels: