Member since
07-25-2019
29
Posts
0
Kudos Received
0
Solutions
02-14-2019
01:40 PM
Hi All, My cluser details: ambari 2.5 HDP 2.6.3 I have installed knox on one of the my node, its working fine on demo ldap service. but when i change the configuration for ldap getting below mentioned error: 2019-02-14 14:22:26,842 INFO hadoop.gateway (KnoxLdapRealm.java:getUserDn(691)) - Computed userDn: cn=username,ou=hadoop_usr_dev,ou=groups,ou=accounts,dc=domainname,dc=com using dnTemplate for principal: username
2019-02-14 14:22:26,847 INFO hadoop.gateway (KnoxLdapRealm.java:doGetAuthenticationInfo(203)) - Could not login: org.apache.shiro.authc.UsernamePasswordToken - username, rememberMe=false (1.1.1.1) 2019-02-14 14:22:26,847 ERROR hadoop.gateway (KnoxLdapRealm.java:doGetAuthenticationInfo(205)) - Shiro unable to login: javax.naming.AuthenticationException: [LDAP: error code 32 - No Such Object] kindly suggest what went wrong , if any ody faced this issue. Thanks, kant
... View more
- Tags:
- hadoop-ldap
- Security
Labels:
01-15-2019
07:24 AM
Thanks for your suggestion scharn, 1. Do i have to stop local demo ladp service via ambari 2. How about user-ldif conf file Thanks,
... View more
01-14-2019
09:49 AM
HI All, I have installed atlas and working fine and able to see the hive table lineage successfully. now we wanted see lineage for HBASE table and columns. My HDP version is 2.6.3 my ambari version is 2.5.3 Altals will support the lineage for Hbase tables ..? if yes kindly suggest step to configure In atlas webUI in search dropdown i cna see HABE_TABLE , when i searched with hbase-table result is empty. if i search with HIVE_TABLE result is showing all hive tables. kindly suggest.
... View more
Labels:
01-11-2019
06:36 AM
Hi All, I have setup knox locally and works fine. now wanted to integrate with LDAP kindly suggest configuration file to be updated and what are information required from AD/LDAP team. My cluster is non secure (non KRB) cluster. Thanks, kant
... View more
Labels:
01-07-2019
12:30 PM
My cluster deails: Ambari version --> 2.5.2 HDP version --> 2.6.3 HDP installed and managed by ambari Now i want to upgrdae ambari from 2.5.2 to 2.7.X . can i directly go for 2.7.x or first i have to upgrade 2.6.x and than 2.6.X to 2.7.x..? Kindly suggest. Thanks, Kant
... View more
Labels:
10-17-2018
12:49 PM
Hi All I have to build development cluster for one of my customer. In respective to gateway i have to use customers (third party) gateway for security instead of knox. Is there any handfull material or any suggestion will be very help full. How to configure third party gateway instead of knox what are configuration etc.. Thanks, Surya
... View more
- Tags:
- Security
09-17-2018
01:37 PM
we are planning for new dev 10 node cluster i need user management in HDP with Ambari. wanted to go for KRB integrated with AD 1.Do we need any service account to be created..? 2. Let assume that secure (KRB) cluster is u and running , how do i set up new users going forward , if user addded in AD group is the only way and hdfs home directory on edge node. actually my question here is how to synac AD users with hdp clusters. 3. do we need separate ids for yarn job in AD pr how to manage jobs etc. please point me out for some good link or readables. Thanks, kant
... View more
- Tags:
- Security
09-06-2018
12:27 PM
Hi All, I am trying to run hive on spark engine. cluster details: HDP 2.6/ Ambari 2.5 /hive-1.2.1/saprk-1.6 I have cpoied spark assembly jar file to hive /lib/ directory and set hive.execution.engine=spark and tried to run query through an error as below . hive> set hive.execution.engine=spark;
hive> select count(*) from sparksql_query.final_result;
Query ID = user_20180906105558_d4566904-e2a4-4d94-b91b-404bdd6c1b12
Total jobs = 1
Launching Job 1 out of 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapreduce.job.reduces=<number>
Starting Spark Job = 61362f50-0bd4-4676-bfef-87d433386a94
Status: SENT
Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
hive> Any sigggestion will helps me a lot kant
... View more
- Tags:
- Hadoop Core
- Hive
Labels:
07-25-2018
01:08 PM
Hi All, My Cluster : HDP 2.6 Ambari : 2.5 I want setup Apche tika on top of HDP or to work with existing HDP cluster . i am not awre of apache tika new to it. firstly can we do setup apache tike on HDP 2.6 if yes Kindly any one sugggest some link or process of setting up the tile on HDP. Thanks, Kant
... View more
07-12-2018
11:13 AM
Hi Frnds, I have installed ranger on HDP 2.6.3 cluster non krb , i have navigate the below location but lof file is empty but i am able to see access entry in ranger webUI. just wnated to know the log location of access files xasecure.audit.destination.hdfs.dir
hdfs://<Hostanme>:8020/ranger/audit
/ranger/audit/hiveServer2/20180711/
/ranger/audit/hiveServer2/20180711/hiveServer2_ranger_audit_hostname.lfnet.se.log
... View more
- Tags:
- ranger-audit
- Security
Labels:
07-11-2018
12:10 PM
Hi Frnds, I am looking for hive query or hdfs cmds where i can see as like in history cmd we used to see them in linux OS. If hive query saved in any log files , like ranger or any audit log file.
... View more
- Tags:
- Hadoop Core
- log4
07-04-2018
11:09 AM
My cluster details: HDP 2.6.3 Ambari 2.5 File view in ambari 1.0.0 - created and its working fine for uploading and other stuff. My question here is can we open pdf files and .jpg files from HDFS. i ahve some pdf files from which we have created hive table and works fine so for all good. I want open those pdf files via file view. Yes i can open but data is not a proper format , looks like some bianry format. Is it possible to open pdf with proper format if yes kindly suggest any configs or settings. Thanks, kant
... View more
06-07-2018
12:07 PM
Thanks but No luck by changing attriubute as immuatable. after chnaging the attr log files saying that unable to delete temp webapps folder - warning.. actual my issue is not opening the zeppelin webpage at all.
... View more
06-06-2018
11:57 AM
No, not at have any one faced same issue kindly let me know the soln thanks,
... View more
06-06-2018
11:55 AM
The issue is same but i am struck on the same. Is there any one faced same issue or any soln thanks Kant
... View more
06-06-2018
11:51 AM
WARN [2018-06-06 13:31:14,247] ({main} ContextHandler.java[log]:2062) - unavailable
javax.servlet.ServletException: Resource class org.apache.zeppelin.server.ZeppelinServer can not be instantiated due to InvocationTargetException
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createSingletonInstance(CXFNonSpringJaxrsServlet.java:396)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createApplicationInstance(CXFNonSpringJaxrsServlet.java:454)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createServerFromApplication(CXFNonSpringJaxrsServlet.java:432)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.init(CXFNonSpringJaxrsServlet.java:93)
at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:616)
at org.eclipse.jetty.servlet.ServletHolder.initialize(ServletHolder.java:396)
at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:871)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:298)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1349)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1342)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:741)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:505)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:163)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.server.Server.start(Server.java:387)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.Server.doStart(Server.java:354)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.apache.zeppelin.server.ZeppelinServer.main(ZeppelinServer.java:235)
WARN [2018-06-06 13:31:14,952] ({main} WebAppContext.java[doStart]:514) - Failed startup of context o.e.j.w.WebAppContext@62e7f11d{/,file:/usr/hdp/2.6.3.0-235/zeppelin/webapps/webapp/,STARTING}{/usr/hdp/current/zeppelin-server/lib/zeppelin-web-0.7.3.2.6.3.0-235.war}
javax.servlet.ServletException: Resource class org.apache.zeppelin.server.ZeppelinServer can not be instantiated due to InvocationTargetException
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createSingletonInstance(CXFNonSpringJaxrsServlet.java:396)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createApplicationInstance(CXFNonSpringJaxrsServlet.java:454)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createServerFromApplication(CXFNonSpringJaxrsServlet.java:432)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.init(CXFNonSpringJaxrsServlet.java:93)
at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:616)
at org.eclipse.jetty.servlet.ServletHolder.initialize(ServletHolder.java:396)
at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:871)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:298)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1349) Followed steps as per the link for hotfix but still not able to open the GUI 503 error. https://community.hortonworks.com/content/supportkb/177625/index.html
... View more
04-25-2018
04:56 AM
Thanks for your comments I am not using remote storage for notebook and interpreter. using local storage only. my storage location is: /usr/hdp/2.6.xxxxxxx/zeppelin/notebook. I manually changed permission and ownership as 755 & zeppelin:hadoop for webapps directory my issues here is after restarting zeppelin service via ambri all permissions and ownership of the webapps follder will be changing permissions are 750 - may be this the reason i am not able to access the webUI of zeppelin. ownership - zeppelin:zeppelin
... View more
04-24-2018
04:37 PM
Failed startup of context o.e.j.w.WebAppContext@10e92f8f{/,file:/usr/hdp/2.6.3.0-235/zeppelin/webapps/webapp/,STARTING}{/usr/hdp/current/zeppelin-server/lib/zeppelin-web-0.7.3.2.6.3.0-235.war} javax.servlet.ServletException: Resource class org.apache.zeppelin.server.ZeppelinServer can not be instantiated due to InvocationTargetException at
... View more
04-24-2018
04:25 PM
Thanks jay kumar, performed below steps: I have changed permission and owner on webapps folder and restarted the zeppelin, after restarted zeppelin via ambari ownership & permission are getting back old (owner - zeppelin:zeppelin & permission - 750). Is there any conf or setting where can i make persistant ownership and permission on webappps directory. please suggest
... View more
04-24-2018
12:42 PM
cluster details: HDP 2.6.3 - upgraded from 2.5 to 2.6 - all services are working fine without any issuess zeppelin webpage access error HTTP 503 i agve logged into zeppelin hosts root manually changed group name and permission drwxrwxrwx 3 zeppelin zeppelin 4096 Apr 24 14:19 webapps tried to access webpage same error, i knkow there may be restart required to zeppelin service. once i restarted zeppelin via ambari the owner and permission of the webapps folder are changing to drwxr-x--- 3 zeppelin zeppelin 4096 Apr 24 14:40 webapps pease can any one suggest how to solve this issue .
... View more
- Tags:
- Hadoop Core
- zeppelin
Labels:
04-24-2018
10:07 AM
when i restarted zeppelin notebook service via ambari, it's overwriting permission on /webapps directory with 750. may be this the reson i am not able ot access webUI. kindly suggest how can i stop zeppelin to stop overwrite permission and owner ship of the webappas directory under zeppelin installtion directory.
... View more
04-24-2018
04:50 AM
Any one suggest.?
... View more
04-23-2018
12:43 PM
I have upgraded HDP clsuter from 2.5.3 to 2.6.3 (not finalized at) in testing pahse. Only zeppelin is the issue , rest all services are working fine. zeppelin webUI not able to access after upgrdae. I have checked /usr/hdp/2.6.3.0-235/zeppelin/webapps drwxr-x--- 3 zeppelin zeppelin 4096 Apr 23 14:13 webapps - permission , i have changed it to zeppelin:hadoop and restarted - again webapps folder ownership is chnaging to zeppelin:zeppelin bu service restart. I tried to enable to HDFS storgae for zeppelin and tried with local as well by setting conf location and storage type. but still i am not able to access webui. Kindly suggest thanks in Adv =====================LOG FILE================= Here log file here: tail -n 100 zeppelin-zeppelin-.log
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.server.Server.start(Server.java:387)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.Server.doStart(Server.java:354)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.apache.zeppelin.server.ZeppelinServer.main(ZeppelinServer.java:213)
INFO [2018-04-23 13:58:23,303] ({main} AbstractConnector.java[doStart]:266) - Started ServerConnector@52b6319f{HTTP/1.1}{0.0.0.0:9995}
INFO [2018-04-23 13:58:23,303] ({main} Server.java[doStart]:379) - Started @4279ms
INFO [2018-04-23 13:58:23,303] ({main} ZeppelinServer.java[main]:221) - Done, zeppelin server started
INFO [2018-04-23 14:12:55,315] ({Thread-14} ZeppelinServer.java[run]:225) - Shutting down Zeppelin Server ...
INFO [2018-04-23 14:12:55,318] ({Thread-14} AbstractConnector.java[doStop]:306) - Stopped ServerConnector@52b6319f{HTTP/1.1}{0.0.0.0:9995}
INFO [2018-04-23 14:12:55,322] ({Thread-14} ContextHandler.java[log]:2052) - Cleaning up Shiro Environment
INFO [2018-04-23 14:12:55,322] ({Thread-14} ContextHandler.java[doStop]:865) - Stopped o.e.j.w.WebAppContext@10e92f8f{/,file:/usr/hdp/2.6.3.0-235/zeppelin/webapps/webapp/,UNAVAILABLE}{/usr/hdp/current/zeppelin-server/lib/zeppelin-web-0.7.3.2.6.3.0-235.war}
ERROR [2018-04-23 14:12:55,973] ({Thread-14} ZeppelinServer.java[run]:232) - Error while stopping servlet container
java.lang.NullPointerException
at org.apache.zeppelin.server.ZeppelinServer$2.run(ZeppelinServer.java:228)
INFO [2018-04-23 14:12:55,973] ({Thread-14} ZeppelinServer.java[run]:234) - Bye
INFO [2018-04-23 14:13:06,502] ({main} ZeppelinConfiguration.java[create]:102) - Load configuration from file:/etc/zeppelin/2.6.3.0-235/0/zeppelin-site.xml
INFO [2018-04-23 14:13:06,565] ({main} ZeppelinConfiguration.java[create]:110) - Server Host: 0.0.0.0
INFO [2018-04-23 14:13:06,565] ({main} ZeppelinConfiguration.java[create]:112) - Server Port: 9995
INFO [2018-04-23 14:13:06,565] ({main} ZeppelinConfiguration.java[create]:116) - Context Path: /
INFO [2018-04-23 14:13:06,568] ({main} ZeppelinConfiguration.java[create]:117) - Zeppelin Version: 0.7.3
INFO [2018-04-23 14:13:06,592] ({main} Log.java[initialized]:186) - Logging initialized @400ms
INFO [2018-04-23 14:13:06,667] ({main} ZeppelinServer.java[setupWebAppContext]:370) - ZeppelinServer Webapp path: /usr/hdp/current/zeppelin-server/webapps
INFO [2018-04-23 14:13:06,971] ({main} ZeppelinServer.java[main]:211) - Starting zeppelin server
INFO [2018-04-23 14:13:06,973] ({main} Server.java[doStart]:327) - jetty-9.2.15.v20160210
INFO [2018-04-23 14:13:09,107] ({main} StandardDescriptorProcessor.java[visitServlet]:297) - NO JSP Support for /, did not find org.eclipse.jetty.jsp.JettyJspServlet
INFO [2018-04-23 14:13:09,119] ({main} ContextHandler.java[log]:2052) - Initializing Shiro environment
INFO [2018-04-23 14:13:09,119] ({main} EnvironmentLoader.java[initEnvironment]:128) - Starting Shiro environment initialization.
INFO [2018-04-23 14:13:09,196] ({main} EnvironmentLoader.java[initEnvironment]:141) - Shiro environment initialized in 76 ms.
WARN [2018-04-23 14:13:09,202] ({main} ServletHolder.java[getNameOfJspClass]:923) - Unable to make identifier for jsp rest trying rest instead
WARN [2018-04-23 14:13:09,653] ({main} NativeCodeLoader.java[<clinit>]:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
WARN [2018-04-23 14:13:11,472] ({main} DomainSocketFactory.java[<init>]:117) - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
INFO [2018-04-23 14:13:11,485] ({main} FileSystemConfigStorage.java[<init>]:69) - Creating filesystem: org.apache.hadoop.hdfs.DistributedFileSystem
INFO [2018-04-23 14:13:11,619] ({main} FileSystemConfigStorage.java[<init>]:74) - Store zeppelin configuration files under hdfs://<NN>:8020/user/zeppelin/conf
INFO [2018-04-23 14:13:11,791] ({main} InterpreterSettingManager.java[init]:328) - InterpreterSettingRef name jdbc
INFO [2018-04-23 14:13:11,792] ({main} InterpreterSettingManager.java[init]:328) - InterpreterSettingRef name angular
INFO [2018-04-23 14:13:11,792] ({main} InterpreterSettingManager.java[init]:328) - InterpreterSettingRef name livy
INFO [2018-04-23 14:13:11,792] ({main} InterpreterSettingManager.java[init]:328) - InterpreterSettingRef name spark
INFO [2018-04-23 14:13:11,792] ({main} InterpreterSettingManager.java[init]:328) - InterpreterSettingRef name sh
INFO [2018-04-23 14:13:11,792] ({main} InterpreterSettingManager.java[init]:328) - InterpreterSettingRef name md
INFO [2018-04-23 14:13:11,794] ({main} FileSystemConfigStorage.java[loadInterpreterSettings]:97) - Load Interpreter Setting from file: hdfs://<NN>:8020/user/zeppelin/conf/interpreter.json
WARN [2018-04-23 14:13:11,930] ({main} ContextHandler.java[log]:2062) - unavailable
javax.servlet.ServletException: Resource class org.apache.zeppelin.server.ZeppelinServer can not be instantiated due to InvocationTargetException
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createSingletonInstance(CXFNonSpringJaxrsServlet.java:396)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createApplicationInstance(CXFNonSpringJaxrsServlet.java:454)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createServerFromApplication(CXFNonSpringJaxrsServlet.java:432)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.init(CXFNonSpringJaxrsServlet.java:93)
at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:616)
at org.eclipse.jetty.servlet.ServletHolder.initialize(ServletHolder.java:396)
at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:871)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:298)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1349)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1342)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:741)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:505)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:163)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.server.Server.start(Server.java:387)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.Server.doStart(Server.java:354)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.apache.zeppelin.server.ZeppelinServer.main(ZeppelinServer.java:213)
WARN [2018-04-23 14:13:12,603] ({main} WebAppContext.java[doStart]:514) - Failed startup of context o.e.j.w.WebAppContext@10e92f8f{/,file:/usr/hdp/2.6.3.0-235/zeppelin/webapps/webapp/,STARTING}{/usr/hdp/current/zeppelin-server/lib/zeppelin-web-0.7.3.2.6.3.0-235.war}
javax.servlet.ServletException: Resource class org.apache.zeppelin.server.ZeppelinServer can not be instantiated due to InvocationTargetException
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createSingletonInstance(CXFNonSpringJaxrsServlet.java:396)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createApplicationInstance(CXFNonSpringJaxrsServlet.java:454)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.createServerFromApplication(CXFNonSpringJaxrsServlet.java:432)
at org.apache.cxf.jaxrs.servlet.CXFNonSpringJaxrsServlet.init(CXFNonSpringJaxrsServlet.java:93)
at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:616)
at org.eclipse.jetty.servlet.ServletHolder.initialize(ServletHolder.java:396)
at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:871)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:298)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1349)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1342)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:741)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:505)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:163)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132)
at org.eclipse.jetty.server.Server.start(Server.java:387)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.Server.doStart(Server.java:354)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.apache.zeppelin.server.ZeppelinServer.main(ZeppelinServer.java:213)
INFO [2018-04-23 14:13:12,609] ({main} AbstractConnector.java[doStart]:266) - Started ServerConnector@7342e05d{HTTP/1.1}{0.0.0.0:9995}
INFO [2018-04-23 14:13:12,609] ({main} Server.java[doStart]:379) - Started @6419ms
INFO [2018-04-23 14:13:12,609] ({main} ZeppelinServer.java[main]:221) - Done, zeppelin server started
... View more
- Tags:
- Hadoop Core
- zeppelin
Labels:
04-16-2018
01:51 PM
Thank you very much sandeep, I have checked as you said rpm -qa | grep tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch its insatllled diff rpm. as of now some use cases are going on my cluster i will follow your suggestion and update ===================================================thanks
... View more
04-16-2018
12:56 PM
Thanks sandeep, DN1: dependency check error :sudo yum check dependencies Loaded plugins: langpacks, product-id, search-disabled-repos shc_2_6_4_0_91-1.1.0.2.6.4.0-91.noarch has missing requires of hdp-select >= ('0', '2.6.4.0', '91') Error: check ['dependencies'] DN2: dependency check error: sudo yum check dependencies Loaded plugins: langpacks, product-id, search-disabled-repos pig_2_6_4_0_91-0.16.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91-client pig_2_6_4_0_91-0.16.0.2.6.4.0-91.noarch has missing requires of hdp-select >= ('0', '2.6.4.0', '91') tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91 tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91-hdfs tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91-mapreduce tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91-yarn tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hdp-select >= ('0', '2.6.4.0', '91') Error: check ['dependencies'] hdp-select - output for DNs accumulo-client - None accumulo-gc - None accumulo-master - None accumulo-monitor - None accumulo-tablet - None accumulo-tracer - None atlas-client - 2.5.3.0-37 atlas-server - 2.5.3.0-37 falcon-client - None falcon-server - None flume-server - None hadoop-client - 2.5.3.0-37 hadoop-hdfs-datanode - 2.5.3.0-37 hadoop-hdfs-journalnode - 2.5.3.0-37 hadoop-hdfs-namenode - 2.5.3.0-37 hadoop-hdfs-nfs3 - 2.5.3.0-37 hadoop-hdfs-portmap - 2.5.3.0-37 hadoop-hdfs-secondarynamenode - 2.5.3.0-37 hadoop-hdfs-zkfc - 2.5.3.0-37 hadoop-httpfs - None hadoop-mapreduce-historyserver - 2.5.3.0-37 hadoop-yarn-nodemanager - 2.5.3.0-37 hadoop-yarn-resourcemanager - 2.5.3.0-37 hadoop-yarn-timelineserver - 2.5.3.0-37 hbase-client - 2.5.3.0-37 hbase-master - 2.5.3.0-37 hbase-regionserver - 2.5.3.0-37 hive-metastore - 2.5.3.0-37 hive-server2 - 2.5.3.0-37 hive-server2-hive2 - 2.5.3.0-37 hive-webhcat - 2.5.3.0-37 kafka-broker - 2.5.3.0-37 knox-server - None livy-server - 2.5.3.0-37 mahout-client - None oozie-client - None oozie-server - None phoenix-client - 2.5.3.0-37 phoenix-server - 2.5.3.0-37 ranger-admin - None ranger-kms - None ranger-tagsync - None ranger-usersync - None slider-client - None spark-client - 2.5.3.0-37 spark-historyserver - 2.5.3.0-37 spark-thriftserver - 2.5.3.0-37 spark2-client - 2.5.3.0-37 spark2-historyserver - 2.5.3.0-37 spark2-thriftserver - 2.5.3.0-37 sqoop-client - 2.5.3.0-37 sqoop-server - 2.5.3.0-37 storm-client - None storm-nimbus - None storm-slider-client - None storm-supervisor - None zeppelin-server - 2.5.3.0-37 zookeeper-client - 2.5.3.0-37 zookeeper-server - 2.5.3.0-37 hdp-select - output of NN - bit diff i will paste only diff lines here - only 3 entry are looks higher version. shc - 2.6.3.0-235 livy2-client - 2.6.3.0-235 livy2-server - 2.6.3.0-235 kindly suggest me how to proceed now. shall down grade on NN (hdp-select ) or upgrade DN (hdp-select) manully. i dont have internet on my nodes. can you suggest link or steps to dwongrade/upgrade hdp-select Thanks, kant
... View more
04-16-2018
08:39 AM
Thanks in Advance I am trying to upgrade hdp 2.5.3 to 2.6.3 via ambrai. (1- masater and 2 DNs) created local repo. updated version file (HDP-2.6.3.0-235.xml) from the repo list and browsed into ambari version tab. updated local repo path with respect to OS version and hit on install button. packages are pushed to master node but getting error on both the data node as error mentioned below . kindly any one help on this issue. Thanks, kant ERROR with transaction check vs depsolve:
hdp-select >= 2.6.4.0-91 is needed by (installed) shc_2_6_4_0_91-1.1.0.2.6.4.0-91.noarch
** Found 2 pre-existing rpmdb problem(s), 'yum check' output follows:
libdb-5.3.21-21.el7_4.x86_64 is a duplicate with libdb-5.3.21-20.el7.x86_64
shc_2_6_4_0_91-1.1.0.2.6.4.0-91.noarch has missing requires of hdp-select >= ('0', '2.6.4.0', '91')
Your transaction was saved, rerun it with:
yum load-transaction /tmp/yum_save_tx.2018-04-16.10-01.4oTlyk.yumtx
2018-04-16 10:01:40,333 - checked_call[['/usr/bin/yum', '-d', '0', '-e', '0', 'check', 'dependencies']] {'sudo': True}
2018-04-16 10:01:42,425 - Could not install packages. Error: Execution of '/usr/bin/yum -d 0 -e 0 check dependencies' returned 1. Error: check ['dependencies']
shc_2_6_4_0_91-1.1.0.2.6.4.0-91.noarch has missing requires of hdp-select >= ('0', '2.6.4.0', '91')
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/custom_actions/scripts/install_packages.py", line 168, in actionexecute
ret_code = self.install_packages(package_list)
File "/var/lib/ambari-agent/cache/custom_actions/scripts/install_packages.py", line 409, in install_packages
elif not verifyDependencies():
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/packages_analyzer.py", line 311, in verifyDependencies
code, out = rmf_shell.checked_call(cmd, sudo=True)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 check dependencies' returned 1. Error: check ['dependencies']
shc_2_6_4_0_91-1.1.0.2.6.4.0-91.noarch has missing requires of hdp-select >= ('0', '2.6.4.0', '91')
DN2:
ERROR with transaction check vs depsolve:
hdp-select >= 2.6.4.0-91 is needed by (installed) pig_2_6_4_0_91-0.16.0.2.6.4.0-91.noarch
hdp-select >= 2.6.4.0-91 is needed by (installed) tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch
** Found 10 pre-existing rpmdb problem(s), 'yum check' output follows:
glibc-2.17-196.el7_4.2.x86_64 is a duplicate with glibc-2.17-196.el7.x86_64
glibc-common-2.17-196.el7_4.2.x86_64 is a duplicate with glibc-common-2.17-196.el7.x86_64
libdb-5.3.21-21.el7_4.x86_64 is a duplicate with libdb-5.3.21-20.el7.x86_64
pig_2_6_4_0_91-0.16.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91-client
pig_2_6_4_0_91-0.16.0.2.6.4.0-91.noarch has missing requires of hdp-select >= ('0', '2.6.4.0', '91')
tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91
tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91-hdfs
tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91-mapreduce
tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hadoop_2_6_4_0_91-yarn
tez_2_6_4_0_91-0.7.0.2.6.4.0-91.noarch has missing requires of hdp-select >= ('0', '2.6.4.0', '91')
Your transaction was saved, rerun it with:
yum load-transaction /tmp/yum_save_tx.2018-04-12.08-26.99xfvr.yumtx
... View more
- Tags:
- Hadoop Core
- upgrade