Member since
07-07-2017
15
Posts
0
Kudos Received
0
Solutions
04-03-2018
10:17 AM
Issue description – When
starting Ambari the Database explorer is unusable for 2 minutes until an error
message appears " "Query timed out to fetch table description for
user: XXXX"", then the user has to refresh the database explorer in
order to get a database and table listing. This is an intermittent issue
which gets resolved after refreshing or login in for the second time. Action steps – In
order to resolve the issue ,we changed the ambari timeout properties but it
didn't resolve the issue. agent.package.install.task.timeout=1800
agent.task.timeout=900 server.http.session.inactive_timeout=1800 server.task.timeout=1200
user.inactivity.timeout.default=900 user.inactivity.timeout.role.readonly.default=900 views.ambari.hive.Hive_view.result.fetch.timeout=6000000 views.ambari.hive.hive_instance_1_50.result.fetch.timeout=6000000 views.ambari.request.connect.timeout.millis=6000000 views.ambari.request.read.timeout.millis=6000000 PFA application logs and system configuration settings.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
12-01-2017
06:09 AM
Hey Lucy, I am also facing the same issue. All the views works fine with other browser but with IE I am not able to use hive 1.5 view. Although hive-2.0 view is working fine. You can try using hive2.0 view with IE. If you have any resolution with hive-1.5 view that you might have implemented, please let me know.
... View more
11-22-2017
07:40 AM
Yes. We have made those changes. Whenever I enable user impersonation in the interpreter for hive, and submits a query. The query gets submitted as anonymous user. For SH and SPARK interpreter it works
... View more
11-22-2017
07:38 AM
Please check the image.
... View more
11-22-2017
07:37 AM
sparkui.png Were do I customize this property so that It can go to some custom URL??
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache Zeppelin
09-16-2017
08:21 AM
Labels:
- Labels:
-
Apache Hive
-
Apache Zeppelin
08-15-2017
12:18 PM
Is it necessary to set passwordless ssh. Because in the doc it is mentioned either to set password less ssh or set this property. Do we require both?
... View more
08-15-2017
05:53 AM
To enable user impersonation I have performed below steps: 1. Added below property inzeppelin_env_content through Ambari. export ZEPPELIN_IMPERSONATE_CMD='sudo -H -u ${ZEPPELIN_IMPERSONATE_USER} bash -c ' 2. Enable user impersonate in the sh interpreter. (The interpreter is set per user in isolated mode) Below is the log for the same. java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.thrift.transport.TSocket.open(TSocket.java:182)
at org.apache.zeppelin.interpreter.remote.ClientFactory.create(ClientFactory.java:51)
at org.apache.zeppelin.interpreter.remote.ClientFactory.create(ClientFactory.java:37)
at org.apache.commons.pool2.BasePooledObjectFactory.makeObject(BasePooledObjectFactory.java:60)
at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:861)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterProcess.getClient(RemoteInterpreterProcess.java:90)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.init(RemoteInterpreter.java:211)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:377)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.getFormType(LazyOpenInterpreter.java:105)
at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:387)
at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
at org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:329)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
... View more
Labels:
- Labels:
-
Apache Zeppelin
07-13-2017
08:26 AM
The configuration is same for both hive views on both the instances.
... View more
07-13-2017
08:13 AM
Logs:[ambari-client-thread-16338] ContainerResponse:537 - Mapped exception to response: 500 (Internal Server Error)
javax.ws.rs.WebApplicationException: java.net.SocketTimeoutException: connect timed out
at org.apache.ambari.view.hive2.HelpService.atsStatus(HelpService.java:110)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java
... View more
07-07-2017
10:49 AM
But Apache's document for R interpreter says: To run R code and visualize plots in Apache Zeppelin, you will need R on your master node (or your dev laptop). And does sparkR utility distributes the load on the cluster automatically or we need to add any properties for that? I have spark interpreter where I have set property: master=yarn-client and I have set the spark_home. Is this enough?
... View more
07-07-2017
10:15 AM
I am trying to run a query using sparkR interpreter in Zeppelin. We have R installed on the namenode machine and Zeppelin installed on the other machine. So when we fire a query using sparkR interpreter it doesn'display any error on the zeppelin screen but when I see the interpreter logs it throws an error: zeppelin Caused by: java.io.IOException: Cannot run program "R" (in directory "."): error=2, No such file or directory . Do we need to install R on every node?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Spark
-
Apache Zeppelin