Member since
03-25-2016
142
Posts
48
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5593 | 06-13-2017 05:15 AM | |
1833 | 05-16-2017 05:20 AM | |
1307 | 03-06-2017 11:20 AM | |
7542 | 02-23-2017 06:59 AM | |
2195 | 02-20-2017 02:19 PM |
02-14-2017
02:57 PM
@Bala Vignesh N V Please, have a look at the below links. Hopefully, there will give you some more details: http://hortonworks.com/blog/evaluating-hive-with-tez-as-a-fast-query-engine/ http://hortonworks.com/apache/mapreduce/ https://community.hortonworks.com/questions/65772/when-tezmr-is-better-for-query-execution.html
... View more
02-13-2017
11:17 AM
Problem I log in to Ambari as numeric username (or partially numeric)
i.e. 123user, 12345678. When accessing Hive View the following comes up Here is the full stack of the error: Service 'userhome' check failed:
java.lang.IllegalArgumentException: Invalid value: "12345678" does not belong to the domain ^[A-Za-z_][A-Za-z0-9._-]*[$]?$
at org.apache.hadoop.hdfs.web.resources.StringParam$Domain.parse(StringParam.java:53)
at org.apache.hadoop.hdfs.web.resources.StringParam.<init>(StringParam.java:25)
at org.apache.hadoop.hdfs.web.resources.UserParam.<init>(UserParam.java:68)
at org.apache.hadoop.hdfs.web.resources.UserParam.<init>(UserParam.java:75)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.getHomeDirectory(WebHdfsFileSystem.java:387)
at org.apache.ambari.view.utils.hdfs.HdfsApi$7.run(HdfsApi.java:187)
at org.apache.ambari.view.utils.hdfs.HdfsApi$7.run(HdfsApi.java:185)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.ambari.view.utils.hdfs.HdfsApi.execute(HdfsApi.java:397)
at org.apache.ambari.view.utils.hdfs.HdfsApi.getHomeDir(HdfsApi.java:185)
at org.apache.ambari.view.commons.hdfs.UserService.homeDir(UserService.java:57)
at org.apache.ambari.view.hive2.resources.files.FileService.userhomeSmokeTest(FileService.java:245)
at org.apache.ambari.view.hive2.HelpService.userhomeStatus(HelpService.java:89)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1507)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.AmbariAuthorizationFilter.doFilter(AmbariAuthorizationFilter.java:257)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.jwt.JwtAuthenticationFilter.doFilter(JwtAuthenticationFilter.java:96)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at org.apache.ambari.server.security.authentication.AmbariAuthenticationFilter.doFilter(AmbariAuthenticationFilter.java:88)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.AmbariUserAuthorizationFilter.doFilter(AmbariUserAuthorizationFilter.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:237)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.api.MethodOverrideFilter.doFilter(MethodOverrideFilter.java:72)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.api.AmbariPersistFilter.doFilter(AmbariPersistFilter.java:47)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.view.AmbariViewsMDCLoggingFilter.doFilter(AmbariViewsMDCLoggingFilter.java:54)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.view.ViewThrottleFilter.doFilter(ViewThrottleFilter.java:161)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.security.AbstractSecurityHeaderFilter.doFilter(AbstractSecurityHeaderFilter.java:109)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.security.AbstractSecurityHeaderFilter.doFilter(AbstractSecurityHeaderFilter.java:109)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.eclipse.jetty.servlets.UserAgentFilter.doFilter(UserAgentFilter.java:82)
at org.eclipse.jetty.servlets.GzipFilter.doFilter(GzipFilter.java:294)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:499)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:427)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:212)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:201)
at org.apache.ambari.server.controller.AmbariHandlerList.handle(AmbariHandlerList.java:150)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:370)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:973)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1035)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:641)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:231)
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)
Solution This is a known issue. And the permanent solution for this problem has been scheduled for delivery in Ambari 2.5 Workaround As for the workaround, please follow up the below steps: NOTE: these are the exact steps I did follow on my local
environment (HDP 2.5.0.0-1245, Ambari 2.4.0.1). a) Copy /var/lib/ambari-server/resources/views/work/HIVE\{1.5.0\}/WEB-INF/lib/hadoop-common-2.7.3.2.5.0.0-1245.jar into temp dir: /tmp
b) unzip /tmp/hadoop-common-2.7.3.2.5.0.0-1245.jar
c) Extract the core-default.xml file from the above jar file and add the following entry in it:
<property>
<name>dfs.webhdfs.user.provider.user.pattern</name>
<value>^[A-Za-z0-9_][A-Za-z0-9._-]*[$]?$</value>
<description>
Valid pattern for user and group names for webhdfs, it must be a valid java regex.
</description>
</property>
d) Now create a "classes" directory in /var/lib/ambariserver/resources/views/work/HIVE\{1.5.0\}/WEB-INF
and move the edited core-default.xml inside it.
Example:
$ cd /var/lib/ambari-server/resources/views/work/HIVE{1.5.0}/WEB-INF
$ mkdir classes
$ cp -f /tmp/core-default.xml /var/lib/ambari-server/resources/views/work/HIVE\{1.5.0\}/WEB-INF/classes/
e) Then restart the ambari-server.
$ ambari-server restart
Now, I can see it is working fine for user 12345678
... View more
Labels:
02-10-2017
11:50 AM
2 Kudos
@Narasimha K From the existing notebook, you can unbind it. If you would like to completely remove it, try doing this: - log in to Zeppelin UI with admin privileges - go to <username> (top-right corner) -> Interpreters - search the one you are interested in and remove
... View more
02-09-2017
04:34 PM
Environment HDP 2.5.x Ambari 2.4.x Problem I need to use anaconda for %livy.pyspark. Now, it is using the default python2.6 %livy.pyspark
import sys
print(sys.path)
-----------------------------------------
['/var/hadoop/b/yarn/local/usercache/<user>/appcache/application_1483612761447_100542/container_e11_1483612761447_100542_01_000001/tmp', u'/var/hadoop/b/yarn/local/usercache/<user>/appcache/application_1483612761447_100542/spark-0e25c417-e8c0-4b60-b167-789dc3293bd7/userFiles-a1d6eced-0de1-4e3d-9bc1-f5aea925915d/py4j-0.9-src.zip', u'/var/hadoop/b/yarn/local/usercache/<user>/appcache/application_1483612761447_100542/spark-0e25c417-e8c0-4b60-b167-789dc3293bd7/userFiles-a1d6eced-0de1-4e3d-9bc1-f5aea925915d/pyspark.zip', u'/var/hadoop/b/yarn/local/usercache/<user>/appcache/application_1483612761447_100542/spark-0e25c417-e8c0-4b60-b167-789dc3293bd7/userFiles-a1d6eced-0de1-4e3d-9bc1-f5aea925915d', '/var/hadoop/b/yarn/local/usercache/<user>/appcache/application_1483612761447_100542/container_e11_1483612761447_100542_01_000001/pyspark.zip', '/var/hadoop/b/yarn/local/usercache/<user>/appcache/application_1483612761447_100542/container_e11_1483612761447_100542_01_000001/py4j-0.9-src.zip', '/usr/lib64/python26.zip', '/usr/lib64/python2.6', '/usr/lib64/python2.6/plat-linux2', '/usr/lib64/python2.6/lib-tk', '/usr/lib64/python2.6/lib-old', '/usr/lib64/python2.6/lib-dynload', '/usr/lib64/python2.6/site-packages', '/usr/lib64/python2.6/site-packages/gtk-2.0', '/usr/lib/python2.6/site-packages']
How can I get this working with python-3.5? Solution Based on https://issues.apache.org/jira/browse/ZEPPELIN-1609, there is a new pyspark implemented within livy - %livy.pyspark3. This is delivered in Zeppelin 0.7 that comes with HDP 2.6. For now, do the following: > go to Ambari UI -> Spark -> Config -> Advanced livy-env -> content
-> set: export PYSPARK_PYTHON=/opt/anaconda/bin/python - path to new python version
-> set: export PYSPARK_DRIVER_PYTHON=/opt/anaconda/bin/python - path to new python version
> save the changes
> Restart all required services
... View more
02-09-2017
12:07 PM
4 Kudos
Overview R is currently supported through livy interpreter by running
%livy.sparkr Environment I did test the below solution with HDP 2.5.0.0-1245 and Ambari 2.4.0.1 HDP 2.5.3.0-37 and Ambari 2.4.2.0 Steps To Follow 1. Install R on all YARN worker nodes $ yum install R-devel libcurl-devel openssl-devel 2. Confirm R works fine [root@dkhdp251 zeppelin]# R -e "print(1+1)"
R version 3.3.2 (2016-10-31) -- "Sincere Pumpkin Patch"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-redhat-linux-gnu (64-bit)
R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.
Natural language support but running in an English locale
R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.
Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.
> print(1+1)
[1] 2
3. Confirm R works fine from Zeppelin 4. Install the following packages from Zeppelin: devtools, data.table, base64env and
knitr %livy.sparkr
install.packages('data.table', repos = 'http://cran.us.r-project.org')
install.packages('base64enc', repos = 'http://cran.us.r-project.org')
install.packages('knitr', repos = 'http://cran.us.r-project.org')
install.packages('ggplot2', repos = 'http://cran.us.r-project.org')
5. Confirm the packages have been installed %livy.sparkr
library(data.table)
library(base64enc)
library(knitr)
library(ggplot2) The above should run and finish returning nothing in output. 6. Run the following code to build the graphics %livy.sparkr
library(data.table)
library(ggplot2)
library(knitr)
set.seed(42)
# generate sample data
dat <- rbind(data.table(gender="female",value=rnorm(1e4)),
data.table(gender="male",value=rnorm(1e4,2,1))
)
# plot
p1 <- ggplot(dat,aes(x=value,color=gender)) + geom_density()
# save to tmp file
ggsave(filename="/tmp/myplot.png", plot=p1)
# get base64 of the image for display in html
printImageURI<-function(file){
uri=image_uri(file)
file.remove(file)
cat(sprintf("%%html <img width='700' src=\"%s\" />\n", uri))
}
printImageURI("/tmp/myplot.png")
This will generate the following graphics
... View more
02-07-2017
04:31 PM
5 Kudos
ENVIRONMENT and SETUP I did test the below solution with HDP 2.5.0.0-1245 and Ambari 2.4.0.1 HDP 2.5.3.0-37 and Ambari 2.4.2.0 Shiro.ini [users]
# List of users with their password allowed to access Zeppelin.
# To use a different strategy (LDAP / Database / ...) check the shiro doc at http://shiro.apache.org/configuration.html#Configuration-INISections
admin = password, admin
maria_dev = password, admin
user1 = password, role1
user2 = password, role2
user3 = password, admin
[main]
#activeDirectoryRealm = org.apache.zeppelin.server.ActiveDirectoryGroupRealm
#activeDirectoryRealm.systemUsername = CN=Administrator,CN=Users,DC=HW,DC=EXAMPLE,DC=COM
#activeDirectoryRealm.systemPassword = Password1!
#activeDirectoryRealm.hadoopSecurityCredentialPath = jceks://user/zeppelin/zeppelin.jceks
#activeDirectoryRealm.searchBase = CN=Users,DC=HW,DC=TEST,DC=COM
#activeDirectoryRealm.url = ldap://ad-nano.test.example.com:389
#activeDirectoryRealm.groupRolesMap = ""
#activeDirectoryRealm.authorizationCachingEnabled = true
#ldapRealm = org.apache.shiro.realm.ldap.JndiLdapRealm
#ldapRealm.userDnTemplate = uid={0},cn=users,cn=accounts,dc=example,dc=com
#ldapRealm.contextFactory.url = ldap://ldaphost:389
#ldapRealm.contextFactory.authenticationMechanism = SIMPLE
sessionManager = org.apache.shiro.web.session.mgt.DefaultWebSessionManager
securityManager.sessionManager = $sessionManager
# 86,400,000 milliseconds = 24 hour
#securityManager.sessionManager.globalSessionTimeout = 86400000
shiro.loginUrl = /api/login
[urls]
# anon means the access is anonymous.
# authcBasic means Basic Auth Security
# To enfore security, comment the line below and uncomment the next one
#/api/version = anon
#/** = anon
/api/interpreter/** = authc, roles[admin]
/api/configurations/** = authc, roles[admin]
/api/credential/** = authc, roles[admin]
/** = authc
Configure JDBC interpreter for HIVE as: - Zeppelin UI -> Interpreter -> JDBC -> hive.url use URL from Ambari -> Hive -> HiveServer2 JDBC URL like jdbc:hive2://dkhdp253.dk:2181,dkhdp252.dk:2181,dkhdp251.dk:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2 - "User Impersonate" under JDBC interpreter is to be unchecked - In Hive config - ensure hive.server2.enable.doAs is set to TRUE Dependencies in JDBC interpreter - org.apache.hive:hive-jdbc:2.0.1 - org.apache.hadoop:hadoop-common:2.7.2 - org.apache.hive.shims:hive-shims-0.23:2.1.0
PROBLEM When initially running a query through %jdbc(hive) I am getting org.apache.hive.service.cli.HiveSQLException: Failed to validate proxy privilege of zeppelin for user3
at org.apache.hive.service.auth.HiveAuthFactory.verifyProxyAccess(HiveAuthFactory.java:396)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getProxyUser(ThriftCLIService.java:751)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getUserName(ThriftCLIService.java:386)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:413)
at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:316)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1257)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1242)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:562)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.security.authorize.AuthorizationException: User: zeppelin is not allowed to impersonate user3
at org.apache.hadoop.security.authorize.DefaultImpersonationProvider.authorize(DefaultImpersonationProvider.java:119)
at org.apache.hadoop.security.authorize.ProxyUsers.authorize(ProxyUsers.java:102)
at org.apache.hadoop.security.authorize.ProxyUsers.authorize(ProxyUsers.java:116)
at org.apache.hive.service.auth.HiveAuthFactory.verifyProxyAccess(HiveAuthFactory.java:392)
... 13 more
The fix is to add the following lines into HDFS Service -> Configs -> "Custom core-site" hadoop.proxyuser.zeppelin.hosts=*
hadoop.proxyuser.zeppelin.groups=* Next running a query in JDBC interpreter for
i.e. hive as “user3” this returns the following in hiveserver2.log: Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=user3, access=WRITE, inode="/user/user3":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1811)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1794)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4011)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1102)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:630)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
SOLUTION As the first step to sort out the problem I
create a dedicated user’s folder in HDFS [root@dkhdp252 hive]# hdfs dfs -mkdir /user/user3
[root@dkhdp252 hive]# hdfs dfs -chown user3:hdfs /user/user3
[root@dkhdp252 hive]# hdfs dfs -chmod 755 /user/user3
[root@dkhdp252 hive]# hdfs dfs -ls /user
Found 12 items
drwxr-xr-x - admin hdfs 0 2016-12-10 07:49 /user/admin
drwxrwx--- - ambari-qa hdfs 0 2017-01-30 15:32 /user/ambari-qa
drwxr-xr-x - hcat hdfs 0 2016-11-29 09:25 /user/hcat
drwxr-xr-x - hdfs hdfs 0 2016-12-06 08:04 /user/hdfs
drwxr-xr-x - hive hdfs 0 2017-02-06 14:23 /user/hive
drwxrwxr-x - livy hdfs 0 2016-11-29 09:52 /user/livy
drwxr-xr-x - maria_dev hdfs 0 2017-02-07 15:53 /user/maria_dev
drwxrwxr-x - oozie hdfs 0 2016-12-09 16:05 /user/oozie
drwxrwxr-x - spark hdfs 0 2016-11-29 16:30 /user/spark
drwxr-xr-x - user3 hdfs 0 2017-02-07 16:01 /user/user3
drwxr-xr-x - zeppelin hdfs 0 2016-11-29 16:17 /user/zeppelin
Next Restart JDBC interpreter Now, when running the same query again I can see the job
starts up in RM UI however checking out the application log I can see: Application application_1486481563532_0002 failed 2 times due to AM Container for appattempt_1486481563532_0002_000002 exited with exitCode: -1000
For more detailed output, check the application tracking page: http://dkhdp253.dk:8088/cluster/app/application_1486481563532_0002 Then click on links to logs of each attempt.
Diagnostics: Application application_1486481563532_0002 initialization failed (exitCode=255) with output: main : command provided 0
main : run as user is user3
main : requested yarn user is user3
User user3 not found
Failing this attempt. Failing the application.
The next step is to create the “user3” in all the worker
nodes like $ adduser user3 Restart JDBC interpreter Now, re-running the query I can see the process runs and completes successfully
... View more
Labels:
02-04-2017
08:20 AM
5 Kudos
PROBLEM After restarting the Zeppelin service from UI I have the
following situation:
Zeppelin service runs fine – “green” in Ambari Going to http://<ZEPPELIN_HOST>:9995
- I am getting Checking out zeppelin-zeppelin-<ZEPPELIN_HOST>.log
file I can see INFO [2017-02-04 07:47:09,081] ({main}
ZeppelinServer.java[main]:114) - Starting zeppelin server INFO [2017-02-04
07:47:09,084] ({main} Server.java[doStart]:327) - jetty-9.2.15.v20160210 WARN [2017-02-04
07:47:09,129] ({main} WebAppContext.java[doStart]:514) - Failed startup of
context
o.e.j.w.WebAppContext@3b938003{/,null,null}{/usr/hdp/current/zeppelin-server/lib/zeppelin-web-0.6.0.2.5.0.0-1245.war}java.lang.IllegalStateException: Failed to delete temp dir
/usr/hdp/2.5.0.0-1245/zeppelin/webapps at
org.eclipse.jetty.webapp.WebInfConfiguration.configureTempDirectory(WebInfConfiguration.java:372) at
org.eclipse.jetty.webapp.WebInfConfiguration.resolveTempDirectory(WebInfConfiguration.java:260) at
org.eclipse.jetty.webapp.WebInfConfiguration.preConfigure(WebInfConfiguration.java:69) at
org.eclipse.jetty.webapp.WebAppContext.preConfigure(WebAppContext.java:468) at
org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:504) at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) at
org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132) at
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114) at
org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61) at
org.eclipse.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:163) at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) at
org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:132) at
org.eclipse.jetty.server.Server.start(Server.java:387) at
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:114) at
org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61) at
org.eclipse.jetty.server.Server.doStart(Server.java:354) at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) at
org.apache.zeppelin.server.ZeppelinServer.main(ZeppelinServer.java:116) SOLUTION - Log into Zeppelin Host as root - Check the ownership and permissions of
/usr/hdp/2.5.0.0-1245/zeppelin (this is the path from the log above) $ ls –lrt /usr/hdp/2.5.0.0-1245/zeppelin
[root@dkhdp251 zeppelin]# ls -lrt /usr/hdp/2.5.0.0-1245/zeppelin
total 60
-rwxr-xr-x. 1 zeppelin zeppelin 8547 Aug 26 03:50 README.md
-rwxr-xr-x. 1 zeppelin zeppelin 15308 Aug 26 03:50 LICENSE
drwxr-xr-x. 2 zeppelin zeppelin 4096 Nov 29 15:46 bin
drwxr-xr-x. 8 zeppelin zeppelin 4096 Nov 29 15:46 interpreter
drwxr-xr-x. 2 zeppelin zeppelin 12288 Nov 29 15:47 lib
lrwxrwxrwx. 1 root root 28 Nov 29 15:48 conf -> /etc/zeppelin/2.5.0.0-1245/0
drwxr-xr-x. 41 zeppelin hadoop 4096 Nov 29 16:12 local-repo
drwxr-xr-x. 32 zeppelin zeppelin 4096 Feb 2 11:43 notebook
drwxr-xr-x. 3 root root 4096 Feb 4 07:43 webapps - Make sure the permission to “webapps” folder is 755. - The ownership of “webapps” and sub-folders needs to be
zeppelin:hadoop (or the hadoop group you have defined in your cluster). So, do
the following: $ chown -R zeppelin:hadoop /usr/hdp/2.5.0.0-1245/zeppelin/webapps - Now, restart Zeppelin service from Ambari Now, running http://<ZEPPELIN_HOST>:9995
I can see
... View more
Labels:
02-02-2017
07:37 AM
5 Kudos
OVERVIEW This facility is available in Zeppelin 0.7.0 based on the JIRA: https://issues.apache.org/jira/browse/ZEPPELIN-1320. The below steps allow you to get this working in HDP 2.5 environment with Zeppelin 0.6. ENVIRONMENT and SETUP I did test the below solution with
HDP 2.5.0.0-1245 Ambari 2.4.0.1 Shiro.ini [users]
# List of users with their password allowed to access Zeppelin.
# To use a different strategy (LDAP / Database / ...) check the shiro doc at http://shiro.apache.org/configuration.html#Configuration-INISections
admin = password
user1 = password, role1, role2
user2 = password, role3
#user3 = password4, role2
# Sample LDAP configuration, for user Authentication, currently tested for single Realm
[main]
#activeDirectoryRealm = org.apache.zeppelin.server.ActiveDirectoryGroupRealm
#activeDirectoryRealm.systemUsername = CN=Administrator,CN=Users,DC=HW,DC=EXAMPLE,DC=COM
#activeDirectoryRealm.systemPassword = Password1!
#activeDirectoryRealm.hadoopSecurityCredentialPath = jceks://user/zeppelin/zeppelin.jceks
#activeDirectoryRealm.searchBase = CN=Users,DC=HW,DC=TEST,DC=COM
#activeDirectoryRealm.url = ldap://ad-nano.test.example.com:389
#activeDirectoryRealm.groupRolesMap = ""
#activeDirectoryRealm.authorizationCachingEnabled = true
#ldapRealm = org.apache.shiro.realm.ldap.JndiLdapRealm
#ldapRealm.userDnTemplate = uid={0},cn=users,cn=accounts,dc=example,dc=com
#ldapRealm.contextFactory.url = ldap://ldaphost:389
#ldapRealm.contextFactory.authenticationMechanism = SIMPLE
#sessionManager = org.apache.shiro.web.session.mgt.DefaultWebSessionManager
#securityManager.sessionManager = $sessionManager
# 86,400,000 milliseconds = 24 hour
#securityManager.sessionManager.globalSessionTimeout = 86400000
shiro.loginUrl = /api/login
[urls]
# anon means the access is anonymous.
# authcBasic means Basic Auth Security
# To enfore security, comment the line below and uncomment
the next one
#/api/version = anon
#/** = anon
#/** = authc
/** = authcBasic
PROBLEM In the current version of Zeppelin, enabling “User
Impersonate” in SH interpreter causing the connection refused error SOLUTION To get “user2” get working with SH interpreter, follow up
the below steps: 1. Log into Zeppelin node as ROOT and create a “user2” user: [root@dkhdp251 ~]# adduser user2
[root@dkhdp251 ~]# passwd user2
Changing password for user user2.
New password:
BAD PASSWORD: it is based on a dictionary word
Retype new password:
passwd: all authentication tokens updated successfully.
[root@dkhdp251 ~]#
2. Log in to Zeppelin node as zeppelin user to set the
passwordless authentication for “user2”: [root@dkhdp251 ~]# su - zeppelin
[zeppelin@dkhdp251 ~]$ ssh-keygen (this is to be run only once for all the users)
Generating public/private rsa key pair.
Enter file in which to save the key (/home/zeppelin/.ssh/id_rsa):
/home/zeppelin/.ssh/id_rsa already exists.
Overwrite (y/n)? n (As I already have keygen generated, I skipped this step for presentation)
[zeppelin@dkhdp251 ~]$ ssh user2@dkhdp251.dk mkdir -p .ssh
user2@dkhdp251.dk's password:
[zeppelin@dkhdp251 ~]$
[zeppelin@dkhdp251 ~]$
[zeppelin@dkhdp251 ~]$ cat ~/.ssh/id_rsa.pub | ssh user2@dkhdp251.dk 'cat >> .ssh/authorized_keys'
user2@dkhdp251.dk's password:
[zeppelin@dkhdp251 ~]$
[zeppelin@dkhdp251 ~]$
[zeppelin@dkhdp251 ~]$ ssh user2@dkhdp251.dk "chmod 700 .ssh; chmod 640 .ssh/authorized_keys"
user2@dkhdp251.dk's password:
[zeppelin@dkhdp251 ~]$
[zeppelin@dkhdp251 ~]$
[zeppelin@dkhdp251 ~]$ ssh user2@dkhdp251.dk
[user2@dkhdp251 ~]$
[user2@dkhdp251 ~]$
[user2@dkhdp251 ~]$ exit
logout
Connection to dkhdp251.dk closed.
[zeppelin@dkhdp251 ~]$
[zeppelin@dkhdp251 ~]$
[zeppelin@dkhdp251 ~]$ ssh user2@localhost
Last login: Thu Feb 2 06:37:26 2017 from 172.26.81.190
[user2@dkhdp251 ~]$ 3. Log in to Zeppelin
UI a) first, log in as the user with access to interpreters - go to Interpreters - edit SH interpreter - enable “User Impersonate” - save the changes - restart SH interpreter b) log in as “user2” and run script from notebook Due to this error the “user2” does not have a right
permission accessing zeppelin keytab. For now, do the following: - go to Interpreter - edit SH interpreter - remove the following properties: zeppelin.shell.auth.type
zeppelin.shell.keytab.location
zeppelin.shell.principal - save the changes - restart interpreter So, the SH interpreter looks like this Now when running the sh code as “user2” NOTE: Restarting Zeppelin will add the above removed properties
back to SH interpreter. To get around it: - log in as ROOT to zeppelin node $ ls -lrt /etc/security/keytabs/zeppelin.server.kerberos.keytab (this will tell you the <GROUP>)
$ usermod -a -G <GROUP> user2 (add the user to zeppelin’s keytab group)
$ chmod 440 /etc/security/keytabs/zeppelin.server.kerberos.keytab (make sure the above keytab has a read permission for groups (440)) - Restart SH interpreter
... View more
Labels:
02-01-2017
11:38 AM
2 Kudos
Permissions are available under existing notebook -> padlock
button There are three different permissions:
Owners
– this is the owner(s) of the notebook. This permissions allows owners to give further - read/write - permission to that notebook Readers
– these users can only read scripts created in the section of that notebook.
They can neither execute nor edit the scripts - see below Writers
– these users can edit the scripts in notebook. With that permissions the users
can also read the notebook. Hence, there is no need to add them to be
Reader as well Reader users: When they try to execute the script When they edit the script - the screen comes up after few seconds. But the content create is not saved
... View more
Labels:
01-30-2017
12:52 PM
STATEMENT: Ambari audit log file gets generated from Ambari ver. 2.4 - https://issues.apache.org/jira/browse/AMBARI-15241. PROBLEM: How can I enable ambari audit in Ambari ver. 2.2.2? SOLUTION: To get ambari 2.2.2 having the audit, follow up the below instruction. This would be to change log level to DEBUG for this PersistKeyValueService class. - log into ambari node
- edit /etc/ambari-server/conf/log4j.properties
- set (or add) the following property:
log4j.logger.org.apache.ambari.server.api.services.PersistKeyValueService=DEBUG
- save the changes
- run: $ ambari-server restart
... View more
Labels: