Member since
06-13-2016
76
Posts
13
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2048 | 08-09-2017 06:54 PM | |
2938 | 05-03-2017 02:25 PM | |
4093 | 03-28-2017 01:56 PM | |
4250 | 09-26-2016 09:05 PM | |
2872 | 09-22-2016 03:49 AM |
10-12-2016
02:43 PM
@santoshsb They are already configured to point to the nameservice URI. See my first screenshot: hive --service metatool -listFSRootListing FS Roots. hdfs://cluster1/apps/hive/warehouse/test2.d hdfs://cluster1/apps/hive/warehouse/raw.db hdfs://cluster1/apps/hive/warehouse/test.db hdfs://cluster1/apps/hive/warehouse hdfs://cluster1/apps/hive/warehouse/lookup.db
... View more
10-11-2016
10:36 PM
Hello, After enabling HA on the name node, hive is unable to access the hive databases whose metastore name got updated to the new FS path: Error: Error while compiling statement: FAILED: SemanticException java.lang.IllegalArgumentException: java.net.UnknownHostException: cluster1 (state=42000,code=40000) hive metastore tool shows: hive --service metatool -listFSRootListing FS Roots..
hdfs://cluster1/apps/hive/warehouse/test2.db
hdfs://cluster1/apps/hive/warehouse/raw.db
hdfs://cluster1/apps/hive/warehouse/test.db
hdfs://cluster1/apps/hive/warehouse
hdfs://cluster1/apps/hive/warehouse/lookup.db cluster1 is the correct fs.defaultFS that was setup during HA. <property>
<name>fs.defaultFS</name>
<value>hdfs://cluster1</value>
<final>true</final>
</property> If I create a new database in hive, it gets created using the actual name node host name: 0: jdbc:hive2://ip-10-555-2-555.ec2.internal:> create database test3; No rows affected (0.148 seconds) Listing FS Roots.. hdfs://ip-10-123-5-42.ec2.internal:8020/apps/hive/warehouse/test3.db any ideas on what could be missing? hdfs client works fine with the HA name.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
10-04-2016
10:47 PM
Hello, I am trying to access hive JDBC through Knox in a secured cluster (kerberos). When accessing them directly, it works fine. I am able to connect with hiveserver directly in HTTP mode passing in kerbeos principal and creating kerberos ticket beeline -u 'jdbc:hive2:/<hive_server>:10001/;transportMode=http;httpPath=cliservice;principal=hive/_HOST@DEV.COM' and access WEBHDFS fine directly connecting to namenode: curl -i --negotiate -u : 'http://<namenode>:50070/webhdfs/v1/?op=LISTSTATUS' Going through Knox gateway (using sample LDAP for simplicity), I get: curl -iku guest:guest-password -X GET 'https://<knox_gateway>:8443/gateway/default/webhdfs/v1/?op=LISTSTATUS' <html><head><meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
<title>Error 401 Authentication required</title>
</head><body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /webhdfs/v1/.
Reason:<pre> Authentication required</pre></p><hr/><i><small>Powered by Jetty://</small></i><br/> In gateway-audit i do see the request getting translated to the actual internal request but its returning 401. audit|WEBHDFS||||access|uri|/gateway/default/webhdfs/v1/?op=LISTSTATUS|unavailable|Request method: GET audit|WEBHDFS|guest|||authentication|uri|/gateway/default/webhdfs/v1/?op=LISTSTATUS|success| audit|WEBHDFS|guest|||authentication|uri|/gateway/default/webhdfs/v1/?op=LISTSTATUS|success|Groups: [] audit|WEBHDFS|guest|||authorization|uri|/gateway/default/webhdfs/v1/?op=LISTSTATUS|success| audit|WEBHDFS|guest|||dispatch|uri|http://<name_node>:50070/webhdfs/v1/?op=LISTSTATUS&doAs=guest|unavailable|Request method: GET audit|WEBHDFS|guest|||dispatch|uri|http://<name_node>:50070/webhdfs/v1/?op=LISTSTATUS&doAs=guest|success|Response status: 401 audit|WEBHDFS|guest|||access|uri|/gateway/default/webhdfs/v1/?op=LISTSTATUS|success|Response status: 401 Similarly in Hive, I can connect to hiveserver directly but when I attempt through knox I get: 16/10/04 22:31:34 [main]: ERROR jdbc.HiveConnection: Error opening sessionorg.apache.thrift.transport.TTransportException: HTTP Response code: 401 In Hive server logs: 2016-10-04 22:31:34,063 INFO [HiveServer2-HttpHandler-Pool: Thread-299]: thrift.ThriftHttpServlet (ThriftHttpServlet.java:doKerberosAuth(398)) - Failed to authenticate with http/_HOST kerberos principal, trying with hive/_HOST kerberos principal 2016-10-04 22:31:34,063 ERROR [HiveServer2-HttpHandler-Pool: Thread-299]: thrift.ThriftHttpServlet (ThriftHttpServlet.java:doKerberosAuth(406)) - Failed to authenticate with hive/_HOST kerberos principal 2016-10-04 22:31:34,064 ERROR [HiveServer2-HttpHandler-Pool: Thread-299]: thrift.ThriftHttpServlet (ThriftHttpServlet.java:doPost(209)) - Error: org.apache.hive.service.auth.HttpAuthenticationException: java.lang.reflect.UndeclaredThrowableException at org.apache.hive.service.cli.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:407) at org.apache.hive.service.cli.thrift.ThriftHttpServlet.doPost(ThriftHttpServlet.java:159) at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) at javax.servlet.http.HttpServlet.service(HttpServlet.java:820) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:565) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:479) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:225) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1031) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:406) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:186) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:965) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:117) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:111) at org.eclipse.jetty.server.Server.handle(Server.java:349) at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:449) at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:925) at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:952) at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235) at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:76) at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:609) at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:45) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.UndeclaredThrowableException at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742) at org.apache.hive.service.cli.thrift.ThriftHttpServlet.doKerberosAuth(ThriftHttpServlet.java:404) ... 23 more Caused by: org.apache.hive.service.auth.HttpAuthenticationException: Authorization header received from the client is empty. at org.apache.hive.service.cli.thrift.ThriftHttpServlet.getAuthHeader(ThriftHttpServlet.java:548) at org.apache.hive.service.cli.thrift.ThriftHttpServlet.access$100(ThriftHttpServlet.java:74) at org.apache.hive.service.cli.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:449) at org.apache.hive.service.cli.thrift.ThriftHttpServlet$HttpKerberosServerAction.run(ThriftHttpServlet.java:412) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) ... 24 more FYI I have: hadoop.proxyuser.knox.hosts=<knox_gateway>hadoop.proxyuser.knox.groups=* Thanks for any help!
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Knox
09-28-2016
07:20 PM
they are - i use the same credentials for the UI's that go through knox gateway, also in knoxgateway log i see:
2016-09-28 19:19:06,039 INFO hadoop.gateway (AclsAuthorizationFilter.java:doFilter(85)) - Access Granted: true
... View more
09-28-2016
06:24 PM
@skothari Getting the below: 16/09/28 18:23:43 [main]: ERROR jdbc.HiveConnection: Error opening session org.apache.thrift.transport.TTransportException: HTTP Response code: 401 at org.apache.thrift.transport.THttpClient.flushUsingHttpClient(THttpClient.java:262) at org.apache.thrift.transport.THttpClient.flush(THttpClient.java:313) at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73) at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62) at org.apache.hive.service.cli.thrift.TCLIService$Client.send_OpenSession(TCLIService.java:154) at org.apache.hive.service.cli.thrift.TCLIService$Client.OpenSession(TCLIService.java:146) at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:552) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:170) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:187) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:146) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:211) at org.apache.hive.beeline.Commands.connect(Commands.java:1190) at org.apache.hive.beeline.Commands.connect(Commands.java:1086) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52) at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:989) at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:832) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:790) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:490) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:473) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:233) at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
HTTP Response code: 401 (state=08S01,code=0)
... View more
09-28-2016
04:26 PM
Hello, I am trying to connect to hive server JDBC through knox that has kerberos authentication. I was able to connect through knox but after enabling kerberos im having some issues. Prior to kerberos this worked: jdbc:hive2://<knox_host>:8443/;ssl=true;sslTrustStore=/knox/gateway.jks;trustStorePassword=knox?hive.server2.transport.mode=http;hive.server2.thrift.http.path=gateway/default/hive connecting directly without knox: !connect jdbc:hive2://<hiveserver_hist>:10001/default;principal=hive/_HOST@REALM.COM;transportMode=http;httpPath=cliservice I've tried many different jdbc connection string combinations with no success. Is the principal=hive/_HOST@REALM.COM required? Last i tried was: jdbc:hive2://<knox_host>:8443/;ssl=false;httpPath=gateway/default/hive;transportMode=http;sslTrustStore=/knox/gateway.jks;trustStorePassword=knox Which gave me: org.apache.thrift.transport.TTransportException: org.apache.http.NoHttpResponseException: ec2-54-85-108-57.compute-1.amazonaws.com:8443 failed to respond
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Knox
09-27-2016
06:18 PM
1 Kudo
Hello, I've created an HDF instance that authenticates with LDAP, the initial admin was setup using a SSL certificate so I can get into the NiFi console as the admin user. I am trying to grant access to another non-admin user and getting the below error when trying to login from another host that does not have certificate: 2016-09-27 17:56:55,897 INFO [NiFi Web Server-28] o.a.n.w.s.NiFiAuthenticationFilter Authentication success for cn=test user,ou=users,dc=hadoop,dc=com
2016-09-27 17:56:55,898 INFO [NiFi Web Server-28] o.a.n.w.a.c.AccessDeniedExceptionMapper cn=test user,ou=users,dc=hadoop,dc=com does not have permission to access the requested resource. Returning Forbidden response. It looks like its authenticating fine with my LDAP server but running to issues with authorization. In the NiFi console i've created that user, "cn=test user,ou=users,dc=hadoop,dc=com" and granted access policy to "view the component. Here is login-identity provider: <property name="User Search Base">OU=users,DC=hadoop,DC=com</property> <property name="User Search Filter">uid={0}</property> and the results of ldapsearch: # test user, users, instream.com dn: cn=test user,ou=users,dc=hadoop,dc=com uid: tuser Am I creating the user incorrectly in NiFi or require any additional settings in nifi.properties? Thanks
... View more
Labels:
- Labels:
-
Apache NiFi
09-26-2016
09:05 PM
The Nifi team has identified and issue with Hive scripts causing this processor to hang. Basically these hive commands are running Mapreduce or Tez jobs that are producing a lot of standard out which is being returned to the NiFi processor. If the amount of stdout or sterr returned gets large the processor can hang. To prevent this from happening, we recommend adding the “-S” option to hive commands or “—silent=true” to beeline commands that are executed using the NiFi script processors.
... View more
09-25-2016
01:00 AM
Hello, I have setup knox to authenticate with our LDAP server and everything is working except when accessing the Hadoop UI's. Users that are not part of the group i've defined in AclsAuthz are still able to login. This works as expected when trying to access KNOX API.
see below: Knox topology - i expect only users in "knox" group to be able to have access. <provider>
<role>authorization</role>
<name>AclsAuthz</name>
<enabled>true</enabled>
<param name="knox.acl" value="*;knox;*"/>
</provider> /bin/knoxcli.sh user-auth-test --cluster default --u mliem --p '*******' --g LDAP authentication successful! mliem is a member of: admin mliem is a member of: knox mliem is a member of: developers /bin/knoxcli.sh user-auth-test --cluster default --u jdoe --p '*******'' --g LDAP authentication successful! jdoe is a member of: developers -------------------------------------------------------- curl -u mliem:'*****' -ik 'https://<knox_ip>:8443/gateway/default/api/v1/version' HTTP/1.1 200 OK curl -u jdoe:'*****' -ik 'https://<knox_ip>:8443/gateway/default/api/v1/version' HTTP/1.1 403 Forbidden Now when I access the UI's as defined in my topology: <service>
<role>YARNUI</role>
<url>http://{{rm_host}}:{{rm_port}}</url>
</service> Both mliem (expected) and jdoe can access. Is there anything additional I need to add to my topology in order to leverage the groups i've defined in my LDAP server? Thanks ,
... View more
Labels:
- Labels:
-
Apache Knox