Member since
08-15-2016
33
Posts
6
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
9195 | 02-21-2017 10:56 AM | |
2417 | 01-18-2017 07:06 AM | |
3281 | 04-07-2016 04:15 PM | |
4020 | 04-04-2016 05:03 PM |
10-24-2016
05:49 AM
JDBC Kerberos Ticket. Let me know, if you have any insight http://community.cloudera.com/t5/Interactive-Short-cycle-SQL/impala-kerberosed-jdbc-connection-from-SQL-Workbench-on-Windows/m-p/46415#M2142
... View more
10-21-2016
07:33 AM
It would be great if JDBC connection to a Kerberosed Cluster worked. I opened a seperate ticket on that, lets see where that goes.
... View more
10-18-2016
07:35 AM
Yes, I am KINITed. From the MIT Kerberos Desktop client, I can see both the Kerberos Tocket and the Windows Domain Account ticket. I also went to command prompt and verified that, C:\Program Files\MIT\Kerberos\bin\klist shows the ticket.
... View more
10-14-2016
03:21 PM
Hey Guys - we have data, where the timestamp field is of datatype string, the values are non-zero padded timestamps. For Example: 1/1/2015 1:34:45 PM 1/10/2014 1:02:45 AM 11/1/2014 11:04:45 AM When we do a cast as timestamp, we get NULL result - reason begin, impala expects the day and month values to be 2 digit, that is zero-padded. Is there a Regex function that will allow me to parse these non-zero padded timestamp to zero-padded and then convert to timestamp datatype?
... View more
Labels:
- Labels:
-
Apache Impala
10-13-2016
08:12 AM
We are running a big query on Impala Tables. Once the results appear, when we try to export the results to CSV, we get the following error page. Any ideas, on what could be wrong here? We are using CDH 5.7.2 - Impala, Spark on YARN, HUE along with HDFS. Traceback (most recent call last):
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/desktop/core/src/desktop/lib/wsgiserver.py", line 1215, in communicate
req.respond()
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/desktop/core/src/desktop/lib/wsgiserver.py", line 576, in respond
self._respond()
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/desktop/core/src/desktop/lib/wsgiserver.py", line 590, in _respond
for chunk in response:
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/build/env/lib/python2.7/site-packages/Django-1.6.10-py2.7.egg/django/utils/six.py", line 535, in next
return type(self).__next__(self)
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/build/env/lib/python2.7/site-packages/Django-1.6.10-py2.7.egg/django/http/response.py", line 292, in __next__
return self.make_bytes(next(self._iterator))
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/desktop/core/src/desktop/lib/export_csvxls.py", line 86, in create_generator
for headers, data in content_generator:
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/apps/beeswax/src/beeswax/data_export.py", line 71, in HS2DataAdapter
results = db.fetch(handle, start_over=start_over, rows=FETCH_SIZE)
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/apps/beeswax/src/beeswax/server/dbms.py", line 282, in fetch
return self.client.fetch(query_handle, start_over, rows)
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 1035, in fetch
data_table = self._client.fetch_data(operationHandle, orientation=orientation, max_rows=max_rows)
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 786, in fetch_data
results, schema = self.fetch_result(operation_handle, orientation, max_rows)
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 819, in fetch_result
schema = self.call(self._client.GetResultSetMetadata, meta_req)
File "/opt/cloudera/parcels/CDH-5.7.2-1.cdh5.7.2.p0.18/lib/hue/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 642, in call
raise QueryServerException(Exception('Bad status for request %s:\n%s' % (req, res)), message=message)
QueryServerException: Bad status for request TGetResultSetMetadataReq(operationHandle=TOperationHandle(hasResultSet=True, modifiedRowCount=None, operationType=0, operationId=THandleIdentifier(secret='\n%?\xfe\x0f\x0eFR\x88T\x16iM\xe2(\x03', guid='\n%?\xfe\x0f\x0eFR\x88T\x16iM\xe2(\x03'))):
TGetResultSetMetadataResp(status=TStatus(errorCode=None, errorMessage='Invalid query handle', sqlState='HY000', infoMessages=None, statusCode=3), schema=None)
... View more
10-12-2016
02:55 PM
This is what I see in the JDBC Trace Log Files. Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.DSIConnection(com.cloudera.impala.hivecommon.core.HiveJDBCEnvironment@1bbd56aa): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(101, Variant[type: TYPE_WSTRING, value: ImpalaJDBC]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(139, Variant[type: TYPE_WSTRING, value: User]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(22, Variant[type: TYPE_WSTRING, value: Impala]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(58, Variant[type: TYPE_WSTRING, value: `]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(66, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(68, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(76, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(81, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(83, Variant[type: TYPE_UINT16, value: -1]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.setProperty(80, Variant[type: TYPE_WSTRING, value: N]): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.dsi.core.impl.DSIConnection.registerWarningListener(com.cloudera.impala.jdbc.common.SWarningListener@157abbad): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.hivecommon.core.HiveJDBCConnection.updateConnectionSettings(): +++++ enter +++++ Oct 12 17:50:52.989 TRACE 27 com.cloudera.impala.hivecommon.core.HiveJDBCConnection.connect({AuthMech=Variant[type: TYPE_WSTRING, value: 1], ConnSchema=Variant[type: TYPE_WSTRING, value: NULL], DatabaseType=Variant[type: TYPE_WSTRING, value: Impala], HiveServerType=Variant[type: TYPE_WSTRING, value: 2], Host=Variant[type: TYPE_WSTRING, value: gateway.mygws.com], KrbHostFQDN=Variant[type: TYPE_WSTRING, value: ip-10-0-0-186.ec2.internal], KrbRealm=Variant[type: TYPE_WSTRING, value: MOBISTAT], KrbServiceName=Variant[type: TYPE_WSTRING, value: impala], LogLevel=Variant[type: TYPE_WSTRING, value: 6], LogPath=Variant[type: TYPE_WSTRING, value: D:\Mobistat\Log\], Port=Variant[type: TYPE_WSTRING, value: 21051], Principal=Variant[type: TYPE_WSTRING, value: krishnat/ip-10-0-0-186@MOBISTAT]}): +++++ enter +++++ Oct 12 17:50:52.989 ERROR 27 com.cloudera.impala.exceptions.ExceptionConverter.toSQLException: [Simba][ImpalaJDBCDriver](500310) Invalid operation: Unable to obtain Principal Name for authentication ; java.sql.SQLException: [Simba][ImpalaJDBCDriver](500310) Invalid operation: Unable to obtain Principal Name for authentication ; at com.cloudera.impala.hivecommon.api.HiveServer2ClientFactory.createTransport(HiveServer2ClientFactory.java:224) at com.cloudera.impala.hivecommon.api.HiveServer2ClientFactory.createClient(HiveServer2ClientFactory.java:52) at com.cloudera.impala.hivecommon.core.HiveJDBCConnection.connect(HiveJDBCConnection.java:597) at com.cloudera.impala.jdbc.common.BaseConnectionFactory.doConnect(BaseConnectionFactory.java:219) at com.cloudera.impala.jdbc.common.AbstractDriver.connect(AbstractDriver.java:216) at workbench.db.DbDriver.connect(DbDriver.java:513) at workbench.db.ConnectionMgr.connect(ConnectionMgr.java:244) at workbench.db.ConnectionMgr.getConnection(ConnectionMgr.java:172) Caused by: com.cloudera.impala.support.exceptions.GeneralException: [Simba][ImpalaJDBCDriver](500310) Invalid operation: Unable to obtain Principal Name for authentication ; ... 8 more Caused by: javax.security.auth.login.LoginException: Unable to obtain Principal Name for authentication at com.sun.security.auth.module.Krb5LoginModule.promptForName(Unknown Source) at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Unknown Source) at com.sun.security.auth.module.Krb5LoginModule.login(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at javax.security.auth.login.LoginContext.invoke(Unknown Source) at javax.security.auth.login.LoginContext.access$000(Unknown Source) at javax.security.auth.login.LoginContext$4.run(Unknown Source) at javax.security.auth.login.LoginContext$4.run(Unknown Source) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.login.LoginContext.invokePriv(Unknown Source) at javax.security.auth.login.LoginContext.login(Unknown Source) at com.cloudera.impala.hivecommon.api.HiveServer2ClientFactory.createTransport(HiveServer2ClientFactory.java:113) at com.cloudera.impala.hivecommon.api.HiveServer2ClientFactory.createClient(HiveServer2ClientFactory.java:52) at com.cloudera.impala.hivecommon.core.HiveJDBCConnection.connect(HiveJDBCConnection.java:597) at com.cloudera.impala.jdbc.common.BaseConnectionFactory.doConnect(BaseConnectionFactory.java:219) at com.cloudera.impala.jdbc.common.AbstractDriver.connect(AbstractDriver.java:216) at workbench.db.DbDriver.connect(DbDriver.java:513) at workbench.db.ConnectionMgr.connect(ConnectionMgr.java:244) at workbench.db.ConnectionMgr.getConnection(ConnectionMgr.java:172) at workbench.gui.profiles.ConnectionGuiHelper$1.run(ConnectionGuiHelper.java:104) I checked very worker node, did not see any errors in the Impala Daemon Logs.
... View more
09-30-2016
11:41 AM
We have a 15 Node Kerborised Impala Cluster with a HAProxy. We have no issue using HUE to run queries. We are also able to use the ODBC Driver on a Windows Machine, authenticate with Kerberos and connect to the Impala via HA Proxy. However, when we try to connect to the Impala HA Proxy using SQL Workbench via JDBC Driver. We get the following error message: [Simba][ImpalaJDBCDriver](500310) Invalid operation: Unable to obtain Principal Name for authentication ; The connection string is: jdbc:impala://<PUBLIC IP ADDRESS>:21051;AuthMech=1;KrbRealm=<REALM>;KrbHostFQDN=<fqdn>;KrbServiceName=impala; We tried adding the Principal parameter, but it doesn't help. Any ideas, on how to get Impala JDBC to work from a windows machine using Kerberos?
... View more
Labels:
- Labels:
-
Apache Impala
-
Cloudera Hue
-
Kerberos
09-14-2016
01:50 PM
We currently have CDH 5.7.2 Cluster, and want to upgrade to 5.8. Is there a documentation, that we can use to do this transistion?
... View more
Labels:
08-15-2016
01:29 PM
Is it recommended to install HAProxy on the Gateway node, or should be it a dedicated server by itself? If so, is there a sepecific memory requirement? I would like to have this on AWS
... View more
Labels:
- Labels:
-
Gateway
- « Previous
-
- 1
- 2
- Next »