Member since
09-17-2014
93
Posts
5
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
32531 | 03-02-2015 12:47 PM | |
2613 | 02-03-2015 01:24 PM | |
4397 | 12-12-2014 08:19 AM | |
4403 | 11-07-2014 01:55 PM | |
2825 | 10-13-2014 06:47 PM |
07-08-2024
12:05 AM
1 Kudo
I run into same issue after deploying cdp 7.1.7. Is there workaround for this issue ?
... View more
10-19-2022
05:36 AM
How do I put the sql connector jar in /var/lib/sqoop?
Thanks
... View more
10-24-2020
05:49 AM
I can connect with "beeline -u jdbc:hive2://" [20:28 hadoop@Cavin-Y7000 hive]$ beeline -u jdbc:hive2://
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://
20/10/24 20:30:27 [main]: WARN conf.HiveConf: HiveConf of name hive.server2.connection.host does not exist
Hive Session ID = 4977083b-4d07-4ff0-930f-7afb9e214933
20/10/24 20:30:28 [main]: WARN session.SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
20/10/24 20:30:29 [main]: WARN metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
20/10/24 20:30:30 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:30 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:30 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:30 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:30 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:30 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:31 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:31 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:31 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:31 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:31 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
20/10/24 20:30:31 [main]: WARN DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
Connected to: Apache Hive (version 3.1.2)
Driver: Hive JDBC (version 3.1.2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 3.1.2 by Apache Hive
0: jdbc:hive2://> but when I use "beeline -u jdbc:hive2://localhost:10000/default" I got this error: [20:26 hadoop@Cavin-Y7000 hive]$ ./bin/beeline -u jdbc:hive2://localhost:10000
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000
20/10/24 20:28:38 [main]: WARN jdbc.HiveConnection: Failed to connect to localhost:10000
Unknown HS2 problem when communicating with Thrift server.
Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000: Invalid status 16 (state=08S01,code=0)
Beeline version 3.1.2 by Apache Hive it's very confusing for me for several days, and I cannot just use "jdbc:hive2://" within java code, it will give me the same error as using "beeline -u jdbc:hive2://localhost:10000/default" in command line
... View more
02-05-2019
08:05 AM
1 Kudo
Hi I am targeting ojdbc7.jar. I have both ojdbc6 and ojdbc7 in my /opt/cloudera/parcels/CDH/lib/sqoop/lib/ I ran sqoop import --libjars ...../sqoop/lib/ojdbc7.jar --driver "oracle.jdbc.driver.OracleDriver" --connect "jdbc:oracle:thin:@(description=(ADDRESS=(PROTOCOL=TCP)(HOST=XXXXX.XXXX.com)(PORT=XXXX))(CONNECT_DATA=(SERVER=DEDICATED)(service_name=XXXXX.XXXXX.com)))" --username XXXXXX --password XXXXXX --query 'select * from XXXXXXX where rownum < 100 and $CONDITIONS' --fields-terminated-by "\001" --target-dir /XXX/XXXX/XXXXX/ --num-mappers 1 --verbose --delete-target-dir The job worked but I can't determine if it used driver ojdbc7 or odbc6 as i see both in log DEBUG mapreduce.JobBase: Adding to job classpath: file:..... ..../sqoop/lib/ojdbc6.jar DEBUG mapreduce.JobBase: Adding to job classpath: file:..... ..../sqoop/lib/ojdbc7.jar is there a way to target only to use the ojdbc7.jar and not any other driver? trying to test this new ojdbc7 and I am not sure if it is still using old 6? also I am reaing compatability issues with 9 and the ojdbc7, should I remove ojdbc6 all together and replace with ojdbc6 or keep both? If I keep both how do I target one driver vs. the other
... View more
04-20-2017
10:53 AM
if you fixed the issue, please tell me how do create roles for hive database and grant permissions.
... View more
06-08-2016
08:35 PM
Hi, I too encountered the same error, but here I'm using MS Access or Excel to connect to Hadoop via Impala ODBC driver a 32-bit version. Any pointers will be appreciated.
... View more
02-04-2016
02:34 PM
https://issues.apache.org/jira/browse/SENTRY-1001
... View more
02-05-2015
03:22 PM
Thank you for letting us know the solution. 🙂
... View more
12-16-2014
01:17 PM
Hi Alessio, I was able to use -ldap and connect to my Kerberized Impala cluster: impala-shell -i SERVERNAME.DOMAIN.COM:21000 -ldap You may not have LDAP integration setup or Impala configured for LDAP integration. I hope this helps.
... View more