Member since
05-10-2016
184
Posts
60
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4099 | 05-06-2017 10:21 PM | |
4106 | 05-04-2017 08:02 PM | |
5018 | 12-28-2016 04:49 PM | |
1243 | 11-11-2016 08:09 PM | |
3333 | 10-22-2016 03:03 AM |
12-28-2016
04:51 PM
@Barret Miller Did you restart the hive services post the changes ?
... View more
12-28-2016
04:49 PM
@kotesh banoth please complete your sentence 🙂
... View more
12-01-2016
06:33 PM
Can you snapshot the page where you have the policies ? If the profile is public, I believe it overrides any other permissions. How about you introducing your user and denying any privileges from him over HDFS.
... View more
11-16-2016
07:23 AM
1 Kudo
This should be easy enough for you to test: 1. Insert values 1 to 40 for column user_id into table user_info_bucketed 2. Insert around 440 rows from 41 to 440 3. Ideally, each bucket should have about 19 rows, or around that 4. You can then check something like: SELECT user_id,INPUT__FILE__NAME FROM user_info_bucketed WHERE user_id = 5;
SELECT user_id,INPUT__FILE__NAME FROM user_info_bucketed WHERE user_id = 50;
SELECT user_id,INPUT__FILE__NAME FROM user_info_bucketed WHERE user_id = 101;
SELECT user_id,INPUT__FILE__NAME FROM user_info_bucketed WHERE user_id = 160;
OR you can check the physical location of the file on HDFS to determine the line count.
... View more
11-15-2016
03:43 PM
1 Kudo
You can try what @Rajkumar Singh has recommended. At times, the error is also temporary in nature, so trying to login again to ambari might help, unless you have already tried that.
... View more
11-11-2016
08:09 PM
@Karan Alang Hive shell is not secured through ranger, its only hiveserver2. Try what you were trying using beeline or a JDBC app. It should not allow you to get in.
... View more
11-11-2016
05:36 PM
@Timothy Spann Can you help with the explain plan along with hive -hiveconf hive.root.logger=debug,console -e 'query' output
... View more
11-11-2016
12:05 AM
1 Kudo
@Dhiwa TdG If you verified your output for Hive via Hive/Beeline shell, then its a different story, you are actually seeing the output on STDOUT. With Pig, you can try using pig view instead. Ambari -> admin (drop down) -> Manage Ambari -> Views -> PIG -> Create Instance (If you don't have a PIG view already)
... View more
11-10-2016
02:35 AM
1 Kudo
Goal Access Hive through R shell with Rhive Authentication mechanism: Kerberos
Assumptions Have a HDP cluster up and running Have configured and installed R Cluster is configured on CentOS7 Need to have openssl-devel installed on the server where R is installed If using openjdk, ensure that the JAVA_HOME is set to the path where javac is available Steps Download Rserve wget https://rforge.net/Rserve/snapshot/Rserve_1.8-5.tar.gz Use R to install Rserve R CMD INSTALL Rserve_1.8-5.tar.gz Download rhive library git clone https://github.com/nexr/RHive.gita Set the appropriate path for HIVE_HOME and HADOOP_HOME export HADOOP_HOME=/usr/hdp/2.5.0.0-1245/hadoop; export HIVE_HOME=/usr/hdp/2.5.0.0-1245/hivea Change directory to RHive cd RHive Use ant to build ant build Use R to build RHive, this will build a file within the same directory R CMD build RHive Use the newly created file name to install RHive library R CMD INSTALL RHive_2.0-0.10.tar.gz Demo Launch R using "R" at the unix prompt Load RHive library > library(RHive)
Loading required package: rJava
Loading required package: Rserve
Use the connection string in this format > rhive.connect(host="node1.hortonworks.com:10000/default;principal=hive/node1.hortonworks.com@HDP.COM;AuthMech=1;KrbHostFQDN=service.hortonworks.com;KrbServiceName=hive;KrbRealm=HDP.COM", defaultFS="hdfs://node1.hortonworks.com/rhive", hiveServer2=TRUE,updateJar=FALSE)
2016-11-09 22:30:00,425 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-11-09 22:30:01,361 WARN [main] shortcircuit.DomainSocketFactory (DomainSocketFactory.java:<init>(117)) - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2016-11-09 22:30:01,853 INFO [Thread-7] jdbc.Utils (Utils.java:parseURL(316)) - Supplied authorities: node1.hortonworks.com:10000
2016-11-09 22:30:01,853 INFO [Thread-7] jdbc.Utils (Utils.java:parseURL(432)) - Resolved authority: node1.hortonworks.com:10000
Querying a table > rhive.query("SELECT * FROM default.test")
test.col1 test.col2
1 1 1
2 2 2
3 3 3
4 4 4
>
Enjoy !!
... View more
Labels:
11-09-2016
09:58 PM
There is no need to pass the principal name when zookeeper quorum is being used for JDBC. As long as a valid ticket is available and impersonation settings are appropriate, it will work: [root@services RHive]# kinit -kt myuser.service.keytab myuser/services.hortonworks.com@HDP.COM
[root@services RHive]# beeline -u "jdbc:hive2://node1.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/node1.hortonworks.com@HDP.COM"
Connecting to jdbc:hive2://node1.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/node1.hortonworks.com@HDP.COM
Connected to: Apache Hive (version 1.2.1000.2.5.0.0-1245)
Driver: Hive JDBC (version 1.2.1000.2.5.0.0-1245)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.2.1000.2.5.0.0-1245 by Apache Hive
0: jdbc:hive2://node1.hortonworks.com:2181/> !q
Closing: 0: jdbc:hive2://node1.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/node1.hortonworks.com@HDP.COM
[root@services RHive]# beeline -u "jdbc:hive2://node1.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2"
Connecting to jdbc:hive2://node1.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
Connected to: Apache Hive (version 1.2.1000.2.5.0.0-1245)
Driver: Hive JDBC (version 1.2.1000.2.5.0.0-1245)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.2.1000.2.5.0.0-1245 by Apache Hive
0: jdbc:hive2://node1.hortonworks.com:2181/> !q
Closing: 0: jdbc:hive2://node1.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
[root@services RHive]# kdestroy
[root@services RHive]# beeline -u "jdbc:hive2://node1.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2"
Connecting to jdbc:hive2://node1.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
16/11/09 21:57:15 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
... View more