Member since
07-18-2016
262
Posts
12
Kudos Received
21
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7538 | 09-21-2018 03:16 AM | |
4105 | 07-25-2018 05:03 AM | |
4972 | 02-13-2018 02:00 AM | |
2292 | 01-21-2018 02:47 AM | |
39581 | 08-08-2017 10:32 AM |
08-21-2017
03:25 PM
1 Kudo
Hi @zkfs I haven't seen a query to do that yet in Hive. Instead, you can query the hive metastore for the information, though be mindful that queries run directly against the metastore could impact your hive performance and are not recommended for production systems. Look at the TBL_PRIVS and TBLS within the hive DB in the metastore, joining these on the TBL_ID may give you the table view you are looking for. You can probably construct a similar metastore query to look at it from a PRINCIPAL_TYPE (role) as well.
... View more
08-08-2017
10:32 AM
Finally Worked for Me and did some work around. Steps as below. 1) Create Temp table with same columns. 2) Overwrite table with required row data. 3)Drop Hive partitions and HDFS directory. 4)Insert records for respective partitions and rows. 5) verify the counts. 1) hive> select count(*) from emptable where od='17_06_30' and ccodee=!'123';
OK
27
hive> select count(*) from emptable where od='17_06_30' and ccodee='123';
OK
7
hive>show create table emptable_tmp; :- Note hdfs location
2)Create table and overwrite with required partitioned data
hive> CREATE TABLE `emptable_tmp`(
'rowid` string,PARTITIONED BY (`od` string)
ROW FORMAT SERDE
'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'
STORED AS INPUTFORMAT
'org.apache.hadoop.mapred.SequenceFileInputFormat';
hive> insert into emptable_tmp partition(od) select * from emptable where od='17_06_30' and ccodee!='123';
Time taken for adding to write entity : 1
Partition database.emptable_tmp{od=17_06_30} stats: [numFiles=20, numRows=27,totalSize=6216,rawDataSize=5502]
OK
3) Drop Partitions from Hive and HDFS directory as well, as this is External table.
hive> alter table emptable drop partition(od='17_06_30');
Dropped the partition od=17_06_30
OK
Time taken: 0.291 seconds
HDFS partition deletion
#hdfs dfs -rm -r /hdfs/location/emptable/ods='17_06_30'
4) Insert data for that partition only.
hive> insert into emptable partition(od) select * from emptable_tmp;
Partition database.emptable{ds=17_06_30} stats: [numFiles=66, numRows=20, totalSize=5441469982, rawDataSize=]
OK
Time taken: 27.282 seconds
5) Verifying the counts on partitions and respective rows data
1) hive> select count(*) from emptable where od='17_06_30' and ccodee=!'123';
OK
27
hive> select count(*) from emptable where od='17_06_30' and ccodee='123';
OK
0
... View more
06-18-2017
04:22 PM
It went fine after restart httpd service went fine [root@repository ~]# service httpd status
httpd (pid 2030) is running...
[root@repository ~]#
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* base: centos.exabytes.com.my
* extras: centos.exabytes.com.my
* updates: centos.exabytes.com.my
repo id repo name status
HDP-2.1 HDP-2.1 98
HDP-UTILS-1.1.0.19 HDP-UTILS-1.1.0.19 48
Updates-ambari-2.2.0.0 ambari-2.2.0.0 - Updates 8
base CentOS-6 - Base 6,706
extras CentOS-6 - Extras 45
mysql-connectors-community MySQL Connectors Community 36
mysql-tools-community MySQL Tools Community 47
mysql56-community MySQL 5.6 Community Server 358
updates CentOS-6 - Updates 358
repolist: 7,704
[root@repository ~]#
... View more
06-07-2017
04:22 AM
This issue occurring once in 3-4 days ays, below four log created in hbase log directory. Jun 3 00:16 gc.log-201705120309
Jun 3 00:19 gc.log-201706030016
Jun 3 00:45 gc.log-201706030019
Jun 4 13:43 gc.log-201706030045
Jun 4 17:10 gc.log-201706041343
Jun 7 12:18 gc.log-201706041710
... View more
05-23-2017
07:39 PM
Server 1 :- 192.168.154.111 (centos)
Server 2 :- 192.168.154.113 (centos2)
1) First install the client on server from where you want to connect
Configure Repository and install client
[root@centos2 ~]# yum install mysql-client
2) My Sql software installed on centos2
[root@centos2 ~]# yum install mysql-server
3)Connected to Mysql on Centos2, Following user exist before creating Hive user and Database.
#mysql -u root -p password
mysql> Select host,user from mysql.user;
+-----------------+-------+
| host | user |
+-----------------+-------+
| localhost | root |
+-----------------+-------+
8 rows in set (0.00 sec)
Creating hive user and database
mysql> create user 'hive'@'192.168.154.111' identified by 'hive';
Query OK, 0 rows affected (0.05 sec)
mysql> create user 'hive'@'localhost' identified by 'hive';
Query OK, 0 rows affected (0.00 sec)
mysql> create database hive;
Query OK, 1 row affected (0.00 sec)
mysql> grant ALL ON hive.* TO 'hive'@'192.168.154.111';
Query OK, 0 rows affected (0.00 sec)
mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)
mysql> flush hosts;
Query OK, 0 rows affected (0.00 sec)
mysql> grant ALL ON hive.* TO 'hive'@'localhost';
Query OK, 0 rows affected (0.00 sec)
mysql> Select host,user from mysql.user;
+-----------------+-------+
| host | user |
+-----------------+-------+
| % | oozie |
| 127.0.0.1 | oozie |
| 127.0.0.1 | root |
| 192.168.154.111 | hive |
| 192.168.154.111 | oozie |
| 192.168.154.113 | oozie |
| ::1 | root |
| localhost | hive |
| localhost | oozie |
| localhost | root |
+-----------------+-------+
10 rows in set (0.00 sec)
4) Verify you able to connect server MySQL server using the client ( Client Must be installed from which server you connecting), Verify user from ambari-server 192.168.154.111(Client) and 192.168.154.113(server) is MySQL Server IP address
[root@centos ~]# mysql -u hive -h 192.168.154.113 -p
Enter password:
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql> select user();
+----------------------+
| user() |
+----------------------+
| hive@centos.test.com |
+----------------------+
1 row in set (0.01 sec)
mysql> exit;
[root@centos ~]# 5) Above Client not able to connect to MY sql server , we need to grant permission as below.
mysql> grant ALL ON hive.* TO 'hive'@'%' identified by 'hive';
Query OK, 0 rows affected (0.00 sec)
mysql> show grants for hive;
+-----------------------------------------------------------------------------------------------| Grants for hive@% -----------------------------------------------------------------------------------------+
| GRANT USAGE ON *.* TO 'hive'@'%' IDENTIFIED BY PASSWORD '*4DF1D66463C18D44E3B001A8FB1BBFBEA13E27FC' |
| GRANT ALL PRIVILEGES ON `hive`.* TO 'hive'@'%' |+-------------------------------------------------------------------------------------------+
2 rows in set (0.00 sec)
Once more verification from Ambari-server, is Hive username and password is working fine
[root@centos]#/usr/lib/hive/bin/schematool -initSchema -dbType mysql -userName hive -passWord hive
Metastore connection URL: jdbc:mysql://centos2.test.com/hive?createDatabaseIfNotExist=true
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hive
Starting metastore schema initialization to 0.13.0
Initialization script hive-schema-0.13.0.mysql.sql
Initialization script completed
schemaTool completeted
[root@centos ~]#
... View more
Labels:
05-01-2017
08:44 AM
Error: Could not open client transport with JDBC Uri: jdbc:hive2://hostname/default;: Peer indicated failure: Error validating the login (state=08S01,code=0) This was the issue due to Password, wrong given while beeline connection, after changing the password at OS level it went fine.
... View more
04-24-2017
07:15 PM
We have the same issue with one particular user. The query works fine as other users. I have checked permissions under .hiveJars and the like.
... View more
04-02-2017
08:17 AM
Login Ambari WI with admin/admin console. do the service check. 1) Hive :- Click on Hive service and do the Run Service Check .
You will get exact error, what is issue and trouble shoot the Issue. 2) Smart Sense you can ignore .
3) Amabri-Metrics :- Click on Ambar-Metricks do the Run Service Check
... View more
04-01-2017
01:03 PM
Solution for Given Error, still 3 Ans not able to get. 1st Error :-
Error: E0501 : E0501: Could not perform authorization operation, User: oozie is not allowed to impersonate ambari-qa Invalid sub-command: Missing argument for option: info Solution :-
<property>
<name>hadoop.proxyuser.oozie.groups</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.oozie.hosts</name>
<value></value>
</property>
<property>
<name>hadoop.proxyuser.oozie.users</name>
<value>*</value>
</property>
2nd Error :-
Error :- Python script has been killed due to timeout after waiting 300 secs
Solution:- on ambari-server changed the following parameter
Increased the time 300 to 900 seconds mine OOZIE version is 4.0.0.2.0
[ambari-qa@centos root]$ oozie version
Oozie client build version: 4.0.0.2.1.10.0-881
[ambari-qa@centos root]$
[root@centos ~]# grep 300 /var/lib/ambari-server/resources/common-services/OOZIE/4.0.0.2.0/metainfo.xml
<timeout>300</timeout>
[root@centos ~]# 3rd Error:- resource_management.core.exceptions.Fail: Execution of '/var/lib/ambari-agent/tmp/oozieSmoke2.sh redhat
/var/lib/oozie /etc/oozie/conf /usr/bin http://centos2.test.com:11000/oozie /usr/share/doc/oozie-4.0.0.2.1.10.0 /etc/hadoop/conf
/usr/bin ambari-qa False' returned 1. source /etc/oozie/conf/oozie-env.sh ;
/usr/bin/oozie -Doozie.auth.token.cache=false job -oozie http://centos2.test.com:11000/oozie -config
/usr/share/doc/oozie-4.0.0.2.1.10.0/examples/apps/map-reduce/job.properties -run
SLF4J: Class path contains multiple SLF4J bindings.
Error:-
resource_management.core.exceptions.Fail: Execution of '/var/lib/ambari-agent/tmp/oozieSmoke2.sh redhat /var/lib/oozie /etc/oozie/conf /usr/bin http://centos2.test.com:11000/oozie /usr/share/doc/oozie-4.0.0.2.1.10.0 /etc/hadoop/conf /usr/bin ambari-qa False' returned 1. source /etc/oozie/conf/oozie-env.sh ; /usr/bin/oozie -Doozie.auth.token.cache=false job -oozie http://centos2.test.com:11000/oozie -config /usr/share/doc/oozie-4.0.0.2.1.10.0/examples/apps/map-reduce/job.properties -run
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/oozie/lib/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/oozie/lib/slf4j-simple-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.hadoop.security.authentication.client.KerberosAuthenticator).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Job ID : 0000026-170331225332898-oozie-oozi-W
------------------------------------------------------------------------------------------------------------------------------------
Workflow Name : map-reduce-wf
App Path : hdfs://centos4.test.com:8020/user/ambari-qa/examples/apps/map-reduce
Status : RUNNING
Run : 0
User : ambari-qa
Group : -
Created : 2017-03-31 21:50 GMT
Started : 2017-03-31 21:50 GMT
Last Modified : 2017-03-31 21:50 GMT
Ended : -
CoordAction ID: -
Actions
------------------------------------------------------------------------------------------------------------------------------------
ID Status Ext ID Ext Status Err Code
------------------------------------------------------------------------------------------------------------------------------------
0000026-170331225332898-oozie-oozi-W@mr-node PREP - - -
------------------------------------------------------------------------------------------------------------------------------------
0000026-170331225332898-oozie-oozi-W@:start: OK - OK -
------------------------------------------------------------------------------------------------------------------------------------
Job ID : 0000026-170331225332898-oozie-oozi-W
------------------------------------------------------------------------------------------------------------------------------------
Workflow Name : map-reduce-wf
App Path : hdfs://centos4.test.com:8020/user/ambari-qa/examples/apps/map-reduce
Status : RUNNING
Run : 0
User : ambari-qa
Group : -
Created : 2017-03-31 21:50 GMT
Started : 2017-03-31 21:50 GMT
Last Modified : 2017-03-31 21:50 GMT
Ended : -
CoordAction ID: -
... View more
03-26-2017
04:36 AM
I did the following this, mine issue has been Resolved 1) Download the “Requires: perl(DBI)” package ,we are using centos 6.8 version, download the
below RPM perl-DBI-1.609-4.el6.x86_64.rpm http://rpmfind.net/linux/rpm2html/search.php?query=perl-DBI 2) Install the perl DBI rpm on Linux server
[root@centos2 ~]# rpm -ivh perl-DBI-1.609-4.el6.x86_64.rpm
Preparing... ########################################### [100%]
1:perl-DBI ########################################### [100%]
[root@centos2 ~]#
3) [root@hostname ~]# sudo yum install mysql-community-server
Loaded plugins: fastestmirror
Setting up Install Process
Repository HDP-UTILS-1.1.0.19 is listed more than once in the configuration
Loading mirror speeds from cached hostfile
Resolving Dependencies
--> Running transaction check
---> Package mysql-community-server.x86_64 0:5.6.35-2.el6 will be installed
--> Processing Dependency: mysql-community-client(x86-64) >= 5.6.10 for package: mysql-community-server-5.6.35-2.el6.x86_64
--> Running transaction check
---> Package mysql-community-client.x86_64 0:5.6.35-2.el6 will be installed
--> Finished Dependency Resolution
Dependencies Resolved
=========================================================================================================================================
Package Arch Version Repository Size
=========================================================================================================================================
Installing:
mysql-community-server x86_64 5.6.35-2.el6 mysql56-community 54 M
Installing for dependencies:
mysql-community-client x86_64 5.6.35-2.el6 mysql56-community 18 M
Transaction Summary
=========================================================================================================================================
Install 2 Package(s)
Total download size: 73 M
Installed size: 324 M
Is this ok [y/N]: y
Downloading Packages:
(1/2): mysql-community-client-5.6.35-2.el6.x86_64.rpm | 18 MB 00:01
(2/2): mysql-community-server-5.6.35-2.el6.x86_64.rpm | 54 MB 00:04
-----------------------------------------------------------------------------------------------------------------------------------------
Total 11 MB/s | 73 MB 00:06
Running rpm_check_debug
Running Transaction Test
Transaction Test Succeeded
Running Transaction
Warning: RPMDB altered outside of yum.
Installing : mysql-community-client-5.6.35-2.el6.x86_64 1/2
Installing : mysql-community-server-5.6.35-2.el6.x86_64 2/2
Verifying : mysql-community-client-5.6.35-2.el6.x86_64 1/2
Verifying : mysql-community-server-5.6.35-2.el6.x86_64 2/2
Installed:
mysql-community-server.x86_64 0:5.6.35-2.el6
Dependency Installed:
mysql-community-client.x86_64 0:5.6.35-2.el6
Complete!
[root@hostname ~]# 4) Verify the packages installed or not
[root@hostname ~]# rpm -qa |grep mysql
mysql-community-common-5.6.35-2.el6.x86_64
mysql57-community-release-el6-9.noarch
mysql-community-libs-compat-5.6.35-2.el6.x86_64
mysql-community-libs-5.6.35-2.el6.x86_64
mysql-community-client-5.6.35-2.el6.x86_64
mysql-community-server-5.6.35-2.el6.x86_64
[root@hostname ~]#
[root@hostname ~]# service mysqld status
mysqld is stopped
[root@hostname ~]#
... View more