Member since
08-22-2016
9
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2873 | 03-01-2017 04:41 PM |
10-14-2017
08:18 PM
I have fields orig_date, Fields1, Field2 and I want to calculate rank as follows using hive , any suggestions.
Orig_date Field1 Field2 Rank 9/1/2017 1 1 1 9/2/2017 0 1 2 9/3/2017 1 0 3 9/4/2017 0 0 4 9/5/2017 0 0 4 9/6/2017 0 0 4 9/7/2017 0 0 4 9/8/2017 1 1 5 9/9/2017 1 0 6 9/10/2017 1 1 7 9/11/2017 1 1 8 9/12/2017 0 0 9 9/13/2017 0 0 9 9/14/2017 1 1 10 9/15/2017 1 1 11
... View more
Labels:
03-15-2017
07:38 PM
1 Kudo
i have 100 tables Test_1,Test_2......Test80 test90 out of which some 70 tables start with string Test_ need to be removed. I need to use only hive queries, nothing from unix just pure hive commands, I am looking for something like this drop table where tablename like Test_*
... View more
Labels:
03-01-2017
04:41 PM
you can use left join to get this Step5 can be replace with create table z_test2 as select a.LastName, case when b.FirstName is null then a.FirstName else b.FirstName end as firstname, a.address, a.city from z_test1 a left outer join z_test b on a.LastName = b.LastName
... View more
02-03-2017
07:39 PM
H @Eugene Koifmani, I tried understanding the documentation you have provided but couldn't figure it out what you are saying, I want an hive working code which can replace this update z_test1 set FirstName= z.FirstName from z_test z where z_test1.LastName= z.LastName
... View more
02-03-2017
05:23 PM
I need to do an update of a column with values from another table Step1: execute(set hive.enforce.bucketing=true)by hadoop; execute(set hive.exec.dynamic.partition.mode=nonstrict)by hadoop; CREATE TABLE z_test1
(
LastName varchar(255),
FirstName varchar(255),
Address varchar(255),
City varchar(255)
)
clustered by (City) into 3 buckets
stored as orc TBLPROPERTIES ('transactional'='true') Step2: Insert into z_test1 VALUES ("david","baker","houston","tx"),("rose","crab","houston","tx") lastname firstname address city
david baker houston tx
rose crab houston tx Step3: execute(set hive.enforce.bucketing=true)by hadoop; execute(set hive.exec.dynamic.partition.mode=nonstrict)by hadoop; CREATE TABLE z_test
(
LastName varchar(255),
FirstName varchar(255),
Address varchar(255),
City varchar(255),
country varchar(255)
)
clustered by (City) into 3 buckets
stored as orc TBLPROPERTIES ('transactional'='true') Step4: Insert into z_test VALUES ("david","burrows","houston","tx","usa") lastname firstname address city country
david burrows houston tx usa
Step5: update z_test1 set FirstName= z.FirstName from z_test z where z_test1.LastName= z.LastName Error in step5 Desired Result: davids first name in table z_test1 should be updated as burrows from second table lastname firstname address city
david burrows houston tx
rose crab houston tx
... View more
02-03-2017
04:26 PM
we have purchased presto for this and it works much faster https://prestodb.io/docs/current/
... View more
11-30-2016
01:51 AM
hi, i have a table in which I need to calculate the max among the previous 30 days for every record , is there a way in hive with sub queries?, see below for sample data
FIELD1 FIELD2 FIELD3 DATE VAR MAX VAR USA TX RTB 20160930 1 USA TX RTB 20161001 1 USA TX RTB 20161002 1 USA TX RTB 20161003 1 USA TX RTB 20161004 1 USA TX RTB 20161005 2 USA TX RTB 20161006 2 USA TX RTB 20161007 2 USA TX RTB 20161008 2 USA TX RTB 20161009 1 USA TX RTB 20161010 2 USA TX RTB 20161011 2 USA TX RTB 20161012 2 USA TX RTB 20161013 2 USA TX RTB 20161014 2 USA TX RTB 20161015 2 USA TX RTB 20161016 2 USA TX RTB 20161017 2 USA TX RTB 20161018 2 USA TX RTB 20161019 2 USA TX RTB 20161020 2 USA TX RTB 20161021 2 USA TX RTB 20161022 2 USA TX RTB 20161023 2 USA TX RTB 20161024 2 USA TX RTB 20161025 3 USA TX RTB 20161026 2 USA TX RTB 20161027 2 USA TX RTB 20161028 2 USA TX RTB 20161029 2 USA TX RTB 20161030 2 3 USA TX RTB 20161031 2 3 USA TX RTB 20161101 2 3 USA TX RTB 20161102 2 3 USA TX RTB 20161103 2 3 USA TX RTB 20161104 2 3 USA TX RTB 20161105 2 3 USA TX RTB 20161106 2 3 USA TX RTB 20161107 2 3 USA TX RTB 20161108 2 3 USA TX RTB 20161109 2 3 USA TX RTB 20161110 2 3 USA TX RTB 20161111 2 3 USA TX RTB 20161112 2 3 USA TX RTB 20161113 2 3 USA TX RTB 20161114 2 3 USA TX RTB 20161115 2 3 USA TX RTB 20161116 2 3 USA TX RTB 20161117 2 3 USA TX RTB 20161118 2 3 USA TX RTB 20161119 2 3 USA TX RTB 20161120 2 3 USA TX RTB 20161121 2 3 USA TX RTB 20161122 2 3 USA TX RTB 20161123 2 3 USA TX RTB 20161124 2 3 USA TX RTB 20161125 2 2 USA TX RTB 20161126 2 2 USA TX RTB 20161127 2 2 USA TX RTB 20161128 2 2 USA TX RTB 20161129 2 2 UK LONDON RTB 20160930 1 UK LONDON RTB 20161001 1 UK LONDON RTB 20161002 1 UK LONDON RTB 20161003 1 UK LONDON RTB 20161004 1 UK LONDON RTB 20161005 1 UK LONDON RTB 20161006 1 UK LONDON RTB 20161007 1 UK LONDON RTB 20161008 1 UK LONDON RTB 20161009 1 UK LONDON RTB 20161010 1 UK LONDON RTB 20161011 1 UK LONDON RTB 20161012 1 UK LONDON RTB 20161013 1 UK LONDON RTB 20161014 1 UK LONDON RTB 20161015 1 UK LONDON RTB 20161016 1 UK LONDON RTB 20161017 1 UK LONDON RTB 20161018 1 UK LONDON RTB 20161019 1 UK LONDON RTB 20161020 1 UK LONDON RTB 20161021 1 UK LONDON RTB 20161022 1 UK LONDON RTB 20161023 1 UK LONDON RTB 20161024 1 UK LONDON RTB 20161025 4 UK LONDON RTB 20161026 2 UK LONDON RTB 20161027 2 UK LONDON RTB 20161028 2 UK LONDON RTB 20161029 2 UK LONDON RTB 20161030 2 4 UK LONDON RTB 20161031 2 4 UK LONDON RTB 20161101 2 4 UK LONDON RTB 20161102 2 4 UK LONDON RTB 20161103 2 4 UK LONDON RTB 20161104 2 4 UK LONDON RTB 20161105 2 4 UK LONDON RTB 20161106 2 4 UK LONDON RTB 20161107 2 4 UK LONDON RTB 20161108 2 4 UK LONDON RTB 20161109 2 4 UK LONDON RTB 20161110 2 4 UK LONDON RTB 20161111 2 4 UK LONDON RTB 20161112 2 4 UK LONDON RTB 20161113 2 4 UK LONDON RTB 20161114 2 4 UK LONDON RTB 20161115 2 4 UK LONDON RTB 20161116 2 4 UK LONDON RTB 20161117 2 4 UK LONDON RTB 20161118 2 4 UK LONDON RTB 20161119 2 4 UK LONDON RTB 20161120 2 4 UK LONDON RTB 20161121 2 4 UK LONDON RTB 20161122 2 4 UK LONDON RTB 20161123 2 4 UK LONDON RTB 20161124 2 4 UK LONDON RTB 20161125 2 2 UK LONDON RTB 20161126 2 2 UK LONDON RTB 20161127 2 2 UK LONDON RTB 20161128 2 2 UK LONDON RTB 20161129 2 2
... View more
- Tags:
- Data Processing
- Hive
Labels:
08-23-2016
08:55 PM
@Sunile Manjee between operator in non equi joins is not working, can you write down a simple code in those lines so that I can understand better. Thanks for your time and help!
... View more
08-22-2016
04:41 PM
1 Kudo
SELECT dim.Date1, *
from fact join Dim on Dim.Date1 between fact.Begin_date and fact.End_date I have this sql query and I need to convert it to hive query. I know that non equi joins does not work in hive and tried this but it is not working. SELECT dim.Date1, *
from Dim,fact where dim.date1 between fact.Begin_date and fact.End_date Any thoughts?
... View more
- Tags:
- Data Processing
- join