Member since
12-29-2015
26
Posts
3
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
457 | 11-21-2017 07:21 AM | |
4857 | 09-07-2016 03:07 PM |
07-24-2018
01:15 PM
got the above error while select the records from hive table.
... View more
07-24-2018
12:47 PM
casting problem in parquet file format in hive Earlier i have the datatype for one column as decimal and stored as parquet. Now i changed the datatype to bigint. After changing this, i couldn't able to select the data from the table. it showing error message Caused by:
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while
processing row [Error getting row data with exception
java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast
to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable Please help me on this. Thanks!
... View more
Labels:
- Labels:
-
Apache Hive
07-15-2018
01:56 PM
Thanks @ Sandeep Nemuri @ rguruvannagari works fine
... View more
07-15-2018
07:01 AM
@Sandeep Nemuri Thanks for your reply. the above sort will give correct output if the directory have only one day files. if the directory have 20180704 files. it sorted output like abcd_1_20180703 abcd_1_20180704 abcd_2_20180703 abcd_2_20180704 abcd_3_20180703 abcd_3_20180704 abcd_4_20180703 abcd_4_20180704 abcd_5_20180703 abcd_5_20180704 abcd_6_20180703 abcd_6_20180704 But i expect abcd_1_20180703 abcd_2_20180703 abcd_3_20180703 abcd_4_20180703 abcd_5_20180703 abcd_6_20180703 abcd_1_20180704 abcd_2_20180704 abcd_3_20180704 abcd_4_20180704 abcd_5_20180704 abcd_6_20180704 Any idea?
... View more
07-13-2018
08:06 AM
How to sort the filename in shell. My files looks like abcd_2_20180703 abcd_4_20180703 abcd_5_20180703 abcd_1_20180703 abcd_3_20180703 abcd_6_20180703 And i expect after the sorting abcd_1_20180703 abcd_2_20180703 abcd_3_20180703 abcd_5_20180703 .. Please help me sort the files. TIA
... View more
- Tags:
- UNIX
11-21-2017
07:21 AM
Here is the solution for that WHEN a1.Col1 = '1' THEN "b1.col2" Else CAST ('' as timestamp) END as Col3 . It will work
... View more
11-20-2017
03:02 PM
How to change the timstamp type to string with the condition of my query has CASE condition like. " WHEN a1.Col1 = '1' THEN "b1.col2" Else "" END as Col3" here Col1 is timestamp and Col3 is String, So when i run the query, got this error .. ""Argument type mismatch '""': The expressions after THEN should have the same type: "timestamp" is expected but "string" is found"". I expect to here, if the condition failed i need to store "" (empty) value. Please help on this. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
07-18-2017
01:02 PM
Got Invalid table alias or column reference 'NEW_AGE while using CASE statement in hive. Please find the query below and do the needful, I want to get the result(NEW_AGE) from WHEN condition and again process this results(NEW_AGE) to another CASE statement. SELECT CONSUMER_R ,
CNTRY_ISO_C , NEW_AGE =(CASE WHEN ( DTBIRTH_Y = '0001-01-01' )
THEN 0
ELSE cast((DATEDIFF(current_date,'0001-01-01')/365) as smallint)
END), CASE
WHEN NEW_AGE = 0 THEN CAST('99) UNKNOWN' as char(11)) WHEN NEW_AGE BETWEEN 1 AND 18 THEN CAST('01) < 18' as char(11)) WHEN NEW_AGE BETWEEN 18 AND 24 THEN CAST('02) 18 ~ 24' as char(11)) WHEN NEW_AGE BETWEEN 25 AND 29 THEN CAST('03) 25 ~ 29' as char(11)) WHEN NEW_AGE BETWEEN 30 AND 34 THEN CAST('04) 30 ~ 34' as char(11)) WHEN NEW_AGE BETWEEN 35 AND 39 THEN CAST('05) 35 ~ 39' as char(11)) WHEN NEW_AGE BETWEEN 40 AND 44 THEN CAST('06) 40 ~ 44' as char(11)) WHEN NEW_AGE BETWEEN 45 AND 49 THEN CAST('07) 45 ~ 49' as char(11)) WHEN NEW_AGE BETWEEN 50 AND 54 THEN CAST('08) 50 ~ 54' as char(11)) WHEN NEW_AGE BETWEEN 55 AND 59 THEN CAST('09) 55 ~ 59' as char(11)) WHEN NEW_AGE BETWEEN 60 AND 64 THEN CAST('10) 60 ~ 64' as char(11)) WHEN NEW_AGE BETWEEN 65 AND 69 THEN CAST('11) 65 ~ 69' as char(11)) WHEN NEW_AGE BETWEEN 70 AND 74 THEN CAST('12) 70 ~ 74' as char(11)) WHEN NEW_AGE >= 75 THEN cast('13) 75 +' as char(11))
ELSE cast('99) UNKNOWN' as char(11)) END NEW_AGE_RANGE,
UPDATE_Y,
UPDATE_M
FROM datamart_db.M_C_CONSMR_TBL
WHERE
(coalesce(AGE_R, 0) <> coalesce(NEW_AGE, 0));
... View more
Labels:
- Labels:
-
Apache Hive
05-18-2017
10:23 AM
How to import more than one table from DB2 to hive using sqoop ? Here are my usecase, I have around 100 tables in my DB2 database. i need to import only 5 tables into hive. How could i do that?
... View more
Labels:
- Labels:
-
Apache Sqoop
11-03-2016
06:46 AM
Thanks @Greg Keys. This is what i want 🙂
... View more
11-02-2016
12:53 PM
1 Kudo
I want to run my SH script only from Monday to Friday. How to create oozie job for this case?
... View more
Labels:
- Labels:
-
Apache Oozie
10-21-2016
01:58 PM
Thanks for your input @Greg Keys !!! Yes the values has whitespace. So i used to trim the values before CAST. It's working as expected. SELECT CAST(regexp_replace(regexp_replace(TRIM(column1),'\\.',''),',','.') as decimal(12,2)) FROM table_name;
... View more
10-21-2016
01:20 PM
Hive CAST functions return NULL values:For example one of the staging area table column have the data like -21.475,00, -26.609,00, -21.932,47, -17.300,00(String), My expected output would be like in landing area is -21475,00, -26609,00, -21932,47, -17300.00(decimal(12,2). Staging area column's datatype : String Landing area table column's datatype: decimal(12,2). During data movement from staging to Landing area. i have used insert query with select statement like SELECT CAST(regexp_replace(regexp_replace(column1,'\\.',''),',','.') as decimal(12,2)) FROM table_name; the above query return null values. Kindly do the needful. Thanks in advance!!
... View more
Labels:
- Labels:
-
Apache Hive
10-03-2016
10:10 AM
Yess.. Thanks Ayub
... View more
09-26-2016
01:45 PM
How to get the row number for particular values from hive table: For example i have column name with VAX_NUM and one of the values for this column is 0006756423. I want know the row number for this value. Do the needful. Thanks
... View more
Labels:
- Labels:
-
Apache Hive
09-07-2016
03:07 PM
1 Kudo
Thanks bpreachuk, 'field.delim' = '|' also not helping me, But somehow we have fixed the issue with below CSV serde properties : WITH SERDEPROPERTIES (
"separatorChar" = ",",
"quoteChar" = "\"",
"escapeChar" = "\\",
"serialization.encoding"='ISO-8859-1')
LOCATION '/path/' TBLPROPERTIES (
'store.charset'='ISO-8859-1',
'retrieve.charset'='ISO-8859-1',
'skip.header.line.count'='1');
... View more
09-07-2016
01:33 PM
Thanks Jk for your immediate response. After using "ROW FORMAT SERDE ‘org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe’ WITH SERDEPROPERTIES(“serialization.encoding”=’UTF-8′);" solved the spanish character issue. But we have one more column with values like -10,476.53 because
of this column, we had column jumping , this values stored in hive -10 in one column and 476.53 in another column. Do the needful.
... View more