Member since
08-18-2017
145
Posts
19
Kudos Received
17
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1583 | 05-09-2024 02:50 PM | |
5090 | 09-13-2022 10:50 AM | |
2410 | 07-25-2022 12:18 AM | |
4535 | 06-24-2019 01:56 PM | |
2105 | 10-13-2018 04:40 AM |
10-04-2018
09:06 AM
HIVE-6348 should resolve this issue. I don't see any workaround for this issue. This fix is available in HDP-3
... View more
10-03-2018
08:01 AM
1 Kudo
Probably you might be hitting TEZ-2741. can you try to run the query by setting below config hive.compute.splits.in.am=false
... View more
09-28-2018
07:21 AM
If First day of week should start from Monday, change the subtraction/addition date to 1900-01-08 --First day of the week as Monday
select date_sub('2018-09-12',pmod(datediff('2018-09-12','1900-01-08'),7));
+-------------+--+
| _c0 |
+-------------+--+
| 2018-09-10 |
+-------------+--+
--Last day of the week as Sunday
select date_add('2018-09-12',6 - pmod(datediff('2018-09-12','1900-01-08'),7));
+-------------+--+
| _c0 |
+-------------+--+
| 2018-09-16 |
+-------------+--+
... View more
09-25-2018
03:25 PM
1 Kudo
Using SQL -- First Day of the week
select date_sub('2018-09-25',pmod(datediff('2018-09-25','1900-01-07'),7));
+-------------+--+
| _c0 |
+-------------+--+
| 2018-09-23 |
+-------------+--+
-- Last Day of the week
select date_add('2018-09-25',6 - pmod(datediff('2018-09-25','1900-01-07'),7));
+-------------+--+
| _c0 |
+-------------+--+
| 2018-09-29 |
+-------------+--+ If my answer helped to solve your problem, accept the answer. It might help others in the community.
... View more
09-25-2018
11:05 AM
1 Kudo
You can write a custom UDF in Hive to pick any day of the week. You can refer lastDay UDF code as an example // Code in Custom UDF FirstDayOfWeek
SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd");
Calendar calendar = Calendar.getInstance();
calendar.setTime(formatter.parse("2018-09-16")); // Actual Date string column here
calendar.set(Calendar.DAY_OF_WEEK, Calendar.SUNDAY);
System.out.println(calendar.getTime());
... View more
09-21-2018
04:29 PM
1 Kudo
Create text table with comma(,) as field delimiter create table textcomma(age int, name string) row format delimited fields terminated by ',' stored as textfile;
insert into textcomma values(1,'a'),(2,'b'),(3,'c'); Option 1 : CTAS text table with pipe(|) as field delimiter create table textpipe row format delimited fields terminated by '|' stored as textfile as select * from textcomma; # hadoop fs -cat /apps/hive/warehouse/textcomma/000000_0 1,a
2,b
3,c # hadoop fs -cat /apps/hive/warehouse/textpipe/000000_0 1|a
2|b
3|c Option 2 : Insert overwrite directory to write textfile with pipe(|) delimiter INSERT OVERWRITE DIRECTORY '/tmp/text-pipe'
ROW FORMAT DELIMITED FIELDS TERMINATED BY '|' STORED AS TEXTFILE
SELECT * FROM textcomma; # hadoop fs -cat /tmp/text-pipe/000000_0 1|a
2|b
3|c If it helps to solves your query, accept the answer. It might help others.
... View more
09-16-2018
02:36 PM
Is it MR job ? what is the value for hive.execution.engine? Are you using specific queue to launch this job ? It seems to be resource unavailable issue. If AppMaster is launched, can you collect application_1536988895253_0006 logs & check why task container's are not getting launched yet.
... View more
09-13-2018
03:00 PM
There are multiple ways to populate avro tables. 1) Insert into avro_table values(<col1>,<col2>..,<colN>) -- This way hive will write avro files. 2) Generating avro files & copying directly to '/tmp/staging', You can read avro documentation to write avro files directly into hdfs path. Avro Reader/Writer APIs will take care of storing & retrieving records, we don't need to explicitly specify delimiters for avro files.
... View more
09-12-2018
04:06 PM
you can create external table with location & write the text files directly on that path. eg., create external table staging1(id struct<tid:string,action:string,createdts:timestamp>, cid string, anumber string) row format delimited fields terminated by ',' collection items terminated by '|' stored as textfile LOCATION '/tmp/staging/'; All text files can be directly written at /tmp/staging/ by kafka or flume If Kafka or flume will be able to generate Avro files, then you can skip the staging table & create external avro table & write avro files directly on to external table location.
... View more
09-12-2018
03:55 PM
Yes. File content will be # hadoop fs -cat /tmp/data1.txt 1|success|2018-09-12 17:45:39.69,3,12345 Then you need to load the content into staging table using below command load data inpath '/tmp/data1.txt' into table staging; Then from staging, you need to load it into actual avro table using below command insert into testtbl select * from staging; If my answer helped you to resolve your issue, you can accept it. It will be helpful for others.
... View more