Member since
01-12-2016
123
Posts
12
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1501 | 12-12-2016 08:59 AM |
12-31-2016
06:51 AM
@vamsi valiveti
a> This slide is from Hortonworks Training course. The course/slides are available to paid customers only. b> i> there is nothing like grouping ii> Shuffle happens when data move from Map to reduce (please see the diagram) and Merge happens during sort phase at Reducer side.
... View more
12-29-2016
03:51 PM
@vamsi valiveti it could be the option, right. But for production usage i'd think additionally about how to stop the agents and how to monitor the agent. From my experience init.d service script + ganglia monitoring is a best option. It allows you to run/stop agents easily with the commands like: /etc/init.d/flume "agent" stop/start. And ganglia provides a nice web interface for the monitoring.
... View more
12-16-2016
03:33 PM
You are not doing anything wrong and neither is the book. The limitation iswith the formula itself. This formula does not account for scenarios where [ (caNT-dsII)/dsF ] leads to fractions. In such situations, the caNT will not match current(0) through calculation without eyeballing it. If you take a look at the text book it says “Notably, the nominal time 2014-10-19T06:00Z and current(0) do not exactly match in this example”
... View more
12-12-2016
08:59 AM
I got the answer as below: isempty_data_1 = filter t_1 by SIZE(failTime)>0; (23,{(6,Archana,Mishra,23,9848022335,Chennai)}) (24,{(8,Bharathi,Nambiayar,24,9848022333,Chennai)})
... View more
12-01-2016
04:00 PM
Same answer: since z2 is a bag, you need to flatten it to a tuple to do a distinct on it. For the data you are showing: z3 = for each z2 FLATTEN(BagToTuple($0)); z4 = distinct z3; The link gives the detailed explanation of why this is required.
... View more
11-16-2016
09:51 AM
I tired below command but it is not changing to 777.It is changing to rw-rw-rw- hadoop fs -chmod 777 -R /vamsi/part-m-00003
... View more
07-01-2017
03:17 PM
The right way to think about LATERAL VIEW is that it allows a table-generating function (UDTF) to be treated as a table source, so that it can be used like any other table, including selects, joins and more.
LATERAL VIEW is often used with explode, but explode is just one UDTF of many, a full list is available in the documentation.
To take an example:
select tf1.*, tf2.*
from (select 0) t
lateral view explode(map('A',10,'B',20,'C',30)) tf1
lateral view explode(map('A',10,'B',20,'C',30)) tf2;
This results in:
tf1.key
tf1.value
tf2.key
tf2.value
A
10
A
10
A
10
B
20
A
10
C
30
B
20
A
10
(5 rows were truncated)
The thing to see here is that this query is a cross product join between the tables tf1 and tf2. The LATERAL VIEW syntax allowed me to treat them as tables. The original question used "AS" syntax, which automatically maps the generated table's columns to column aliases. In my view it is much more powerful to leave them as tables and use their fully qualified table correlation identifiers.
These tables can be used in joins as well:
select tf1.*, tf2.*
from (select 0) t
lateral view explode(map('A',10,'B',20,'C',30)) tf1
lateral view explode(map('A',10,'B',20,'C',30)) tf2 where tf1.key = tf2.key;
Now we get:
tf1.key
tf1.value
tf2.key
tf2.value
A
10
A
10
B
20
B
20
C
30
C
30
... View more
11-04-2016
02:07 PM
'Hiển thị bảng;' is how to SQL tiêu chuẩn receive bảng tên. '! Bảng "is cụ thể cho Beeline to use 'bảng hiển thị;' to ensure SQL of you is xách tay cho all khách hàng khác SQL. 2) Use '! Sh <script> 'to run shell commands, for example like 0: jdbc: hive2: //hdp224.local:10000/default> sh HDFS dfs -ls /! Tìm thấy 9 mặt hàng drwxrwxrwx - sợi hadoop 0 năm 2016-11 - 01 14: 07 / ứng dụng - bản ghi drwxr - xr - x - HDFS HDFS 0 năm 2016-11 - 01 12: 41 / application drwxr - xr - x - sợi hadoop 0 năm 2016-11 - 01 15: 55 / ATS drwxr - xr - x - UserA người sử dụng 0 năm 2016-11 - 01 14: 29 / dữ liệu drwxr - xr - x - HDFS HDFS 0 năm 2016-11 - 01 12: 38 / HDP drwxr - xr - x - HDFS mapred 0 năm 2016-11 - 01 12: 38 / mapred drwxrwxrwx - mapred hadoop 0 năm 2016-11 - 01 12: 38 / mr - lịch sử drwxrwxrwx - HDFS HDFS 0 năm 2016-11 - 01 15: 56 / tmp drwxr - xr - x - HDFS HDFS 0 năm 2016-11 - 01 14: 06 / user ' Cảm ơn, may mắn thay tôi cũng muốn biết
... View more
- « Previous
-
- 1
- 2
- Next »