Member since
01-07-2020
64
Posts
1
Kudos Received
0
Solutions
06-17-2021
07:34 AM
I have some impala queries in a file and I want each time I am running the queries to set the mem_limit = 3gb but in the command not in the file inside impala-shell -f /path/ (and add here the mem_limit) Is this possible?
... View more
Labels:
- Labels:
-
Apache Impala
06-17-2021
03:46 AM
Hi, I have a table in Impala and I want to make its data to be sorted. I thought to move the data into a temp table and start insert into but the amount of data is huge and I was wondering if there is any other option to sort them in the background. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache Impala
06-11-2021
01:40 AM
Hi, I am trying to run some impala queries through impala shell and every time it throws an error : ERROR: Failed to create thread scanner-thread (finst:1d4df837xxx:f84fad02xxx, plan-node-id:1, thread-idx:28) in category fragment-execution: boost::thread_resource_error: Resource temporarily unavailable. In cloudera manager Impala service seems ok and stable. Why this happens? Thanks in advance.
... View more
Labels:
- Labels:
-
Apache Impala
05-25-2021
05:25 AM
I am trying to learn flume and for this I am creating some rows to ingest. I have a row like "ted";"27";"hi, i am good; you?";"ok" With the delimiter as ; it separates me also the hi, i am good you? I want only to change field when the ; has a " next to it. Is this possible?
... View more
Labels:
- Labels:
-
Apache Flume
04-15-2021
08:18 AM
Hi I am trying to run this command in hdfs to find files older that 2 hours. hdfs dfs -find /path/* -name "*.log.*" -mmin +120 but it doesnt recognizes -mmin. How can I achieve that?
... View more
Labels:
- Labels:
-
Apache Hadoop
03-22-2021
12:01 AM
Thanks for your quick response. Although I wrote that I have read clouderas documentation about UDFS and it didnt help me a lot. Is there any source with examples?
... View more
03-21-2021
11:56 PM
Hi I perform many queries with the below statement and I thought that instead of writing over and over again the same cases if it is possible to create a UDF in impala to handle this like sql server. (select people, sum(new_money) as new_money from
(
select people,
case
when pocket_01 = 'leather' then sum(nvl(money,0.0))/100 else 0.0
end +
case
when pocket_02 = 'leather' then sum(nvl(money,0.0))/100 else 0.0
end +
case
when pocket_03 = 'leather' then sum(nvl(money,0.0))/100 else 0.0
end +
case
when pocket_04 = 'leather' then sum(nvl(money,0.0))/100 else 0.0
end +
case
when pocket_05 = 'leather' then sum(nvl(money,0.0))/100 else 0.0
end +
case
when pocket_06 = 'leather' then sum(nvl(money,0.0))/100 else 0.0
end +
case
when pocket_07 = 'leather' then sum(nvl(money,0.0))/100 else 0.0
end ) as new_money from bank where date = '20201010'
) target
group by people ) as target Is this possible? I have searched everywhere to find a simple example of how to create a udf but I only found this in the cloudera documentation which didnt help me a lot.
... View more
Labels:
- Labels:
-
Apache Impala
11-03-2020
02:34 AM
Hi, I am trying to run various impala queries in HUE but every time I got this error message: SQL Error [500312] [HY000]: [Cloudera][ImpalaJDBCDriver](500312)
Error in fetching data rows: Disk I/O error on wn01:
Failed to open HDFS file <file's name > ¶Error(255):
Unknown error 255¶Root cause: Asn1Exception: Identifier doesn't match expected value (906)¶; I run invalidate metadata on the table but this error continues. Queries are ok, because last week they ran well. Thanks in advance
... View more
Labels:
- Labels:
-
Apache Impala
04-08-2020
07:45 AM
Hello,
I want to transfer files from hdfs to kudu. I tried through talend fabric and its components but I have an error : Cannot run anywhere due to node and executor blacklist.
Can you help me please? Thanks a lot.
... View more
Labels:
- Labels:
-
Apache Kudu
-
HDFS
02-06-2020
01:15 AM
@robbiez Thanks for your answer, but I want also to ask something. If I have a big amount of data which are to be parsed and something goes wrong in the process of parsing, the parsing will start from the zero or I can do something in order to start again from the part which crashed the process?
... View more
- « Previous
- Next »