Member since
04-03-2019
92
Posts
6
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3443 | 01-21-2022 04:31 PM | |
5882 | 02-25-2020 10:02 AM | |
3553 | 02-19-2020 01:29 PM | |
2568 | 09-17-2019 06:33 AM | |
5603 | 08-26-2019 01:35 PM |
08-26-2019
09:26 AM
I am using HDP-3.1 and YARN 3.1.1. The YARN application type is TEZ. The hive mapreduce job used to work fine. But no mapduce job can run after a complete restart of the HDP. The MR job just hanged and I had to kill them. The job diagnosis show that
Application is added to the scheduler and is not yet activated. Skipping AM assignment as cluster resource is empty. Details : AM Partition = <DEFAULT_PARTITION>; AM Resource Request = <memory:4096, vCores:1>; Queue Resource Limit for AM = <memory:0, vCores:0>; User AM Resource Limit of the queue = <memory:0, vCores:0>; Queue AM Resource Usage = <memory:0, vCores:0>;
I changed the configuration of yarn.scheduler.capacity.maximum-am-resource-percent from 0.2 to 0.5 but still got the same result.
Another issue is that I do not know how to restart YARN. From Ambari UI, I used the menu button YARN - Actions - Restart All, and then clicked the "CONFIRM RESTART ALL" button to confirm. But nothing happens.
... View more
Labels:
08-01-2019
06:44 PM
The problem went away after I changed my script adding "order by field1". select row_number() over (order by Field1) as Key1 from mytable;
... View more
08-01-2019
12:21 AM
I got the same problem. I have a table with over 3k records. I ran the following query in Hive and see the Key1 column recycling the number from 1 to 1024. select row_number() over () as Key1 from mytable;
... View more
07-31-2019
04:17 PM
@Geoffrey Shelton Okot That is exactly what I missed. Thank you very much for the prompt and right-to-the-point response.
... View more
07-30-2019
09:40 PM
I am following the instruction at links below to add users to Hive admin role. https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_dataintegration/content/hive-013-feature-sql-standard-based-grant-revoke.html and https://community.hortonworks.com/articles/4568/sql-based-authorization-in-hive.html However, I could not find this configuration "hive.users.in.admin.role" via Ambari UI (Hive - Configs - Advanced). I downloaded the hive_site.xml (via Ambria's Download Client Configs button) and found no such configuration in the xml file either. What did I miss? Below is my current Hive security settings. Authorization: SQL-Standard Based (SQLStdAuth) Authentication: None Run as end user instead of Hive user: False
... View more
Labels:
- Labels:
-
Apache Hive
06-14-2019
10:04 PM
@Geoffrey Shelton Okot My intention is to use "su hive", not "su - hive". Below is the resource showing the difference. https://www.tecmint.com/difference-between-su-and-su-commands-in-linux/ Thanks.
... View more
06-14-2019
09:57 PM
@Gulshad Ansari Thanks. You are right. The execute permission is required for entering the directory.
... View more
06-12-2019
09:56 PM
My issue is actually caused by the fact that Port 10000 is not working. After installing the Hive client no another data node, where port 10000 is working, I am able to follow the steps above and create the DSN successfully.
... View more
06-03-2019
04:14 AM
@Shu Your code put me on the right track. Thanks again. However, I got some strange returned records. Here are my guesses of the possible causes. 1. The CR and LF as the row delimiter and they are not the default row delimiter of the regex SerDe. How can I specify the row delimiter? 2. There are two strange characters in the first column of the first row. It might be related to the row delimiter too. Below is the create-table script. Create External Table slog(LogTime string, LogSource string, LogMessage string) ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe' WITH SERDEPROPERTIES ("input.regex" = "(.{46})(.{24})(.*)") LOCATION '/path/to/slog/'; I attached a screenshot of the log file (in notepad ++) and the hive query result.
... View more
06-02-2019
03:22 AM
@Shu Thanks for your help. I am currently on the road and will test your solution once I am back in office.
... View more
- « Previous
- Next »