Member since
05-02-2019
319
Posts
145
Kudos Received
59
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7107 | 06-03-2019 09:31 PM | |
1721 | 05-22-2019 02:38 AM | |
2174 | 05-22-2019 02:21 AM | |
1357 | 05-04-2019 08:17 PM | |
1667 | 04-14-2019 12:06 AM |
04-14-2020
09:44 AM
Is it possible to do something similar but receiving the Server, Port and Path where the files are? Because GetSFTP and ListSFTP can't receive any input
... View more
12-09-2019
11:07 PM
Hi, Hope this document will clarify your doubts. This was a tuning document. https://blog.cloudera.com/how-to-tune-your-apache-spark-jobs-part-2/ Thanks AK
... View more
05-04-2019
08:17 PM
Probably for using HCatalog with can be extremely useful for Pig programmers even if they don't want to use Hive and just leverage this for schema management instead of defining AS clauses in their LOAD commands? Just as likely this is something hard-coded into Ambari? If you really don't want Hive, I bet you can just delete it after installation. For giggles, I stood up an HDFS-only HDP 3.1.0 cluster for https://community.hortonworks.com/questions/245432/is-it-possible-to-install-only-hdfs-on-linux-machi.html?childToView=245544#answer-245544 and just added Pig (required YARN, MR, Tez & ZK, but that makes sense!) and did NOT require Hive to be added as seen below. Good luck and happy Hadooping!
... View more
04-15-2019
08:32 AM
Thanks @Lester Martin I keep in mind the balancer admin command. I solve the issue simply by removing a very huge file created by a data scientist executing a very huge request on hive. The temporary files located at /tmp/hive/[user] seems to be not replicated (i'am not sure of that).
... View more
03-06-2019
11:15 PM
Welcome to Phoenix... where the cardinal rule is if you are going to use Phoenix, then for that table, don't look at it or use it directly from the HBase API. What you are seeing is pretty normal. I don't see your DDL, but I'll give you an example to compare against. Check out the DDL at https://github.com/apache/phoenix/blob/master/examples/WEB_STAT.sql and focus on the CORE column which is a BIGINT and the ACTIVE_VISITOR column which is INTEGER. Here's the data that gets loaded into it; https://github.com/apache/phoenix/blob/master/examples/WEB_STAT.csv. Here's what it looks like via Phoenix... Here's what it looks like through HBase shell (using the API)... Notice the CORE and ACTIVE_VISITOR values looking a lot like your example? Yep, welcome to Phoenix. Remember, use Phoenix only for Phoenix tables and you'll be all right. 🙂 Good luck and happy Hadooping/HBasing!
... View more
04-27-2018
11:12 AM
Surely NOT the same issue, but along this line of buggy behavior in the HDP Sandbox (2.6.0.3) using Hive and getting messages mentioning hostnames sandbox and sandbox.hortonworks.com, I got this message a few times. FAILED: SemanticException Unable to determine if hdfs://sandbox.hortonworks.com:8020/user/root/salarydata is encrypted: java.lang.IllegalArgumentException: Wrong FS: hdfs://sandbox.hortonworks.com:8020/user/root/salarydata, expected: hdfs://sandbox:8020 It seems to go away if I just exit the SSH connection and establish it again.
... View more
07-31-2017
12:34 PM
Yep, this could work, but for a big cluster I could imagine this being time-consuming. The initial recursive listing (especially since it will represent down to the file level) could be quite large for any file system of any size. The more time-consuming effort would be to run the "hdfs dfs -count" command over and over and over. But... like you said, this should work. Preferably, I'd want the NN to just offer a "show me all quoto details" or at least just "show me directories w/quotas". Since this function is not present, Maybe there is a performance hit for NN to quickly determine this that I'm not considering as seems lightweight to me. Thanks for your suggestion.
... View more
06-23-2017
07:18 PM
Thanks so much @Lester Martin I appreciate your help now worked, I replaced my statement using yours and it worked. salaries_cl = FOREACH salaries_fl GENERATE (int)year as year:int,$1,$2,$3, (long)salary as salary:long; Weird why the other one didn't work but well thanks so much.
... View more
06-07-2017
08:06 PM
Excellent. Truthfully, the case sensitivity is a bit weird in Pig -- kind of like the rules of the English language. Hehe!
... View more