Member since
10-01-2015
3933
Posts
1150
Kudos Received
374
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 3649 | 05-03-2017 05:13 PM | |
| 3006 | 05-02-2017 08:38 AM | |
| 3262 | 05-02-2017 08:13 AM | |
| 3216 | 04-10-2017 10:51 PM | |
| 1680 | 03-28-2017 02:27 AM |
03-13-2016
12:22 AM
1 Kudo
Go to Ambari > hosts > host > install pig client. You don't need to copy anything manually.
... View more
03-12-2016
11:41 PM
1 Kudo
https://community.hortonworks.com/content/kbentry/9782/nifihdf-dataflow-optimization-part-1-of-2.html https://community.hortonworks.com/content/kbentry/9785/nifihdf-dataflow-optimization-part-2-of-2.html https://community.hortonworks.com/content/kbentry/8631/how-to-create-nifi-fault-tolerance-using-multiple-1.html https://community.hortonworks.com/questions/22226/nifi-local-drives-raid-1-vs-raid-5.html#comment-22246
... View more
03-12-2016
11:38 PM
1 Kudo
Also from nifi team https://community.hortonworks.com/content/kbentry/7882/hdfnifi-best-practices-for-setting-up-a-high-perfo.html
... View more
03-12-2016
11:35 PM
1 Kudo
Then this https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html#configuration-best-practices
... View more
03-12-2016
11:32 PM
1 Kudo
Let's start with this https://community.hortonworks.com/questions/4098/nifi-sizing.html
... View more
03-12-2016
08:57 PM
On the same download page there is an archive section where you can find older versions. here's link for your convenience http://hortonworks.com/products/hortonworks-sandbox/#archive
... View more
03-12-2016
05:20 PM
Please post your workflow and ditectory tree. I have a sample app using hcatalog with pig, difference for you would be to replace pig with hive. Pleas make sure you have hive-site.xml and proper permissions on the files. https://github.com/dbist/oozie/tree/master/apps/hcatalog
... View more
03-12-2016
03:05 PM
2 Kudos
Please see this https://pig.apache.org/docs/r0.15.0/test.html#explai Use the EXPLAIN operator to review the logical, physical, and map reduce execution plans that are used to compute the specified relationship. If no script is given:
The logical plan shows a pipeline of operators to be executed to build the relation. Type checking and backend-independent optimizations (such as applying filters early on) also apply. The physical plan shows how the logical operators are translated to backend-specific physical operators. Some backend optimizations also apply. The mapreduce plan shows how the physical operators are grouped into map reduce jobs.
... View more
03-12-2016
09:45 AM
please refer to Hive wiki https://cwiki.apache.org/confluence/display/Hive/LanguageManual+JoinOptimization
... View more
03-12-2016
09:05 AM
Sure, please accept the answer is satisfied
... View more