Member since
11-03-2016
4
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1076 | 03-28-2017 04:34 PM |
03-28-2017
04:34 PM
Answer is : do not install the workers in a dedicated VPC (even if the choice is possible) or setup up a bridge between VPCs.
... View more
02-23-2017
08:19 PM
Attempt to achieve HDP 2.5 Data Cloud setup in AWS. Cluster fails with error : "Cannot use the specified Ambari stack: HDPRepo{stack='null';
utils='null'}. Error: java.lang.IllegalArgumentException: Could not
access base url .
http://private-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.5.0.1-139
. connect timed out" URL seems not the right one for getting the necessary packages... Curl said : error 404. How to fix?
... View more
Labels:
12-29-2016
05:08 PM
Hi, If you ssh xxx.xxx.xxx.xxx -p 22 you'll be in Centos and you'll be able to use root.
For this, use with ALT+F5 or putty. If you ssh xxx.xxx.xxx.xxx -p 2222 you'll be connected to the sandbox (ie docker).
In docker, you'll be able to use the predefined accounts (maria_dev, raj_ops, ... ) but not root... guess why ;-).
... View more
11-03-2016
11:45 PM
Actually I refer to a complete schema as soon as possible. First when I describe it when loading the hive table, then when I transform data. Works fine for me when I need to clean fields with complex criterias within the same table. A = LOAD 'hive_table' USING org.apache.hive.hcatalog.pig.HCatLoader() as (f0: chararray,f1: chararray,f2: chararray;
B = FOREACH A GENERATE $0 as (f0:chararray),$1 as (f1:chararray),REPLACE($2,'John Doe','Mr Bean') as (f2:chararray);
STORE B INTO 'hive_table' using org.apache.hive.hcatalog.pig.HCatStorer();
... View more