Member since
09-24-2015
527
Posts
136
Kudos Received
19
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2130 | 06-30-2017 03:15 PM | |
3053 | 10-14-2016 10:08 AM | |
8318 | 09-07-2016 06:04 AM | |
10063 | 08-26-2016 11:27 AM | |
1449 | 08-23-2016 02:09 PM |
03-31-2016
07:13 PM
Hi: After upgrade the ambari 2.2.1, i can upgrade the hdp 2.4. Many thanks
... View more
03-31-2016
06:15 PM
Ok thanks: actually we are going to use sparkR and i would like to use the last version of spark so, first i need to update de ambari 2.2.1 right??? and then update de hdp??? thanks
... View more
03-31-2016
04:50 PM
i cant Register Target Version 2.4 into HDP 2.3.4, my ambari version is: ADMIN_VIEW{2.2.0.0
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
03-30-2016
04:39 PM
Hi: if i have for the slaves t those directories: /data/1,/data/2,/data/3,/data/4,/data/5 can i have just this for the master???? /data/1/namenode,/data/2/namenode
... View more
03-30-2016
01:56 PM
1 Kudo
How much diskrecomend you to have in the manager server in the: /tmp /usr/hdp /var/log /home The manager need to have HDFS disk??? thanks
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache YARN
03-29-2016
01:20 PM
Hi: i can access with any user to the window explorer in hue to see the files why?? No se puede acceder: /user/hdfs.
SecurityException: Failed to obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException: User: hue is not allowed to impersonate hdfs (error 403)
hdfs is in hadoop group on hue [root@a01hop01 hue]# id hue
uid=1015(hue) gid=1015(hue) groups=1015(hue)
Thanks
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Cloudera Hue
03-26-2016
08:13 AM
finally I insert with bucker like this: CREATE EXTERNAL TABLE IF NOT EXISTS journey_importe_v2(
FECHAOPRCNF date,
codnrbeenf string,
codnrbeenf2 string,
CODTXF string,
FREQ BIGINT,
IMPORTE DECIMAL(9, 2)
)
CLUSTERED BY (codnrbeenf) INTO 25 BUCKETS
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
stored as ORC
LOCATION '/RSI/tables/logs/importe_v2'
TBLPROPERTIES ("immutable"="false","transactional"="true");
create table IF NOT EXISTS temp_journey_importe_v2 (importe STRING);
LOAD DATA INPATH '/RSI/staging/output/journey_importe/${date}' OVERWRITE INTO TABLE temp_journey_importe_v2;
set hive.enforce.bucketing = true;
INSERT INTO TABLE journey_importe_v2
SELECT
regexp_extract(importe, '^(?:([^,]*)\,?){1}', 1) FECHAOPRCNF,
regexp_extract(importe, '^(?:([^,]*)\,?){2}', 1) codnrbeenf,
regexp_extract(importe, '^(?:([^,]*)\,?){3}', 1) codnrbeenf2,
regexp_extract(importe, '^(?:([^,]*)\,?){4}', 1) CODTXF,
regexp_extract(importe, '^(?:([^,]*)\,?){5}', 1) FREQ,
regexp_extract(importe, '^(?:([^,]*)\,?){6}', 1) IMPORTE
from temp_journey_importe_v2;
there is a better way?? how many buckets recomend me use??
... View more
03-19-2016
07:25 PM
1 Kudo
Hi: finally its works like this: with double comillas RUTAHIVE=$ANYODIR"/"${BATCH:2:1}"/"${BATCH:3}""
beeline -u jdbc:hive2://hostname:10000 -n hive -p temporal01 -f /home/hdfs/scripts/hive/store_wordcount.hql --hivevar date=$RUTAHIVE
... View more