Member since
09-24-2015
527
Posts
136
Kudos Received
19
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2863 | 06-30-2017 03:15 PM | |
4299 | 10-14-2016 10:08 AM | |
9523 | 09-07-2016 06:04 AM | |
11557 | 08-26-2016 11:27 AM | |
1892 | 08-23-2016 02:09 PM |
06-10-2016
06:12 AM
thanks: just it ?? I dont need anything else from HDP?? and for the centos OS??
... View more
06-10-2016
05:50 AM
1 Kudo
Hi: Which directories from my machines i need to make a diary or mensual backup??? i mean /usr/hdp or /var/log etc etc???? Many thanks
... View more
Labels:
- Labels:
-
Apache Hadoop
06-09-2016
06:41 PM
Hi: Now i got this error tu type truncate table Error: Error while compiling statement: FAILED: SemanticException [Error 10146]: Cannot truncate non-managed table mi_cliente_fmes. (state=42000,code=10146)
... View more
06-09-2016
02:46 PM
hi: i am receiving tihs error with this script: CREATE EXTERNAL TABLE IF NOT EXISTS mi_cliente_fmes(
id_interno_pe bigint,
cod_nrbe_en int,
mi_nom_cliente string,
fec_ncto_const_pe string,
fecha_prim_rl_cl string ,
sexo_in string,
cod_est_civil_indv string,
cod_est_lab_indv string,
num_hijos_in int,
ind_autnmo_in string,
cod_ofcna_corr string,
cod_cpcdad_lgl_in int
)
CLUSTERED BY (cod_nrbe_en) INTO 60 BUCKETS
stored as ORC
LOCATION '/RSI/tables/desercion/mi_cliente_fmes'
set hive.enforce.bucketing = true;
set map.reduce.tasks = 25;
SET hive.exec.parallel=true;
SET hive.vectorized.execution.enabled=true;
INSERT OVERWRITE TABLE mi_cliente_fmes
select id_interno_pe,
cod_nrbe_en,
mi_nom_cliente,
fec_ncto_const_pe,
fecha_prim_rl_cl,
sexo_in,
cod_est_civil_indv,
cod_est_lab_indv,
num_hijos_in,
ind_autnmo_in,
cod_ofcna_corr,
cod_cpcdad_lgl_in
FROM mi_cliente_fmes_temp;
and the error: Error: Error while compiling statement: FAILED: SemanticException [Error 10295]: INSERT OVERWRITE not allowed on table with OutputFormat that implements AcidOutputFormat while transaction manager that supports ACID is in use (state=42000,code=10295)
... View more
Labels:
- Labels:
-
Apache Hive
06-08-2016
07:27 PM
hi: can i import with sqoop to hive with append?? i am receiving this error, also 2.8GB it took 2 hour and 35 minutes its normal?? 16/06/08 21:23:40 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/06/08 21:23:40 DEBUG sqoop.Sqoop: Append mode for hive imports is not yet supported. Please remove the parameter --append-mode
Append mode for hive imports is not yet supported. Please remove the parameter --append-mode
at org.apache.sqoop.tool.BaseSqoopTool.validateHiveOptions(BaseSqoopTool.java:1410)
at org.apache.sqoop.tool.ImportTool.validateOptions(ImportTool.java:1130)
at org.apache.sqoop.Sqoop.run(Sqoop.java:138)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
Append mode for hive imports is not yet supported. Please remove the parameter --append-mode
and mi script is like this sqoop import -D oraoop.disabled=true --verbose \
--connect jdbc:oracle:thin:@hostname:2521/CIP_BATCH \
--username=U029550 \
--password=Mayo2016 \
--hive-import \
--hive-table desercion_clientes_2 \
--hive-overwrite \
--query "select ID_INTERNO_PE,MI_FECHA_FIN_MES,COD_NRBE_EN,COD_LINEA,ID_GRP_PD,MI_NUM_TOT_AC_ACT,MI_NUM_AC_SUS,MI_SDO_AC_P,MI_NUM_AC_P,MI_DIA_AC_P,MI_INT_DEV_ACR_D,MI_INT_DEV_DEU_D,MI_COMIS_APL_D,MI_TOT_MOV_D,MI_TOT_MOV_H,MI_TOT_IMP_MOV_D,MI_TOT_IMP_MOV_H from RDWC01.MI_CLTE_ECO_GEN where \$CONDITIONS AND COD_RL_PERS_AC = 01 AND COD_LINEA in ('01','03','04','05') AND COD_NRBE_EN = '3159' AND TRUNC(MI_FECHA_FIN_MES) >=TO_DATE('2010-01-01', 'YYYY-MM-DD')" \
--boundary-query "select min(ID_INTERNO_PE), max(ID_INTERNO_PE) from RDWC01.MI_CLTE_ECO_GEN" \
--incremental append \
--check-column MI_FECHA_FIN_MES \
--last-value $LAST_ROW \
--num-mappers 5 \
--split-by ID_INTERNO_PE \
--direct \
--fetch-size 10000 \
--target-dir /RSI/staging/tmp/desercion_clientes_2
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
06-08-2016
08:00 AM
Hi: its ok now, its work with this; --query "select ID_INTERNO_PE,MI_FECHA_FIN_MES,COD_NRBE_EN,COD_LINEA,ID_GRP_PD,MI_NUM_TOT_AC_ACT,MI_NUM_AC_SUS,MI_SDO_AC_P,MI_NUM_AC_P,MI_DIA_AC_P,MI_INT_DEV_ACR_D,MI_INT_DEV_DEU_D,MI_COMIS_APL_D,MI_TOT_MOV_D,MI_TOT_MOV_H,MI_TOT_IMP_MOV_D,MI_TOT_IMP_MOV_H from RDWC01.MI_CLTE_ECO_GEN where \$CONDITIONS AND COD_RL_PERS_AC = 01 AND COD_LINEA in ('01','03','04','05') AND COD_NRBE_EN = '3159' AND TRUNC(MI_FECHA_FIN_MES) >=TO_DATE('2010-01-01', 'YYYY-MM-DD')" \
NoteIf you are issuing the query wrapped with double quotes ("), you will have to use \$CONDITIONS instead of just $CONDITIONS to disallow your shell from treating it as a shell variable. For example, a double quoted query may look like: "SELECT * FROM x WHERE a='foo' AND \$CONDITIONS"
Many thanks all of you.
... View more
06-08-2016
07:53 AM
Hi: why my $CONDITIONS doesnt work?? 16/06/08 09:52:53 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Query [select ID_INTERNO_PE,MI_FECHA_FIN_MES,COD_NRBE_EN,COD_LINEA,ID_GRP_PD,MI_NUM_TOT_AC_ACT,MI_NUM_AC_SUS,MI_SDO_AC_P,MI_NUM_AC_P,MI_DIA_AC_P,MI_INT_DEV_ACR_D,MI_INT_DEV_DEU_D,MI_COMIS_APL_D,MI_TOT_MOV_D,MI_TOT_MOV_H,MI_TOT_IMP_MOV_D,MI_TOT_IMP_MOV_H from RDWC01.MI_CLTE_ECO_GEN where COD_RL_PERS_AC = 01 AND COD_LINEA in ('01','03','04','05') AND COD_NRBE_EN = '3159' AND TRUNC(MI_FECHA_FIN_MES) >=TO_DATE('2010-01-01', 'YYYY-MM-DD') ] must contain '$CONDITIONS' in WHERE clause.
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:300)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1845)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
sqoop import -D oraoop.disabled=true --verbose \
--connect jdbc:oracle:thin:@hostname:2521/CIP_BATCH \
--username=U029550 \
--password=Mayo2016 \
--query "select ID_INTERNO_PE,MI_FECHA_FIN_MES,COD_NRBE_EN,COD_LINEA,ID_GRP_PD,MI_NUM_TOT_AC_ACT,MI_NUM_AC_SUS,MI_SDO_AC_P,MI_NUM_AC_P,MI_DIA_AC_P,MI_INT_DEV_ACR_D,MI_INT_DEV_DEU_D,MI_COMIS_APL_D,MI_TOT_MOV_D,MI_TOT_MOV_H,MI_TOT_IMP_MOV_D,MI_TOT_IMP_MOV_H from RDWC01.MI_CLTE_ECO_GEN where COD_RL_PERS_AC = 01 AND COD_LINEA in ('01','03','04','05') AND COD_NRBE_EN = '3159' AND TRUNC(MI_FECHA_FIN_MES) >=TO_DATE('2010-01-01', 'YYYY-MM-DD') $CONDITIONS" \
--boundary-query "select min(MI_NUM_TOT_AC_ACT), max(MI_NUM_TOT_AC_ACT) from RDWC01.MI_CLTE_ECO_GEN" \
--split-by MI_NUM_TOT_AC_ACT \
--direct \
--target-dir=/RSI/datalake/desercion/2016/1 \
... View more
06-08-2016
06:54 AM
hi: i have receiving this error with the $CONDITIONS sqoop import -D oraoop.disabled=true \
--connect jdbc:oracle:thin:@HOSTNAME:2521/CIP_BATCH \
--username=U029550 \
--password=Mayo2016 \
--query "SELECT ID_INTERNO_PE,MI_FECHA_FIN_MES,COD_NRBE_EN,COD_LINEA,ID_GRP_PD,MI_NUM_TOT_AC_ACT,MI_NUM_AC_SUS,MI_SDO_AC_P,MI_NUM_AC_P,MI_DIA_AC_P,MI_INT_DEV_ACR_D,MI_INT_DEV_DEU_D,MI_COMIS_APL_D,MI_TOT_MOV_D,MI_TOT_MOV_H,MI_TOT_IMP_MOV_D,MI_TOT_IMP_MOV_ FROM RDWC01.MI_CLTE_ECO_GEN WHERE $CONDITIONS AND COD_RL_PERS_AC = 01 AND COD_LINEA in ('01','03','04','05') AND COD_NRBE_EN = '3159' AND TRUNC(MI_FECHA_FIN_MES) >=TO_DATE('2010-01-01', 'YYYY-MM-DD')" \
--split-by MI_NUM_TOT_AC_ACT \
--fetch-size=50000 \
--direct \
--target-dir=/RSI/datalake/desercion/2016/1 --verbose \
16/06/08 08:56:18 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Query [SELECT ID_INTERNO_PE,MI_FECHA_FIN_MES,COD_NRBE_EN,COD_LINEA,ID_GRP_PD,MI_NUM_TOT_AC_ACT,MI_NUM_AC_SUS,MI_SDO_AC_P,MI_NUM_AC_P,MI_DIA_AC_P,MI_INT_DEV_ACR_D,MI_INT_DEV_DEU_D,MI_COMIS_APL_D,MI_TOT_MOV_D,MI_TOT_MOV_H,MI_TOT_IMP_MOV_D,MI_TOT_IMP_MOV_ FROM RDWC01.MI_CLTE_ECO_GEN WHERE '' AND COD_RL_PERS_AC = 01 AND COD_LINEA in ('01','03','04','05') AND COD_NRBE_EN = '3159' AND TRUNC(MI_FECHA_FIN_MES) >=TO_DATE('2010-01-01', 'YYYY-MM-DD')] must contain '$CONDITIONS' in WHERE clause.
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:300)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1845)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
Please why the $CONDITIONS doesnt work¿??
... View more
06-07-2016
08:02 PM
Thanks,still doesnt split, I dont know why, is necesary to use $CONDITIONS???
... View more
06-07-2016
06:45 PM
yes, id_person is part of the primary key....so i need to look another colum that is not a primary key and also integer¿ thanks
... View more