Created 06-22-2016 07:14 PM
Hi:
from pig i cant insert into table phoenix hbase
gnostics=[Task failed, taskId=task_1464163049638_1419_1_00_000053, diagnostics=[TaskAttempt 0 failed, info=[Error: Failure while running task:java.lang.RuntimeException: Unable to process column CHAR:"CODNRBEENF", innerMessage=Unknown type java.util.HashMap passed to PhoenixHBaseStorage at org.apache.phoenix.pig.writable.PhoenixPigDBWritable.write(PhoenixPigDBWritable.java:66) at org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:78) at org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:39) at org.apache.phoenix.pig.PhoenixHBaseStorage.putNext(PhoenixHBaseStorage.java:184) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:136) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:95) at org.apache.tez.mapreduce.output.MROutput$1.write(MROutput.java:503) at org.apache.pig.backend.hadoop.executionengine.tez.plan.operator.POStoreTez.getNextTuple(POStoreTez.java:125) at org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.runPipeline(PigProcessor.java:332) at org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.run(PigProcessor.java:197) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:344) at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:181) at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:172) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:172) at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:168) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Unknown type java.util.HashMap passed to PhoenixHBaseStorage at org.apache.phoenix.pig.util.TypeUtil.getType(TypeUtil.java:158) at org.apache.phoenix.pig.util.TypeUtil.castPigTypeToPhoenix(TypeUtil.java:177) at org.apache.phoenix.pig.writable.PhoenixPigDBWritable.convertTypeSpecificValue(PhoenixPigDBWritable.java:79) at org.apache.phoenix.pig.writable.PhoenixPigDBWritable.write(PhoenixPigDBWritable.java:59) ... 22 more
Created 06-22-2016 07:20 PM
hi:
schema phoenix:
CREATE TABLE IF NOT EXISTS journey_oficina_hbase( CODNRBEENF CHAR(4) not null, FECHAOPRCNF CHAR(21) not null , CODINTERNO CHAR(4), CODTXF CHAR(8), FREQ BIGINT, CONSTRAINT pk PRIMARY KEY (CODNRBEENF,FECHAOPRCNF) );
script pig:
D = FOREACH B generate (chararray) SUBSTRING($0, 0, 13) as fecha ,(chararray) $1 as CODNRBE,(chararray) $2 as CODINTERNO,(chararray) $3 as CODTX; D = FILTER D BY ($1 != ''); F = GROUP D BY (CONCAT(fecha,':00:00'),CODNRBE,CODINTERNO,CODTX); G = FOREACH F GENERATE (CHARARRAY) group.$1 as entidad, (CHARARRAY) group.$0 as fecha, (CHARARRAY) group.$2 as oficina, (CHARARRAY) group.$3 as operacion, (long) COUNT(D) as freq;
thanks
Created 06-22-2016 07:19 PM
It looks like you're trying to write a Map type, which Phoenix does not support.
Can you share the DDL for your Phoenix table and the schema of the relation you're trying to write into Phoenix?
Created 06-22-2016 07:20 PM
hi:
schema phoenix:
CREATE TABLE IF NOT EXISTS journey_oficina_hbase( CODNRBEENF CHAR(4) not null, FECHAOPRCNF CHAR(21) not null , CODINTERNO CHAR(4), CODTXF CHAR(8), FREQ BIGINT, CONSTRAINT pk PRIMARY KEY (CODNRBEENF,FECHAOPRCNF) );
script pig:
D = FOREACH B generate (chararray) SUBSTRING($0, 0, 13) as fecha ,(chararray) $1 as CODNRBE,(chararray) $2 as CODINTERNO,(chararray) $3 as CODTX; D = FILTER D BY ($1 != ''); F = GROUP D BY (CONCAT(fecha,':00:00'),CODNRBE,CODINTERNO,CODTX); G = FOREACH F GENERATE (CHARARRAY) group.$1 as entidad, (CHARARRAY) group.$0 as fecha, (CHARARRAY) group.$2 as oficina, (CHARARRAY) group.$3 as operacion, (long) COUNT(D) as freq;
thanks
Created 06-22-2016 07:26 PM
I changed it like that and the same error
CREATE TABLE IF NOT EXISTS journey_oficina_hbase( CODNRBEENF VARCHAR not null, FECHAOPRCNF VARCHAR not null , CODINTERNO VARCHAR, CODTXF VARCHAR, FREQ BIGINT, CONSTRAINT pk PRIMARY KEY (CODNRBEENF,FECHAOPRCNF) );
Created 06-22-2016 07:37 PM
and my out data is that:
(3008,2016-06-01 11:00:00,0161,GCA10CON,1) (3008,2016-06-01 11:00:00,0161,GIN02OOU,14) (3008,2016-06-01 11:00:00,0161,IBC06MOU,3) (3008,2016-06-01 11:00:00,0161,RGE62COU,1) (3008,2016-06-01 11:00:00,0161,STS06CON,6) (3008,2016-06-01 11:00:00,0161,VPR28COU,2) (3008,2016-06-01 11:00:00,0162,GAE05COU,1) (3008,2016-06-01 11:00:00,0162,PGEA8COU,3) (3008,2016-06-01 11:00:00,0163,DVI41OOU,5) (3008,2016-06-01 11:00:00,0163,GAC11COU,10) (3008,2016-06-01 11:00:00,0163,GAC67COU,22)
Created 06-22-2016 07:39 PM
Looks like you need put in your pig script:
raw_data = LOAD 'hdfs:/user/xx/journey_oficina_hbase' USING PigStorage(',') AS ( CODNRBEENF CHAR(4) not null, FECHAOPRCNF CHAR(21) not null , CODINTERNO CHAR(4), CODTXF CHAR(8), FREQ BIGINT, );
Created 06-22-2016 07:54 PM
the sintaxis is wrong and i changed like this and still same error:
raw_data = LOAD '/tmp/JOURNEY_OFICINA_HBASE.csv' USING PigStorage(',') as (CODNRBEENF:chararray, FECHAOPRCNF :chararray, CODINTERNO :chararray, CODTXF :chararray, FREQ:long);
TaskAttempt 3 failed, info=[Error: Failure while running task:java.lang.RuntimeException: Unable to process column CHAR:"CODNRBEENF", innerMessage=Unknown type java.util.HashMap passed to PhoenixHBaseStorage at org.apache.phoenix.pig.writable.PhoenixPigDBWritable.write(PhoenixPigDBWritable.java:66) at org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:78 at org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:39) at org.apache.phoenix.pig.PhoenixHBaseStorage.putNext(PhoenixHBaseStorage.java:184) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:136) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:95) at org.apache.tez.mapreduce.output.MROutput$1.write(MROutput.java:503) at org.apache.pig.backend.hadoop.executionengine.tez.plan.operator.POStoreTez.getNextTuple(POStoreTez.java:125) at org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.runPipeline(PigProcessor.java:332) at org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.run(PigProcessor.java:197) at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:344) at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:181) at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:172) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:172) at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:168) at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Unknown type java.util.HashMap passed to PhoenixHBaseStorage at org.apache.phoenix.pig.util.TypeUtil.getType(TypeUtil.java:158) at org.apache.phoenix.pig.util.TypeUtil.castPigTypeToPhoenix(TypeUtil.java:177) at org.apache.phoenix.pig.writable.PhoenixPigDBWritable.convertTypeSpecificValue(PhoenixPigDBWritable.java:79) at org.apache.phoenix.pig.writable.PhoenixPigDBWritable.write(PhoenixPigDBWritable.java:59) ... 22 more
Created 06-22-2016 08:58 PM
raw_data = LOAD '/tmp/JOURNEY_OFICINA_HBASE.csv' USING PigStorage(',') as (entidad:chararray, fecha :chararray, oficina :chararray, operacion :chararray, freq:long); STORE raw_data into 'hbase://JOURNEY_OFICINA_HBASE' using org.apache.phoenix.pig.PhoenixHBaseStorage('lnxbig05','-batchSize 5000');
MANY THANKS!!
Created 06-22-2016 11:08 PM
Can you try put the below first.
load 'hbase://table/JOURNEY_OFICINA_HBASE' using org.apache.phoenix.pig.PhoenixHBaseLoader(zkQuorum);