Member since
04-05-2018
14
Posts
1
Kudos Received
0
Solutions
07-02-2018
10:15 AM
I really cannot understand behavior of this forum. Most of the problems I experience can be easily reproduced and straightforward, but I usually don't get any help about these issues.
... View more
06-30-2018
06:34 PM
I have tried all compression types. Same error. Current configuration can be seen in screenshot. What property in *-site.xml s can be causing this, any idea? screen-shot-2018-06-30-at-212647.png
... View more
06-29-2018
04:59 PM
Hi,
I have created hdf 3.1 cluster in azure via cloudbreak 2.7.0 (I also tried nifi coming with hdf management pack on hdp cluster on azure same problem). I am getting the following error in ConvertAvroToORC processor. I have been using the same flow in another on-premise cluster without problem. What can be causing this?
2018-06-29 16:53:50,449 WARN [Timer-Driven Process Thread-6] o.a.n.c.t.ContinuallyRunProcessorTask Administratively Yielding ConvertAvroToORC[id=4c5f4ab4-0164-1000-0000-0000698d849d] due to uncaught Exception: java.lang.NoClassDefFoundError: org/tukaani/xz/XZInputStream
2018-06-29 16:53:50,449 WARN [Timer-Driven Process Thread-6] o.a.n.c.t.ContinuallyRunProcessorTask
java.lang.NoClassDefFoundError: org/tukaani/xz/XZInputStream
at org.apache.avro.file.XZCodec.decompress(XZCodec.java:74)
at org.apache.avro.file.DataFileStream$DataBlock.decompressUsing(DataFileStream.java:352)
at org.apache.avro.file.DataFileStream.hasNext(DataFileStream.java:199)
at org.apache.nifi.processors.hive.ConvertAvroToORC.lambda$onTrigger$0(ConvertAvroToORC.java:234)
at org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2827)
at org.apache.nifi.processors.hive.ConvertAvroToORC.onTrigger(ConvertAvroToORC.java:208)
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.tukaani.xz.XZInputStream
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... View more
Labels:
- Labels:
-
Apache NiFi
05-18-2018
03:51 PM
I have created a HDInsight HBase cluster, but I was not able to find a way to connect HBase with Phoenix securely. Some other cluster types support Enterprise Security Package which seems to be supporting Ranger authorization. How can I simply create username password authenticated connection to HBase via Phoenix on HDInsight HBase cluster? Thanks
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
04-16-2018
10:39 AM
In test cluster I tried to execute same script, but failed again. I managed to find the problem by checking both attempt and application logs from YARN Resource Manager UI. It was failing because insufficient memory allocation for MapReduce job.
... View more
04-16-2018
10:36 AM
Hi, I managed to find it inside /usr/hdp/current/oozie-client/doc/ in test cluster. But it doesn't exist in sandbox environment
... View more
04-12-2018
07:30 AM
1 Kudo
Hi, I am working on Oozie in sandbox environment. I cannot find oozie.examples.tar.gz in /usr/hdp/current/oozie* folders. Where can I find them?
... View more
Labels:
- Labels:
-
Apache Oozie
04-11-2018
02:49 PM
Hi,
I am trying to use Sqoop with Workflow Manager without success. When I look at attempt logs from YARN UI, i see the excerpt below from stdout.
36727 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Transferred 1.9004 KB in 27.0408 seconds (71.9652 bytes/sec)
36727 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Transferred 1.9004 KB in 27.0408 seconds (71.9652 bytes/sec)
36730 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 100 records.
36730 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 100 records.
36730 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Publishing Hive/Hcat import job data to Listeners
36730 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Publishing Hive/Hcat import job data to Listeners
38953 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM customerdata AS t WHERE 1=0
38953 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM customerdata AS t WHERE 1=0
39653 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM customerdata AS t WHERE 1=0
39653 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM customerdata AS t WHERE 1=0
40009 [main] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
40009 [main] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
<<< Invocation of Sqoop command completed <<<
Hadoop Job IDs executed by Sqoop: job_1523018090709_0047
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
Oozie Launcher failed, finishing Hadoop job gracefully
Oozie Launcher, uploading action data to HDFS sequence file: hdfs://sandbox-hdp.hortonworks.com:8020/user/admin/oozie-oozi/0000023-180407052519473-oozie-oozi-W/sqoop-extract--sqoop/action-data.seq
Successfully reset security manager from org.apache.oozie.action.hadoop.LauncherSecurityManager@64665781 to null
Oozie Launcher ends
In stderr
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/11/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/195/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Note: /tmp/sqoop-yarn/compile/dc3a5c785ecbf3fc6b982843ce9201a2/customerdata.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Logging initialized using configuration in jar:file:/hadoop/yarn/local/filecache/193/hive-common-1.2.1000.2.6.4.0-91.jar!/hive-log4j.properties
Intercepting System.exit(1)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
log4j:WARN No appenders could be found for logger (org.apache.hadoop.ipc.Client).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Same sqoop command is successful when i execute it from command line (root user). It seems that extraction is successful but hive table creation is not. I have attached full stdout which includes sqoop commandstdout.txt
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Sqoop
04-10-2018
12:19 PM
Any ideas?
... View more
04-10-2018
12:19 PM
Hi, Even after doing the steps as stated in the doc, problem continued. I managed to find the workaround by commenting out hive-env template section about HIVE_AUX_JARS_PATH and adding command below. export HIVE_AUX_JARS_PATH=/usr/hdp/current/phoenix-client/phoenix-hive.jar I dont know why script didnt work, will try to find out when I have time.
... View more
04-09-2018
10:38 AM
Is there any suggestion?
... View more
04-05-2018
04:59 PM
Hi, On hdp 2.6.4 when I try to create a hive external table with following command. I get org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException Cannot find class 'org.apache.phoenix.hive.PhoenixStorageHandler' CREATE EXTERNAL TABLE loadprofile(
key STRING,
k_0_0_0 INT,
k_0_9_1_2 STRING,
k_0_9_5 INT,
k_0_8_4 INT,
r_date STRING,
a DOUBLE,
b DOUBLE,
c DOUBLE,
d DOUBLE,
e DOUBLE,
f DOUBLE,
g_1 DOUBLE,
g_2 DOUBLE,
g_3 DOUBLE,
h_1 DOUBLE,
h_2 DOUBLE,
h_3 DOUBLE)
STORED BY 'org.apache.phoenix.hive.PhoenixStorageHandler'
TBLPROPERTIES (
"phoenix.table.name" = "loadprofile",
"phoenix.zookeeper.quorum" = "iqb.hdp1.com,iqb.hdp2.com,iqb.hdp3.com",
"phoenix.zookeeper.znode.parent" = "/hbase-unsecure",
"phoenix.zookeeper.client.port" = "2181",
"phoenix.rowkeys" = "key",
"phoenix.column.mapping" = "key:key,k_0_0_0:k_0_0_0,k_0_9_1_2:k_0_9_1_2,k_0_9_5:k_0_9_5,k_0_8_4:k_0_8_4,r_date:date,a:a,b:b,c:c,d:d,e:e,f:f,g_1:g_1,g_2:g_2,g_3:g_3,h_1:h_1,h_2:h_2,h_3:h_3"
); Do i need to install PhoenixStorageHandler manually. If so, how can i do that?
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Hive
-
Apache Phoenix
04-05-2018
08:47 AM
Hi, We are ingesting realtime data into Druid datasource via SAM. The problem is all dimensions have String types as seen on Superset. In Schema Registry schema we have different type of column types including long, integer, double and string. What can be done to prevent this, or is there a workaround?
... View more
Labels:
- Labels:
-
Druid
-
Schema Registry