Member since
06-07-2017
4
Posts
0
Kudos Received
0
Solutions
06-22-2017
07:38 AM
Hi, Slim. Thank you for your replay. I made the correct file permission as I use druid user to put file there in HDFS. It turned out to be another problem ,apart from putting file to HDFS, about YARN queue manager which I have problem with HTTPS set up.
As I tested in test environment, I rolled back to the snapshot before I got problem with HTTPS. I will tried druid again someday.
Thank you Andres and Slim!
... View more
06-16-2017
09:26 AM
Hi Andres,
I just put the file in HDFS, but the same error still occurs. Maybe I've done something wrong during the steps. I'm still working around. Thank you very much for your kindly reply.
... View more
06-15-2017
08:15 PM
Hi all,
Now I'm facing a problem in submitting the wikiticker job with Druid 0.9.2 bundled in HDP 2.6.0.3. (But there isn't any problem with standalone version of 0.10.0 in my local PC)
However, I must use the HDP version for the compatibility to Apache Ambari and the rest of my existing cluster.
This is how I submit the job.
[centos@dev-server1 druid]$ curl -X 'POST' -H 'Content-Type:application/json' -d @quickstart/wikiticker-index.json localhost:8090/druid/indexer/v1/task
{"task":"index_hadoop_
wikiticker_2017-06-15T11:04:18.145Z"}
But it appeared to be FAILED in Coordinator Console with the following error (I tried many times with a few different setting that I thought it might solve: e.g. change UNIX timezone to be my local time or add mapred.job.classloader in jobProperties as describe in this link, but to no avail):
2017-06-15T11:04:31,361 ERROR [task-runner-0-priority-0] io.druid.indexing.overlord.<wbr>ThreadPoolTaskRunner - Exception while running task[HadoopIndexTask{id=index_<wbr>hadoop_wikiticker_2017-06-<wbr>15T11:04:18.145Z, type=index_hadoop, dataSource=wikiticker}]
java.lang.RuntimeException: java.lang.reflect.<wbr>InvocationTargetException
at com.google.common.base.<wbr>Throwables.propagate(<wbr>Throwables.java:160) ~[guava-16.0.1.jar:?]
at io.druid.indexing.common.task.<wbr>HadoopTask.<wbr>invokeForeignLoader(<wbr>HadoopTask.java:204) ~[druid-indexing-service-0.9.<wbr>2.2.6.0.3-8.jar:0.9.2.2.6.0.3-<wbr>8]
at io.druid.indexing.common.task.<wbr>HadoopIndexTask.run(<wbr>HadoopIndexTask.java:175) ~[druid-indexing-service-0.9.<wbr>2.2.6.0.3-8.jar:0.9.2.2.6.0.3-<wbr>8]
at io.druid.indexing.overlord.<wbr>ThreadPoolTaskRunner$<wbr>ThreadPoolTaskRunnerCallable.<wbr>call(ThreadPoolTaskRunner.<wbr>java:436) [druid-indexing-service-0.9.2.<wbr>2.6.0.3-8.jar:0.9.2.2.6.0.3-8]
at io.druid.indexing.overlord.<wbr>ThreadPoolTaskRunner$<wbr>ThreadPoolTaskRunnerCallable.<wbr>call(ThreadPoolTaskRunner.<wbr>java:408) [druid-indexing-service-0.9.2.<wbr>2.6.0.3-8.jar:0.9.2.2.6.0.3-8]
at java.util.concurrent.<wbr>FutureTask.run(FutureTask.<wbr>java:266) [?:1.8.0_131]
at java.util.concurrent.<wbr>ThreadPoolExecutor.runWorker(<wbr>ThreadPoolExecutor.java:1142) [?:1.8.0_131]
at java.util.concurrent.<wbr>ThreadPoolExecutor$Worker.run(<wbr>ThreadPoolExecutor.java:617) [?:1.8.0_131]
at java.lang.Thread.run(Thread.<wbr>java:748) [?:1.8.0_131]
Caused by: java.lang.reflect.<wbr>InvocationTargetException
at sun.reflect.<wbr>NativeMethodAccessorImpl.<wbr>invoke0(Native Method) ~[?:1.8.0_131]
at sun.reflect.<wbr>NativeMethodAccessorImpl.<wbr>invoke(<wbr>NativeMethodAccessorImpl.java:<wbr>62) ~[?:1.8.0_131]
at sun.reflect.<wbr>DelegatingMethodAccessorImpl.<wbr>invoke(<wbr>DelegatingMethodAccessorImpl.<wbr>java:43) ~[?:1.8.0_131]
at java.lang.reflect.Method.<wbr>invoke(Method.java:498) ~[?:1.8.0_131]
at io.druid.indexing.common.task.<wbr>HadoopTask.<wbr>invokeForeignLoader(<wbr>HadoopTask.java:201) ~[druid-indexing-service-0.9.<wbr>2.2.6.0.3-8.jar:0.9.2.2.6.0.3-<wbr>8]
... 7 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.mapreduce.<wbr>lib.input.<wbr>InvalidInputException: Input path does not exist: hdfs://dev-server1.c.sertis-<wbr>data-center.internal:8020/<wbr>user/druid/quickstart/<wbr>wikiticker-2015-09-12-sampled.<wbr>json
at com.google.common.base.<wbr>Throwables.propagate(<wbr>Throwables.java:160) ~[guava-16.0.1.jar:?]
at io.druid.indexer.<wbr>DetermineHashedPartitionsJob.<wbr>run(<wbr>DetermineHashedPartitionsJob.<wbr>java:208) ~[druid-indexing-hadoop-0.9.2.<wbr>2.6.0.3-8.jar:0.9.2.2.6.0.3-8]
at io.druid.indexer.JobHelper.<wbr>runJobs(JobHelper.java:349) ~[druid-indexing-hadoop-0.9.2.<wbr>2.6.0.3-8.jar:0.9.2.2.6.0.3-8]
at io.druid.indexer.<wbr>HadoopDruidDetermineConfigurat<wbr>ionJob.run(<wbr>HadoopDruidDetermineConfigurat<wbr>ionJob.java:91) ~[druid-indexing-hadoop-0.9.2.<wbr>2.6.0.3-8.jar:0.9.2.2.6.0.3-8]
at io.druid.indexing.common.task.<wbr>HadoopIndexTask$<wbr>HadoopDetermineConfigInnerProc<wbr>essing.runTask(<wbr>HadoopIndexTask.java:291) ~[druid-indexing-service-0.9.<wbr>2.2.6.0.3-8.jar:0.9.2.2.6.0.3-<wbr>8]
at sun.reflect.<wbr>NativeMethodAccessorImpl.<wbr>invoke0(Native Method) ~[?:1.8.0_131]
at sun.reflect.<wbr>NativeMethodAccessorImpl.<wbr>invoke(<wbr>NativeMethodAccessorImpl.java:<wbr>62) ~[?:1.8.0_131]
at sun.reflect.<wbr>DelegatingMethodAccessorImpl.<wbr>invoke(<wbr>DelegatingMethodAccessorImpl.<wbr>java:43) ~[?:1.8.0_131]
at java.lang.reflect.Method.<wbr>invoke(Method.java:498) ~[?:1.8.0_131]
at io.druid.indexing.common.task.<wbr>HadoopTask.<wbr>invokeForeignLoader(<wbr>HadoopTask.java:201) ~[druid-indexing-service-0.9.<wbr>2.2.6.0.3-8.jar:0.9.2.2.6.0.3-<wbr>8]
... 7 more
Caused by: org.apache.hadoop.mapreduce.<wbr>lib.input.<wbr>InvalidInputException: Input path does not exist: hdfs://dev-server1.c.sertis-<wbr>data-center.internal:8020/<wbr>user/druid/quickstart/<wbr>wikiticker-2015-09-12-sampled.<wbr>json
at org.apache.hadoop.mapreduce.<wbr>lib.input.FileInputFormat.<wbr>singleThreadedListStatus(<wbr>FileInputFormat.java:323) ~[?:?]
at org.apache.hadoop.mapreduce.<wbr>lib.input.FileInputFormat.<wbr>listStatus(FileInputFormat.<wbr>java:265) ~[?:?]
at org.apache.hadoop.mapreduce.<wbr>lib.input.FileInputFormat.<wbr>getSplits(FileInputFormat.<wbr>java:387) ~[?:?]
at org.apache.hadoop.mapreduce.<wbr>lib.input.<wbr>DelegatingInputFormat.<wbr>getSplits(<wbr>DelegatingInputFormat.java:<wbr>115) ~[?:?]
at org.apache.hadoop.mapreduce.<wbr>JobSubmitter.writeNewSplits(<wbr>JobSubmitter.java:301) ~[?:?]
at org.apache.hadoop.mapreduce.<wbr>JobSubmitter.writeSplits(<wbr>JobSubmitter.java:318) ~[?:?]
at org.apache.hadoop.mapreduce.<wbr>JobSubmitter.<wbr>submitJobInternal(<wbr>JobSubmitter.java:196) ~[?:?]
at org.apache.hadoop.mapreduce.<wbr>Job$10.run(Job.java:1290) ~[?:?]
at org.apache.hadoop.mapreduce.<wbr>Job$10.run(Job.java:1287) ~[?:?]
at java.security.<wbr>AccessController.doPrivileged(<wbr>Native Method) ~[?:1.8.0_131]
at javax.security.auth.Subject.<wbr>doAs(Subject.java:422) ~[?:1.8.0_131]
at org.apache.hadoop.security.<wbr>UserGroupInformation.doAs(<wbr>UserGroupInformation.java:<wbr>1866) ~[?:?]
at org.apache.hadoop.mapreduce.<wbr>Job.submit(Job.java:1287) ~[?:?]
at io.druid.indexer.<wbr>DetermineHashedPartitionsJob.<wbr>run(<wbr>DetermineHashedPartitionsJob.<wbr>java:116) ~[druid-indexing-hadoop-0.9.2.<wbr>2.6.0.3-8.jar:0.9.2.2.6.0.3-8]
at io.druid.indexer.JobHelper.<wbr>runJobs(JobHelper.java:349) ~[druid-indexing-hadoop-0.9.2.<wbr>2.6.0.3-8.jar:0.9.2.2.6.0.3-8]
at io.druid.indexer.<wbr>HadoopDruidDetermineConfigurat<wbr>ionJob.run(<wbr>HadoopDruidDetermineConfigurat<wbr>ionJob.java:91) ~[druid-indexing-hadoop-0.9.2.<wbr>2.6.0.3-8.jar:0.9.2.2.6.0.3-8]
at io.druid.indexing.common.task.<wbr>HadoopIndexTask$<wbr>HadoopDetermineConfigInnerProc<wbr>essing.runTask(<wbr>HadoopIndexTask.java:291) ~[druid-indexing-service-0.9.<wbr>2.2.6.0.3-8.jar:0.9.2.2.6.0.3-<wbr>8]
at sun.reflect.<wbr>NativeMethodAccessorImpl.<wbr>invoke0(Native Method) ~[?:1.8.0_131]
at sun.reflect.<wbr>NativeMethodAccessorImpl.<wbr>invoke(<wbr>NativeMethodAccessorImpl.java:<wbr>62) ~[?:1.8.0_131]
at sun.reflect.<wbr>DelegatingMethodAccessorImpl.<wbr>invoke(<wbr>DelegatingMethodAccessorImpl.<wbr>java:43) ~[?:1.8.0_131]
at java.lang.reflect.Method.<wbr>invoke(Method.java:498) ~[?:1.8.0_131]
at io.druid.indexing.common.task.<wbr>HadoopTask.<wbr>invokeForeignLoader(<wbr>HadoopTask.java:201) ~[druid-indexing-service-0.9.<wbr>2.2.6.0.3-8.jar:0.9.2.2.6.0.3-<wbr>8]
... 7 more
2017-06-15T11:04:31,375 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.<wbr>TaskRunnerUtils - Task [index_hadoop_wikiticker_2017-<wbr>06-15T11:04:18.145Z] status changed to [FAILED].
2017-06-15T11:04:31,378 INFO [task-runner-0-priority-0] io.druid.indexing.worker.<wbr>executor.ExecutorLifecycle - Task completed with status: {
"id" : "index_hadoop_wikiticker_2017-<wbr>06-15T11:04:18.145Z",
"status" : "FAILED",
"duration" : 6650
}
Please find attached log.txt for more information
Thank you very much.
... View more
Labels: