Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

java.io.FileNotFoundException: (Is a directory)

avatar
Master Mentor

HDP 2.4.2

Ambari 2.2.2

druid-0.9.0

I am following this http://druid.io/docs/latest/tutorials/quickstart.html and running

[root@nss03 druid-0.9.0]# curl -X 'POST' -H 'Content-Type:application/json' -d @quickstart/wikiticker-index.json http://overlordnode:8090/druid/indexer/v1/task

{"task":"index_hadoop_wikiticker_2016-05-24T11:38:51.681Z"}

[root@nss03 druid-0.9.0]#

I can see that job is submitted to the yarn queue. RM UI error details.

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/10/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/130/log4j-slf4j-impl-2.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /hadoop/yarn/log/application_1464036814491_0009/container_e04_1464036814491_0009_01_000001 (Is a directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.hadoop.yarn.ContainerLogAppender.activateOptions(ContainerLogAppender.java:55)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64)
	at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:285)
	at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
	at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
	at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
	at org.apache.hadoop.service.AbstractService.<clinit>(AbstractService.java:43)
May 24, 2016 4:39:11 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver as a provider class
May 24, 2016 4:39:11 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
May 24, 2016 4:39:11 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices as a root resource class
1 ACCEPTED SOLUTION

avatar
Master Mentor

I was able to fix the above issue by adding hadoop jars in the class path while starting the components

Start Coordinator, Overlord  ns03
java `cat conf/druid/coordinator/jvm.config | xargs` -cp conf/druid/_common:conf/druid/coordinator:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/* io.druid.cli.Main server coordinator &
java `cat conf/druid/overlord/jvm.config | xargs` -cp conf/druid/_common:conf/druid/overlord:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*  io.druid.cli.Main server overlord &
Start Historicals and MiddleManagers    ns02
java `cat conf/druid/historical/jvm.config | xargs` -cp conf/druid/_common:conf/druid/historical:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*  io.druid.cli.Main server historical &
java `cat conf/druid/middleManager/jvm.config | xargs` -cp conf/druid/_common:conf/druid/middleManager:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*  io.druid.cli.Main server middleManager &
Start Druid Broker
java `cat conf/druid/broker/jvm.config | xargs` -cp conf/druid/_common:conf/druid/broker:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*  io.druid.cli.Main server broker &

View solution in original post

3 REPLIES 3

avatar
Super Guru

@Neeraj Sabharwal possibly a yarn client version difference issue.

avatar
Master Mentor

I was able to fix the above issue by adding hadoop jars in the class path while starting the components

Start Coordinator, Overlord  ns03
java `cat conf/druid/coordinator/jvm.config | xargs` -cp conf/druid/_common:conf/druid/coordinator:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/* io.druid.cli.Main server coordinator &
java `cat conf/druid/overlord/jvm.config | xargs` -cp conf/druid/_common:conf/druid/overlord:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*  io.druid.cli.Main server overlord &
Start Historicals and MiddleManagers    ns02
java `cat conf/druid/historical/jvm.config | xargs` -cp conf/druid/_common:conf/druid/historical:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*  io.druid.cli.Main server historical &
java `cat conf/druid/middleManager/jvm.config | xargs` -cp conf/druid/_common:conf/druid/middleManager:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*  io.druid.cli.Main server middleManager &
Start Druid Broker
java `cat conf/druid/broker/jvm.config | xargs` -cp conf/druid/_common:conf/druid/broker:lib/*:/usr/hdp/2.4.2.0-258/hadoop/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/lib/*:/usr/hdp/2.4.2.0-258/hadoop-yarn/*:/usr/hdp/2.4.2.0-258/hadoop/client/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/*:/usr/hdp/2.4.2.0-258/hadoop-mapreduce/lib/*  io.druid.cli.Main server broker &

avatar
Expert Contributor

HDP 2.6 has native support to druid where you don't have to do all this classpath adding.