Created on 10-07-2019 12:52 PM - edited 10-07-2019 12:53 PM
I am trying to submit a Hadoop-based ingestion task, but when I POST it to the Druid Overlord it fails. Here is the task script:
{ "type": "index_hadoop", "spec": { "ioConfig": { "type": "hadoop", "inputSpec": { "type": "static", "paths": "test/Nacho_test/test.csv" } }, "dataSchema": { "dataSource": "test", "granularitySpec": { "type": "uniform", "segmentGranularity": "YEAR", "queryGranularity": "NONE", "intervals": [ "2010-12-31/2017-12-31" ] }, "parser": { "type": "hadoopyString", "parseSpec": { "format": "csv", "timestampSpec": { "column": "FECHA", "format": "yyyy-mm-dd" }, "columns": [ "CURSO", "FECHA", "BECAS", "TIPO", "INDICADOR" ], "dimensionsSpec": { "dimensions": [ "CURSO", "FECHA", "BECAS", "TIPO", "INDICADOR" ] } } }, "metricsSpec": [ { "name": "count", "type": "count" } ] }, "tuningConfig": { "type": "hadoop" } } }
These are all the warnings and errors from the task's full log:
2019-10-07T10:41:12,591 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2019-10-07T10:41:13,661 WARN [main] org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 2019-10-07T10:41:14,664 WARN [main] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29 2019-10-07T10:41:16,559 WARN [main-SendThread(impulsa-int-slave.europe-west1-b.c.gijon-impulsa.internal:2181)] org.apache.zookeeper.ClientCnxn - SASL configuration failed: javax.security.auth.login.LoginException: Zookeeper client cannot authenticate using the 'Client' section of the supplied JAAS configuration: '/usr/hdp/current/druid-middlemanager/conf/druid_jaas.conf' because of a RuntimeException: java.lang.SecurityException: java.io.IOException: /usr/hdp/current/druid-middlemanager/conf/druid_jaas.conf (No such file or directory) Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it. 2019-10-07T10:41:16,563 ERROR [main-EventThread] org.apache.curator.ConnectionState - Authentication failed 2019-10-07T10:41:17,966 WARN [main] com.sun.jersey.spi.inject.Errors - The following warnings have been detected with resource and/or provider classes: WARNING: A HTTP GET method, public void io.druid.server.http.SegmentListerResource.getSegments(long,long,long,javax.servlet.http.HttpServletRequest) throws java.io.IOException, MUST return a non-void type. 2019-10-07T10:41:18,093 WARN [main] io.druid.query.lookup.LookupReferencesManager - No lookups found for tier [__default], response [io.druid.java.util.http.client.response.FullResponseHolder@1de85972] 2019-10-07 10:41:18,436 task-runner-0-priority-0 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup at java.lang.Class.cast(Class.java:3369) at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167) at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301) at org.apache.logging.log4j.core.lookup.Interpolator.(Interpolator.java:94) at org.apache.logging.log4j.core.config.AbstractConfiguration.(AbstractConfiguration.java:111) at org.apache.logging.log4j.core.config.DefaultConfiguration.(DefaultConfiguration.java:48) at org.apache.logging.log4j.core.LoggerContext.(LoggerContext.java:75) at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.createContext(ClassLoaderContextSelector.java:171) at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.locateContext(ClassLoaderContextSelector.java:145) at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:70) at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:57) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:140) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41) at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103) at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42) at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89) at java.util.logging.LogManager.demandLogger(LogManager.java:551) at java.util.logging.Logger.demandLogger(Logger.java:455) at java.util.logging.Logger.getLogger(Logger.java:502) at com.google.inject.internal.util.Stopwatch.(Stopwatch.java:27) at com.google.inject.internal.InternalInjectorCreator.(InternalInjectorCreator.java:61) at com.google.inject.Guice.createInjector(Guice.java:96) at com.google.inject.Guice.createInjector(Guice.java:73) at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:61) at io.druid.indexer.HadoopDruidIndexerConfig.(HadoopDruidIndexerConfig.java:106) at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:311) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:184) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:444) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:416) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 2019-10-07 10:41:18,439 task-runner-0-priority-0 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup at java.lang.Class.cast(Class.java:3369) at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167) at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301) at org.apache.logging.log4j.core.lookup.Interpolator.(Interpolator.java:105) at org.apache.logging.log4j.core.config.AbstractConfiguration.(AbstractConfiguration.java:111) at org.apache.logging.log4j.core.config.DefaultConfiguration.(DefaultConfiguration.java:48) at org.apache.logging.log4j.core.LoggerContext.(LoggerContext.java:75) at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.createContext(ClassLoaderContextSelector.java:171) at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.locateContext(ClassLoaderContextSelector.java:145) at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:70) at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:57) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:140) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41) at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103) at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42) at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89) at java.util.logging.LogManager.demandLogger(LogManager.java:551) at java.util.logging.Logger.demandLogger(Logger.java:455) at java.util.logging.Logger.getLogger(Logger.java:502) at com.google.inject.internal.util.Stopwatch.(Stopwatch.java:27) at com.google.inject.internal.InternalInjectorCreator.(InternalInjectorCreator.java:61) at com.google.inject.Guice.createInjector(Guice.java:96) at com.google.inject.Guice.createInjector(Guice.java:73) at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:61) at io.druid.indexer.HadoopDruidIndexerConfig.(HadoopDruidIndexerConfig.java:106) at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:311) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:184) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:444) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:416) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 2019-10-07 10:41:18,493 task-runner-0-priority-0 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup at java.lang.Class.cast(Class.java:3369) at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167) at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301) at org.apache.logging.log4j.core.lookup.Interpolator.(Interpolator.java:94) at org.apache.logging.log4j.core.config.AbstractConfiguration.(AbstractConfiguration.java:111) at org.apache.logging.log4j.core.config.xml.XmlConfiguration.(XmlConfiguration.java:81) at org.apache.logging.log4j.core.config.xml.XmlConfigurationFactory.getConfiguration(XmlConfigurationFactory.java:44) at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:490) at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:460) at org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:256) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:561) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:578) at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:214) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41) at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103) at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42) at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89) at java.util.logging.LogManager.demandLogger(LogManager.java:551) at java.util.logging.Logger.demandLogger(Logger.java:455) at java.util.logging.Logger.getLogger(Logger.java:502) at com.google.inject.internal.util.Stopwatch.(Stopwatch.java:27) at com.google.inject.internal.InternalInjectorCreator.(InternalInjectorCreator.java:61) at com.google.inject.Guice.createInjector(Guice.java:96) at com.google.inject.Guice.createInjector(Guice.java:73) at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:61) at io.druid.indexer.HadoopDruidIndexerConfig.(HadoopDruidIndexerConfig.java:106) at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:311) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:184) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:444) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:416) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 2019-10-07 10:41:18,495 task-runner-0-priority-0 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup at java.lang.Class.cast(Class.java:3369) at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167) at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301) at org.apache.logging.log4j.core.lookup.Interpolator.(Interpolator.java:105) at org.apache.logging.log4j.core.config.AbstractConfiguration.(AbstractConfiguration.java:111) at org.apache.logging.log4j.core.config.xml.XmlConfiguration.(XmlConfiguration.java:81) at org.apache.logging.log4j.core.config.xml.XmlConfigurationFactory.getConfiguration(XmlConfigurationFactory.java:44) at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:490) at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:460) at org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:256) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:561) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:578) at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:214) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41) at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103) at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42) at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89) at java.util.logging.LogManager.demandLogger(LogManager.java:551) at java.util.logging.Logger.demandLogger(Logger.java:455) at java.util.logging.Logger.getLogger(Logger.java:502) at com.google.inject.internal.util.Stopwatch.(Stopwatch.java:27) at com.google.inject.internal.InternalInjectorCreator.(InternalInjectorCreator.java:61) at com.google.inject.Guice.createInjector(Guice.java:96) at com.google.inject.Guice.createInjector(Guice.java:73) at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:61) at io.druid.indexer.HadoopDruidIndexerConfig.(HadoopDruidIndexerConfig.java:106) at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:311) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:184) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:444) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:416) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 2019-10-07 10:41:18,503 task-runner-0-priority-0 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup at java.lang.Class.cast(Class.java:3369) at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167) at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301) at org.apache.logging.log4j.core.lookup.Interpolator.(Interpolator.java:94) at org.apache.logging.log4j.core.config.AbstractConfiguration.(AbstractConfiguration.java:111) at org.apache.logging.log4j.core.config.DefaultConfiguration.(DefaultConfiguration.java:48) at org.apache.logging.log4j.core.layout.PatternLayout$Builder.build(PatternLayout.java:444) at org.apache.logging.log4j.core.layout.PatternLayout.createDefaultLayout(PatternLayout.java:327) at org.apache.logging.log4j.core.appender.ConsoleAppender$Builder.(ConsoleAppender.java:157) at org.apache.logging.log4j.core.appender.ConsoleAppender.newBuilder(ConsoleAppender.java:149) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.createBuilder(PluginBuilder.java:154) at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:119) at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:888) at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:828) at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:820) at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:449) at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:197) at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:209) at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:492) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:562) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:578) at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:214) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41) at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103) at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42) at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89) at java.util.logging.LogManager.demandLogger(LogManager.java:551) at java.util.logging.Logger.demandLogger(Logger.java:455) at java.util.logging.Logger.getLogger(Logger.java:502) at com.google.inject.internal.util.Stopwatch.(Stopwatch.java:27) at com.google.inject.internal.InternalInjectorCreator.(InternalInjectorCreator.java:61) at com.google.inject.Guice.createInjector(Guice.java:96) at com.google.inject.Guice.createInjector(Guice.java:73) at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:61) at io.druid.indexer.HadoopDruidIndexerConfig.(HadoopDruidIndexerConfig.java:106) at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:311) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:184) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:444) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:416) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 2019-10-07 10:41:18,505 task-runner-0-priority-0 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup at java.lang.Class.cast(Class.java:3369) at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167) at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301) at org.apache.logging.log4j.core.lookup.Interpolator.(Interpolator.java:105) at org.apache.logging.log4j.core.config.AbstractConfiguration.(AbstractConfiguration.java:111) at org.apache.logging.log4j.core.config.DefaultConfiguration.(DefaultConfiguration.java:48) at org.apache.logging.log4j.core.layout.PatternLayout$Builder.build(PatternLayout.java:444) at org.apache.logging.log4j.core.layout.PatternLayout.createDefaultLayout(PatternLayout.java:327) at org.apache.logging.log4j.core.appender.ConsoleAppender$Builder.(ConsoleAppender.java:157) at org.apache.logging.log4j.core.appender.ConsoleAppender.newBuilder(ConsoleAppender.java:149) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.createBuilder(PluginBuilder.java:154) at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:119) at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:888) at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:828) at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:820) at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:449) at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:197) at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:209) at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:492) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:562) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:578) at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:214) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41) at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103) at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42) at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89) at java.util.logging.LogManager.demandLogger(LogManager.java:551) at java.util.logging.Logger.demandLogger(Logger.java:455) at java.util.logging.Logger.getLogger(Logger.java:502) at com.google.inject.internal.util.Stopwatch.(Stopwatch.java:27) at com.google.inject.internal.InternalInjectorCreator.(InternalInjectorCreator.java:61) at com.google.inject.Guice.createInjector(Guice.java:96) at com.google.inject.Guice.createInjector(Guice.java:73) at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:61) at io.druid.indexer.HadoopDruidIndexerConfig.(HadoopDruidIndexerConfig.java:106) at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:311) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:184) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:444) at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:416) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 2019-10-07T10:41:20,145 WARN [task-runner-0-priority-0] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2019-10-07T10:41:20,872 WARN [task-runner-0-priority-0] org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 2019-10-07T10:41:21,525 WARN [task-runner-0-priority-0] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29 2019-10-07T10:41:22,973 WARN [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobResourceUploader - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 2019-10-07T10:41:22,993 WARN [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobResourceUploader - No job jar file set. User classes may not be found. See Job or Job#setJar(String). 2019-10-07T10:41:23,250 ERROR [task-runner-0-priority-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Exception while running task[HadoopIndexTask{id=index_hadoop_test_2019-10-07T10:41:06.614Z, type=index_hadoop, dataSource=test}] java.lang.RuntimeException: java.lang.reflect.InvocationTargetException at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?] at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:222) ~[druid-indexing-service-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:184) ~[druid-indexing-service-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:444) [druid-indexing-service-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:416) [druid-indexing-service-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) ~[druid-indexing-service-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] ... 7 more Caused by: java.lang.RuntimeException: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://impulsa-int-master.europe-west1-b.c.gijon-impulsa.internal:8020/user/druid/test/Nacho_test/test.csv at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?] at io.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:209) ~[druid-indexing-hadoop-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at io.druid.indexer.JobHelper.runJobs(JobHelper.java:368) ~[druid-indexing-hadoop-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:91) ~[druid-indexing-hadoop-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:325) ~[druid-indexing-service-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) ~[druid-indexing-service-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] ... 7 more Caused by: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://impulsa-int-master.europe-west1-b.c.gijon-impulsa.internal:8020/user/druid/test/Nacho_test/test.csv at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:332) ~[?:?] at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:274) ~[?:?] at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:396) ~[?:?] at org.apache.hadoop.mapreduce.lib.input.DelegatingInputFormat.getSplits(DelegatingInputFormat.java:115) ~[?:?] at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:310) ~[?:?] at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:327) ~[?:?] at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:200) ~[?:?] at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570) ~[?:?] at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567) ~[?:?] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) ~[?:?] at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567) ~[?:?] at io.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:119) ~[druid-indexing-hadoop-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at io.druid.indexer.JobHelper.runJobs(JobHelper.java:368) ~[druid-indexing-hadoop-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:91) ~[druid-indexing-hadoop-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:325) ~[druid-indexing-service-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) ~[druid-indexing-service-0.12.1.3.0.1.0-187.jar:0.12.1.3.0.1.0-187] ... 7 more 2019-10-07T10:41:23,268 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_hadoop_test_2019-10-07T10:41:06.614Z] status changed to [FAILED]. 2019-10-07T10:41:23,272 INFO [task-runner-0-priority-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: { "id" : "index_hadoop_test_2019-10-07T10:41:06.614Z", "status" : "FAILED", "duration" : 6279 }
If you need more info please ask. Thanks in advance.