Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Malformed ORC file Invalid postscript

avatar
Super Collaborator

Stack : Installed HDP-2.3.2.0-2950 using Ambari 2.1

sqoop import

sqoop import --connect 'jdbc:sqlserver://dbserver;database=dbname' --username username --password password --as-textfile --fields-terminated-by '|'  --table DimECU  --warehouse-dir /dataload/tohdfs/reio/odpdw/may2016 --verbose

create external table

CREATE EXTERNAL TABLE IF NOT EXISTS DimECU (`ECU_ID` int,`ECU_Name` varchar(15),`ECU_FAMILY_NAME` varchar(15),`INSERTED_BY`varchar(64),`INSERTION_DATE` timestamp) ROW FORMAT DELIMITED
   FIELDS TERMINATED BY '|' STORED AS ORC LOCATION '/dataload/tohdfs/reio/odpdw/may2016/DimECU';

Can't select the data :

hive (odp_dw_may2016_orc)>
                         >
                         > select * from DimECU limit 5;
OK
dimecu.ecu_id   dimecu.ecu_name dimecu.ecu_family_name  dimecu.inserted_by      dimecu.insertion_date
Failed with exception java.io.IOException:org.apache.hadoop.hive.ql.io.FileFormatException: Malformed ORC file hdfs://l1031lab.sss.se.scania.com:8020/dataload/tohdfs/reio/odpdw/may2016/DimECU/part-m-00000. Invalid postscript.
Time taken: 0.074 seconds

The exception is :

2016-05-12 13:17:26,334 ERROR [main]: CliDriver (SessionState.java:printError(960)) - Failed with exception java.io.IOException:org.apache.hadoop.hive.ql.io.FileFormatException: Malformed ORC file hdfs://l1031lab.sss.se.scania.com:8020/dataload/tohdfs/reio/odpdw/may2016/DimECU/part-m-00000. Invalid postscript.
java.io.IOException: org.apache.hadoop.hive.ql.io.FileFormatException: Malformed ORC file hdfs://l1031lab.sss.se.scania.com:8020/dataload/tohdfs/reio/odpdw/may2016/DimECU/part-m-00000. Invalid postscript.
at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:508)
at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:415)
at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:140)
at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1672)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.io.FileFormatException: Malformed ORC file hdfs://l1031lab.sss.se.scania.com:8020/dataload/tohdfs/reio/odpdw/may2016/DimECU/part-m-00000. Invalid postscript.
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.ensureOrcFooter(ReaderImpl.java:251)
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.extractMetaInfoFromFooter(ReaderImpl.java:376)
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.<init>(ReaderImpl.java:317)
at org.apache.hadoop.hive.ql.io.orc.OrcFile.createReader(OrcFile.java:237)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getReader(OrcInputFormat.java:1208)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getRecordReader(OrcInputFormat.java:1117)
at org.apache.hadoop.hive.ql.exec.FetchOperator$FetchInputFormatSplit.getRecordReader(FetchOperator.java:674)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:324)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:446)
... 15 more
2016-05-12 13:17:26,334 INFO  [main]: exec.TableScanOperator (Operator.java:close(613)) - 0 finished. closing... 
2016-05-12 13:17:26,334 INFO  [main]: exec.SelectOperator (Operator.java:close(613)) - 1 finished. closing... 
2016-05-12 13:17:26,334 INFO  [main]: exec.LimitOperator (Operator.java:close(613)) - 2 finished. closing... 
2016-05-12 13:17:26,334 INFO  [main]: exec.ListSinkOperator (Operator.java:close(613)) - 4 finished. closing... 
2016-05-12 13:17:26,334 INFO  [main]: exec.ListSinkOperator (Operator.java:close(635)) - 4 Close done
2016-05-12 13:17:26,334 INFO  [main]: exec.LimitOperator (Operator.java:close(635)) - 2 Close done
2016-05-12 13:17:26,335 INFO  [main]: exec.SelectOperator (Operator.java:close(635)) - 1 Close done
2016-05-12 13:17:26,335 INFO  [main]: exec.TableScanOperator (Operator.java:close(635)) - 0 Close done
2016-05-12 13:17:26,352 INFO  [Atlas Logger 0]: security.SecureClientUtils (SecureClientUtils.java:getClientConnectionHandler(91)) - Real User: hive (auth:SIMPLE), is from ticket cache? false
2016-05-12 13:17:26,353 INFO  [Atlas Logger 0]: security.SecureClientUtils (SecureClientUtils.java:getClientConnectionHandler(94)) - doAsUser: hive
2016-05-12 13:17:26,356 INFO  [main]: CliDriver (SessionState.java:printInfo(951)) - Time taken: 0.065 seconds
2016-05-12 13:17:26,356 INFO  [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2016-05-12 13:17:26,356 INFO  [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks start=1463051846356 end=1463051846356 duration=0 from=org.apache.hadoop.hive.ql.Driver>
2016-05-12 13:17:26,989 INFO  [Atlas Logger 0]: hook.HiveHook (HiveHook.java:run(168)) - Atlas hook failed
org.apache.atlas.AtlasServiceException: Metadata service API SEARCH_GREMLIN failed with status 400(Bad Request) Response Body ({"error":"javax.script.ScriptException: javax.script.ScriptException: com.thinkaurelius.titan.core.TitanException: Could not start new transaction","stackTrace":"org.apache.atlas.discovery.DiscoveryException: javax.script.ScriptException: javax.script.ScriptException: com.thinkaurelius.titan.core.TitanException: Could not start new transaction\n\tat org.apache.atlas.discovery.graph.GraphBackedDiscoveryService.searchByGremlin(GraphBackedDiscoveryService.java:175)\n\tat org.apache.atlas.GraphTransactionInterceptor.invoke(GraphTransactionInterceptor.java:41)\n\tat org.apache.atlas.web.resources.MetadataDiscoveryResource.searchUsingGremlinQuery(MetadataDiscoveryResource.java:155)\n\tat sun.reflect.GeneratedMethodAccessor74.invoke(Unknown Source)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:606)\n\tat com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)\n\tat com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)\n\tat com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)\n\tat com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)\n\tat com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)\n\tat com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)\n\tat com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)\n\tat com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)\n\tat com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)\n\tat com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)\n\tat com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:820)\n\tat com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:287)\n\tat com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:277)\n\tat com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:182)\n\tat com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:91)\n\tat com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:85)\n\tat org.apache.atlas.web.filters.AuditFilter.doFilter(AuditFilter.java:67)\n\tat com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:82)\n\tat com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:119)\n\tat com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:133)\n\tat com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:130)\n\tat com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:203)\n\tat com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:130)\n\tat org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)\n\tat org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399)\n\tat org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)\n\tat org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)\n\tat org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)\n\tat org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)\n\tat org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)\n\tat org.mortbay.jetty.Server.handle(Server.java:326)\n\tat org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)\n\tat org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:928)\n\tat org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549)\n\tat org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)\n\tat org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)\n\tat org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)\n\tat org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)\nCaused by: javax.script.ScriptException: javax.script.ScriptException: com.thinkaurelius.titan.core.TitanException: Could not start new transaction\n\tat com.tinkerpop.gremlin.groovy.jsr223.GremlinGroovyScriptEngine.eval(GremlinGroovyScriptEngine.java:94)\n\tat javax.script.AbstractScriptEngine.eval(AbstractScriptEngine.java:233)\n\tat org.apache.atlas.discovery.graph.GraphBackedDiscoveryService.searchByGremlin(GraphBackedDiscoveryService.java:172)\n\t... 48 more\nCaused by: javax.script.ScriptException: com.thinkaurelius.titan.core.TitanException: Could not start new transaction\n\tat com.tinkerpop.gremlin.groovy.jsr223.GremlinGroovyScriptEngine.eval(GremlinGroovyScriptEngine.java:221)\n\tat com.tinkerpop.gremlin.groovy.jsr223.GremlinGroovyScriptEngine.eval(GremlinGroovyScriptEngine.java:90)\n\t... 50 more\nCaused by: com.thinkaurelius.titan.core.TitanException: Could not start new transaction\n\tat com.thinkaurelius.titan.graphdb.database.StandardTitanGraph.newTransaction(StandardTitanGraph.java:276)\n\tat com.thinkaurelius.titan.graphdb.transaction.StandardTransactionBuilder.start(StandardTransactionBuilder.java:220)\n\tat com.thinkaurelius.titan.graphdb.database.StandardTitanGraph.newThreadBoundTransaction(StandardTitanGraph.java:265)\n\tat com.thinkaurelius.titan.graphdb.blueprints.TitanBlueprintsGraph.getAutoStartTx(TitanBlueprintsGraph.java:104)\n\tat com.thinkaurelius.titan.graphdb.blueprints.TitanBlueprintsGraph.query(TitanBlueprintsGraph.java:225)\n\tat com.thinkaurelius.titan.graphdb.blueprints.TitanBlueprintsGraph.query(TitanBlueprintsGraph.java:27)\n\tat com.tinkerpop.pipes.transform.GraphQueryPipe.processNextStart(GraphQueryPipe.java:34)\n\tat com.tinkerpop.pipes.transform.GraphQueryPipe.processNextStart(GraphQueryPipe.java:17)\n\tat com.tinkerpop.pipes.AbstractPipe.next(AbstractPipe.java:89)\n\tat com.tinkerpop.pipes.IdentityPipe.processNextStart(IdentityPipe.java:19)\n\tat com.tinkerpop.pipes.AbstractPipe.next(AbstractPipe.java:89)\n\tat com.tinkerpop.pipes.IdentityPipe.processNextStart(IdentityPipe.java:19)\n\tat com.tinkerpop.pipes.AbstractPipe.hasNext(AbstractPipe.java:98)\n\tat com.tinkerpop.pipes.util.Pipeline.hasNext(Pipeline.java:105)\n\tat org.codehaus.groovy.runtime.DefaultGroovyMethods.toList(DefaultGroovyMethods.java:1946)\n\tat org.codehaus.groovy.runtime.dgm$836.invoke(Unknown Source)\n\tat org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:271)\n\tat org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:53)\n\tat org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)\n\tat org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)\n\tat org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:112)\n\tat Script180.run(Script180.groovy:1)\n\tat com.tinkerpop.gremlin.groovy.jsr223.GremlinGroovyScriptEngine.eval(GremlinGroovyScriptEngine.java:219)\n\t... 51 more\nCaused by: com.thinkaurelius.titan.diskstorage.PermanentBackendException: Could not start BerkeleyJE transaction\n\tat com.thinkaurelius.titan.diskstorage.berkeleyje.BerkeleyJEStoreManager.beginTransaction(BerkeleyJEStoreManager.java:144)\n\tat com.thinkaurelius.titan.diskstorage.berkeleyje.BerkeleyJEStoreManager.beginTransaction(BerkeleyJEStoreManager.java:34)\n\tat com.thinkaurelius.titan.diskstorage.keycolumnvalue.keyvalue.OrderedKeyValueStoreManagerAdapter.beginTransaction(OrderedKeyValueStoreManagerAdapter.java:52)\n\tat com.thinkaurelius.titan.diskstorage.Backend.beginTransaction(Backend.java:465)\n\tat com.thinkaurelius.titan.graphdb.database.StandardTitanGraph.openBackendTransaction(StandardTitanGraph.java:282)\n\tat com.thinkaurelius.titan.graphdb.database.StandardTitanGraph.newTransaction(StandardTitanGraph.java:272)\n\t... 73 more\nCaused by: com.sleepycat.je.LogWriteException: (JE 5.0.73) Environment must be closed, caused by: com.sleepycat.je.LogWriteException: Environment invalid because of previous exception: (JE 5.0.73) \/var\/lib\/atlas\/data\/berkeley java.io.IOException: No space left on device LOG_WRITE: IOException on write, log is likely incomplete. Environment is invalid and must be closed.\n\tat com.sleepycat.je.LogWriteException.wrapSelf(LogWriteException.java:72)\n\tat com.sleepycat.je.dbi.EnvironmentImpl.checkIfInvalid(EnvironmentImpl.java:1512)\n\tat com.sleepycat.je.Environment.checkEnv(Environment.java:2185)\n\tat com.sleepycat.je.Environment.beginTransactionInternal(Environment.java:1313)\n\tat com.sleepycat.je.Environment.beginTransaction(Environment.java:1284)\n\tat com.thinkaurelius.titan.diskstorage.berkeleyje.BerkeleyJEStoreManager.beginTransaction(BerkeleyJEStoreManager.java:134)\n\t... 78 more\nCaused by: com.sleepycat.je.LogWriteException: Environment invalid because of previous exception: (JE 5.0.73) \/var\/lib\/atlas\/data\/berkeley java.io.IOException: No space left on device LOG_WRITE: IOException on write, log is likely incomplete. Environment is invalid and must be closed.\n\tat com.sleepycat.je.log.FileManager.writeLogBuffer(FileManager.java:1652)\n\tat com.sleepycat.je.log.LogBufferPool.writeBufferToFile(LogBufferPool.java:260)\n\tat com.sleepycat.je.log.LogBufferPool.writeCompleted(LogBufferPool.java:345)\n\tat com.sleepycat.je.log.LogManager.serialLogWork(LogManager.java:716)\n\tat com.sleepycat.je.log.LogManager.serialLogInternal(LogManager.java:493)\n\tat com.sleepycat.je.log.SyncedLogManager.serialLog(SyncedLogManager.java:42)\n\tat com.sleepycat.je.log.LogManager.multiLog(LogManager.java:395)\n\tat com.sleepycat.je.log.LogManager.log(LogManager.java:335)\n\tat com.sleepycat.je.txn.Txn.logCommitEntry(Txn.java:957)\n\tat com.sleepycat.je.txn.Txn.commit(Txn.java:719)\n\tat com.sleepycat.je.txn.Txn.commit(Txn.java:584)\n\tat com.sleepycat.je.Transaction.commit(Transaction.java:317)\n\tat com.thinkaurelius.titan.diskstorage.berkeleyje.BerkeleyJETx.commit(BerkeleyJETx.java:81)\n\tat com.thinkaurelius.titan.diskstorage.keycolumnvalue.cache.CacheTransaction.commit(CacheTransaction.java:198)\n\tat com.thinkaurelius.titan.diskstorage.BackendTransaction.commitStorage(BackendTransaction.java:117)\n\tat com.thinkaurelius.titan.graphdb.database.StandardTitanGraph.commit(StandardTitanGraph.java:670)\n\tat com.thinkaurelius.titan.graphdb.transaction.StandardTitanTx.commit(StandardTitanTx.java:1337)\n\tat com.thinkaurelius.titan.graphdb.blueprints.TitanBlueprintsGraph.commit(TitanBlueprintsGraph.java:60)\n\tat org.apache.atlas.GraphTransactionInterceptor.invoke(GraphTransactionInterceptor.java:42)\n\tat org.apache.atlas.services.DefaultMetadataService.createEntity(DefaultMetadataService.java:231)\n\tat org.apache.atlas.web.resources.EntityResource.submit(EntityResource.java:96)\n\tat sun.reflect.GeneratedMethodAccessor41.invoke(Unknown Source)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:606)\n\tat com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)\n\tat com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)\n\tat com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)\n\tat com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)\n\tat com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)\n\tat com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)\n\tat com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)\n\tat com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)\n\tat com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)\n\tat com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:820)\n\tat com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:287)\n\tat com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:277)\n\tat com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:182)\n\tat com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:91)\n\tat com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:85)\n\tat org.apache.atlas.web.filters.AuditFilter.doFilter(AuditFilter.java:67)\n\tat com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:82)\n\tat com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:119)\n\tat com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:133)\n\tat com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:130)\n\tat com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:203)\n\tat com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:130)\n\tat org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1212)\n\tat org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:399)\n\tat org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)\n\tat org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)\n\tat org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)\n\tat org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:450)\n\tat org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)\n\tat org.mortbay.jetty.Server.handle(Server.java:326)\n\tat org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)\n\tat org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:945)\n\tat org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:756)\n\tat org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)\n\t... 3 more\nCaused by: java.io.IOException: No space left on device\n\tat java.io.RandomAccessFile.writeBytes0(Native Method)\n\tat java.io.RandomAccessFile.writeBytes(RandomAccessFile.java:520)\n\tat java.io.RandomAccessFile.write(RandomAccessFile.java:550)\n\tat com.sleepycat.je.log.FileManager.writeToFile(FileManager.java:1757)\n\tat com.sleepycat.je.log.FileManager.writeLogBuffer(FileManager.java:1637)\n\t... 65 more\n"})
at org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:365)
at org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:346)
at org.apache.atlas.AtlasClient.searchByGremlin(AtlasClient.java:294)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.getEntityReferenceFromGremlin(HiveMetaStoreBridge.java:227)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.getProcessReference(HiveMetaStoreBridge.java:183)
at org.apache.atlas.hive.hook.HiveHook.registerProcess(HiveHook.java:297)
at org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:202)
at org.apache.atlas.hive.hook.HiveHook.access$200(HiveHook.java:54)
at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:166)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
1 ACCEPTED SOLUTION

avatar
Super Guru

@Kaliyug Antagonist

The issue here is that you imported the sqoop data as --as-textfile but you created the table with ORC store format. Therefore hive is throwing this error becuase data is not in ORC format. Either you need to change the create table storage to text format OR you need to import data in ORC format through sqoop.

View solution in original post

1 REPLY 1

avatar
Super Guru

@Kaliyug Antagonist

The issue here is that you imported the sqoop data as --as-textfile but you created the table with ORC store format. Therefore hive is throwing this error becuase data is not in ORC format. Either you need to change the create table storage to text format OR you need to import data in ORC format through sqoop.