Member since
04-29-2016
192
Posts
20
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1601 | 07-14-2017 05:01 PM | |
2707 | 06-28-2017 05:20 PM |
05-10-2017
07:21 PM
@Bryan Bende, placed the phoenix jar file in NiFi's work directory path and restarted NiFi instance; now seeing a different error in the log when I enable the HBase client service - "org.apache.nifi.StdErr java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab(Ljavax/security/auth/Subject;)Z" Failed to invoke @OnEnabled method due to java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab(Ljavax/security/auth/Subject;)Z
2017-05-10 14:13:50,016 ERROR [NiFi logging handler] org.apache.nifi.StdErr [StandardProcessScheduler Thread-6] ERROR org.apache.nifi.controller.service.StandardControllerServiceNode -
2017-05-10 14:13:50,016 ERROR [NiFi logging handler] org.apache.nifi.StdErr java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab(Ljavax/security/auth/Subject;)Z
2017-05-10 14:13:50,016 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.security.UserGroupInformation.<init>(UserGroupInformation.java:623)
2017-05-10 14:13:50,017 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytabAndReturnUGI(UserGroupInformation.java:1200)
2017-05-10 14:13:50,017 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.hadoop.SecurityUtil.loginKerberos(SecurityUtil.java:52)
2017-05-10 14:13:50,017 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.hbase.HBase_1_1_2_ClientService.createConnection(HBase_1_1_2_ClientService.java:226)
2017-05-10 14:13:50,017 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.hbase.HBase_1_1_2_ClientService.onEnabled(HBase_1_1_2_ClientService.java:178)
2017-05-10 14:13:50,017 ERROR [NiFi logging handler] org.apache.nifi.StdErr at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2017-05-10 14:13:50,017 ERROR [NiFi logging handler] org.apache.nifi.StdErr at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2017-05-10 14:13:50,017 ERROR [NiFi logging handler] org.apache.nifi.StdErr at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-05-10 14:13:50,017 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.lang.reflect.Method.invoke(Method.java:498)
2017-05-10 14:13:50,018 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:137)
2017-05-10 14:13:50,018 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:125)
2017-05-10 14:13:50,018 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:70)
2017-05-10 14:13:50,018 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:47)
2017-05-10 14:13:50,018 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.controller.service.StandardControllerServiceNode$2.run(StandardControllerServiceNode.java:348)
2017-05-10 14:13:50,018 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
2017-05-10 14:13:50,018 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2017-05-10 14:13:50,018 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
2017-05-10 14:13:50,019 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
2017-05-10 14:13:50,020 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
2017-05-10 14:13:50,020 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2017-05-10 14:13:50,020 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.lang.Thread.run(Thread.java:745)
2017-05-10 14:13:50,020 ERROR [NiFi logging handler] org.apache.nifi.StdErr [StandardProcessScheduler Thread-6] ERROR org.apache.nifi.controller.service.StandardControllerServiceNode - Failed to invoke @OnEnabled method of HBase_1_1_2_ClientService[id=102e119a-19a2-1409-671f-dddd93a063de] due to java.lang.NoSuchMethodError: org.apache.hadoop.security.authentication.util.KerberosUtil.hasKerberosKeyTab(Ljavax/security/auth/Subject;)Z
... View more
05-10-2017
03:19 PM
@Bryan Bende thank you.
... View more
05-10-2017
01:39 PM
@Bryan Bende thank you. Deleting the NiFi's work directory is a manual step done deliberately to address some issue right ? it's not something that would occur in routine NiFi maintenance activities, like restarting the NiFi instance after OS patches, etc. Also, I'm guessing the phoenix-client.jar would be found somewhere on the HDP installation; would you happen to know the path to find it, so I can let our Admins know.
... View more
05-10-2017
03:51 AM
@Timothy Spann Haven't restarted NiFi yet, that's what I was leaning towards, but wanted to wait before I tried that, in case there are other things that need to be done before the restart. the account running NiFi doesn't have any issues connecting to and writing to HDP, we have another dataflow running that writes to HDFS on the same NiFi instance.
... View more
05-10-2017
03:39 AM
@Bryan Bende is this the interim solution until we upgrade to 1.1.x - https://community.hortonworks.com/questions/67074/nifi-hbase-service-controller-failing-in-nifi-10.html ?
... View more
05-09-2017
06:51 PM
@Ward Bekker sorry for the delay in replying. It turned out that I was missing hbase-site.xml in the Hadoop configuration files, which I added; now the log shows a different error - "Failed to invoke @OnEnabled method due to java.io.IOException: java.lang.reflect.InvocationTargetException" ; it does not seem to be a login issue, because it says successfully logged in. By the way, we're on NiFi 1.0.1 2017-05-09 13:34:47,645 ERROR [NiFi logging handler] org.apache.nifi.StdErr [StandardProcessScheduler Thread-2] INFO org.apache.nifi.hbase.HBase_1_1_2_ClientService - HBase_1_1_2_ClientService[id=102e119a-19a2-1409-671f-dddd93a063de] HBase Security Enabled, logging in as principal principal@principal.com with keytab /home/nifitest/kdc/nifitest.service.keytab
2017-05-09 13:34:47,663 ERROR [NiFi logging handler] org.apache.nifi.StdErr [StandardProcessScheduler Thread-2] INFO org.apache.nifi.hbase.HBase_1_1_2_ClientService - HBase_1_1_2_ClientService[id=102e119a-19a2-1409-671f-dddd93a063de] Successfully logged in as principal principal@principal.com with keytab /home/nifitest/kdc/nifitest.service.keytab
2017-05-09 13:34:47,664 ERROR [NiFi logging handler] org.apache.nifi.StdErr [StandardProcessScheduler Thread-2] ERROR org.apache.nifi.controller.service.StandardControllerServiceNode - HBase_1_1_2_ClientService[id=102e119a-19a2-1409-671f-dddd93a063de] Failed to invoke @OnEnabled method due to java.io.IOException: java.lang.reflect.InvocationTargetException
2017-05-09 13:34:47,664 ERROR [NiFi logging handler] org.apache.nifi.StdErr [StandardProcessScheduler Thread-2] ERROR org.apache.nifi.controller.service.StandardControllerServiceNode -
2017-05-09 13:34:47,664 ERROR [NiFi logging handler] org.apache.nifi.StdErr java.io.IOException: java.lang.reflect.InvocationTargetException
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.hbase.HBase_1_1_2_ClientService$1.run(HBase_1_1_2_ClientService.java:232)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.hbase.HBase_1_1_2_ClientService$1.run(HBase_1_1_2_ClientService.java:229)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.security.AccessController.doPrivileged(Native Method)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at javax.security.auth.Subject.doAs(Subject.java:422)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.hbase.HBase_1_1_2_ClientService.createConnection(HBase_1_1_2_ClientService.java:229)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.hbase.HBase_1_1_2_ClientService.onEnabled(HBase_1_1_2_ClientService.java:178)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at sun.reflect.GeneratedMethodAccessor706.invoke(Unknown Source)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.lang.reflect.Method.invoke(Method.java:498)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:137)
2017-05-09 13:34:47,665 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:125)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:70)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:47)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.nifi.controller.service.StandardControllerServiceNode$2.run(StandardControllerServiceNode.java:348)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.lang.Thread.run(Thread.java:745)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr Caused by: java.lang.reflect.InvocationTargetException
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at sun.reflect.GeneratedConstructorAccessor457.newInstance(Unknown Source)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2017-05-09 13:34:47,666 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr ... 24 more
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr Caused by: java.lang.UnsupportedOperationException: Unable to find org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.ipc.RpcControllerFactory.instantiate(RpcControllerFactory.java:58)
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2242)
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:690)
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:630)
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr ... 28 more
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
2017-05-09 13:34:47,667 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
2017-05-09 13:34:47,668 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.lang.Class.forName0(Native Method)
2017-05-09 13:34:47,668 ERROR [NiFi logging handler] org.apache.nifi.StdErr at java.lang.Class.forName(Class.java:264)
2017-05-09 13:34:47,668 ERROR [NiFi logging handler] org.apache.nifi.StdErr at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
2017-05-09 13:34:47,668 ERROR [NiFi logging handler] org.apache.nifi.StdErr ... 32 more
... View more
05-06-2017
03:47 AM
@ksuresh, my question is not about how to do it again, rather how can I get back the data from the table that I have now, since this is a large table, I would rather not load the table again. I did try the hcatalog route, but it was taking extremely long time; Regarding the 2nd option you mentioned, in the two step process, the 2nd step - writing data from the staging table to ORC table - takes just as long as loading the staging table from sqoop, correct ?
... View more
05-05-2017
10:27 PM
Hello, I created a Hive table (text format, no partitions) using Sqoop import; after creating the table, wanting to change the table to ORC format, I ran "ALTER TABLE TESTDB.DST_TABLE SET FILEFORMAT ORC"; later I learned that it would NOT change the existing data to ORC format; Now, when I try "select * from ...", to read data from the table, it throw this error below: "java.io.IOException: org.apache.hadoop.hive.ql.io.FileFormatException: Malformed ORC file hdfs://haserver/apps/hive/warehouse/testdb.db/dst_table/part-m-00000.deflate. Invalid postscript. (state=,code=0)"
I can't do CTAS either, that one throws another error: ERROR : Vertex failed, vertexName=Map 1, vertexId=vertex_1493740850622_0077_1_00, diagnostics=[Vertex vertex_1493740850622_0077_1_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: dst_table initializer failed, vertex=vertex_1493740850622_0077_1_00 [Map 1], java.lang.RuntimeException: serious problem
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1273)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getSplits(OrcInputFormat.java:1300)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:307)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:409)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:155)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:273)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:266)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:266)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: org.apache.hadoop.hive.ql.io.FileFormatException: Malformed ORC file hdfs://haserver/apps/hive/warehouse/testdb.db/dst_table/part-m-00000.deflate. Invalid postscript.
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:188)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1268)
... 15 more
Caused by: org.apache.hadoop.hive.ql.io.FileFormatException: Malformed ORC file hdfs://
haserver/apps/hive/warehouse/testdb.db/dst_table/part-m-00000.deflate. Invalid postscript.
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.ensureOrcFooter(ReaderImpl.java:257)
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.extractFileTail(ReaderImpl.java:384)
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.<init>(ReaderImpl.java:321)
at org.apache.hadoop.hive.ql.io.orc.OrcFile.createReader(OrcFile.java:241)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.populateAndCacheStripeDetails(OrcInputFormat.java:1099)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.callInternal(OrcInputFormat.java:1001)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.access$2000(OrcInputFormat.java:838)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator$1.run(OrcInputFormat.java:992)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator$1.run(OrcInputFormat.java:989)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.call(OrcInputFormat.java:989)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.call(OrcInputFormat.java:838)
... 4 more
]
ERROR : DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0
Error: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1493740850622_0077_1_00, diagnostics=[Vertex vertex_1493740850622_0077_1_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: dst_switch_dates initializer failed, vertex=vertex_1493740850622_0077_1_00 [Map 1], java.lang.RuntimeException: serious problem
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1273)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getSplits(OrcInputFormat.java:1300)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:307)
at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:409)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:155)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:273)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:266)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:266)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: org.apache.hadoop.hive.ql.io.FileFormatException: Malformed ORC file hdfs://haserver/apps/hive/warehouse/testdb.db/dst_table/part-m-00000.deflate. Invalid postscript.
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:188)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:1268)
... 15 more
Caused by: org.apache.hadoop.hive.ql.io.FileFormatException: Malformed ORC file hdfs://
haserver/apps/hive/warehouse/testdb.db/dst_table/part-m-00000.deflate. Invalid postscript.
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.ensureOrcFooter(ReaderImpl.java:257)
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.extractFileTail(ReaderImpl.java:384)
at org.apache.hadoop.hive.ql.io.orc.ReaderImpl.<init>(ReaderImpl.java:321)
at org.apache.hadoop.hive.ql.io.orc.OrcFile.createReader(OrcFile.java:241)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.populateAndCacheStripeDetails(OrcInputFormat.java:1099)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.callInternal(OrcInputFormat.java:1001)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.access$2000(OrcInputFormat.java:838)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator$1.run(OrcInputFormat.java:992)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator$1.run(OrcInputFormat.java:989)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.call(OrcInputFormat.java:989)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.call(OrcInputFormat.java:838)
... 4 more
]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 (state=08S01,code=2)
0
It would not let me alter the table format back to TEXTFILE either: Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Changing file format (from ORC) is not supported for table testdb.dst_table (state=08S01,code=1)
What I can do to recover/repair the table, so I can get to the data in the table. Thanks in advance.
... View more
Labels:
- Labels:
-
Apache Hive
05-05-2017
06:16 PM
1 Kudo
Hello, While I was trying to setup the HBase Client Service (disabling and enabling a few times, to edit the properties), I now see the service is stuck in "Disabling" state; when I select the only option that is now available, "Remove", it would not let me delete it, it says "cannot be deleted because it is not disabled"; I tried creating a second Client Service, but that too ended up in the same situation. What I noticed is, even when enabling it, it never went to the "Enabled" state, it was showing "Enabling" for state. Our HDP environment is Kerberized, and so I did provide the necessary information in the properties for HBase Client service. Initially, I did not supply the ZooKepper info, because the documentation says it's required only if the Hadoop Configuration Files are not supplied. But when it was showing an error ("The node /hbase is not in ZooKeeper. It should have been written by the master. Check the value configured in 'zookeeper.znode.parent'. There could be a mismatch with the one configured in the master."), I added the ZooKepper info in the properties. As a side note, PutHDFS works perfectly fine with the same Hadoop Configuration Files, Principal, and Keytab; Please suggest how to make this work, thanks.
... View more
Labels:
- Labels:
-
Apache NiFi
05-01-2017
08:21 PM
@Wynner Thank you for the MergeContent suggestion, we do use that right now, but since our requirement is to store data as close to real time as possible in HDFS, we're waiting for no more than a minute in MergeContent; whatever is accumulated in that time frame, we write it to HDFS. But, would you please answer this question below about the PutHDFS append option Once the desired size HDFS file is reached and I switch to a 2nd PutHDFS processor that inserts data in a new file (like I have it in the diagram), how does the file "close" happen on the previous file where appends were happening ? does the PutHDFS processor (the "append" one), after some idle time go back and close the file ? or does PutHDFS close the file after each append; if the latter (opening and closing for each append), it is possible for that to degrade performance.
... View more