Member since
02-03-2016
22
Posts
3
Kudos Received
0
Solutions
07-26-2017
06:26 AM
Yes, I have HDP running 7.3 with no problems as well. Do not expect any problems from 7.4, but like to know official position
... View more
07-26-2017
06:25 AM
Does not make sense. If it is not listed in the docs, it does not mean it will not work. 7.3 as not listed as well, but HDP runs perfectly on it...
... View more
05-19-2017
12:05 AM
Is there a way to disable scheduler and not show button on the page?
... View more
02-09-2017
01:49 AM
Thejas, here is the stack trace: > SELECT http_vendor,status FROM ex_httpvendorstats where cntry='us' and dt='2017-01-15' > and hour='7' and prod_type='exchange' and http_vendor = 'neptunenormandy' limit 10; Query ID = lfedotov_20170207223755_3becb17f-1bef-46c1-b9b9-9773e88566f9 Total jobs = 1 Launching Job 1 out of 1 -------------------------------------------------------------------------------- VERTICES STATUS TOTAL COMPLETED RUNNING PENDING FAILED KILLED -------------------------------------------------------------------------------- Map 1 FAILED -1 0 0 -1 0 0 -------------------------------------------------------------------------------- VERTICES: 00/01 [>>--------------------------] 0% ELAPSED TIME: 1486525056.00 s -------------------------------------------------------------------------------- Status: Failed Vertex failed, vertexName=Map 1, vertexId=vertex_1486178409721_43966_2_00, diagnostics=[Vertex vertex_1486178409721_43966_2_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: ex_httpvendorstats initializer failed, vertex=vertex_1486178409721_43966_2_00 [Map 1], java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException at org.apache.hadoop.fs.s3native.NativeS3FileSystem.createDefaultStore(NativeS3FileSystem.java:342) at org.apache.hadoop.fs.s3native.NativeS3FileSystem.initialize(NativeS3FileSystem.java:332) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:258) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) at org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat.listStatus(AvroContainerInputFormat.java:42) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:307) at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:409) at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:155) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:273) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:266) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:266) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.ClassNotFoundException: org.jets3t.service.S3ServiceException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 26 more ] DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1486178409721_43966_2_00, diagnostics=[Vertex vertex_1486178409721_43966_2_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: ex_httpvendorstats initializer failed, vertex=vertex_1486178409721_43966_2_00 [Map 1], java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException at org.apache.hadoop.fs.s3native.NativeS3FileSystem.createDefaultStore(NativeS3FileSystem.java:342) at org.apache.hadoop.fs.s3native.NativeS3FileSystem.initialize(NativeS3FileSystem.java:332) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:258) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) at org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat.listStatus(AvroContainerInputFormat.java:42) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:307) at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:409) at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:155) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:273) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:266) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:266) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.ClassNotFoundException: org.jets3t.service.S3ServiceException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 26 more ]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 hive>
... View more
02-08-2017
03:47 AM
I have HDP 2.5.0.0 with Ambari 2.4.1
... View more
02-08-2017
03:46 AM
Implemented settings from this post: https://community.hortonworks.com/questions/74503/unable-to-write-hive-query-output-to-s3.html But problem is still on the place. Can not find any other solution for it. Any help will be very much appreciated!
... View more
Labels:
- Labels:
-
Apache Hive
02-03-2017
10:25 PM
What does this error means and how to fix it?
... View more
Labels:
- Labels:
-
Apache YARN
11-20-2016
07:30 PM
So, the answer is "no" for Ambari support. Expected... What about non-ambani configuration? Is it possible? is there any docs on this?
... View more
11-20-2016
06:02 PM
@jss have you read question before answering? I clearly asking about INTERACTIVE HiveServer2, whichis part of the Hive 2.1 which is in tech preview on HDP 2.5 Your answer is about good old boring Hive 1 HiveServer2
... View more