Member since
02-03-2016
22
Posts
3
Kudos Received
0
Solutions
07-26-2017
06:26 AM
Yes, I have HDP running 7.3 with no problems as well. Do not expect any problems from 7.4, but like to know official position
... View more
07-26-2017
06:25 AM
Does not make sense. If it is not listed in the docs, it does not mean it will not work. 7.3 as not listed as well, but HDP runs perfectly on it...
... View more
05-19-2017
12:16 AM
Need to disable scheduler completely by safety reassons. Is it possible?
... View more
Labels:
05-19-2017
12:05 AM
Is there a way to disable scheduler and not show button on the page?
... View more
05-06-2017
12:39 AM
Same error when attempting to upgrade from 2.5.3 to 2.6.0.3 using Ambari 2.5.0.3. Ambari is not using SSL, so it does not have its own truststore. How can I fix this pre-upgrade check problem?
... View more
02-09-2017
01:49 AM
Thejas, here is the stack trace: > SELECT http_vendor,status FROM ex_httpvendorstats where cntry='us' and dt='2017-01-15' > and hour='7' and prod_type='exchange' and http_vendor = 'neptunenormandy' limit 10; Query ID = lfedotov_20170207223755_3becb17f-1bef-46c1-b9b9-9773e88566f9 Total jobs = 1 Launching Job 1 out of 1 -------------------------------------------------------------------------------- VERTICES STATUS TOTAL COMPLETED RUNNING PENDING FAILED KILLED -------------------------------------------------------------------------------- Map 1 FAILED -1 0 0 -1 0 0 -------------------------------------------------------------------------------- VERTICES: 00/01 [>>--------------------------] 0% ELAPSED TIME: 1486525056.00 s -------------------------------------------------------------------------------- Status: Failed Vertex failed, vertexName=Map 1, vertexId=vertex_1486178409721_43966_2_00, diagnostics=[Vertex vertex_1486178409721_43966_2_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: ex_httpvendorstats initializer failed, vertex=vertex_1486178409721_43966_2_00 [Map 1], java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException at org.apache.hadoop.fs.s3native.NativeS3FileSystem.createDefaultStore(NativeS3FileSystem.java:342) at org.apache.hadoop.fs.s3native.NativeS3FileSystem.initialize(NativeS3FileSystem.java:332) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:258) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) at org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat.listStatus(AvroContainerInputFormat.java:42) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:307) at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:409) at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:155) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:273) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:266) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:266) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.ClassNotFoundException: org.jets3t.service.S3ServiceException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 26 more ] DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1486178409721_43966_2_00, diagnostics=[Vertex vertex_1486178409721_43966_2_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: ex_httpvendorstats initializer failed, vertex=vertex_1486178409721_43966_2_00 [Map 1], java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException at org.apache.hadoop.fs.s3native.NativeS3FileSystem.createDefaultStore(NativeS3FileSystem.java:342) at org.apache.hadoop.fs.s3native.NativeS3FileSystem.initialize(NativeS3FileSystem.java:332) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:258) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) at org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat.listStatus(AvroContainerInputFormat.java:42) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:307) at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:409) at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:155) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:273) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:266) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:266) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.ClassNotFoundException: org.jets3t.service.S3ServiceException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 26 more ]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 hive>
... View more
02-08-2017
03:47 AM
I have HDP 2.5.0.0 with Ambari 2.4.1
... View more
02-08-2017
03:46 AM
Implemented settings from this post: https://community.hortonworks.com/questions/74503/unable-to-write-hive-query-output-to-s3.html But problem is still on the place. Can not find any other solution for it. Any help will be very much appreciated!
... View more
02-03-2017
10:25 PM
What does this error means and how to fix it?
... View more
Labels:
01-24-2017
03:51 PM
Yes, but how can I start it if I do not have ID? It is a mandatry parameter...
... View more
01-24-2017
07:58 AM
So, it is not possible to test drive it? Does this mean that SmartSense verifying ID with Hortonworks back end during the start? WHat is my cluster does not have access to internet and can not communicate to Hortonworks? How can I use it in such case?
... View more
01-24-2017
01:26 AM
I do not have support contract now and like to try SmartSense. How can I do this? SmartSense server is not starting without Customer account name and SmartSense ID.
... View more
Labels:
11-29-2016
05:49 AM
Could you provide details? I have doAs set to True in Ambari, HiveServer2 (regular one) use it, but HiveServer2 Interactive (new one, in Tech Preview) does not.
All queries submitted through new one executed as user hive. What else need to be set in the configuration in order to have it working?
... View more
11-20-2016
07:30 PM
So, the answer is "no" for Ambari support. Expected... What about non-ambani configuration? Is it possible? is there any docs on this?
... View more
11-20-2016
06:02 PM
@jss have you read question before answering? I clearly asking about INTERACTIVE HiveServer2, whichis part of the Hive 2.1 which is in tech preview on HDP 2.5 Your answer is about good old boring Hive 1 HiveServer2
... View more
02-18-2016
12:01 AM
1 Kudo
/etc/krb5.conf on which host?
... View more
02-16-2016
04:11 PM
1 Kudo
@Geoffrey Shelton Okot This is not the case. Kerberos configured correctly and works perfectly from the other machines. There are something very specific to Richmond's laptop. Is there a way to have a debug log from ODBC driver? We tried to turn logging on, but it does not produce any logs... This seems to me like some kind of communication problem between laptop and KDC... @Neeraj Sabharwal This is ODBC, not JDBC...
... View more
02-03-2016
12:28 AM
I have MySQL, which only accepting SSL connections fro user hive
I have HiveServer2, which connect to MySQL on start, so I have to pass these parameters to jdbc:myssql:// connect string: -Djavax.net.ssl.keyStore=/etc/hive/conf/keystore -Djavax.net.ssl.keyStorePassword=P@ssw0rd -Djavax.net.ssl.trustStore=/etc/hive/conf/truststore -Djavax.net.ssl.trustStorePassword=P@ssw0rd Can not find where to put it in the Hive configs. javax.jdo.option.ConnectionURL do not accepting adding it to the string. Same parameters works perfectly fine when passed to Java application in command line or by setting System.setProperty Is this even possible?
... View more
Labels: