Member since
05-20-2016
155
Posts
220
Kudos Received
30
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7062 | 03-23-2018 04:54 AM | |
2592 | 10-05-2017 02:34 PM | |
1427 | 10-03-2017 02:02 PM | |
8318 | 08-23-2017 06:33 AM | |
3128 | 07-27-2017 10:20 AM |
12-27-2016
11:09 AM
2 Kudos
@rdoktorics thanks , this works !
... View more
12-27-2016
07:00 AM
1 Kudo
@Constantin Stanca I see this issue with both fresh and upgraded system and moving from s3n and s3a help me in uploading to S3.
... View more
12-27-2016
06:58 AM
1 Kudo
@Rajkumar Singh Thanks. I could see that jets3t-0.9.0.jar is loaded. Also as per @Rajesh Balamohan suggestion moving from s3n to s3a , I could get it working.
... View more
12-26-2016
03:19 PM
1 Kudo
@Rajkumar Singh I can see the jar's in specified location, how did we check whether is loading these jars ? ls -lrt /usr/hdp/2.5.3.0-14/hadoop/lib/jets3t-0.9.0.jar
-rw-r--r--. 1 root root 539735 Nov 10 18:00 /usr/hdp/2.5.3.0-14/hadoop/lib/jets3t-0.9.0.jar
... View more
12-26-2016
01:00 PM
1 Kudo
@Rajkumar Singh Thank you for your reply. Should HDP not take care of packaging this correctly ? This issue I see it in HDP 2.5
... View more
12-26-2016
12:25 PM
2 Kudos
I am on HDP 2.5 and when trying to write hive query output to S3, I get below exception.
Caused by: java.lang.NoClassDefFoundError: org/jets3t/service/ServiceException
at org.apache.hadoop.fs.s3native.NativeS3FileSystem.createDefaultStore(NativeS3FileSystem.java:342)
at org.apache.hadoop.fs.s3native.NativeS3FileSystem.initialize(NativeS3FileSystem.java:332)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.initializeOp(FileSinkOperator.java:348)
at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.initializeOp(VectorFileSinkOperator.java:70)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:363)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:482)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:439)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:482)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:439)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
at org.apache.hadoop.hive.ql.exec.MapOperator.initializeMapOperator(MapOperator.java:489)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.init(MapRecordProcessor.java:231)
... 15 more
Caused by: java.lang.ClassNotFoundException: org.jets3t.service.ServiceException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 34 more
Below is what I did from hive shell INSERT OVERWRITE DIRECTORY 's3n://santhosh.aws.com/tmp'
SELECT * FROM REGION
The jets3t library is part of the hive classpath ?
... View more
Labels:
- Labels:
-
Apache Hive
12-26-2016
05:29 AM
4 Kudos
below shell command works for me for both ambari and HDP repo cluster create --version 2.X --stackRepoId HDP-2.X --stackBaseURL http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/2.x/BUILDS/2.X.X.0-154 --utilsRepoId HDP-UTILS-1.1.0.21 --utilsBaseURL http://s3.amazonaws.com/dev.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7 --stack HDP --verify true --os redhat7 --ambariRepoGpgKey http://s3.amazonaws.com/dev.hortonworks.com/ambari/centos6/2.x/BUILDS/2.X.X.0-524/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins --ambariRepoBaseURL http://s3.amazonaws.com/dev.hortonworks.com/ambari/centos6/2.x/BUILDS/2.X.X.0-524 --ambariVersion 2.X.X.0-524 --enableSecurity true --kerberosMasterKey master --kerberosAdmin admin --kerberosPassword admin --wait true
... View more
12-25-2016
02:45 PM
3 Kudos
you could also do the same via cloudbreak shell using below command. cluster create --version 2.X --stackRepoId HDP-2.X --stackBaseURL http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos7/2.x/BUILDS/2.X.X.0-154 --utilsRepoId HDP-UTILS-1.1.0.21 --utilsBaseURL http://s3.amazonaws.com/dev.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7 --stack HDP --verify true --os redhat7 --ambariRepoGpgKey http://s3.amazonaws.com/dev.hortonworks.com/ambari/centos6/2.x/BUILDS/2.X.X.0-524/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins --ambariRepoBaseURL http://s3.amazonaws.com/dev.hortonworks.com/ambari/centos6/2.x/BUILDS/2.X.X.0-524 --ambariVersion 2.X.X.0-524 --enableSecurity true --kerberosMasterKey master --kerberosAdmin admin --kerberosPassword admin --wait true
... View more
12-24-2016
07:30 AM
2 Kudos
Does cloudbreak shell provide a mechanism for automation i.e on executing below -- is it possible to "echo" the response to stdout ? java -jar /tmp/cloudbreak-shell.jar --sequenceiq.user=admin@example.com --sequenceiq.password=passowrd --identity.address=https://XX.XX.XX.XX/identity --cloudbreak.address=https://XX.XX.XX.XX --cert.validation=false --cmdfile /tmp/cmdfile This is what I am trying to achieve via automation 1] Send cmdfile with "credential list" -- This should return me the list of credentials 2] If my credential is not present send another cmdfile for "credential create" The problem here is when I am running cloudbreak-shell via --cmdfile , the response of command is not re-directed to the stdout so unable to go this route.
... View more
Labels:
- Labels:
-
Hortonworks Cloudbreak