Member since
01-20-2017
39
Posts
6
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3014 | 03-09-2017 07:42 PM |
06-06-2017
07:28 PM
@slachterman the main reason I want to add credential provider to the hadoop configs is, we are planning to create hive tables on top of s3 data and authorized users (ranger policies) can access that table. I tried to pass hadoop.security.credential.provider.path in HiveServer2 connection as parameter, but that is not helping to get access to s3. I am getting below error, Error: Failed to open new session: org.apache.hive.service.cli.HiveSQLException: java.lang.IllegalArgumentException: Cannot modify hadoop.security.credential.provider.path at runtime. It is not in list of params that are allowed to be modified at runtime (state=,code=0) To address above error, I have added hadoop.security.credential.provider.path to hive.security.authorization.sqlstd.confwhitelist.append, but still I am getting the same error.
... View more
06-05-2017
09:07 PM
Thanks @slachterman .. what is the best way to pass credentials using Hadoop credentials API? As per documentations I have seen, I can pass using fs.s3a.security.credential.provider.path, but this parameter is not working.
... View more
06-05-2017
07:30 PM
Thanks @slachterman .. I have followed the same article as mentioned above. From CLI I am able to successfully access s3 buckets by passing -Dhadoop.security.credential.provider.path=jceks://hdfs/app/awss3/aws.jceks. But if I configure same in core-site.xml, Hadoop services are failing to start with below error. So I configured fs.s3a.secret.key, fs.s3a.access.key, but getting above mentioned error. Exception in thread "main" java.lang.StackOverflowError
at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
at java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
at java.io.File.exists(File.ava:819)
at sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:1245)
at sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:1212)
at sun.misc.URLClassPath$1.next(URLClassPath.java:240)
at sun.misc.URLClassPath$1.hasMoreElements(URLClassPath.java:250)
at java.net.URLClassLoader$3$1.run(URLClassLoader.java:601)
at java.net.URLClassLoader$3$1.run(URLClassLoader.java:599)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader$3.next(URLClassLoader.java:598)
at java.net.URLClassLoader$3.hasMoreElements(URLClassLoader.java:623)
at sun.misc.CompoundEnumeration.next(CompoundEnumeration.java:45)
at sun.misc.CompoundEnumeration.hasMoreElements(CompoundEnumeration.java:54)
at java.util.ServiceLoader$LazyIterator.hasNextService(ServiceLoader.java:354)
at java.util.ServiceLoader$LazyIterator.hasNext(ServiceLoader.java:393)
at java.util.ServiceLoader$1.hasNext(ServiceLoader.java:474)
at javax.xml.parsers.FactoryFinder$1.run(FactoryFinder.java:293)
at java.security.AccessController.doPrivileged(Native Method)
at javax.xml.parsers.FactoryFinder.findServiceProvider(FactoryFinder.java:289)
at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:267)
at javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:120)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2549)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2526)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2418)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1232)
at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:675)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:286)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:274)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:804)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2920)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2910)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2776)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:377)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
at org.apache.hadoop.security.alias.JavaKeyStoreProvider.initFileSystem(JavaKeyStoreProvider.java:89)
at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.<init>(AbstractJavaKeyStoreProvider.java:82)
at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:49)
at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:41)
at org.apache.hadoop.security.alias.JavaKeyStoreProvider$Factory.createProvider(JavaKeyStoreProvider.java:100)
at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:58)
at org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:1959)
at org.apache.hadoop.conf.Configuration.getPassword(Configuration.java:1939)
at org.apache.hadoop.security.LdapGroupsMapping.getPassword(LdapGroupsMapping.java:621)
at org.apache.hadoop.security.LdapGroupsMapping.setConf(LdapGroupsMapping.java:564)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at org.apache.hadoop.security.Groups.<init>(Groups.java:99)
at org.apache.hadoop.security.Groups.<init>(Groups.java:95)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:420)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:297)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:274)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:804)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
at or
... View more
06-04-2017
11:01 PM
2 Kudos
Hi, We have a requirement to fetch the data from Amazon s3 and trying to configure hdfs to support that. So far we have configured fs.s3a.secret.key, fs.s3a.access.key these properties and passing credentials as plain text, which works fine. As part of security requirements we are trying to encrypt credentails using Hadoop credentials API, but its failing to read. Getting below error message, $ hadoop fs -ls s3a://aws-test/testfile.csv ls: doesBucketExist on wework-mz: com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain: Unable to load AWS credentials from any provider in the chain Steps done: hadoop credential create fs.s3a.access.key -provider jceks://hdfs/app/awss3/aws.jceks
hadoop credential create fs.s3a.secret.key -provider jceks://hdfs/app/awss3/aws.jceks $ hadoop credential list -provider jceks://hdfs/app/awss3/aws.jceks
Listing aliases for CredentialProvider: jceks://hdfs/app/awss3/aws.jceks
fs.s3a.secret.key
fs.s3a.access.key Note: NameNode is in HA
... View more
Labels:
04-26-2017
08:41 PM
Thanks @gnovak. I was able to extract container-log4j.properties file as suggested and updated it to write to SocketAppender for container logs. But for some reason container is not writing logs to SocketAppender. If I add same log4j socketappender parameters to root logger from ambari, Nodemanager service is sending logs to socket appender. Any thoughts on how to configure container to write logs to socket appender using log4j? Below is the updated container-log4j, hadoop.root.logger=DEBUG,CLA,SA log4j.appender.SA=org.apache.log4j.net.SocketAppender log4j.appender.SA.Port=4560 log4j.appender.SA.RemoteHost=localhost log4j.appender.SA.ReconnectionDelay=10000 log4j.appender.SA.Application=NM-${yarn.app.container.log.dir} Looking at the conatiner launch script, I see below is being passed as root logger and it is getting overriden -Dlog4j.configuration=container-log4j.properties -Dhadoop.root.logger=INFO,CLA
... View more
04-25-2017
12:59 AM
In HDP 2.5 (managed using Ambari) how to override default container-log4j file. I am trying to add SocketAppender for all container logs so that it can be sent to logstash. In version HDP 2.4.0 I can find container-log4j file /usr/hdp/2.4.0.0-169/hadoop/src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/resources/container-log4j.properties, but not in HDP 2.5. Any thoughts on how to configure container logs to include socket appender would be great.
... View more
Labels:
03-13-2017
04:55 PM
@Sumesh @Josh Elser Thanks for your response. Distcp from dev to prod worked without any issues, but I tried Hbase copy table from prod to dev and found that dev was not correctly configured for Cross realm trust. After fixing dev to accept prod tickets, I am able to successfully copy data.
... View more
03-09-2017
07:42 PM
Hi @Artem Ervits .. I have deployed Ambari server v2.4.2 on a brand new machine and it worked. Not sure what was the issue on the server where we already have ambari-agent v 2.2.2 is installed. Removing agents and also cleaning up directories didn't helped to completely resolve the issue. Thanks a lot for your help !!
... View more
03-09-2017
05:58 PM
Hi @Sumesh I have followed the same documentation for setting up cross realm. I think issue is with configuring domain_realms. Can u please let me know how that needs to be configured.
... View more
03-09-2017
12:38 AM
@Artem Ervits I have followed the above listed steps and initial error is resolved. But I see new error message. (We are using RedHat 6.6 version of OS and Python 2.6.6) [root@pxnhd539 views]# ambari-server setup
Using python /usr/bin/python
Setup ambari-server
Traceback (most recent call last):
File "/usr/sbin/ambari-server.py", line 37, in <module>
from ambari_server.serverUpgrade import upgrade, upgrade_stack, set_current
File "/usr/lib/python2.6/site-packages/ambari_server/serverUpgrade.py", line 50, in <module>
from ambari_server.setupMpacks import replay_mpack_logs
File "/usr/lib/python2.6/site-packages/ambari_server/setupMpacks.py", line 41, in <module>
from resource_management.libraries.functions.tar_archive import extract_archive, get_archive_root_dir
ImportError: cannot import name extract_archive
... View more