Member since
01-13-2016
6
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
28660 | 07-16-2016 01:20 AM |
09-30-2024
07:40 AM
非安全集群被阻止rpc通信,使用webhdfs协议,hadoop distcp -D ipc.client.fallback-to-simple-auth-allowed=true webhdfs://nn1:50070/foo/bar hdfs://nn2:8020/bar/foo
... View more
04-05-2018
05:54 AM
step1:.add this two property file into core-site.xml file. <property> <name>fs.s3a.access.key</name> <value>your aws IAM user access key</value> </property>
<property> <name>fs.s3a.secret.key</name> <value>your aws IAM user secret key</value> </property> step2: add s3 bucket endpoint property file into core-site.xml.before you add check s3 bucket region. for example my bucket in mumbai location:https://s3.ap-south1.amazonaws.com/bucketname/foldername/filename.csv
<property> <name>fs.s3a.endpoint</name> <value> s3,bucket.locatoon </value> s3.ap-south1.amazonaws.com </property> Note:otherwise you get 400 Bad Request WARN s3a.S3AFileSystem:Client: Amazon S3 error 400: 400 Bad Request; Bad Request com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code:400; Error Code:400 Bad Request; step 3.add hadoop.security.credential.provider.path property file into core-site.xml.for this use can add access.key and secret.key file on hdfs path(hadoop credential API to store AWS secrets.). example:these commands run as I: hdfs
hdfs dfs -chown s3_acces:hdfs /user/s3_access II: hadoop credential create fs.s3a.access.key -value aws-IAM-user_accesskey - / provider jceks://hdfs@10.22.121.0:8020/user/s3_access/s3.jceks. III:hadoop credential create fs.s3a.secret.key -value aws-IAM-user_secretkey -provider jceks://hdfs@10.22.121.0:8020/user/s3_access/s3.jceks IV. hadoop credential list -provider jceks://hdfs@10.22.121.0:8020/user/s3_access/s3.jceks you will get output as below: Listing aliases for CredentialProvider: jceks://hdfs@13.229.32.224:8020/user/s3_access/s3.jceks fs.s3a.secret.key fs.s3a.access.key finally you craeted store AWS secrets credential on hadoop] hdfs dfs -chowm s3_acces:hdfs /user/s3_access/s3.jceks hdfs dfs -chmod 666 /user/s3_access/s3.jceks <property> <name>hadoop.security.credential.provider.path</name> <value>jceks://hdfs@10.22.121.0:8020/user/s3_access/s3.jceks</value> </property> step 4:restart ambari-server: ambari-server restart
hadoop fs -ls s3a://yourbucketname/folder/file.csv hadoop distcp s3a://yourbucketname/foldername/filename.csv hdfs://10.22.121.0:8020/you hdfc folder flollow this link: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP2.6.2/bk_cloud-data-access/content/s3-config-props.html
... View more