Hi there, I'm having the same issue, connecting to an S3-compatable (non AWS) endpoint. I'm using normal hadoop 2.6.3 out of the box with Java 1.8 (65). I had to add the aws jar to the HADOOP_CLASSPATH. I have confirmed that the s3a:// connector works fine againt AWS. However, when I set the fs.s3a.endpoint parameter to a different endpoint, it still queries AWS. $ hadoop-2.6.3/bin/hadoop fs -ls s3a://foo:bar@some-bucket/ Works for an AWS bucket, fails for a bucket on my S2-compatable storage (auth error). Here's my block in core-site.xml: <property> <name>fs.s3a.endpoint</name> <description>AWS S3 endpoint to connect to. An up-to-date list is provided in the AWS Documentation: regions and endpoints. Without this property, the standard region (s3.amazonaws.com) is assumed. </description> <value>my.local.endpoint.tld</value> </property> Halp!?
... View more