Reply
New Contributor
Posts: 2
Registered: ‎01-07-2018

Integration with Hitachi Object Storage (HCP)

Hi guys,

I need to integrate Hitachi Object Storage (HCP) with Cloudera 5.12.

I read some whitepaper from Hitachi, that HCP treated as S3 AWS Object Storage

 

https://community.hds.com/docs/DOC-1000184

 

 

I already follow the instruction from that whitepaper, but still no luck

 

[root@quickstart hadoop]# hadoop fs -Dfs.s3a.access.key="dGVYW50MQ==" -Dfs.s3a.secret.key="161ebd7d45089b3446ee4e06dbcf92" -ls s3a://cloudera
ls: doesBucketExist on cloudera: com.amazonaws.SdkClientException: Unable to execute HTTP request: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target: Unable to execute HTTP request: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

 

Is there any somebody already success with Object Storage integration ? please help

 

 

thanks,

jimmy

 

 

 

Posts: 1,568
Kudos: 293
Solutions: 240
Registered: ‎07-31-2013

Re: Integration with Hitachi Object Storage (HCP)

You will need to specify your custom endpoint URL too, besides credentials,
just like is done on the page 12 of the document you've referenced, but
with property 'fs.s3a.endpoint' instead (for s3a). See
http://archive.cloudera.com/cdh5/cdh/5/hadoop/hadoop-project-dist/hadoop-common/core-default.xml#fs....

Without specifying your custom endpoint URL, the requests will go to
Amazon's S3 servers instead (default).
Backline Customer Operations Engineer
Highlighted
New Contributor
Posts: 2
Registered: ‎01-07-2018

Re: Integration with Hitachi Object Storage (HCP)

[ Edited ]

 

Actually i already add endpoint in S3 Connector service through Cloudera manager

 

S3 Connector Endpoint.png

 

 

Announcements