Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Integration with Hitachi Object Storage (HCP)

avatar
New Contributor

Hi guys,

I need to integrate Hitachi Object Storage (HCP) with Cloudera 5.12.

I read some whitepaper from Hitachi, that HCP treated as S3 AWS Object Storage

 

https://community.hds.com/docs/DOC-1000184

 

 

I already follow the instruction from that whitepaper, but still no luck

 

[root@quickstart hadoop]# hadoop fs -Dfs.s3a.access.key="dGVYW50MQ==" -Dfs.s3a.secret.key="161ebd7d45089b3446ee4e06dbcf92" -ls s3a://cloudera
ls: doesBucketExist on cloudera: com.amazonaws.SdkClientException: Unable to execute HTTP request: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target: Unable to execute HTTP request: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

 

Is there any somebody already success with Object Storage integration ? please help

 

 

thanks,

jimmy

 

 

 

3 REPLIES 3

avatar
Mentor
You will need to specify your custom endpoint URL too, besides credentials,
just like is done on the page 12 of the document you've referenced, but
with property 'fs.s3a.endpoint' instead (for s3a). See
http://archive.cloudera.com/cdh5/cdh/5/hadoop/hadoop-project-dist/hadoop-common/core-default.xml#fs....

Without specifying your custom endpoint URL, the requests will go to
Amazon's S3 servers instead (default).

avatar
New Contributor

 

Actually i already add endpoint in S3 Connector service through Cloudera manager

 

S3 Connector Endpoint.png

 

 

avatar
New Contributor

Hi Jimmy,

Did you able to integrate Cloudera with HCP? Or still struggling?