Support Questions

Find answers, ask questions, and share your expertise

Integration with Hitachi Object Storage (HCP)

avatar
New Contributor

Hi guys,

I need to integrate Hitachi Object Storage (HCP) with Cloudera 5.12.

I read some whitepaper from Hitachi, that HCP treated as S3 AWS Object Storage

 

https://community.hds.com/docs/DOC-1000184

 

 

I already follow the instruction from that whitepaper, but still no luck

 

[root@quickstart hadoop]# hadoop fs -Dfs.s3a.access.key="dGVYW50MQ==" -Dfs.s3a.secret.key="161ebd7d45089b3446ee4e06dbcf92" -ls s3a://cloudera
ls: doesBucketExist on cloudera: com.amazonaws.SdkClientException: Unable to execute HTTP request: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target: Unable to execute HTTP request: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

 

Is there any somebody already success with Object Storage integration ? please help

 

 

thanks,

jimmy

 

 

 

4 REPLIES 4

avatar
Mentor
You will need to specify your custom endpoint URL too, besides credentials,
just like is done on the page 12 of the document you've referenced, but
with property 'fs.s3a.endpoint' instead (for s3a). See
http://archive.cloudera.com/cdh5/cdh/5/hadoop/hadoop-project-dist/hadoop-common/core-default.xml#fs....

Without specifying your custom endpoint URL, the requests will go to
Amazon's S3 servers instead (default).

avatar
New Contributor

 

Actually i already add endpoint in S3 Connector service through Cloudera manager

 

S3 Connector Endpoint.png

 

 

avatar
New Contributor

Hi Jimmy,

Did you able to integrate Cloudera with HCP? Or still struggling?

avatar
New Contributor

I actually ran into this same issue when integrating Hitachi HCP with Cloudera 5.12. It was super frustrating at first, but I finally got it working while I was studying for some cloud certifications and practicing with sample questions. Here's what worked for me:

The problem you're seeing (PKIX path building failed) is related to SSL certificate validation. HCP’s certificate might not be recognized by your Java truststore, so Java blocks the connection.

Here’s what I did to fix it:

  1. Download the SSL certificate from your HCP endpoint.
  2. Import that certificate into your Java truststore using the keytool command:
sudo keytool -import -alias hcp-cert -keystore $JAVA_HOME/jre/lib/security/cacerts -file /path/to/hcp-cert.pem

3. Restart the Cloudera services after the import.

Also, in case you're just trying things out, you could disable SSL validation temporarily (not for production though) by adding this parameter:

-Dfs.s3a.connection.ssl.enabled=false

Honestly, working through similar practice scenarios on Study4Exam really helped me get a grip on these types of configurations. They have questions that dive deep into AWS, S3, and related security settings. Hope that helps!