Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

SAM Error Deploying due to Dependency Resolution Error

Contributor

I get the following error with HDF 3.0.1 when deploying an application.

I'm on a single server that uses a proxy.

I can see it is a known error but no real work around -

https://community.hortonworks.com/questions/118255/streaming-analytics-manager-fail-to-deploy-applic...

The issue is STORM-2598. But Storm 1.2.0 is not out yet. Help!

ERROR:

Topology submission failed due to: java.lang.Exception: Topology could not be deployed successfully: storm deploy command failed with Exception in thread "main" java.lang.RuntimeException: org.eclipse.aether.resolution.DependencyResolutionException: Failed to read artifact descriptor for org.apache.kafka:kafka-clients:jar:0.10.2.1

7 REPLIES 7

Cloudera Employee

I believe the fix will be provided to HDF 3.0.2. Before the version comes out, you can try out workaround below:

  1. Please check that there's .m2 directory in 'storm' user's home directory.
    1.a. If it exists, please place artifacts to that directory.
    1.b. If it doesn't exist, go to 2.
  2. Please check that there's 'local-repo' in either installation of SAM (streamline) or Storm directory.
    2.a. If it exists, please place artifacts to that directory.
    2.b. Please create .m2 directory in 'storm' user's home directory, and apply 1.a.

Please note that artifacts (jar/pom) should be placed along with metafile (_remote.repositories), and the repository name should be either 'hwx-public' or 'hwx-private' unless it came from maven public repository.

The best bet is pulling artifacts via SAM with internet access, so that artifacts are downloaded to maven local and repository name in _remote.repositories is properly set (important!). Then copies downloaded artifacts (in maven local) to the machines which don’t have internet access.

I am getting the same issue even in HDF 3.2.0.

I am running HDF on a CentOS cluster with limited internet access. I activated the Proxy for SAM but it doesn't change anything.

Explorer

It is pathetic that issue is still going on

 

Could not transfer artifact hortonworks.storm.aws:storm-s3:pom:0.0.1-SNAPSHOT from/to hortonworks.repo (http://nexus-private.hortonworks.com/nexus/content/groups/public/😞 Transfer failed for http://nexus-private.hortonworks.com/nexus/content/groups/public/hortonworks/storm/aws/storm-s3/0.0....

Cloudera Employee

Hello @SurajP May I know which version of HDF are we using?

Explorer

The latest available. hdf311 ,(3.1.1) for docker along with HDP 265 with CDA enabled

Contributor

Thanks for the response. Any indication to when 3.0.2 is to be released?

We don't have internet access, except via a proxy, so we can't get a list of jars and poms required. Any chance of doing a `ls` and sending those, or is that pushing things?

Cloudera Employee

Sorry I missed this, and we've announced HDF 3.0.2 4 days ago.

- Announce:

https://community.hortonworks.com/articles/147515/hortonworks-data-flow-hdf-version-302-release-anno...

- Document:

https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.2/index.html

Hope this helps!

ps. Please vote my answer and mark as selected to make question answered. Thanks!

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.