Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Upgrade from HDP 2.6.4 to 3.0.0 fails at "Update Target Repositories"

Hi all,

I have successfully upgraded Ambari to 2.7.0.0

During the HDP upgrade the process stopped at "Update Target Repositories".

Updating the desired repository version to 3.0.0.0-1634 for all cluster services.

with the full error

org.apache.ambari.server.ServiceNotFoundException: Service not found, clusterName=HDP26, serviceName=SPARK
	at org.apache.ambari.server.state.cluster.ClusterImpl.getService(ClusterImpl.java:883)
	at org.apache.ambari.server.state.UpgradeHelper.setDesiredRepositories(UpgradeHelper.java:922)
	at org.apache.ambari.server.state.UpgradeHelper.updateDesiredRepositoriesAndConfigs(UpgradeHelper.java:894)
	at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:118)
	at org.apache.ambari.server.serveraction.upgrades.UpdateDesiredRepositoryAction.updateDesiredRepositoryVersion(UpdateDesiredRepositoryAction.java:160)
	at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:128)
	at org.apache.ambari.server.serveraction.upgrades.UpdateDesiredRepositoryAction.execute(UpdateDesiredRepositoryAction.java:96)
	at org.apache.ambari.server.serveraction.ServerActionExecutor$Worker.execute(ServerActionExecutor.java:550)
	at org.apache.ambari.server.serveraction.ServerActionExecutor$Worker.run(ServerActionExecutor.java:466)
	at java.lang.Thread.run(Thread.java:745)

Any idea how to get around it? Many thanks!

12 REPLIES 12

Super Collaborator

SPARKv1 service SPARK seems to be removed in HDP3.0. It looks that Ambari DB still has the reference for SPARK service after upgrade.

https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.0.0/bk_ambari-upgrade/content/bhvr_changes_upgr...

Remove SPARK service from the cluster and try upgrade again.

Hi @rguruvannagari

Thank you for your feedback!

I did remove the SPARK services using Ambari.

Carry on with the upgrade process still fails at the same stage. I also checked some postgress DB tables hostcomponentdesiredstate, hostcomponentstate, servicecomponentdesiredstate, clusterservices. The service is not there.

Any other pointers please?

Or maybe there is a way to start the process from the beginning?

Thanks

Super Collaborator

we usually have the service info in 5 tables. Please verify if any of these tables has entry for SPARK.

You can cancel the upgrade and make sure spark service is removed completely from DB and re-do the steps.

   select * FROM servicecomponentdesiredstate WHERE service_name = 'SPARK'; 
   select * FROM hostcomponentdesiredstate WHERE service_name = 'SPARK'; 
   select * FROM hostcomponentstate WHERE service_name = 'SPARK'; 
   select * FROM servicedesiredstate where service_name = 'SPARK'; 
   select * FROM clusterservices where service_name = 'SPARK'; 

I just did a upgrade from HDP2.6.4 to HDP3.0.0 with SPARK service, ambari will delete the service itself during upgrade process, should not need any manual delete step.

@rguruvannagari

Checked all the tables, there are no entries.

Is there any way I can start the process from scratch?

I am currently stack at the "Update Target Repositories" step with the error described initially.

80516-screen-shot-2018-07-15-at-95631-pm.png

Mentor

@rguruvannagari

What is the output of running,

SELECT upgrade_id, cluster_id, from_version, to_version, direction, upgrade_package, upgrade_type FROM upgrade; 

or

SELECT from_version, to_version, direction, skip_failures, skip_sc_failures FROM upgrade;

Basically, you have to populate the upgrade_type and upgrade_pack columns based on the versions used.

This happens because a record may be missing in the repo_version table. If you insert the record before performing the Ambari Upgrade it should work.

You can make sure you have a record for each one of the versions.

-- Assuming 2.6.5.0-292 is missing 
INSERT INTO repo_version (repo_version_id, version, display_name, upgrade_package, repositories, stack_id) VALUES (1, '2.6.5.0-292', 'HDP-2.6.5.0-292', 'upgrade-3.0', '', (SELECT stack_id FROM stack WHERE stack_version = '3.0')); 

Now check the new output

SELECT from_version, to_version FROM upgrade;

HTH

@Geoffrey Shelton Okot

I did check the upgrade table. Here is what I have:


screen-shot-2018-07-15-at-95852-pm.png

Mentor

@rguruvannagari

That was a simple select statement.
Can you run exactly the SQL I shared above.

I have fixed my problem by restoring my cluster from the backup.

Mentor

@Daniel K

So you have reverted to HDP 2.6.4? Or you just restored the Ambari database?

I reverted back to HDP 2.6.4, removed spark and did the upgrade

Explorer

We have the same issue with upgrading from HDP 2.6.0.3 to HDP 3.1.0.0.

I had installed NIFI (from HDF) and MongoDB somewhen before the upgrade through Ambari management packs.

The remove unsupported services step failed for these components, so we paused the upgrade and removed MongoDB and Nifi through ambari.

After continuing the upgrade it fails again.

Failed on: Update Target Repositories

org.apache.ambari.server.ServiceNotFoundException: Service not found, clusterName=itpp, serviceName=MONGODB
    at org.apache.ambari.server.state.cluster.ClusterImpl.getService(ClusterImpl.java:888)
    at org.apache.ambari.server.state.UpgradeHelper.setDesiredRepositories(UpgradeHelper.java:922)
    at org.apache.ambari.server.state.UpgradeHelper.updateDesiredRepositoriesAndConfigs(UpgradeHelper.java:894)
    at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:118)
    at org.apache.ambari.server.serveraction.upgrades.UpdateDesiredRepositoryAction.updateDesiredRepositoryVersion(UpdateDesiredRepositoryAction.java:160)
    at org.apache.ambari.server.orm.AmbariJpaLocalTxnInterceptor.invoke(AmbariJpaLocalTxnInterceptor.java:128)
    at org.apache.ambari.server.serveraction.upgrades.UpdateDesiredRepositoryAction.execute(UpdateDesiredRepositoryAction.java:96)
    at org.apache.ambari.server.serveraction.ServerActionExecutor$Worker.execute(ServerActionExecutor.java:550)
    at org.apache.ambari.server.serveraction.ServerActionExecutor$Worker.run(ServerActionExecutor.java:466)
    at java.lang.Thread.run(Thread.java:745)

We checked in the ambari database as @rguruvannagari suggested, but there are no entries for NIFI or MONGODB. It seems that the upgrade process still remembers the components.

Is there a way to restart the upgrade process (possibly after reverting to 2.6 only if needed?) or to remove stale references to the services from the Ambari database or somewhere else?

New Contributor

@Paul Heinzlreiter

Bit late, but in case anyone else has this problem - I had this issue with Solr during my upgrade from 2.6.5 to 3.1

It first got stuck when trying to remove Solr during the 'Remove unsupported services and components' stage. I wasn't able to back out of the upgrade so I went against the warning by pausing the upgrade and removing Solr. That allowed it to move to the next stage, where it got stuck on 'Update Target Repositories' because it couldn't find the Solr service.

After some digging, it looks like it gets the component list from the 'upgrade_history' table.

So I found the SOLR entry for my current upgrade_id and removed it.

select * from upgrade_history WHERE  service_name='SOLR' and upgrade_id=51;
 id | upgrade_id | service_name | component_name | from_repo_version_id | target_repo_version_id 
----+------------+--------------+----------------+----------------------+------------------------
 103 |         51 | SOLR         | SOLR_SERVER    |                   51 |                    101
(1 row)
DELETE FROM upgrade_history WHERE id=103;

After that, I restarted Ambari and resumed the upgrade and it worked.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.