Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Tutorial 830 foreign key constraint fails

Contributor

Hi,

I am following the tutorial 830 which implements the trucking example app on the HDF 3.0 sandbox. https://hortonworks.com/tutorial/real-time-event-processing-in-nifi-sam-schema-registry-and-superset...

Problem One

The tutorial itself does not have a commenting system associated with it. It suggests we use this platform. Except that the "Find Answers" link is not correct and it recommends using the tags "tutorial-830 and hdf-3.0.0" which I cannot create.

Problem Two

I am getting a foreign key constraint problem when importing the example SAM application from the supplied JSON.

Can anyone confirm if this is a problem with the tutorial or whether I have done something wrong?

Thanks... The exception on screen seems to be as follows:

   An exception with message [com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Cannot add or update a child row: a foreign key constraint fails (`streamline`.`topology`, CONSTRAINT `topology_ibfk_1` FOREIGN KEY (`versionId`) REFERENCES `topology_version` (`id`))] was thrown while processing request.   
4 REPLIES 4

Rising Star

Problem one appears to be a documentation bug that needs to be corrected. As to problem two, I ran into the same issue while setting it up, and I ended up having to temporarily disable the foreign key checks on the MySQL instance that SAM connects to:

1. Inside the MySQL command prompt, run 'SET FOREIGN_KEY_CHECKS=0;'

2. Import the app .json template

3. Inside the MySQL command prompt, run 'SET FOREIGN_KEY_CHECKS=1;'

Hope that helps.

Contributor

It is really helpful to know it wasn't me messing up. I can switch off foreign key restraints temporarily as you suggest. Thanks!

Rising Star

Your welcome, please don't forget to 'accept' the answer if it works!

New Contributor

Further to dsun's reply, here are the implementation steps:

1) login to the HDF sandbox (default password is hadoop)

ssh -p 12222 root@localhost

2) login to the mysql console (default password is also hadoop)

[root@sandbox-hdf streamline]# mysql streamline --user=root -p
Enter password:
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 488
Server version: 5.6.36 MySQL Community Server (GPL)
Copyright (c) 2000, 2017, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

3) set the global foreign_key_checks variable to off:

mysql> set @@global.foreign_key_checks=off;
Query OK, 0 rows affected (0.00 sec)

4) restart SAM on the ambari console

5) add the application as per the original tutorial.

,

Further to dsun's answer, here are the commands to apply the recommended solution:

# Login as root to the hdf sandbox (ssh -p 12222 root@localhost) <default pwd is hadoop># run the following (default pwd is also hadoop>mysql streamline --user=root -p

Enter password: hadoop

Reading table information for completion of table and column names You can turn off this feature to get a quicker startup with -A Welcome to the MySQL monitor.  Commands end with ; or \g. Your MySQL connection id is 488 Server version: 5.6.36 MySQL Community Server (GPL)
Copyright (c) 2000, 2017, Oracle and/or its affiliates. All rights reserved. Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners. Type 'help;' or '\h' for help. Type '\c' to clear the current input statement. 
mysql> set @@global.foreign_key_checks=off;
Query OK, 0 rows affected (0.00 sec)

Then, restart SAM from the ambari console, and try again.