Support Questions

Find answers, ask questions, and share your expertise

Sqoop Import failing

avatar
Contributor

I have to import Table from Mysql to Hive using Sqoop.

My cluster composed by 2 masters nodes and 2 workers Nodes with Ambari 2.7 and HDP3

I tried this command on hdfs :

[hdfs@master1 root]$ sqoop import --connect jdbc:mysql://192.168.1.83:3306/testdb --username root -P --table customers

the Job failed with :

Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'worker1.hadoop.com' (using password: YES)

Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'worker2.hadoop.com' (using password: YES)

1 ACCEPTED SOLUTION

avatar
Master Mentor

@HENI MAHER

Please make sure that you are entering correct password for "root" user.

# mysql -u root -p
Enter Password: <Enter_root_password>


After that you will need to enter the following:

mysql> use mysql;
mysql> GRANT ALL PRIVILEGES ON *.* TO 'root'@'%'  IDENTIFIED BY '<Enter_root_password>';
mysql> GRANT ALL PRIVILEGES ON *.* TO 'root'@'worker1.hadoop.com'  IDENTIFIED BY '<Enter_root_password>';
mysql> GRANT ALL PRIVILEGES ON *.* TO 'root'@'worker2.hadoop.com'  IDENTIFIED BY '<Enter_root_password>';
mysql> FLUSH PRIVILEGES;




You can check MySQL DB' "user" table to see if "root" user has access to see the DB from host 'worker1.hadoop.com' or not by running the following query:

mysql> SELECT User, Host FROM user;
+------+--------------------+
| User | Host               |
+------+--------------------+
| hive | %                  |
| root | 127.0.0.1          |
| root | ::1                |
| root | worker1.hadoop.com |
| root | worker2.hadoop.com |
| root | localhost          |
+------+--------------------+

.

View solution in original post

2 REPLIES 2

avatar
Master Mentor

@HENI MAHER

Please make sure that you are entering correct password for "root" user.

# mysql -u root -p
Enter Password: <Enter_root_password>


After that you will need to enter the following:

mysql> use mysql;
mysql> GRANT ALL PRIVILEGES ON *.* TO 'root'@'%'  IDENTIFIED BY '<Enter_root_password>';
mysql> GRANT ALL PRIVILEGES ON *.* TO 'root'@'worker1.hadoop.com'  IDENTIFIED BY '<Enter_root_password>';
mysql> GRANT ALL PRIVILEGES ON *.* TO 'root'@'worker2.hadoop.com'  IDENTIFIED BY '<Enter_root_password>';
mysql> FLUSH PRIVILEGES;




You can check MySQL DB' "user" table to see if "root" user has access to see the DB from host 'worker1.hadoop.com' or not by running the following query:

mysql> SELECT User, Host FROM user;
+------+--------------------+
| User | Host               |
+------+--------------------+
| hive | %                  |
| root | 127.0.0.1          |
| root | ::1                |
| root | worker1.hadoop.com |
| root | worker2.hadoop.com |
| root | localhost          |
+------+--------------------+

.

avatar
Contributor

Thak you @Jay Kumar SenSharma

I have a question please.
I executed this command:
sqoop-import --connect jdbc:mysql://192.168.1.83/testdb --username root --password ******* --table customers --hive-import --create-hive-table --hive-table testdb.customers --verbose
how to find this table with hive. because,
connecting to beeline and try show tables; i don't have this table testdb.customers.
a suggestion please!!!!