Created 07-04-2021 08:39 AM
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/LogicalType
at org.apache.sqoop.manager.DefaultManagerFactory.accept(DefaultManagerFactory.java:67)
at org.apache.sqoop.ConnFactory.getManager(ConnFactory.java:184)
at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:272)
at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:96)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:616)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.avro.LogicalType
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 11 more
Could you please help me with this.
Thanks in Advance.
Regards,
Venkataramana
Created 07-04-2021 10:54 AM
Can you share the whole stack command +plus output?
Created 07-05-2021 04:49 AM
sqoop import \
> --connect jdbc:mysql://localhost:3306/sample \
> --username root \
> --table test \
> --hive-import \
> --fields-terminated-by "," --hive-import --create-hive-table --hive-table sample.tesr --m 4
Warning: /home/hadoop/sqoop-1.4.7/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/hadoop/sqoop-1.4.7/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /home/hadoop/sqoop-1.4.7/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
21/07/05 11:39:26 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/LogicalType
at org.apache.sqoop.manager.DefaultManagerFactory.accept(DefaultManagerFactory.java:67)
at org.apache.sqoop.ConnFactory.getManager(ConnFactory.java:184)
at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:272)
at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:96)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:616)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.avro.LogicalType
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 11 more
Created 07-05-2021 06:14 AM
How is the root user authentication against the databases? If username password then I don't see the prompt for the password. Can you run the below and re-share the out put
Added
sqoop import \
--connect jdbc:mysql://localhost:3306/sample \
--username root -p \
--table test \
--hive-import \
--fields-terminated-by "," --hive-import --create-hive-table --hive-table sample.tesr --m 4
Created on 07-05-2021 07:22 AM - edited 07-05-2021 07:25 AM
After adding also getting same error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/LogicalType
at org.apache.sqoop.manager.DefaultManagerFactory.accept(DefaultManagerFactory.java:67)
at org.apache.sqoop.ConnFactory.getManager(ConnFactory.java:184)
at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:272)
at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:96)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:616)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.avro.LogicalType
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 11 more
I am trying to execute below command :
sqoop list-databases \
--connect jdbc:mysql://localhost:3306/ \
--username root \
--password root
For this also getting above error.
Created 07-05-2021 08:10 AM
@t1
Is there a way I can re-create your tables?
I could try the same commands I also have MySQL/MariaDB.
And keep you posted!
Created 07-05-2021 10:51 AM
I tried out the sqoop list-databases and my output looks correct
[root@bern ~]# sqoop list-databases \
> --connect jdbc:mysql://localhost:3306/ \
> --username root \
> --password welcome
.........
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
21/07/05 17:28:04 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7.3.1.4.0-315
21/07/05 17:28:05 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
21/07/05 17:28:05 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
information_schema
ambari
druid
harsh8
hive
mysql
oozie
performance_schema
ranger
rangerkms
superset
Then I run exactly the same sqoop import and it succeeded but I think its the underlying table format
[hdfs@bern ~]$ sqoop import --connect jdbc:mysql://localhost:3306/harsh8 --username root --table staff2 --hive-import --fields-terminated-by "," --hive-import --create-hive-table --hive-table staff2_backup --m 1
SLF4J: Class path contains multiple SLF4J bindings.
.....
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
21/07/05 18:36:40 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7.3.1.4.0-315
21/07/05 18:36:41 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
21/07/05 18:36:41 INFO tool.CodeGenTool: Beginning code generation
21/07/05 18:36:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `staff2` AS t LIMIT 1
21/07/05 18:36:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `staff2` AS t LIMIT 1
21/07/05 18:36:43 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/3.1.4.0-315/hadoop-mapreduce
21/07/05 18:36:48 WARN orm.CompilationManager: Could not rename /tmp/sqoop-hdfs/compile/358a7be0c1aae1ac531284e68ae3679e/staff2.java to /home/hdfs/./staff2.java. Error: Destination '/home/hdfs/./staff2.java' already exists
21/07/05 18:36:48 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/358a7be0c1aae1ac531284e68ae3679e/staff2.jar
21/07/05 18:36:49 WARN manager.MySQLManager: It looks like you are importing from mysql.
21/07/05 18:36:49 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
21/07/05 18:36:49 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
21/07/05 18:36:49 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
21/07/05 18:36:49 INFO mapreduce.ImportJobBase: Beginning import of staff2
21/07/05 18:36:59 INFO client.RMProxy: Connecting to ResourceManager at bern.swiss.ch/192.168.0.139:8050
21/07/05 18:37:09 INFO client.AHSProxy: Connecting to Application History server at bern.swiss.ch/192.168.0.139:10200
21/07/05 18:38:28 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /user/hdfs/.staging/job_1625500722080_0001
21/07/05 18:40:05 INFO db.DBInputFormat: Using read commited transaction isolation
21/07/05 18:40:23 INFO mapreduce.JobSubmitter: number of splits:1
21/07/05 18:40:32 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1625500722080_0001
21/07/05 18:40:32 INFO mapreduce.JobSubmitter: Executing with tokens: []
21/07/05 18:40:34 INFO conf.Configuration: found resource resource-types.xml at file:/etc/hadoop/3.1.4.0-315/0/resource-types.xml
21/07/05 18:40:37 INFO impl.YarnClientImpl: Submitted application application_1625500722080_0001
21/07/05 18:40:37 INFO mapreduce.Job: The url to track the job: http://bern.swiss.ch:8088/proxy/application_1625500722080_0001/
21/07/05 18:40:37 INFO mapreduce.Job: Running job: job_1625500722080_0001
21/07/05 18:46:55 INFO mapreduce.Job: Job job_1625500722080_0001 running in uber mode : false
21/07/05 18:46:55 INFO mapreduce.Job: map 0% reduce 0%
21/07/05 18:50:56 INFO mapreduce.Job: map 100% reduce 0%
21/07/05 18:51:09 INFO mapreduce.Job: Job job_1625500722080_0001 completed successfully
21/07/05 18:51:10 INFO mapreduce.Job: Counters: 32
File System Counters
FILE: Number of bytes read=0
.............
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Other local map tasks=1
Total time spent by all maps in occupied slots (ms)=385416
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=192708
Total vcore-milliseconds taken by all map tasks=192708
Total megabyte-milliseconds taken by all map tasks=394665984
Map-Reduce Framework
Map input records=6
Map output records=6
............
Physical memory (bytes) snapshot=152813568
Virtual memory (bytes) snapshot=3237081088
Total committed heap usage (bytes)=81788928
Peak Map Physical memory (bytes)=152813568
Peak Map Virtual memory (bytes)=3237081088
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=223
21/07/05 18:51:10 INFO mapreduce.ImportJobBase: Transferred 223 bytes in 852.4312 seconds (0.2616 bytes/sec)
21/07/05 18:51:10 INFO mapreduce.ImportJobBase: Retrieved 6 records.
21/07/05 18:51:10 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `staff2` AS t LIMIT 1
21/07/05 18:51:10 INFO hive.HiveImport: Loading uploaded data into Hive
My table structure
MariaDB [harsh8]> describe staff2;
+------------+-------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+------------+-------------+------+-----+---------+-------+
| id | int(11) | NO | PRI | NULL | |
| Name | varchar(20) | YES | | NULL | |
| Position | varchar(20) | YES | | NULL | |
| Salary | int(11) | YES | | NULL | |
| Department | varchar(10) | YES | | NULL | |
+------------+-------------+------+-----+---------+-------+
My test table contents
MariaDB [harsh8]> select * from staff2;
+-----+------------+-------------------+--------+------------+
| id | Name | Position | Salary | Department |
+-----+------------+-------------------+--------+------------+
| 100 | Geoffrey | manager | 50000 | Admin |
| 101 | Thomas | Oracle Consultant | 15000 | IT |
| 102 | Biden | Project Manager | 28000 | PM |
| 103 | Carmicheal | Bigdata developer | 30000 | BDS |
| 104 | Johnson | Treasurer | 21000 | Accounts |
| 105 | Gerald | Director | 30000 | Management |
+-----+------------+-------------------+--------+------------+
6 rows in set (0.09 sec)
This is how my comma-delimited
[hdfs@bern ~]$ hdfs dfs -cat /tmp/sqoop/hr.txt
100,Geoffrey,manager,50000,Admin
101,Thomas,Oracle Consultant,15000,IT
102,Biden,Project Manager,28000,PM
103,Carmicheal,Bigdata developer,30000,BDS
104,Johnson,Treasurer,21000,Accounts
105,Gerald,Director,30000,Management
106,Paul,Director,30000,Management
105,Mark,CEO,90000,Management
105,Edward,Janitor,30000,Housing
105,Richard,Farm Manager,31000,Agriculture
105,Albert,System Engineer,21000,IT
The dataset looks like the above is your format AVRO?
Happy hadooping
Created 07-07-2021 05:12 AM
I have copied avro-1.8.1.jar and commons-lang3-3.1.jar both jars. It's working now .
Thanks for your response .
Created 07-08-2021 06:59 AM
Thank you for your participation in Cloudera Community. I'm happy to see you resolved your issue. Please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
Regards,
Vidya Sargur,Created 07-05-2021 02:17 AM
It seems that the avro logical type are not matching. Please double check the data types.
If you find the answer helpful please accept this as a solution.