Support Questions
Find answers, ask questions, and share your expertise

I´m trying to import data from mysql to hive through sqoop but i´m getting an error

Solved Go to solution

I´m trying to import data from mysql to hive through sqoop but i´m getting an error

Explorer

The command is:

sqoop import \
  --connect "jdbc:mysql://sandbox.hortonworks.com:3306/retail_db" \
  --username=root \
  --password=hadoop \
  --table departments \
  --hive-home /apps/hive/warehouse \
  --hive-import \
  --hive-overwrite \
  --hive-table sqoop_import.departments \
  --outdir java_files

The output is:

Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/02/11 02:15:06 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
17/02/11 02:15:06 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/02/11 02:15:06 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/02/11 02:15:06 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/02/11 02:15:06 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/02/11 02:15:06 INFO tool.CodeGenTool: Beginning code generation
17/02/11 02:15:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
17/02/11 02:15:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
17/02/11 02:15:06 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/eef451e1cc95fb2071ebe74f5d9371e9/departments.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/02/11 02:15:08 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/eef451e1cc95fb2071ebe74f5d9371e9/departments.jar
17/02/11 02:15:08 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/02/11 02:15:08 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/02/11 02:15:08 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/02/11 02:15:08 INFO mapreduce.ImportJobBase: Beginning import of departments
17/02/11 02:15:09 INFO impl.TimelineClientImpl: Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
17/02/11 02:15:09 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
17/02/11 02:15:09 INFO client.AHSProxy: Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
17/02/11 02:15:14 INFO db.DBInputFormat: Using read commited transaction isolation
17/02/11 02:15:14 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`department_id`), MAX(`department_id`) FROM `departments`
17/02/11 02:15:14 INFO db.IntegerSplitter: Split size: 1; Num splits: 4 from: 2 to: 7
17/02/11 02:15:14 INFO mapreduce.JobSubmitter: number of splits:4
17/02/11 02:15:14 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1486771879029_0004
17/02/11 02:15:15 INFO impl.YarnClientImpl: Submitted application application_1486771879029_0004
17/02/11 02:15:15 INFO mapreduce.Job: The url to track the job: http://sandbox.hortonworks.com:8088/proxy/application_1486771879029_0004/
17/02/11 02:15:15 INFO mapreduce.Job: Running job: job_1486771879029_0004
17/02/11 02:15:21 INFO mapreduce.Job: Job job_1486771879029_0004 running in uber mode : false
17/02/11 02:15:21 INFO mapreduce.Job:  map 0% reduce 0%
17/02/11 02:15:29 INFO mapreduce.Job:  map 50% reduce 0%
17/02/11 02:15:30 INFO mapreduce.Job:  map 75% reduce 0%
17/02/11 02:15:31 INFO mapreduce.Job:  map 100% reduce 0%
17/02/11 02:15:31 INFO mapreduce.Job: Job job_1486771879029_0004 completed successfully
17/02/11 02:15:31 INFO mapreduce.Job: Counters: 30
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=652000
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=481
		HDFS: Number of bytes written=60
		HDFS: Number of read operations=16
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=8
	Job Counters 
		Launched map tasks=4
		Other local map tasks=4
		Total time spent by all maps in occupied slots (ms)=19510
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=19510
		Total vcore-milliseconds taken by all map tasks=19510
		Total megabyte-milliseconds taken by all map tasks=4877500
	Map-Reduce Framework
		Map input records=6
		Map output records=6
		Input split bytes=481
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=930
		CPU time spent (ms)=3690
		Physical memory (bytes) snapshot=548311040
		Virtual memory (bytes) snapshot=7742504960
		Total committed heap usage (bytes)=176160768
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=60
17/02/11 02:15:31 INFO mapreduce.ImportJobBase: Transferred 60 bytes in 22.7172 seconds (2.6412 bytes/sec)
17/02/11 02:15:31 INFO mapreduce.ImportJobBase: Retrieved 6 records.
17/02/11 02:15:31 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners
17/02/11 02:15:31 INFO atlas.ApplicationProperties: Looking for atlas-application.properties in classpath
17/02/11 02:15:31 INFO atlas.ApplicationProperties: Loading atlas-application.properties from file:/etc/sqoop/2.5.0.0-1245/0/atlas-application.properties
17/02/11 02:15:32 ERROR security.InMemoryJAASConfiguration: Unable to add JAAS configuration for client [KafkaClient] as it is missing param [atlas.jaas.KafkaClient.loginModuleName]. Skipping JAAS config for [KafkaClient]
17/02/11 02:15:32 INFO hook.AtlasHook: Created Atlas Hook
1 ACCEPTED SOLUTION

Accepted Solutions

Re: I´m trying to import data from mysql to hive through sqoop but i´m getting an error

Super Guru

on your sandbox, please confirm you have service atlas running. if not please enable it and rerun sqoop command above

View solution in original post

3 REPLIES 3

Re: I´m trying to import data from mysql to hive through sqoop but i´m getting an error

Super Guru

on your sandbox, please confirm you have service atlas running. if not please enable it and rerun sqoop command above

View solution in original post

Re: I´m trying to import data from mysql to hive through sqoop but i´m getting an error

Explorer

Thank you @Sunile Manjee,

It has worked after starting the dependencies of Atlas (Ambari Infra, HBase and Kafka).

Junior.

Re: I´m trying to import data from mysql to hive through sqoop but i´m getting an error

Explorer

Hi Does this mean Sqoop will never work if i do not have atlas service installed?

I used sqoop list databases and that works fine, but when I try to import i get the "Using Hive-specific delimiters for output. You can override" error