Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Problem with sqoop importing from MySQL database

Problem with sqoop importing from MySQL database

Explorer

Hi All,

 

I'm trying to import data from a test MySql server that I set up.

 

I am running this command to import the 'actor' table:

 

sqoop import --connect jdbc:mysql://52.88.77.77/sakila --username root --password mypassword --verbose --table actor --as-textfile --target-dir /projects/sqoop

Things seem to going well.  It looks like it is splitting the import of the actor table out into multiple tasks. But right at the end I get an error.  

 

This is a snippet of the log messages:

 

15/09/11 13:05:14 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-13-35.us-west-2.compute.internal/172.31.13.35:8032
15/09/11 13:05:15 DEBUG db.DBConfiguration: Fetching password from job credentials store
15/09/11 13:05:15 INFO db.DBInputFormat: Using read commited transaction isolation
15/09/11 13:05:15 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`actor_id`), MAX(`actor_id`) FROM `actor`
15/09/11 13:05:15 DEBUG db.IntegerSplitter: Splits: [                           1 to                          200] into 4 parts
15/09/11 13:05:15 DEBUG db.IntegerSplitter:                            1
15/09/11 13:05:15 DEBUG db.IntegerSplitter:                           51
15/09/11 13:05:15 DEBUG db.IntegerSplitter:                          101
15/09/11 13:05:15 DEBUG db.IntegerSplitter:                          151
15/09/11 13:05:15 DEBUG db.IntegerSplitter:                          200
15/09/11 13:05:15 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`actor_id` >= 1' and upper bound '`actor_id` < 51'
15/09/11 13:05:15 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`actor_id` >= 51' and upper bound '`actor_id` < 101'
15/09/11 13:05:15 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`actor_id` >= 101' and upper bound '`actor_id` < 151'
15/09/11 13:05:15 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`actor_id` >= 151' and upper bound '`actor_id` <= 200'
15/09/11 13:05:16 INFO mapreduce.JobSubmitter: number of splits:4
15/09/11 13:05:16 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1441976602451_0001
15/09/11 13:05:16 INFO impl.YarnClientImpl: Submitted application application_1441976602451_0001
15/09/11 13:05:16 INFO mapreduce.Job: The url to track the job: http://ip-172-31-13-35.us-west-2.compute.internal:8088/proxy/application_1441976602451_0001/
15/09/11 13:05:16 INFO mapreduce.Job: Running job: job_1441976602451_0001
15/09/11 13:05:22 INFO mapred.ClientServiceDelegate: Application state is completed. FinalApplicationStatus=FAILED. Redirecting to job history server
15/09/11 13:05:23 INFO mapreduce.Job: Job job_1441976602451_0001 running in uber mode : false
15/09/11 13:05:23 INFO mapreduce.Job:  map 0% reduce NaN%
15/09/11 13:05:23 INFO mapreduce.Job: Job job_1441976602451_0001 failed with state FAILED due to: 
15/09/11 13:05:23 INFO mapreduce.ImportJobBase: The MapReduce job has already been retired. Performance
15/09/11 13:05:23 INFO mapreduce.ImportJobBase: counters are unavailable. To get this information, 
15/09/11 13:05:23 INFO mapreduce.ImportJobBase: you will need to enable the completed job store on 
15/09/11 13:05:23 INFO mapreduce.ImportJobBase: the jobtracker with:
15/09/11 13:05:23 INFO mapreduce.ImportJobBase: mapreduce.jobtracker.persist.jobstatus.active = true
15/09/11 13:05:23 INFO mapreduce.ImportJobBase: mapreduce.jobtracker.persist.jobstatus.hours = 1
15/09/11 13:05:23 INFO mapreduce.ImportJobBase: A jobtracker restart is required for these settings
15/09/11 13:05:23 INFO mapreduce.ImportJobBase: to take effect.
15/09/11 13:05:23 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@63e68a2b
15/09/11 13:05:23 ERROR tool.ImportTool: Error during import: Import job failed!

Thanks for your help.

1 REPLY 1

Re: Problem with sqoop importing from MySQL database

Explorer

Hi,

 

I changed this command slightly by changing the target director to:

 

--target-dir /user/datalake/dev/sakila/actor

and it works.  I suspect it has something to do the target directory I was using previously.

 

I think there may be more information in the logs.  Which is the best log file to look at for this?

 

Thanks,

Gaj