Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Cannot initialize cluster

Highlighted

Cannot initialize cluster

Explorer

Hello-

 

I'm trying to sqoop data from oracle to hdfs but getting the following error:

 

$ sqoop import --connect jdbc:oracle:thin:@localhost:1521/DB11G --username sqoop --password xx --table sqoop.test

 

14/01/30 10:58:10 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-oracle/compile/fa0ce9acd6ac6d0c349389a6dbfee62b/sqoop.test.jar

14/01/30 10:58:10 INFO mapreduce.ImportJobBase: Beginning import of sqoop.test

14/01/30 10:58:10 WARN conf.Configuration: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address

14/01/30 10:58:10 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar

14/01/30 10:58:10 INFO manager.SqlManager: Executing SQL statement: SELECT FIRST,LAST,EMAIL FROM sqoop.test WHERE 1=0

14/01/30 10:58:11 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps

14/01/30 10:58:11 ERROR security.UserGroupInformation: PriviledgedActionException as:oracle (auth:SIMPLE) cause:java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.

14/01/30 10:58:11 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.

 

at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:122)

at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:84)

at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:77)

at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1239)

at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1235)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:396)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)

at org.apache.hadoop.mapreduce.Job.connect(Job.java:1234)

at org.apache.hadoop.mapreduce.Job.submit(Job.java:1263)

at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1287)

at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)

at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)

at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:247)

at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:606)

at com.quest.oraoop.OraOopConnManager.importTable(OraOopConnManager.java:260)

at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)

at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)

at org.apache.sqoop.Sqoop.run(Sqoop.java:147)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)

at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)

at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)

at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)

at org.apache.sqoop.Sqoop.main(Sqoop.java:240)

 

 

Checking just the Database side works ok:

$ sqoop list-tables --connect jdbc:oracle:thin:@localhost:1521:DB11G --username sqoop --password xx

Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.

Please set $HCAT_HOME to the root of your HCatalog installation.

14/01/30 12:12:20 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.5.0

14/01/30 12:12:20 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.

14/01/30 12:12:20 INFO manager.SqlManager: Using default fetchSize of 1000

14/01/30 12:12:21 INFO manager.OracleManager: Time zone has been set to GMT

TEST

 

 

Any thoughts?

 

Thanks,

BC

2 REPLIES 2
Highlighted

Re: Cannot initialize cluster

Master Collaborator

I'm not positive here, but it looks like you might need to deploy Mapreduce client configs to the machine where you're running this import command, because it's having trouble connecting to your Jobtracker/MR service if I read the exception correctly.  The reason your "list-tables" is working most likely is that it does not need to leverage MR.  Try putting your mapred-site.xml file in /etc/hadoop/conf if you're not using Cloudera Manager.

Highlighted

Re: Cannot initialize cluster

Explorer
Thanks for you input Clint.

I ended up resolving the problem by:
So it looks like in all I needed to:
Pointing Sqoop-env.xml to hadoop-0.20-mapreduce
Capitalizing both owner and table name: SQOOP.TEST
Adding –m1 at the end to make up for the lack of a primary key.

Thanks,
BC
Don't have an account?
Coming from Hortonworks? Activate your account here