Support Questions

Find answers, ask questions, and share your expertise

Oozie Sqoop action throwing java.io.IOException: No columns to generate for ClassWriter

avatar
Expert Contributor

I'm trying to test Oozie's Sqoop action in the following environment:

  • HDP2.3.2
  • Sqoop 1.4.6
  • Oozie 4.2.0

Via the command line, the following sqoop command works:

sqoop import \
	-D mapred.task.timeout=0 \
	--connect jdbc:sqlserver://x.x.x.x:1433;database=CEMHistorical \
 	--table MsgCallArrival \
	--username hadoop \
	--password-file hdfs:///user/sqoop/.adg.password \
	--hive-import \
	--create-hive-table \
	--hive-table develop.oozie \
	--split-by TimeStamp \
	--map-column-hive Call_ID=STRING,Stream_ID=STRING

But when I try to execute the same command via Oozie, I'm running into java.io.IOException: No columns to generate for ClassWriter

Below are my `job.properties` and `workflow.xml`:

nameNode=hdfs://host.vitro.com:8020
jobTracker=host.vitro.com:8050
projectRoot=${nameNode}/user/${user.name}/tmp/sqoop-test/
oozie.use.system.libpath=true
oozie.wf.application.path=${projectRoot}



<workflow-app name="sqoop-test-wf" xmlns="uri:oozie:workflow:0.4">
    <start to="sqoop-import"/>

    <action name="sqoop-import" retry-max="10" retry-interval="1">
        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
        <command>import -D mapred.task.timeout=0 --connect jdbc:sqlserver://x.x.x.x:1433;database=CEMHistorical --table MsgCallArrival --username hadoop --password-file hdfs:///user/sqoop/.adg.password --hive-import --create-hive-table --hive-table develop.oozie --split-by TimeStamp --map-column-hive Call_ID=STRING,Stream_ID=STRING</command>
        </sqoop>
        <ok to="end"/>
        <error to="errorcleanup"/>
    </action>
    <kill name="errorcleanup">
      <message>Sqoop Test WF failed. [${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name ="end"/>
</workflow-app>

I've attached the full log, but here's an excerpt:

2016-01-05 11:29:21,415 ERROR [main] tool.ImportTool (ImportTool.java:run(613)) - Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
    at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)
    at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
    at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
    at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
    at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
    at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

I've been struggling with this problem for quite some time now, any help would be greatly appreciated!

1 ACCEPTED SOLUTION

avatar
Expert Contributor

At the same time that I was getting this issue, I was also dealing with a network issue when trying to issue Sqoop commands via CLI. Although the network issue was resolved and I stopped seeing this IOException, I kept running into new errors that I never managed to resolve.

In the end, I decided to work around it by breaking the hive import into a 2-step workflow:

  1. sqoop action to import into HDFS
  2. hive action to load data from HDFS into hive

UPDATE:

It turns out that the "new errors" was because the "yarn" user doesn't belong to the "hdfs" group and so couldn't perform the hive-import part. Adding this use to the group allows me now to use hive-import in my worfklows instead of the 2-step workflow I used before.

View solution in original post

11 REPLIES 11

avatar
Master Mentor

@Luis Antonio Torres I have a couple of examples with sqoop and hive. Here they are. Here's hcatalog example, so you can use either hive or pig script to execute hcat commands and here's a sqoop command in shell action, also can be mixed with hive actions.

avatar
New Contributor

The Sqoop error 'No columns to generate' could also occur if your sqoop job is unable to determine a driver to use for this job. Try adding --driver com.sqlserver.jdbc.Driver along with the rest of your sqoop import parameters. Hope this helps.