<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: sqoop import issue in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91757#M45823</link>
    <description>Thank you Harsh</description>
    <pubDate>Wed, 19 Jun 2019 08:00:47 GMT</pubDate>
    <dc:creator>andreas</dc:creator>
    <dc:date>2019-06-19T08:00:47Z</dc:date>
    <item>
      <title>sqoop import issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91719#M45819</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I am trying to import a single table from sqoop and i get this error:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Warning: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.&lt;BR /&gt;Please set $ACCUMULO_HOME to the root of your Accumulo installation.&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank" rel="noopener"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]&lt;BR /&gt;19/06/18 16:25:25 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7-cdh6.2.0&lt;BR /&gt;19/06/18 16:25:26 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.&lt;BR /&gt;19/06/18 16:25:26 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override&lt;BR /&gt;19/06/18 16:25:26 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.&lt;BR /&gt;19/06/18 16:25:26 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop will not process this sqoop connection, as an insufficient number of mappers are being used.&lt;BR /&gt;19/06/18 16:25:26 INFO manager.SqlManager: Using default fetchSize of 1000&lt;BR /&gt;19/06/18 16:25:26 INFO tool.CodeGenTool: Beginning code generation&lt;BR /&gt;19/06/18 16:25:26 INFO tool.CodeGenTool: Will generate java class as codegen_WORKFLOW&lt;BR /&gt;19/06/18 16:25:27 INFO manager.OracleManager: Time zone has been set to GMT&lt;BR /&gt;19/06/18 16:25:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM WORKFLOW t WHERE 1=0&lt;BR /&gt;19/06/18 16:25:27 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce&lt;BR /&gt;19/06/18 16:25:29 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-cloudera/compile/e8c2761367830b3f0e903699f598700b/codegen_WORKFLOW.java to /home/cloudera/./codegen_WORKFLOW.java. Error: Destination '/home/cloudera/./codegen_WORKFLOW.java' already exists&lt;BR /&gt;19/06/18 16:25:29 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/e8c2761367830b3f0e903699f598700b/codegen_WORKFLOW.jar&lt;BR /&gt;19/06/18 16:25:29 INFO manager.OracleManager: Time zone has been set to GMT&lt;BR /&gt;19/06/18 16:25:29 WARN manager.OracleManager: The table WORKFLOW contains a multi-column primary key. Sqoop will default to the column IDWORKFLOW only for this job.&lt;BR /&gt;19/06/18 16:25:29 INFO manager.OracleManager: Time zone has been set to GMT&lt;BR /&gt;19/06/18 16:25:29 WARN manager.OracleManager: The table WORKFLOW contains a multi-column primary key. Sqoop will default to the column IDWORKFLOW only for this job.&lt;BR /&gt;19/06/18 16:25:29 INFO mapreduce.ImportJobBase: Beginning import of WORKFLOW&lt;BR /&gt;19/06/18 16:25:29 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar&lt;BR /&gt;19/06/18 16:25:29 INFO manager.OracleManager: Time zone has been set to GMT&lt;BR /&gt;19/06/18 16:25:30 INFO manager.OracleManager: Time zone has been set to GMT&lt;BR /&gt;19/06/18 16:25:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM WORKFLOW t WHERE 1=0&lt;BR /&gt;19/06/18 16:25:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM WORKFLOW t WHERE 1=0&lt;BR /&gt;19/06/18 16:25:30 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps&lt;BR /&gt;19/06/18 16:25:30 INFO client.RMProxy: Connecting to ResourceManager at clouderasrv/172.23.16.226:8032&lt;BR /&gt;19/06/18 16:25:31 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /user/cloudera/.staging/job_1560863992639_0002&lt;BR /&gt;19/06/18 16:26:23 INFO db.DBInputFormat: Using read commited transaction isolation&lt;BR /&gt;19/06/18 16:26:24 INFO mapreduce.JobSubmitter: number of splits:1&lt;BR /&gt;19/06/18 16:26:24 INFO Configuration.deprecation: yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabled&lt;BR /&gt;19/06/18 16:26:25 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1560863992639_0002&lt;BR /&gt;19/06/18 16:26:25 INFO mapreduce.JobSubmitter: Executing with tokens: []&lt;BR /&gt;19/06/18 16:26:25 INFO conf.Configuration: resource-types.xml not found&lt;BR /&gt;19/06/18 16:26:25 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.&lt;BR /&gt;19/06/18 16:26:26 INFO impl.YarnClientImpl: Submitted application application_1560863992639_0002&lt;BR /&gt;19/06/18 16:26:26 INFO mapreduce.Job: The url to track the job: &lt;A href="http://clouderasrv:8088/proxy/application_1560863992639_0002/" target="_blank"&gt;http://clouderasrv:8088/proxy/application_1560863992639_0002/&lt;/A&gt;&lt;BR /&gt;19/06/18 16:26:26 INFO mapreduce.Job: Running job: job_1560863992639_0002&lt;BR /&gt;19/06/18 16:26:36 INFO mapreduce.Job: Job job_1560863992639_0002 running in uber mode : false&lt;BR /&gt;19/06/18 16:26:36 INFO mapreduce.Job: map 100% reduce 0%&lt;BR /&gt;19/06/18 16:26:37 INFO mapreduce.Job: Job job_1560863992639_0002 failed with state KILLED due to: The required MAP capability is more than the supported max container capability in the cluster. Killing the Job. mapResourceRequest: &amp;lt;memory:2560, vCores:1&amp;gt; maxContainerCapability:&amp;lt;memory:2048, vCores:4&amp;gt;&lt;BR /&gt;Job received Kill while in RUNNING state.&lt;/P&gt;
&lt;P&gt;19/06/18 16:26:37 INFO mapreduce.Job: Counters: 3&lt;BR /&gt;Job Counters&lt;BR /&gt;Killed map tasks=1&lt;BR /&gt;Total time spent by all maps in occupied slots (ms)=0&lt;BR /&gt;Total time spent by all reduces in occupied slots (ms)=0&lt;BR /&gt;19/06/18 16:26:37 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead&lt;BR /&gt;19/06/18 16:26:37 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 66.9005 seconds (0 bytes/sec)&lt;BR /&gt;19/06/18 16:26:37 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead&lt;BR /&gt;19/06/18 16:26:37 INFO mapreduce.ImportJobBase: Retrieved 0 records.&lt;BR /&gt;19/06/18 16:26:37 ERROR tool.ImportTool: Import failed: Import job failed!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Any idea why this error is coming up?&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 14:27:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91719#M45819</guid>
      <dc:creator>andreas</dc:creator>
      <dc:date>2022-09-16T14:27:27Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91720#M45820</link>
      <description>Please share your full Sqoop CLI.&lt;BR /&gt;&lt;BR /&gt;The error you are receiving suggests that the configuration passed to this&lt;BR /&gt;specific Sqoop job carried a parameter asking for Map memory to be higher&lt;BR /&gt;than what the administrator has configured as a limit a Map task may&lt;BR /&gt;request. As a result, the container request is rejected. Lowering the&lt;BR /&gt;request memory size of map tasks will let it pass through this check.&lt;BR /&gt;</description>
      <pubDate>Tue, 18 Jun 2019 14:36:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91720#M45820</guid>
      <dc:creator>Harsh J</dc:creator>
      <dc:date>2019-06-18T14:36:59Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91721#M45821</link>
      <description>Can you please advice how to do this?Is this a parameter in cloudera manager&lt;BR /&gt;or parameter in sqoop import command?&lt;BR /&gt;Thanks&lt;BR /&gt;</description>
      <pubDate>Tue, 18 Jun 2019 15:06:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91721#M45821</guid>
      <dc:creator>andreas</dc:creator>
      <dc:date>2019-06-18T15:06:59Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91755#M45822</link>
      <description>It could be passed by either modes, hence the request for the CLI used.&lt;BR /&gt;&lt;BR /&gt;The property to modify on the client configuration (via CM properties or&lt;BR /&gt;via -D early CLI args) is called 'mapreduce.map.memory.mb', and the&lt;BR /&gt;administrative limit is defined in the Resource Manager daemon&lt;BR /&gt;configuration via 'yarn.scheduler.maximum-allocation-mb'&lt;BR /&gt;</description>
      <pubDate>Wed, 19 Jun 2019 06:28:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91755#M45822</guid>
      <dc:creator>Harsh J</dc:creator>
      <dc:date>2019-06-19T06:28:59Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import issue</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91757#M45823</link>
      <description>Thank you Harsh</description>
      <pubDate>Wed, 19 Jun 2019 08:00:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-issue/m-p/91757#M45823</guid>
      <dc:creator>andreas</dc:creator>
      <dc:date>2019-06-19T08:00:47Z</dc:date>
    </item>
  </channel>
</rss>

