<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155235#M20680</link>
    <description>&lt;P&gt;@Geoffrey, Here is the code, I'm using the options file to run sqoop&lt;/P&gt;&lt;P&gt;import
-fs
file:////home/kbollam/temp/
--connect
jdbc:teradata://172.19.7.22/database=BIGDATA_POC 
--connection-manager
org.apache.sqoop.teradata.TeradataConnManager
--username
xxxxx
--password
xxxxx
--fields-terminated-by
'\0x021'
--lines-terminated-by
'\n'
--table
tableName&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
    <pubDate>Thu, 25 Feb 2016 06:34:50 GMT</pubDate>
    <dc:creator>KBOLLAM</dc:creator>
    <dc:date>2016-02-25T06:34:50Z</dc:date>
    <item>
      <title>Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155225#M20670</link>
      <description>&lt;P&gt;I'm trying to do a Sqoop import into local FS from Teradata and I get the following error.&lt;/P&gt;&lt;P&gt;I'm tried using the following import statement&lt;/P&gt;&lt;P&gt;import
-fs
file:////
--connect&lt;/P&gt;&lt;P&gt;import -fs local --connect&lt;/P&gt;&lt;P&gt;Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/02/23 14:24:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950
16/02/23 14:24:19 WARN fs.FileSystem: "local" is a deprecated filesystem name. Use "file:///" instead.
16/02/23 14:24:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/02/23 14:24:19 INFO manager.SqlManager: Using default fetchSize of 1000
16/02/23 14:24:19 INFO tool.CodeGenTool: The connection manager declares that it self manages mapping between records &amp;amp; fields and rows &amp;amp; columns.  No class will will be generated.
16/02/23 14:24:19 INFO teradata.TeradataConnManager: Importing from Teradata Table:PDCR_INFO_LOG
16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting input file format in TeradataConfiguration to textfile
16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Table name to import PDCR_INFO_LOG
16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting job type in TeradataConfiguration to hdfs
16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting input file format in TeradataConfiguration to textfile
16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting number of mappers in TeradataConfiguration to 4
16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting input batch size in TeradataConfiguration to 1000
16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting input separator in TeradataConfiguration to \u0021
16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting source table  to : PDCR_INFO_LOG
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/02/23 14:24:19 INFO common.ConnectorPlugin: load plugins in jar:file:/usr/hdp/2.3.2.0-2950/sqoop/lib/teradata-connector-1.4.1-hadoop2.jar!/teradata.connector.plugins.xml
16/02/23 14:24:19 INFO processor.TeradataInputProcessor: input preprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor starts at:  1456259059883
16/02/23 14:24:20 INFO utils.TeradataUtils: the input database product is Teradata
16/02/23 14:24:20 INFO utils.TeradataUtils: the input database version is 14.10
16/02/23 14:24:20 INFO utils.TeradataUtils: the jdbc driver version is 15.0
16/02/23 14:24:22 INFO processor.TeradataInputProcessor: the teradata connector for hadoop version is: 1.4.1
16/02/23 14:24:22 INFO processor.TeradataInputProcessor: input jdbc properties are jdbc:teradata://172.19.7.22/database=BIGDATA_POC_WORK_TABLES
16/02/23 14:24:23 INFO processor.TeradataInputProcessor: the number of mappers are 4
16/02/23 14:24:23 INFO processor.TeradataInputProcessor: input preprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor ends at:  1456259063699
16/02/23 14:24:23 INFO processor.TeradataInputProcessor: the total elapsed time of input preprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 3s
16/02/23 14:24:24 INFO impl.TimelineClientImpl: Timeline service address: &lt;A href="http://xxxx.xxxx.com:8188/ws/v1/timeline/" target="_blank"&gt;http://xxxx.xxxx.com:8188/ws/v1/timeline/&lt;/A&gt;
16/02/23 14:24:24 INFO client.RMProxy: Connecting to ResourceManager at xxx.xxxx.com/172.19.26.26:8050
16/02/23 14:24:24 INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor starts at:  1456259064288
16/02/23 14:24:24 INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor ends at:  1456259064288
16/02/23 14:24:24 INFO processor.TeradataInputProcessor: the total elapsed time of input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 0s
16/02/23 14:24:24 ERROR teradata.TeradataSqoopImportHelper: Exception running Teradata import job
com.teradata.connector.common.exception.ConnectorException: java.io.FileNotFoundException: File file:/hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:609)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:822)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:599)
        at org.apache.hadoop.fs.DelegateToFileSystem.getFileStatus(DelegateToFileSystem.java:125)
        at org.apache.hadoop.fs.AbstractFileSystem.resolvePath(AbstractFileSystem.java:467)
        at org.apache.hadoop.fs.FilterFs.resolvePath(FilterFs.java:157)
        at org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2193)
        at org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2189)
        at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
        at org.apache.hadoop.fs.FileContext.resolve(FileContext.java:2189)
        at org.apache.hadoop.fs.FileContext.resolvePath(FileContext.java:601)
        at org.apache.hadoop.mapreduce.JobSubmitter.addMRFrameworkToDistributedCache(JobSubmitter.java:457)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:142)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
        at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:134)
        at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:56)
        at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370)
        at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
        at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:140)
        at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:56)
        at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370)
        at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
16/02/23 14:24:24 INFO teradata.TeradataSqoopImportHelper: Teradata import job completed with exit code 1
16/02/23 14:24:24 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Exception running Teradata import job
        at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:373)
        at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
Caused by: com.teradata.connector.common.exception.ConnectorException: java.io.FileNotFoundException: File file:/hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz does not exist
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:609)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:822)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:599)
        at org.apache.hadoop.fs.DelegateToFileSystem.getFileStatus(DelegateToFileSystem.java:125)
        at org.apache.hadoop.fs.AbstractFileSystem.resolvePath(AbstractFileSystem.java:467)
        at org.apache.hadoop.fs.FilterFs.resolvePath(FilterFs.java:157)
        at org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2193)
        at org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2189)
        at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
        at org.apache.hadoop.fs.FileContext.resolve(FileContext.java:2189)
        at org.apache.hadoop.fs.FileContext.resolvePath(FileContext.java:601)
        at org.apache.hadoop.mapreduce.JobSubmitter.addMRFrameworkToDistributedCache(JobSubmitter.java:457)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:142)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
        at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:134)
        at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:56)
        at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370)
        at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
        at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:140)
        at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:56)
        at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370)
        ... 9 more&lt;/P&gt;</description>
      <pubDate>Wed, 24 Feb 2016 04:41:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155225#M20670</guid>
      <dc:creator>KBOLLAM</dc:creator>
      <dc:date>2016-02-24T04:41:23Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155226#M20671</link>
      <description>&lt;P&gt;Does this file exist in HDFS? Is it possible that no MapReduce application is running? &lt;/P&gt;&lt;P&gt;hdfs://hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz&lt;/P&gt;&lt;P&gt;If it doesn't your cluster might have a problem. You can normally find the same file in the local installation as well&lt;/P&gt;&lt;P&gt;/usr/hdp/2.3.2.0-2950/hadoop/mapreduce.tar.gz&lt;/P&gt;&lt;P&gt;Putting it into hdfs at the required location might help however its likely that other files also are not located correctly ( tez libs etc. ) &lt;/P&gt;</description>
      <pubDate>Wed, 24 Feb 2016 05:29:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155226#M20671</guid>
      <dc:creator>bleonhardi</dc:creator>
      <dc:date>2016-02-24T05:29:19Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155227#M20672</link>
      <description>&lt;P&gt;yes Ben, the file exists in the HDFS. &lt;/P&gt;&lt;P&gt;Do you have any running example of options file on how to import a table into local file system.&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;Kiran&lt;/P&gt;</description>
      <pubDate>Wed, 24 Feb 2016 05:37:58 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155227#M20672</guid>
      <dc:creator>KBOLLAM</dc:creator>
      <dc:date>2016-02-24T05:37:58Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155228#M20673</link>
      <description>&lt;P&gt;@&lt;A href="https://community.hortonworks.com/users/693/kbollam.html"&gt;Kiran Bollam&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Here is an example using the teradata connector &lt;A target="_blank" href="https://www-01.ibm.com/support/knowledgecenter/SSPT3X_4.0.0/com.ibm.swg.im.infosphere.biginsights.import.doc/doc/data_warehouse_teradata-import.html"&gt;link&lt;/A&gt; &lt;/P&gt;</description>
      <pubDate>Wed, 24 Feb 2016 06:38:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155228#M20673</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2016-02-24T06:38:21Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155229#M20674</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/693/kbollam.html" nodeid="693"&gt;@Kiran Bollam&lt;/A&gt;&lt;P&gt; See this &lt;A href="https://community.hortonworks.com/articles/6161/hdfs-to-teradata-example.html" target="_blank"&gt;https://community.hortonworks.com/articles/6161/hdfs-to-teradata-example.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;In this case &lt;/P&gt;&lt;P&gt;"com.teradata.connector.common.exception.ConnectorException: java.io.FileNotFoundException: File file:/hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz does not exist at"&lt;/P&gt;&lt;P&gt;You can locate mapreduce.tar.gz in your server and copy to /hdp/apps/2.3.2.0-2950/mapreduce/&lt;/P&gt;&lt;P&gt;find / -name mapreduce.tar.gz&lt;/P&gt;&lt;P&gt;hdfs dfs -put mapreduce.tar.gz /hdp/apps/2.3.2.0-2950/mapreduce/&lt;/P&gt;&lt;P&gt;For example:&lt;/P&gt;&lt;P&gt;In my case its under &lt;/P&gt;&lt;P&gt;/usr/hdp/2.3.4.0-3485/hadoop/mapreduce.tar.gz&lt;/P&gt;&lt;P&gt;[hdfs@phdns01 ~]$ hdfs dfs -ls /hdp/apps/2*/mapreduce&lt;/P&gt;&lt;P&gt;Found 2 items&lt;/P&gt;&lt;P&gt;-r--r--r--   1 hdfs hadoop     105896 2016-02-14 19:41 /hdp/apps/2.3.4.0-3485/mapreduce/hadoop-streaming.jar&lt;/P&gt;&lt;P&gt;-r--r--r--   1 hdfs hadoop  210214446 2016-02-14 19:40 /hdp/apps/2.3.4.0-3485/mapreduce/mapreduce.tar.gz&lt;/P&gt;</description>
      <pubDate>Wed, 24 Feb 2016 07:18:10 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155229#M20674</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-02-24T07:18:10Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155230#M20675</link>
      <description>&lt;P&gt;Neeraj,&lt;/P&gt;&lt;P&gt;Mapreduce.tar.gz file is present in HDFS under /hdp/apps folder. I'm getting the error shown above when I'm trying to import the Table from teradata to Local Filesystem. The import runs fine if it is from Table to HDFS. Do you have any sample for importing from Teradata to Local Filesystem.&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Wed, 24 Feb 2016 12:16:37 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155230#M20675</guid>
      <dc:creator>KBOLLAM</dc:creator>
      <dc:date>2016-02-24T12:16:37Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155231#M20676</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/693/kbollam.html" nodeid="693"&gt;@Kiran Bollam&lt;/A&gt;&lt;P&gt; Why do you want to export into localFS? &lt;/P&gt;&lt;P&gt;I doubt if local FS will work. If you want to get data from HDFS then follow &lt;A target="_blank" href="http://stackoverflow.com/questions/17837871/how-to-copy-file-from-hdfs-to-the-local-file-system"&gt;this&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 24 Feb 2016 12:39:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155231#M20676</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-02-24T12:39:03Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155232#M20677</link>
      <description>&lt;P&gt;AAAAH I see below that you want to write to local filesystem. Why would you do that? No this will not work. &lt;/P&gt;&lt;P&gt;If you want to unload to local filesystem Teradata provides client and unload utilities  for that. If you use Sqoop you want to use MapReduce to store data in HDFS.&lt;/P&gt;&lt;P&gt;So no this will not work. &lt;/P&gt;</description>
      <pubDate>Wed, 24 Feb 2016 17:23:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155232#M20677</guid>
      <dc:creator>bleonhardi</dc:creator>
      <dc:date>2016-02-24T17:23:02Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155233#M20678</link>
      <description>&lt;P&gt;&lt;A href="https://community.hortonworks.com/users/693/kbollam.html"&gt;@Kiran Bollam&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/users/693/kbollam.html"&gt;&lt;/A&gt;Can you copy your code in here? &lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/users/693/kbollam.html"&gt;&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 25 Feb 2016 01:32:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155233#M20678</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2016-02-25T01:32:56Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155234#M20679</link>
      <description>&lt;P&gt;@Neeraj, &lt;/P&gt;&lt;P&gt;Instead of setting up fastimport , we would like to leverage existing sqoop setup. On apache site it is mentioned we can use "local FS" option to import to local filesystem but it is not working.. &lt;/P&gt;&lt;P&gt;&lt;A href="https://sqoop.apache.org/docs/1.4.1-incubating/SqoopUserGuide.html"&gt;Apache sqoop&lt;/A&gt;
&lt;/P&gt;&lt;P&gt;We don't want to duplicate the data in HDFS and local FS.&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;P&gt;Kiran&lt;/P&gt;</description>
      <pubDate>Thu, 25 Feb 2016 02:17:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155234#M20679</guid>
      <dc:creator>KBOLLAM</dc:creator>
      <dc:date>2016-02-25T02:17:18Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155235#M20680</link>
      <description>&lt;P&gt;@Geoffrey, Here is the code, I'm using the options file to run sqoop&lt;/P&gt;&lt;P&gt;import
-fs
file:////home/kbollam/temp/
--connect
jdbc:teradata://172.19.7.22/database=BIGDATA_POC 
--connection-manager
org.apache.sqoop.teradata.TeradataConnManager
--username
xxxxx
--password
xxxxx
--fields-terminated-by
'\0x021'
--lines-terminated-by
'\n'
--table
tableName&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Thu, 25 Feb 2016 06:34:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155235#M20680</guid>
      <dc:creator>KBOLLAM</dc:creator>
      <dc:date>2016-02-25T06:34:50Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155236#M20681</link>
      <description>@Benjamin&lt;P&gt;When I looked at the Sqoop parameters, it said it can be downloaded to local FS thats why I was trying it out. &lt;/P&gt;</description>
      <pubDate>Thu, 25 Feb 2016 06:36:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155236#M20681</guid>
      <dc:creator>KBOLLAM</dc:creator>
      <dc:date>2016-02-25T06:36:14Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155237#M20682</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/693/kbollam.html" nodeid="693"&gt;@Kiran Bollam&lt;/A&gt;  I searched for local and localFS in that doc...There is no reference to local FS&lt;/P&gt;&lt;P&gt;Are you referring to this?&lt;/P&gt;&lt;PRE&gt;-fs &amp;lt;local|namenode:port&amp;gt;      specify a namenode
&lt;/PRE&gt;</description>
      <pubDate>Thu, 25 Feb 2016 10:01:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155237#M20682</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-02-25T10:01:26Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155238#M20683</link>
      <description>&lt;P&gt;Yes &lt;A rel="user" href="https://community.cloudera.com/users/140/nsabharwal.html" nodeid="140"&gt;@Neeraj Sabharwal&lt;/A&gt; ...Thats what i was thinking.. Also, I found some examples on Stackoverflow too.. thats why it was misleading.&lt;/P&gt;&lt;P&gt;here is the link &lt;/P&gt;&lt;P&gt;&lt;A href="http://stackoverflow.com/questions/27024502/how-to-import-data-using-sqoop-from-rdbms-into-local-file-system-not-hdfs"&gt;import to local&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 25 Feb 2016 23:39:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155238#M20683</guid>
      <dc:creator>KBOLLAM</dc:creator>
      <dc:date>2016-02-25T23:39:02Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155239#M20684</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/693/kbollam.html" nodeid="693"&gt;@Kiran Bollam&lt;/A&gt; Ok. Please do close the thread by accepting one of the answers&lt;/P&gt;</description>
      <pubDate>Fri, 26 Feb 2016 01:11:10 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155239#M20684</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-02-26T01:11:10Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155240#M20685</link>
      <description>&lt;P&gt;&lt;A href="https://community.hortonworks.com/users/140/nsabharwal.html"&gt;@Neeraj Sabharwal&lt;/A&gt;  how do i close the thread? I mean how to accept the answer.. &lt;/P&gt;</description>
      <pubDate>Fri, 26 Feb 2016 01:37:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155240#M20685</guid>
      <dc:creator>KBOLLAM</dc:creator>
      <dc:date>2016-02-26T01:37:08Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop Import failing for   com.teradata.connector.common.exception.ConnectorException</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155241#M20686</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/693/kbollam.html" nodeid="693"&gt;@Kiran Bollam&lt;/A&gt; You can see Accept button in the reply. You can click accept on the best answer &lt;/P&gt;</description>
      <pubDate>Fri, 26 Feb 2016 01:48:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-Import-failing-for-com-teradata-connector-common/m-p/155241#M20686</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-02-26T01:48:17Z</dc:date>
    </item>
  </channel>
</rss>

