Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Sqoop Import failing for com.teradata.connector.common.exception.ConnectorException

avatar
Contributor

I'm trying to do a Sqoop import into local FS from Teradata and I get the following error.

I'm tried using the following import statement

import -fs file://// --connect

import -fs local --connect

Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 16/02/23 14:24:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950 16/02/23 14:24:19 WARN fs.FileSystem: "local" is a deprecated filesystem name. Use "file:///" instead. 16/02/23 14:24:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 16/02/23 14:24:19 INFO manager.SqlManager: Using default fetchSize of 1000 16/02/23 14:24:19 INFO tool.CodeGenTool: The connection manager declares that it self manages mapping between records & fields and rows & columns. No class will will be generated. 16/02/23 14:24:19 INFO teradata.TeradataConnManager: Importing from Teradata Table:PDCR_INFO_LOG 16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting input file format in TeradataConfiguration to textfile 16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Table name to import PDCR_INFO_LOG 16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting job type in TeradataConfiguration to hdfs 16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting input file format in TeradataConfiguration to textfile 16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting number of mappers in TeradataConfiguration to 4 16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting input batch size in TeradataConfiguration to 1000 16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting input separator in TeradataConfiguration to \u0021 16/02/23 14:24:19 INFO teradata.TeradataSqoopImportHelper: Setting source table to : PDCR_INFO_LOG SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 16/02/23 14:24:19 INFO common.ConnectorPlugin: load plugins in jar:file:/usr/hdp/2.3.2.0-2950/sqoop/lib/teradata-connector-1.4.1-hadoop2.jar!/teradata.connector.plugins.xml 16/02/23 14:24:19 INFO processor.TeradataInputProcessor: input preprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor starts at: 1456259059883 16/02/23 14:24:20 INFO utils.TeradataUtils: the input database product is Teradata 16/02/23 14:24:20 INFO utils.TeradataUtils: the input database version is 14.10 16/02/23 14:24:20 INFO utils.TeradataUtils: the jdbc driver version is 15.0 16/02/23 14:24:22 INFO processor.TeradataInputProcessor: the teradata connector for hadoop version is: 1.4.1 16/02/23 14:24:22 INFO processor.TeradataInputProcessor: input jdbc properties are jdbc:teradata://172.19.7.22/database=BIGDATA_POC_WORK_TABLES 16/02/23 14:24:23 INFO processor.TeradataInputProcessor: the number of mappers are 4 16/02/23 14:24:23 INFO processor.TeradataInputProcessor: input preprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor ends at: 1456259063699 16/02/23 14:24:23 INFO processor.TeradataInputProcessor: the total elapsed time of input preprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 3s 16/02/23 14:24:24 INFO impl.TimelineClientImpl: Timeline service address: http://xxxx.xxxx.com:8188/ws/v1/timeline/ 16/02/23 14:24:24 INFO client.RMProxy: Connecting to ResourceManager at xxx.xxxx.com/172.19.26.26:8050 16/02/23 14:24:24 INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor starts at: 1456259064288 16/02/23 14:24:24 INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor ends at: 1456259064288 16/02/23 14:24:24 INFO processor.TeradataInputProcessor: the total elapsed time of input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 0s 16/02/23 14:24:24 ERROR teradata.TeradataSqoopImportHelper: Exception running Teradata import job com.teradata.connector.common.exception.ConnectorException: java.io.FileNotFoundException: File file:/hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz does not exist at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:609) at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:822) at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:599) at org.apache.hadoop.fs.DelegateToFileSystem.getFileStatus(DelegateToFileSystem.java:125) at org.apache.hadoop.fs.AbstractFileSystem.resolvePath(AbstractFileSystem.java:467) at org.apache.hadoop.fs.FilterFs.resolvePath(FilterFs.java:157) at org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2193) at org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2189) at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) at org.apache.hadoop.fs.FileContext.resolve(FileContext.java:2189) at org.apache.hadoop.fs.FileContext.resolvePath(FileContext.java:601) at org.apache.hadoop.mapreduce.JobSubmitter.addMRFrameworkToDistributedCache(JobSubmitter.java:457) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:142) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:134) at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:56) at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370) at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:148) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) at org.apache.sqoop.Sqoop.main(Sqoop.java:244) at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:140) at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:56) at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370) at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:148) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) at org.apache.sqoop.Sqoop.main(Sqoop.java:244) 16/02/23 14:24:24 INFO teradata.TeradataSqoopImportHelper: Teradata import job completed with exit code 1 16/02/23 14:24:24 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Exception running Teradata import job at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:373) at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:148) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) at org.apache.sqoop.Sqoop.main(Sqoop.java:244) Caused by: com.teradata.connector.common.exception.ConnectorException: java.io.FileNotFoundException: File file:/hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz does not exist at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:609) at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:822) at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:599) at org.apache.hadoop.fs.DelegateToFileSystem.getFileStatus(DelegateToFileSystem.java:125) at org.apache.hadoop.fs.AbstractFileSystem.resolvePath(AbstractFileSystem.java:467) at org.apache.hadoop.fs.FilterFs.resolvePath(FilterFs.java:157) at org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2193) at org.apache.hadoop.fs.FileContext$25.next(FileContext.java:2189) at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) at org.apache.hadoop.fs.FileContext.resolve(FileContext.java:2189) at org.apache.hadoop.fs.FileContext.resolvePath(FileContext.java:601) at org.apache.hadoop.mapreduce.JobSubmitter.addMRFrameworkToDistributedCache(JobSubmitter.java:457) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:142) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:134) at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:56) at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370) at org.apache.sqoop.teradata.TeradataConnManager.importTable(TeradataConnManager.java:504) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:148) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) at org.apache.sqoop.Sqoop.main(Sqoop.java:244) at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:140) at com.teradata.connector.common.tool.ConnectorJobRunner.runJob(ConnectorJobRunner.java:56) at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob(TeradataSqoopImportHelper.java:370) ... 9 more

1 ACCEPTED SOLUTION

avatar
Master Mentor
@Kiran Bollam

See this https://community.hortonworks.com/articles/6161/hdfs-to-teradata-example.html

In this case

"com.teradata.connector.common.exception.ConnectorException: java.io.FileNotFoundException: File file:/hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz does not exist at"

You can locate mapreduce.tar.gz in your server and copy to /hdp/apps/2.3.2.0-2950/mapreduce/

find / -name mapreduce.tar.gz

hdfs dfs -put mapreduce.tar.gz /hdp/apps/2.3.2.0-2950/mapreduce/

For example:

In my case its under

/usr/hdp/2.3.4.0-3485/hadoop/mapreduce.tar.gz

[hdfs@phdns01 ~]$ hdfs dfs -ls /hdp/apps/2*/mapreduce

Found 2 items

-r--r--r-- 1 hdfs hadoop 105896 2016-02-14 19:41 /hdp/apps/2.3.4.0-3485/mapreduce/hadoop-streaming.jar

-r--r--r-- 1 hdfs hadoop 210214446 2016-02-14 19:40 /hdp/apps/2.3.4.0-3485/mapreduce/mapreduce.tar.gz

View solution in original post

16 REPLIES 16

avatar
Master Guru

Does this file exist in HDFS? Is it possible that no MapReduce application is running?

hdfs://hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz

If it doesn't your cluster might have a problem. You can normally find the same file in the local installation as well

/usr/hdp/2.3.2.0-2950/hadoop/mapreduce.tar.gz

Putting it into hdfs at the required location might help however its likely that other files also are not located correctly ( tez libs etc. )

avatar
Contributor

yes Ben, the file exists in the HDFS.

Do you have any running example of options file on how to import a table into local file system.

Thanks

Kiran

avatar
Master Guru

AAAAH I see below that you want to write to local filesystem. Why would you do that? No this will not work.

If you want to unload to local filesystem Teradata provides client and unload utilities for that. If you use Sqoop you want to use MapReduce to store data in HDFS.

So no this will not work.

avatar
Contributor
@Benjamin

When I looked at the Sqoop parameters, it said it can be downloaded to local FS thats why I was trying it out.

avatar
Master Mentor

@Kiran Bollam

Here is an example using the teradata connector link

avatar
Master Mentor
@Kiran Bollam

See this https://community.hortonworks.com/articles/6161/hdfs-to-teradata-example.html

In this case

"com.teradata.connector.common.exception.ConnectorException: java.io.FileNotFoundException: File file:/hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz does not exist at"

You can locate mapreduce.tar.gz in your server and copy to /hdp/apps/2.3.2.0-2950/mapreduce/

find / -name mapreduce.tar.gz

hdfs dfs -put mapreduce.tar.gz /hdp/apps/2.3.2.0-2950/mapreduce/

For example:

In my case its under

/usr/hdp/2.3.4.0-3485/hadoop/mapreduce.tar.gz

[hdfs@phdns01 ~]$ hdfs dfs -ls /hdp/apps/2*/mapreduce

Found 2 items

-r--r--r-- 1 hdfs hadoop 105896 2016-02-14 19:41 /hdp/apps/2.3.4.0-3485/mapreduce/hadoop-streaming.jar

-r--r--r-- 1 hdfs hadoop 210214446 2016-02-14 19:40 /hdp/apps/2.3.4.0-3485/mapreduce/mapreduce.tar.gz

avatar
Contributor

Neeraj,

Mapreduce.tar.gz file is present in HDFS under /hdp/apps folder. I'm getting the error shown above when I'm trying to import the Table from teradata to Local Filesystem. The import runs fine if it is from Table to HDFS. Do you have any sample for importing from Teradata to Local Filesystem.

Thanks

avatar
Master Mentor
@Kiran Bollam

Why do you want to export into localFS?

I doubt if local FS will work. If you want to get data from HDFS then follow this

avatar
Contributor

@Neeraj,

Instead of setting up fastimport , we would like to leverage existing sqoop setup. On apache site it is mentioned we can use "local FS" option to import to local filesystem but it is not working..

Apache sqoop

We don't want to duplicate the data in HDFS and local FS.

Thanks

Kiran