Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Sqoop Export failed to load into Netezza

Sqoop Export failed to load into Netezza

New Contributor

I am exporting the data from HDFS to Netezza using sqoop , and there are many jobs running in parallel. few of them failed with the following error causing the Sqoop job to fail. Can anyone please help me with this issue. I have taken care of all the permissions and ownership of the tables in Netezza.

I have found a link with a similar issue but doesn't have ane any solution.

Version details:



Sqoop 1.4.6

Parameter file

#indicates that to use the direct mode instead of JDBC mode 
#Use batch mode for underlying statement execution.
#Connect parameter to the Netezza database 
#Username to the Netezza database
#Password to the Netezza database, this file is available in HDFS.

Sqoop command

sqoop export --options-file ${SQOOP_PARM_FILE} \
 --table ${TAR_TABLE} \
 --export-dir ${HDFS_EXP_DIR}/${TABLE} \
 --num-mappers ${mapper_no} \
 --verbose &> ${extended_log}

Error log here

18/03/13 18:08:34 INFO mapreduce.Job: Task Id : attempt_1517179256891_920980_m_000005_2, Status : FAILED
Error: java.lang.InterruptedException
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(
        at com.sun.proxy.$Proxy14.delete(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(
        at java.lang.reflect.Method.invoke(
        at com.sun.proxy.$Proxy15.delete(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.delete(
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(
        at org.apache.hadoop.hdfs.DistributedFileSystem.delete(
        at org.apache.sqoop.util.FileUploader.uploadFilesToDFS(
        at org.apache.hadoop.mapred.MapTask.runNewMapper(
        at org.apache.hadoop.mapred.YarnChild$
        at Method)
        at org.apache.hadoop.mapred.YarnChild.main(
Caused by: java.lang.InterruptedException
        at java.util.concurrent.FutureTask.awaitDone(
        at java.util.concurrent.FutureTask.get(
        at org.apache.hadoop.ipc.Client$Connection.sendRpcRequest(
        ... 26 more

Re: Sqoop Export failed to load into Netezza

Cloudera Employee

Can you post the sqoop command (you can mask user/pwd/host,etc details). Firstly, i hope you took care of permissions as below in Netezza.

1)Database access. CREATE EXTERNAL TABLE access in the database for the user.

2)Table permissions for the user.

3)HDFS target directory write access.

4) Also try without "--direct" option in sqoop. To eliminate option of CREATE EXTERNAL TABLE permissions and debug.

Re: Sqoop Export failed to load into Netezza

New Contributor

hi @bmasna , I have updated the question with the command and the parameters used. Yes, all the permissions are taken care in Netezza and HDFS dir, I am sure because only a few mappers failed with this error, but the other mappers succeeded and loaded the data into Netezza.

Re: Sqoop Export failed to load into Netezza

Cloudera Employee

Hi @sai harshavardhan As you said permissions seems not an issue. Can you try to 1)rerun the same. To avoid any network/timeout errors causing interruption , 2) You also seem to have --verbose option, attach log file(remove anything private). If you're copying cross dc some threads running longer time may be killed due to network or cross DC traffic.

Don't have an account?
Coming from Hortonworks? Activate your account here