Support Questions
Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Hive current directory user not exists

New Contributor


I'm trying to move data from Teradata to Hadoop using the Hortonworks Teradata Connector. I believe the connector is installed properly under /usr/hdp/

I created a new user (cdwadmin) and gave them root privileges.

The error I am receiving when running my sqoop script is "Hive current user directory not exists".

17/04/24 01:47:09 ERROR teradata.TeradataSqoopImportHelper: Exception running Teradata import job com.teradata.connector.common.exception.ConnectorException: Hive current user directory not exists at com.teradata.connector.common.tool.ConnectorJobRunner.runJob( at com.teradata.connector.common.tool.ConnectorJobRunner.runJob( at org.apache.sqoop.teradata.TeradataSqoopImportHelper.runJob( at org.apache.sqoop.teradata.TeradataConnManager.importQuery( at org.apache.sqoop.tool.ImportTool.importTable( at at at at org.apache.sqoop.Sqoop.runSqoop( at org.apache.sqoop.Sqoop.runTool( at org.apache.sqoop.Sqoop.runTool( at org.apache.sqoop.Sqoop.main( 17/04/24 01:47:09 INFO teradata.TeradataSqoopImportHelper: Teradata import job completed with exit code 1

sqoop import --connect "jdbc:teradata://33.33.333.33/DATABASE=SQL_CLASS" --connection-manager org.apache.sqoop.teradata.TeradataConnManager  --username <user> --password <password> --query 'SELECT Subscriber_No ,Street ,City ,"State" ,Zip ,AreaCode ,Phone FROM SQL_CLASS.Addresses WHERE $CONDITIONS' -m 4 --hive-import --hive-table sql_class.Addresses7 --split-by Subscriber_No --target-dir /user/cdwadmin/Addresses7438636T

I'm using HDP 2.4 (Centos 6.7). My understanding is I need to create a directory for this user, but I don't know where or how to do this? Any help would be appreciated. Thank you.


New Contributor

This error no longer appears after running:

hadoop fs -mkdir -p /user/cdwadmin

However, I'm now getting the ClassNotFoundException below:

17/04/24 02:26:36 INFO mapreduce.Job: Task Id : attempt_1492981007746_0002_m_000001_0, Status : FAILED 4/23/2017 9:26:37 PM Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDeException


Error: org.apache.hadoop.fs.FileAlreadyExistsException: /user/cdwadmin/temp_022543/part-m-00000 for client already exists

I'll start another thread for this.

Hi @Todd Wilson,

Looks like your filenames and directories are clashing from previous failed jobs. Can you try the following:

  • delete the underlying directory in the /user/cdwadmin directory that was used for temp space. This looks to be: /user/cdwadmin/temp_022543
  • assuming you are doing a full import from scratch, go into Hive and delete the sql_class.Addresses7 table so it can import from a clean slate.

Let us know how it goes