Welcome to the Cloudera Community

Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Who agreed with this topic

Sqoop import using default HDFS location to import data rather than using mentioned target-dir

avatar
New Contributor

Hi All,

I am using Cloudera-VM and while using sqoop import to load data in HDFS with a mentioned target -dir, it is repeatedly importing the data to the default HDFS location and ignoring the target directory mentioned in sqoop command.

 

Sqoop command i am using with SQL server:

sqoop import --connect "jdbc:sqlserver://hostname;database=DBname;username=anyusername;password=mypassword" --table Person -- --schema Person --m 1 --target-dir /user/hdfs/abhi1/newdir

 

After running the above command, the output is dumped in /user/cloudera/Person directory and not to the "/user/hdfs/abhi1/newdir". 

 

Also if I use --append command since i am running the import command more than once it says Person file already exists at location /user/cloudera/Person

 

11:53:26 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://quickstart.cloudera:8020/user/cloudera/Person already exists
16/06/05 11:53:26 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://quickstart.cloudera:8020/user/cloudera/Person already exists

 

Please help.

 

Regards,

abhi2kk

Who agreed with this topic