Support Questions

Find answers, ask questions, and share your expertise

Error: Invalid path when uploading from MariaDB to Hive Database using Sqoop

avatar

I am running the command from hdfs user hdfs@master>

sqoop import-all-tables --connect jdbc:mysql://10.11.11.15:6306/siki_asmet?serverTimezone=Asia/Jakarta --username micronics -P --hive-import --warehouse-dir /warehouse/siki --hive-database siki_ods --exclude-tables "Sheet1$" --m 1;

 

When i run the above command, i get the following error: 

 

FAILED: SemanticException Line 1:17 Invalid path ''hdfs://master.lpjk.com:8020/warehouse/siki/_asdamkindo_personal_ska_pendidikan'': No files matching path hdfs://master.lpjk.com:8020/warehouse/siki/_asdamkindo_personal_ska_pendidikan (state=42000,code=40000)

 

But when i try to run the command again, it said file already exists. 

1 ACCEPTED SOLUTION

avatar

I solved my problem. In my case one of the talbes' name was starting with a character underscore "_" because of which there was an issue where 2 single quotes were added automatically in the path of the hdfs directory where the copy of the file was stored.

I changed the name of the column by removing the underscore character and now i can import the table into Hive database. I think special characters like that are not easily parsed in Hive or HDFS.

View solution in original post

3 REPLIES 3

avatar
Super Guru

Check that the user executing sqoop is able to write to the target directory.  You may need to create a service user and directory with proper permissions since the hdfs user cannot write to the intended directory.

 

 

If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.  

 

Thanks,


Steven

avatar

Since i am executing sqoop from the user hdfs (hdfs@master>) it should have access to all the directories inside it right? 

if i just use the command without specifying the --warehouse-dir, then also i get the same error. 

 

Also, i disabled permission in hdfs-site.xml from true to false in hdfs.permission.enabled

avatar

I solved my problem. In my case one of the talbes' name was starting with a character underscore "_" because of which there was an issue where 2 single quotes were added automatically in the path of the hdfs directory where the copy of the file was stored.

I changed the name of the column by removing the underscore character and now i can import the table into Hive database. I think special characters like that are not easily parsed in Hive or HDFS.