Member since
09-22-2020
9
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2811 | 09-24-2020 08:34 AM |
09-24-2020
08:34 AM
1 Kudo
I solved my problem. In my case one of the talbes' name was starting with a character underscore "_" because of which there was an issue where 2 single quotes were added automatically in the path of the hdfs directory where the copy of the file was stored. I changed the name of the column by removing the underscore character and now i can import the table into Hive database. I think special characters like that are not easily parsed in Hive or HDFS.
... View more
09-24-2020
08:02 AM
Since i am executing sqoop from the user hdfs (hdfs@master>) it should have access to all the directories inside it right? if i just use the command without specifying the --warehouse-dir, then also i get the same error. Also, i disabled permission in hdfs-site.xml from true to false in hdfs.permission.enabled
... View more
09-24-2020
02:49 AM
I am running the command from hdfs user hdfs@master> sqoop import-all-tables --connect jdbc:mysql://10.11.11.15:6306/siki_asmet?serverTimezone=Asia/Jakarta --username micronics -P --hive-import --warehouse-dir /warehouse/siki --hive-database siki_ods --exclude-tables "Sheet1$" --m 1; When i run the above command, i get the following error: FAILED: SemanticException Line 1:17 Invalid path ''hdfs://master.lpjk.com:8020/warehouse/siki/_asdamkindo_personal_ska_pendidikan'': No files matching path hdfs://master.lpjk.com:8020/warehouse/siki/_asdamkindo_personal_ska_pendidikan (state=42000,code=40000) But when i try to run the command again, it said file already exists.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
-
HDFS