09-20-2018 12:58 AM
Hi All,
I am trying to export Oracle Tables to Hadoop hive using Sqoop Import. Oracle Table size is 32 MB.
I run sqoop import. 54 GB is free on datanode. While importing to hive the Datanode which is of 54 GB gets full everytime.
How come sqoop import to hive table from oracle is taking more 54 GB for 32 MB tables.
Please suggest.
09-20-2018 11:48 AM
09-23-2018 01:05 AM
Hi
Source Table size is 32MB in Oracle. When i try to import it to HDFS using Sqoop.
Sqoop is getting failed. No Space is available in any of the directories.
My Datanode is /u01/dfs is of 54GB.
For 32 MB, Sqoop import is taking more than 54GB.
After Giving --target-dir=/u02
Still the sqoop is importing the datanode /u01/dfs. I can all see that it /u01 get 100% full.
After that i see the error.
Source Oracle Tables contains Blob Objects.
Please suggest