<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: How can I convert the BLOB data to the actual file when importing the Oracle table data to HDFS using Sqoop in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/How-can-I-convert-the-BLOB-data-to-the-actual-file-when/m-p/230158#M192008</link>
    <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/51243/jyothireddy119.html" nodeid="51243"&gt;@jyothi k&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;While I was doing a migration from RDBMS to Hive I did come across the same scenario, that BLOB, CLOB data.&lt;/P&gt;&lt;P&gt;I did approch in a manner that covert the BLOB and CLOB data using bese64 encoding (convert any kind of binary data into readable text format) store in Hive.&lt;/P&gt;&lt;PRE&gt;select UTL_ENCODE.BASE64_ENCODE(blob_column) from oracle_tbl;  -- from Orcale&lt;/PRE&gt;&lt;P&gt;This gives the base64 encoding string, so that you can store this as String in Hive/hdfs, which fed to Sqoop as string.&lt;/P&gt;&lt;P&gt;on the other hand to convert back to BLOB you can use hive unbase64 (). &lt;/P&gt;&lt;P&gt;or the java Base64 package (can be used in either native java Apps or Spark etc etc..)&lt;/P&gt;&lt;P&gt;example :&lt;/P&gt;&lt;PRE&gt;select unbase64(converted_blob_column) from hive_table;&lt;/PRE&gt;&lt;P&gt;for native apps you may refer the java docs for Base64 conversion &lt;A target="_blank" href="https://docs.oracle.com/javase/8/docs/api/java/util/Base64.html "&gt;here &lt;/A&gt;&lt;/P&gt;&lt;P&gt;Hope this helps !!&lt;/P&gt;</description>
    <pubDate>Wed, 20 Dec 2017 07:16:31 GMT</pubDate>
    <dc:creator>bkosaraju</dc:creator>
    <dc:date>2017-12-20T07:16:31Z</dc:date>
  </channel>
</rss>

