<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Sqoop hook doesn't work for atlas? in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107125#M38091</link>
    <description>&lt;P&gt;I install atlas and sqoop respectively and haven't used HDP.&lt;/P&gt;&lt;P&gt;After execute this command:&lt;/P&gt;&lt;PRE&gt;sqoop import -connect
jdbc:mysql://master:3306/hive -username root -password admin -table
TBLS -hive-import -hive-table sqoophook1 
&lt;/PRE&gt;&lt;P&gt;It shows that sqoop import data into hive successfully, and never report error.&lt;/P&gt;&lt;P&gt;Then I check the Atlas UI, search the sqoop_process type, but I can't check any information.  Why?&lt;/P&gt;&lt;P&gt;`&lt;/P&gt;&lt;P&gt;Here is my configuration process:&lt;/P&gt;&lt;P&gt;Step 1: Set the &amp;lt;sqoop-conf&amp;gt;/sqoop-site.xml &lt;/P&gt;&lt;PRE&gt;&amp;lt;property&amp;gt; 
&amp;lt;name&amp;gt;sqoop.job.data.publish.class&amp;lt;/name&amp;gt;
 &amp;lt;value&amp;gt;org.apache.atlas.sqoop.hook.SqoopHook&amp;lt;/value&amp;gt;
 &amp;lt;/property&amp;gt; &lt;/PRE&gt;&lt;P&gt;Step 2: Copy the &amp;lt;atlas-conf&amp;gt;/atlas-application.properties to &amp;lt;sqoop-conf&amp;gt; &lt;/P&gt;&lt;P&gt;Step 3: Link &amp;lt;atlas-home&amp;gt;/hook/sqoop/*.jar in sqoop lib.&lt;/P&gt;&lt;P&gt;`&lt;/P&gt;&lt;P&gt;Are these configuration-steps wrong ?&lt;/P&gt;&lt;P&gt;Here is the output&lt;/P&gt;&lt;PRE&gt;sqoop import -connect jdbc:mysql://zte-1:3306/hive -username root -password admin -table TBLS -hive-import -hive-table sqoophook2
Warning: /var/local/hadoop/sqoop-1.4.6/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /var/local/hadoop/sqoop-1.4.6/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/08/23 01:04:04 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/08/23 01:04:04 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/08/23 01:04:04 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16/08/23 01:04:04 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/08/23 01:04:05 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/08/23 01:04:05 INFO tool.CodeGenTool: Beginning code generation
16/08/23 01:04:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `TBLS` AS t LIMIT 1
16/08/23 01:04:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `TBLS` AS t LIMIT 1
16/08/23 01:04:06 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /var/local/hadoop/hadoop-2.6.0
Note: /tmp/sqoop-hdfs/compile/2606be5f25a97674311440065aac302d/TBLS.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/08/23 01:04:09 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/2606be5f25a97674311440065aac302d/TBLS.jar
16/08/23 01:04:09 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/08/23 01:04:09 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/08/23 01:04:09 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/08/23 01:04:09 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/08/23 01:04:09 INFO mapreduce.ImportJobBase: Beginning import of TBLS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/var/local/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/var/local/hadoop/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/S     taticLoggerBinder.class]
SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/08/23 01:04:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using  builtin-java classes where applicable
16/08/23 01:04:10 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
16/08/23 01:04:11 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job. maps
16/08/23 01:04:11 INFO client.RMProxy: Connecting to ResourceManager at zte-1/192.168.136.128:8032
16/08/23 01:04:16 INFO db.DBInputFormat: Using read commited transaction isolation
16/08/23 01:04:16 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`TBL_ID`), MAX(`TBL_ID`) FROM `TBLS`
16/08/23 01:04:17 INFO mapreduce.JobSubmitter: number of splits:4
16/08/23 01:04:17 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1471882959657_0001
16/08/23 01:04:19 INFO impl.YarnClientImpl: Submitted application application_1471882959657_0001
16/08/23 01:04:19 INFO mapreduce.Job: The url to track the job: &lt;A href="http://zte-1:8088/proxy/application_147188295" target="_blank"&gt;http://zte-1:8088/proxy/application_147188295&lt;/A&gt;                                                                                                                                9657_0001/
16/08/23 01:04:19 INFO mapreduce.Job: Running job: job_1471882959657_0001
16/08/23 01:04:37 INFO mapreduce.Job: Job job_1471882959657_0001 running in uber mode : false
16/08/23 01:04:37 INFO mapreduce.Job:  map 0% reduce 0%
16/08/23 01:05:05 INFO mapreduce.Job:  map 25% reduce 0%
16/08/23 01:05:07 INFO mapreduce.Job:  map 100% reduce 0%
16/08/23 01:05:08 INFO mapreduce.Job: Job job_1471882959657_0001 completed successfully
16/08/23 01:05:08 INFO mapreduce.Job: Counters: 30
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=529788
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=426
                HDFS: Number of bytes written=171
                HDFS: Number of read operations=16
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=8
        Job Counters
                Launched map tasks=4
                Other local map tasks=4
                Total time spent by all maps in occupied slots (ms)=102550
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=102550
                Total vcore-seconds taken by all map tasks=102550
                Total megabyte-seconds taken by all map tasks=105011200
        Map-Reduce Framework
                Map input records=3
                Map output records=3
                Input split bytes=426
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=1227
                CPU time spent (ms)=3640
                Physical memory (bytes) snapshot=390111232
                Virtual memory (bytes) snapshot=3376676864
                Total committed heap usage (bytes)=74018816
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=171
16/08/23 01:05:08 INFO mapreduce.ImportJobBase: Transferred 171 bytes in 57.2488 seconds (2.987 bytes/sec)
16/08/23 01:05:08 INFO mapreduce.ImportJobBase: Retrieved 3 records.
16/08/23 01:05:08 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `TBLS` AS t LIMIT 1
16/08/23 01:05:08 INFO hive.HiveImport: Loading uploaded data into Hive
16/08/23 01:05:19 INFO hive.HiveImport:
16/08/23 01:05:19 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/var/local/hadoop/hive-1.2.1/lib/hive-common-1.2.1.jar!/hive-log4j.properties
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/var/local/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/var/local/hadoop/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an e                                                                                                                                xplanation.
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/08/23 01:05:31 INFO hive.HiveImport: OK
16/08/23 01:05:31 INFO hive.HiveImport: Time taken: 3.481 seconds
16/08/23 01:05:31 INFO hive.HiveImport: Loading data to table default.sqoophook2
16/08/23 01:05:33 INFO hive.HiveImport: Table default.sqoophook2 stats: [numFiles=4, totalSize=171]
16/08/23 01:05:33 INFO hive.HiveImport: OK
16/08/23 01:05:33 INFO hive.HiveImport: Time taken: 1.643 seconds
16/08/23 01:05:35 INFO hive.HiveImport: Hive import complete.
16/08/23 01:05:35 INFO hive.HiveImport: Export directory is contains the _SUCCESS file only, removing the directory.
&lt;/PRE&gt;</description>
    <pubDate>Wed, 17 Aug 2016 11:01:33 GMT</pubDate>
    <dc:creator>dreamcoding</dc:creator>
    <dc:date>2016-08-17T11:01:33Z</dc:date>
    <item>
      <title>Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107125#M38091</link>
      <description>&lt;P&gt;I install atlas and sqoop respectively and haven't used HDP.&lt;/P&gt;&lt;P&gt;After execute this command:&lt;/P&gt;&lt;PRE&gt;sqoop import -connect
jdbc:mysql://master:3306/hive -username root -password admin -table
TBLS -hive-import -hive-table sqoophook1 
&lt;/PRE&gt;&lt;P&gt;It shows that sqoop import data into hive successfully, and never report error.&lt;/P&gt;&lt;P&gt;Then I check the Atlas UI, search the sqoop_process type, but I can't check any information.  Why?&lt;/P&gt;&lt;P&gt;`&lt;/P&gt;&lt;P&gt;Here is my configuration process:&lt;/P&gt;&lt;P&gt;Step 1: Set the &amp;lt;sqoop-conf&amp;gt;/sqoop-site.xml &lt;/P&gt;&lt;PRE&gt;&amp;lt;property&amp;gt; 
&amp;lt;name&amp;gt;sqoop.job.data.publish.class&amp;lt;/name&amp;gt;
 &amp;lt;value&amp;gt;org.apache.atlas.sqoop.hook.SqoopHook&amp;lt;/value&amp;gt;
 &amp;lt;/property&amp;gt; &lt;/PRE&gt;&lt;P&gt;Step 2: Copy the &amp;lt;atlas-conf&amp;gt;/atlas-application.properties to &amp;lt;sqoop-conf&amp;gt; &lt;/P&gt;&lt;P&gt;Step 3: Link &amp;lt;atlas-home&amp;gt;/hook/sqoop/*.jar in sqoop lib.&lt;/P&gt;&lt;P&gt;`&lt;/P&gt;&lt;P&gt;Are these configuration-steps wrong ?&lt;/P&gt;&lt;P&gt;Here is the output&lt;/P&gt;&lt;PRE&gt;sqoop import -connect jdbc:mysql://zte-1:3306/hive -username root -password admin -table TBLS -hive-import -hive-table sqoophook2
Warning: /var/local/hadoop/sqoop-1.4.6/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /var/local/hadoop/sqoop-1.4.6/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/08/23 01:04:04 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/08/23 01:04:04 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/08/23 01:04:04 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16/08/23 01:04:04 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/08/23 01:04:05 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/08/23 01:04:05 INFO tool.CodeGenTool: Beginning code generation
16/08/23 01:04:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `TBLS` AS t LIMIT 1
16/08/23 01:04:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `TBLS` AS t LIMIT 1
16/08/23 01:04:06 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /var/local/hadoop/hadoop-2.6.0
Note: /tmp/sqoop-hdfs/compile/2606be5f25a97674311440065aac302d/TBLS.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/08/23 01:04:09 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/2606be5f25a97674311440065aac302d/TBLS.jar
16/08/23 01:04:09 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/08/23 01:04:09 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/08/23 01:04:09 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/08/23 01:04:09 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/08/23 01:04:09 INFO mapreduce.ImportJobBase: Beginning import of TBLS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/var/local/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/var/local/hadoop/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/S     taticLoggerBinder.class]
SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/08/23 01:04:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using  builtin-java classes where applicable
16/08/23 01:04:10 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
16/08/23 01:04:11 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job. maps
16/08/23 01:04:11 INFO client.RMProxy: Connecting to ResourceManager at zte-1/192.168.136.128:8032
16/08/23 01:04:16 INFO db.DBInputFormat: Using read commited transaction isolation
16/08/23 01:04:16 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`TBL_ID`), MAX(`TBL_ID`) FROM `TBLS`
16/08/23 01:04:17 INFO mapreduce.JobSubmitter: number of splits:4
16/08/23 01:04:17 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1471882959657_0001
16/08/23 01:04:19 INFO impl.YarnClientImpl: Submitted application application_1471882959657_0001
16/08/23 01:04:19 INFO mapreduce.Job: The url to track the job: &lt;A href="http://zte-1:8088/proxy/application_147188295" target="_blank"&gt;http://zte-1:8088/proxy/application_147188295&lt;/A&gt;                                                                                                                                9657_0001/
16/08/23 01:04:19 INFO mapreduce.Job: Running job: job_1471882959657_0001
16/08/23 01:04:37 INFO mapreduce.Job: Job job_1471882959657_0001 running in uber mode : false
16/08/23 01:04:37 INFO mapreduce.Job:  map 0% reduce 0%
16/08/23 01:05:05 INFO mapreduce.Job:  map 25% reduce 0%
16/08/23 01:05:07 INFO mapreduce.Job:  map 100% reduce 0%
16/08/23 01:05:08 INFO mapreduce.Job: Job job_1471882959657_0001 completed successfully
16/08/23 01:05:08 INFO mapreduce.Job: Counters: 30
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=529788
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=426
                HDFS: Number of bytes written=171
                HDFS: Number of read operations=16
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=8
        Job Counters
                Launched map tasks=4
                Other local map tasks=4
                Total time spent by all maps in occupied slots (ms)=102550
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=102550
                Total vcore-seconds taken by all map tasks=102550
                Total megabyte-seconds taken by all map tasks=105011200
        Map-Reduce Framework
                Map input records=3
                Map output records=3
                Input split bytes=426
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=1227
                CPU time spent (ms)=3640
                Physical memory (bytes) snapshot=390111232
                Virtual memory (bytes) snapshot=3376676864
                Total committed heap usage (bytes)=74018816
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=171
16/08/23 01:05:08 INFO mapreduce.ImportJobBase: Transferred 171 bytes in 57.2488 seconds (2.987 bytes/sec)
16/08/23 01:05:08 INFO mapreduce.ImportJobBase: Retrieved 3 records.
16/08/23 01:05:08 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `TBLS` AS t LIMIT 1
16/08/23 01:05:08 INFO hive.HiveImport: Loading uploaded data into Hive
16/08/23 01:05:19 INFO hive.HiveImport:
16/08/23 01:05:19 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/var/local/hadoop/hive-1.2.1/lib/hive-common-1.2.1.jar!/hive-log4j.properties
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/var/local/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/var/local/hadoop/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an e                                                                                                                                xplanation.
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/08/23 01:05:31 INFO hive.HiveImport: OK
16/08/23 01:05:31 INFO hive.HiveImport: Time taken: 3.481 seconds
16/08/23 01:05:31 INFO hive.HiveImport: Loading data to table default.sqoophook2
16/08/23 01:05:33 INFO hive.HiveImport: Table default.sqoophook2 stats: [numFiles=4, totalSize=171]
16/08/23 01:05:33 INFO hive.HiveImport: OK
16/08/23 01:05:33 INFO hive.HiveImport: Time taken: 1.643 seconds
16/08/23 01:05:35 INFO hive.HiveImport: Hive import complete.
16/08/23 01:05:35 INFO hive.HiveImport: Export directory is contains the _SUCCESS file only, removing the directory.
&lt;/PRE&gt;</description>
      <pubDate>Wed, 17 Aug 2016 11:01:33 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107125#M38091</guid>
      <dc:creator>dreamcoding</dc:creator>
      <dc:date>2016-08-17T11:01:33Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107126#M38092</link>
      <description>&lt;P&gt;
	&lt;A rel="user" href="https://community.cloudera.com/users/8733/dreamcoding.html" nodeid="8733"&gt;@Ethan Hsieh&lt;/A&gt;&lt;/P&gt;&lt;P&gt;
	Could you also confirm if the sqoop-site.xml has the rest address for atlas server configured ? &lt;/P&gt;&lt;P&gt;Sample configuration is available &lt;A href="https://github.com/apache/incubator-atlas/blob/master/addons/sqoop-bridge/src/test/resources/sqoop-site.xml"&gt;here&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 17 Aug 2016 17:30:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107126#M38092</guid>
      <dc:creator>ckrishnakumar</dc:creator>
      <dc:date>2016-08-17T17:30:21Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107127#M38093</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/8733/dreamcoding.html" nodeid="8733"&gt;@Ethan Hsieh&lt;/A&gt;&lt;P&gt;Can you paste the console output for the executed sqoop command here? Also please make sure to add the atlas.rest.address property to the sqoop-site.xml or atlas-application.properties file and run the command to see if there is any difference.&lt;/P&gt;</description>
      <pubDate>Sat, 20 Aug 2016 02:56:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107127#M38093</guid>
      <dc:creator>apathan</dc:creator>
      <dc:date>2016-08-20T02:56:27Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107128#M38094</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/40/apathan.html" nodeid="40"&gt;@Ayub Pathan&lt;/A&gt; &lt;/P&gt;&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/459/ckrishnakumar.html" nodeid="459"&gt;@ckrishnakumar&lt;/A&gt; &lt;/P&gt;&lt;P&gt;The sqoop hook still doesn't work.  Here is the console output for the executed sqoop command:&lt;/P&gt;&lt;PRE&gt;sqoop import -connect jdbc:mysql://zte-1:3306/hive -username root -password admin -table TBLS -hive-import -hive-table sqoophook2
Warning: /var/local/hadoop/sqoop-1.4.6/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /var/local/hadoop/sqoop-1.4.6/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/08/23 01:04:04 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/08/23 01:04:04 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/08/23 01:04:04 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16/08/23 01:04:04 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/08/23 01:04:05 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/08/23 01:04:05 INFO tool.CodeGenTool: Beginning code generation
16/08/23 01:04:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `TBLS` AS t LIMIT 1
16/08/23 01:04:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `TBLS` AS t LIMIT 1
16/08/23 01:04:06 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /var/local/hadoop/hadoop-2.6.0
Note: /tmp/sqoop-hdfs/compile/2606be5f25a97674311440065aac302d/TBLS.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/08/23 01:04:09 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/2606be5f25a97674311440065aac302d/TBLS.jar
16/08/23 01:04:09 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/08/23 01:04:09 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/08/23 01:04:09 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/08/23 01:04:09 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/08/23 01:04:09 INFO mapreduce.ImportJobBase: Beginning import of TBLS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/var/local/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/var/local/hadoop/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/S     taticLoggerBinder.class]
SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/08/23 01:04:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using  builtin-java classes where applicable
16/08/23 01:04:10 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
16/08/23 01:04:11 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job. maps
16/08/23 01:04:11 INFO client.RMProxy: Connecting to ResourceManager at zte-1/192.168.136.128:8032
16/08/23 01:04:16 INFO db.DBInputFormat: Using read commited transaction isolation
16/08/23 01:04:16 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`TBL_ID`), MAX(`TBL_ID`) FROM `TBLS`
16/08/23 01:04:17 INFO mapreduce.JobSubmitter: number of splits:4
16/08/23 01:04:17 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1471882959657_0001
16/08/23 01:04:19 INFO impl.YarnClientImpl: Submitted application application_1471882959657_0001
16/08/23 01:04:19 INFO mapreduce.Job: The url to track the job: &lt;A href="http://zte-1:8088/proxy/application_147188295" target="_blank"&gt;http://zte-1:8088/proxy/application_147188295&lt;/A&gt;                                                                                                                                9657_0001/
16/08/23 01:04:19 INFO mapreduce.Job: Running job: job_1471882959657_0001
16/08/23 01:04:37 INFO mapreduce.Job: Job job_1471882959657_0001 running in uber mode : false
16/08/23 01:04:37 INFO mapreduce.Job:  map 0% reduce 0%
16/08/23 01:05:05 INFO mapreduce.Job:  map 25% reduce 0%
16/08/23 01:05:07 INFO mapreduce.Job:  map 100% reduce 0%
16/08/23 01:05:08 INFO mapreduce.Job: Job job_1471882959657_0001 completed successfully
16/08/23 01:05:08 INFO mapreduce.Job: Counters: 30
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=529788
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=426
                HDFS: Number of bytes written=171
                HDFS: Number of read operations=16
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=8
        Job Counters
                Launched map tasks=4
                Other local map tasks=4
                Total time spent by all maps in occupied slots (ms)=102550
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=102550
                Total vcore-seconds taken by all map tasks=102550
                Total megabyte-seconds taken by all map tasks=105011200
        Map-Reduce Framework
                Map input records=3
                Map output records=3
                Input split bytes=426
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=1227
                CPU time spent (ms)=3640
                Physical memory (bytes) snapshot=390111232
                Virtual memory (bytes) snapshot=3376676864
                Total committed heap usage (bytes)=74018816
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=171
16/08/23 01:05:08 INFO mapreduce.ImportJobBase: Transferred 171 bytes in 57.2488 seconds (2.987 bytes/sec)
16/08/23 01:05:08 INFO mapreduce.ImportJobBase: Retrieved 3 records.
16/08/23 01:05:08 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `TBLS` AS t LIMIT 1
16/08/23 01:05:08 INFO hive.HiveImport: Loading uploaded data into Hive
16/08/23 01:05:19 INFO hive.HiveImport:
16/08/23 01:05:19 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/var/local/hadoop/hive-1.2.1/lib/hive-common-1.2.1.jar!/hive-log4j.properties
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/var/local/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/var/local/hadoop/hbase-1.1.5/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an e                                                                                                                                xplanation.
16/08/23 01:05:19 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/08/23 01:05:31 INFO hive.HiveImport: OK
16/08/23 01:05:31 INFO hive.HiveImport: Time taken: 3.481 seconds
16/08/23 01:05:31 INFO hive.HiveImport: Loading data to table default.sqoophook2
16/08/23 01:05:33 INFO hive.HiveImport: Table default.sqoophook2 stats: [numFiles=4, totalSize=171]
16/08/23 01:05:33 INFO hive.HiveImport: OK
16/08/23 01:05:33 INFO hive.HiveImport: Time taken: 1.643 seconds
16/08/23 01:05:35 INFO hive.HiveImport: Hive import complete.
16/08/23 01:05:35 INFO hive.HiveImport: Export directory is contains the _SUCCESS file only, removing the directory.&lt;/PRE&gt;</description>
      <pubDate>Mon, 22 Aug 2016 16:16:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107128#M38094</guid>
      <dc:creator>dreamcoding</dc:creator>
      <dc:date>2016-08-22T16:16:18Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107129#M38095</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/40/apathan.html" nodeid="40"&gt;@Ayub Pathan&lt;/A&gt; &lt;/P&gt;&lt;P&gt;After I added atlas.rest.address property to the sqoop-site.xml, the problem is same. I search sqoop_process in the Atlas Web UI and no result was found.&lt;/P&gt;&lt;P&gt;But the Hive hook is work, it can capture the imported hive_table which is shown in the atlas Web UI.&lt;/P&gt;&lt;P&gt;I paste the output on the next answer. And the output doesn't report any error.&lt;/P&gt;&lt;P&gt;｀｀&lt;/P&gt;&lt;P&gt;I remembered that, when I configured the Hive hook, I added some path of JARs for the HIVE_AUX_JARS_PATH. But the configuration process of Sqoop hook is lack of this step.&lt;/P&gt;&lt;P&gt; Is it necessary to add some path of JARs for sqoop? It seems that the SqoopHook Class doesn't work.&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 16:35:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107129#M38095</guid>
      <dc:creator>dreamcoding</dc:creator>
      <dc:date>2016-08-22T16:35:00Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107130#M38096</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/459/ckrishnakumar.html" nodeid="459"&gt;@ckrishnakumar&lt;/A&gt;&lt;/P&gt;&lt;P&gt;After I added atlas.rest.address property to the sqoop-site.xml, the problem is same. I search sqoop_process in the Atlas Web UI and no result was found.&lt;/P&gt;&lt;P&gt;But the Hive hook is work, it can capture the imported hive_table which is shown in the atlas Web UI.&lt;/P&gt;&lt;P&gt;I paste the output on the next answer. And the output doesn't report any error.&lt;/P&gt;&lt;P&gt;｀｀&lt;/P&gt;&lt;P&gt;I remembered that, when I configured the Hive hook, I added some path of JARs for the HIVE_AUX_JARS_PATH. But the configuration process of Sqoop hook is lack of this step.&lt;/P&gt;&lt;P&gt;Is it necessary to add some path of JARs for sqoop? It seems that the SqoopHook Class doesn't work.&lt;/P&gt;</description>
      <pubDate>Mon, 22 Aug 2016 16:36:07 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107130#M38096</guid>
      <dc:creator>dreamcoding</dc:creator>
      <dc:date>2016-08-22T16:36:07Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107131#M38097</link>
      <description>&lt;P&gt;Step 3: Link &amp;lt;atlas-home&amp;gt;/hook/sqoop/*.jar in sqoop lib. should take care of adding the required jar files on the sqoop path.&lt;/P&gt;</description>
      <pubDate>Tue, 23 Aug 2016 10:52:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107131#M38097</guid>
      <dc:creator>ckrishnakumar</dc:creator>
      <dc:date>2016-08-23T10:52:28Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107132#M38098</link>
      <description>&lt;P&gt;It is possible that only certain version of sqoop supports the hook and the output of the command doesn't seem to have hook kafka entry.&lt;/P&gt;&lt;P&gt;would it be possible to provide the output of this command ? &lt;/P&gt;&lt;P&gt;jar tvf sqoop-&amp;lt;version&amp;gt;jar | grep .class | grep SqoopJobDataPublisher &lt;/P&gt;&lt;P&gt;The output should look like&lt;/P&gt;&lt;P&gt;$  jar tvf sqoop-1.4.7-SNAPSHOT.jar | grep .class | grep SqoopJobDataPublisher &lt;/P&gt;&lt;P&gt;  3462 Fri Jan 22 12:15:04 IST 2016 org/apache/sqoop/SqoopJobDataPublisher$Data.class &lt;/P&gt;&lt;P&gt;   644 Fri Jan 22 12:15:04 IST 2016 org/apache/sqoop/SqoopJobDataPublisher.class&lt;/P&gt;</description>
      <pubDate>Tue, 23 Aug 2016 10:55:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107132#M38098</guid>
      <dc:creator>ckrishnakumar</dc:creator>
      <dc:date>2016-08-23T10:55:24Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107133#M38099</link>
      <description>&lt;P&gt;How to link these JARs?&lt;/P&gt;&lt;P&gt;Copy these JARs into &amp;lt;sqoop-home&amp;gt;/lib ?&lt;/P&gt;&lt;P&gt;Or use command:  ln -s &amp;lt;atlas-home&amp;gt;/hook/sqoop/*  &amp;lt;sqoop-home&amp;gt;/lib/  ?&lt;/P&gt;</description>
      <pubDate>Tue, 23 Aug 2016 12:49:53 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107133#M38099</guid>
      <dc:creator>dreamcoding</dc:creator>
      <dc:date>2016-08-23T12:49:53Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107134#M38100</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/459/ckrishnakumar.html" nodeid="459" target="_blank"&gt;@ckrishnakumar&lt;/A&gt; &lt;/P&gt;&lt;P&gt;The output of this command is nothing.&lt;/P&gt;&lt;P&gt;This is the result shown in the terminal. I also pasted the screenshot next to it.&lt;/P&gt;&lt;PRE&gt;[hdfs@zte-1 sqoop-1.4.6]$ ls
bin        CHANGELOG.txt  conf  ivy      lib          NOTICE.txt   README.txt       sqoop-patch-review.py  src
build.xml  COMPILING.txt  docs  ivy.xml  LICENSE.txt  pom-old.xml  sqoop-1.4.6.jar  sqoop-test-1.4.6.jar   testdata
[hdfs@zte-1 sqoop-1.4.6]$  jar tvf sqoop-1.4.6.jar | grep .class | grep SqoopJobDataPublisher
[hdfs@zte-1 sqoop-1.4.6]$ 
&lt;/PRE&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="6865-1.jpg" style="width: 1631px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/23484i982AD926A0E6F52E/image-size/medium?v=v2&amp;amp;px=400" role="button" title="6865-1.jpg" alt="6865-1.jpg" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 19 Aug 2019 11:48:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107134#M38100</guid>
      <dc:creator>dreamcoding</dc:creator>
      <dc:date>2019-08-19T11:48:03Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107135#M38101</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/8733/dreamcoding.html" nodeid="8733"&gt;@Ethan Hsieh&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Looks like this version of sqoop does not support integration with atlas. You may have to upgrade the version to sqoop-1.4.6.2.3.99.1-5.jar sandbox (HDP 2.4.0) or use sqoop-1.4.7 or later from apache.&lt;/P&gt;&lt;P&gt;Do let me know if the upgrade resolve the issue&lt;/P&gt;</description>
      <pubDate>Tue, 23 Aug 2016 13:13:44 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107135#M38101</guid>
      <dc:creator>ckrishnakumar</dc:creator>
      <dc:date>2016-08-23T13:13:44Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107136#M38102</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/459/ckrishnakumar.html" nodeid="459"&gt;@ckrishnakumar&lt;/A&gt; &lt;/P&gt;&lt;P&gt;Thank you very much. But I can find sqoop-1.4.7 in the official webside: &lt;A href="http://sqoop.apache.org/"&gt;http://sqoop.apache.org/&lt;/A&gt;&lt;/P&gt;&lt;P&gt;This webside shows that the latest stable release is 1.4.6. It doesn't provide the access for downloading the sqoop-1.4.7.&lt;/P&gt;&lt;P&gt;Could you give me a link to download the 1.4.7 version or send me the jar to my email: dreamcoding@outlook.com&lt;/P&gt;</description>
      <pubDate>Tue, 23 Aug 2016 13:55:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107136#M38102</guid>
      <dc:creator>dreamcoding</dc:creator>
      <dc:date>2016-08-23T13:55:38Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107137#M38103</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/8733/dreamcoding.html" nodeid="8733"&gt;@Ethan Hsieh&lt;/A&gt; I have sent you the jar file. Also you will be able to build this jar file by cloning the sqoop git repo - &lt;A href="https://github.com/apache/sqoop.git"&gt;https://github.com/apache/sqoop.git&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Details of how to compile is provided under - &lt;A href="https://github.com/apache/sqoop/blob/trunk/COMPILING.txt" target="_blank"&gt;https://github.com/apache/sqoop/blob/trunk/COMPILING.txt&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 23 Aug 2016 23:27:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107137#M38103</guid>
      <dc:creator>ckrishnakumar</dc:creator>
      <dc:date>2016-08-23T23:27:29Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107138#M38104</link>
      <description>&lt;P&gt;Sqoop hook for atlas is not part of the 2.4.0 release.  &lt;/P&gt;</description>
      <pubDate>Fri, 02 Sep 2016 13:38:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107138#M38104</guid>
      <dc:creator>vranganathan</dc:creator>
      <dc:date>2016-09-02T13:38:35Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107139#M38105</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/459/ckrishnakumar.html" nodeid="459"&gt;@Chethana Krishnakumar&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Thank you very much, after imported the 1.4.7.jar package into sqoop1.4.6 does solve the problem. But I am worried that there will be some small problems in the future, so I came up with several solutions:&lt;/P&gt;&lt;P&gt;1. I found that the version of sqoop in HDP is 1.4.6, but as I mentioned before that the sqoop1.4.6 obtained from the official is not complete, I would like to ask you that can give me a full version of the 1.4.6.&lt;/P&gt;&lt;P&gt;2. can you provide me a full version of the sqoop-1.4.7.,not just only the 1.4.7.jar package&lt;/P&gt;&lt;P&gt;3. I even tried the latest release of the sqoop-1.99.7, but the official information only saw the use of it in importing data from the relational database into HDFS, I want to know the operation steps of using it to import datafrom the relational database into Hive.&lt;/P&gt;</description>
      <pubDate>Wed, 14 Sep 2016 22:06:54 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107139#M38105</guid>
      <dc:creator>dreamcoding</dc:creator>
      <dc:date>2016-09-14T22:06:54Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107140#M38106</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/8733/dreamcoding.html" nodeid="8733"&gt;@Ethan Hsieh&lt;/A&gt; &lt;/P&gt;&lt;P&gt;1.You will now be able to find the sqoop hook with &lt;A href="http://hortonworks.com/tech-preview-hdp-2-5/"&gt;http://hortonworks.com/tech-preview-hdp-2-5/&lt;/A&gt; &lt;/P&gt;&lt;P&gt;2.I could provide you with the full version but that may not be a clean fix.Please build sqoop from latest branch on apache &lt;A href="https://github.com/apache/sqoop"&gt;here&lt;/A&gt; which would have all the changes.&lt;/P&gt;&lt;P&gt;3. Could you please post this as a different question as this is related to sqoop client&lt;/P&gt;</description>
      <pubDate>Mon, 19 Sep 2016 13:14:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107140#M38106</guid>
      <dc:creator>ckrishnakumar</dc:creator>
      <dc:date>2016-09-19T13:14:42Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107141#M38107</link>
      <description>&lt;P&gt;@&lt;A href="https://community.hortonworks.com/users/459/ckrishnakumar.html"&gt;Chethana Krishnakumar&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I have the same question.I used ant to compile the project that was downloaded from the github.But when i used the sqoop to import data from mysql to hive,the data could't be imported to hive and the atlas hook didn't work.So i want to know the sqoop project source that the 1.4.7 jar you got from was got from  github?&lt;/P&gt;</description>
      <pubDate>Tue, 20 Sep 2016 19:44:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107141#M38107</guid>
      <dc:creator>wuqi951654775</dc:creator>
      <dc:date>2016-09-20T19:44:36Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop hook doesn't work for atlas?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107142#M38108</link>
      <description>&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="7823-捕获.png" style="width: 666px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/23483iE3CDBFDFE73B51C5/image-size/medium?v=v2&amp;amp;px=400" role="button" title="7823-捕获.png" alt="7823-捕获.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;when the job finished,it gived me these messages.But I can't understand it.&lt;/P&gt;</description>
      <pubDate>Mon, 19 Aug 2019 11:47:54 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Sqoop-hook-doesn-t-work-for-atlas/m-p/107142#M38108</guid>
      <dc:creator>wuqi951654775</dc:creator>
      <dc:date>2019-08-19T11:47:54Z</dc:date>
    </item>
  </channel>
</rss>

