<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Atlas import from hive FAILED with import-hive.sh in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Atlas-import-from-hive-FAILED-with-import-hive-sh/m-p/199004#M76615</link>
    <description>&lt;P&gt;I have resolved this issue! This issue happens because, HDP 2.6.3 consist Atlas old version, and it does't have appropriate timeout parameters. This bug was fix in &lt;A href="https://issues.apache.org/jira/browse/ATLAS-690" target="_blank"&gt;https://issues.apache.org/jira/browse/ATLAS-690&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I bulid Atlas with master branch and run import with new Jars. And added this parameter : atlas.client.readTimeoutMSecs=600000&lt;/P&gt;&lt;P&gt;And all works propertly, without timeout exeptions.&lt;/P&gt;</description>
    <pubDate>Fri, 30 Mar 2018 15:27:22 GMT</pubDate>
    <dc:creator>kotsubinsky</dc:creator>
    <dc:date>2018-03-30T15:27:22Z</dc:date>
    <item>
      <title>Atlas import from hive FAILED with import-hive.sh</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Atlas-import-from-hive-FAILED-with-import-hive-sh/m-p/199003#M76614</link>
      <description>&lt;P&gt;I use Apache Atlas from HDP 2.6.3&lt;/P&gt;&lt;P&gt;While i run import-hive.sh, this script have imported any databases and tables(not all), but interrupted by next exception:&lt;/P&gt;&lt;P&gt;2018-03-29 11:14:27,257 INFO  - [main:] ~ Importing objects from DB.table1 (HiveMetaStoreBridge:433)
2018-03-29 11:15:27,356 ERROR - [main:] ~ Import failed for hive_table table1 (HiveMetaStoreBridge:326)
org.apache.atlas.hook.AtlasHookException: HiveMetaStoreBridge.getStorageDescQFName() failed.
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerTable(HiveMetaStoreBridge.java:515)
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importTable(HiveMetaStoreBridge.java:289)
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importTables(HiveMetaStoreBridge.java:272)
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importDatabases(HiveMetaStoreBridge.java:143)
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importHiveMetadata(HiveMetaStoreBridge.java:134)
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.main(HiveMetaStoreBridge.java:647)
Caused by: com.sun.jersey.api.client.ClientHandlerException: java.net.SocketTimeoutException: Read timed out
        at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:149)
        at com.sun.jersey.api.client.Client.handle(Client.java:648)
        at com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
        at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
        at com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623)
        at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:334)
        at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:311)
        at org.apache.atlas.AtlasBaseClient.callAPI(AtlasBaseClient.java:199)
        at org.apache.atlas.AtlasClient.callAPIWithBodyAndParams(AtlasClient.java:952)
        at org.apache.atlas.AtlasClient.updateEntity(AtlasClient.java:505)
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.updateInstance(HiveMetaStoreBridge.java:526)
        at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerTable(HiveMetaStoreBridge.java:511)
        ... 5 more
Caused by: java.net.SocketTimeoutException: Read timed out
        at java.net.SocketInputStream.socketRead0(Native Method)
        at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
        at java.net.SocketInputStream.read(SocketInputStream.java:171)
        at java.net.SocketInputStream.read(SocketInputStream.java:141)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
        at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:735)
        at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:678)
        at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1587)
        at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1492)
        at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
        at com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:240)
        at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:147)
        ... 16 more&lt;/P&gt;&lt;P&gt;2018-03-29 10:33:37,303 ERROR - [main:] ~ Able to import 4 tables out of 31 tables from DB. Please check logs for import errors (HiveMetaStoreBridge:279)&lt;/P&gt;&lt;P&gt;I suppose that need increase some timeout , because this interrupted after 60 000 ms... What is timeout controls this?&lt;/P&gt;</description>
      <pubDate>Thu, 29 Mar 2018 15:39:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Atlas-import-from-hive-FAILED-with-import-hive-sh/m-p/199003#M76614</guid>
      <dc:creator>kotsubinsky</dc:creator>
      <dc:date>2018-03-29T15:39:22Z</dc:date>
    </item>
    <item>
      <title>Re: Atlas import from hive FAILED with import-hive.sh</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Atlas-import-from-hive-FAILED-with-import-hive-sh/m-p/199004#M76615</link>
      <description>&lt;P&gt;I have resolved this issue! This issue happens because, HDP 2.6.3 consist Atlas old version, and it does't have appropriate timeout parameters. This bug was fix in &lt;A href="https://issues.apache.org/jira/browse/ATLAS-690" target="_blank"&gt;https://issues.apache.org/jira/browse/ATLAS-690&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I bulid Atlas with master branch and run import with new Jars. And added this parameter : atlas.client.readTimeoutMSecs=600000&lt;/P&gt;&lt;P&gt;And all works propertly, without timeout exeptions.&lt;/P&gt;</description>
      <pubDate>Fri, 30 Mar 2018 15:27:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Atlas-import-from-hive-FAILED-with-import-hive-sh/m-p/199004#M76615</guid>
      <dc:creator>kotsubinsky</dc:creator>
      <dc:date>2018-03-30T15:27:22Z</dc:date>
    </item>
  </channel>
</rss>

