Member since
11-30-2016
85
Posts
7
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3223 | 03-15-2017 07:33 AM |
03-06-2017
04:09 AM
@Ayub Khan Sort of, I am able to associate one entity/attibute to a trait at a time and to do that i am doing an iterative process and the process is time consuming. Can't we bulk upload/update tagged columns/entities in one go. For example : guid_1 public guid_2 private guid_3 public please let me know the curl command to do bulk update. I have gone through the whole Doc, Haven't found any!
... View more
02-08-2017
03:04 PM
1 Kudo
In our org, We are trying to automate the process of tagging Attributes(Hive Column, Table e.t.c). I can create tags using Rest Api. What i am not getting is, How to associate a column to a tag. Any advice will be a great help !! Thanks in advance, Subash
... View more
Labels:
- Labels:
-
Apache Atlas
12-12-2016
06:17 PM
Hey @Sunile Manjee Thank you, I disabled Kafka Ranger plugin and it worked. Now Atlas is capturing real-time changes occurring in hive .
... View more
12-12-2016
02:54 PM
hey @Mats Johansson , as you said I have configured the JAVA_HOME path, Now i am running into time-out issue.Please find the log below: sh import-hive.sh
/usr/lib/jvm/java/bin/java
/usr/lib/jvm/java/bin/jar
Using Hive configuration directory [/etc/hive/conf]
Log file for import is /usr/hdp/2.5.3.0-37/atlas/logs/import-hive.log
2016-12-12 14:28:03,318 INFO - [main:] ~ Looking for atlas-application.propert ies in classpath (ApplicationProperties:73)
2016-12-12 14:28:03,322 INFO - [main:] ~ Loading atlas-application.properties from file:/etc/hive/2.5.3.0-37/0/atlas-application.properties (ApplicationPrope rties:86)
2016-12-12 14:28:03,374 DEBUG - [main:] ~ Configuration loaded: (ApplicationPro perties:99)
2016-12-12 14:28:03,374 DEBUG - [main:] ~ atlas.authentication.method.kerberos = False (ApplicationProperties:102)
2016-12-12 14:28:03,376 DEBUG - [main:] ~ atlas.cluster.name = governance (Appl icationProperties:102)
2016-12-12 14:28:03,376 DEBUG - [main:] ~ atlas.hook.hive.keepAliveTime = 10 (A pplicationProperties:102)
2016-12-12 14:28:03,376 DEBUG - [main:] ~ atlas.hook.hive.maxThreads = 5 (Appli cationProperties:102)
2016-12-12 14:28:03,376 DEBUG - [main:] ~ atlas.hook.hive.minThreads = 5 (Appli cationProperties:102)
2016-12-12 14:28:03,376 DEBUG - [main:] ~ atlas.hook.hive.numRetries = 3 (Appli cationProperties:102)
2016-12-12 14:28:03,376 DEBUG - [main:] ~ atlas.hook.hive.queueSize = 1000 (App licationProperties:102)
2016-12-12 14:28:03,376 DEBUG - [main:] ~ atlas.hook.hive.synchronous = false ( ApplicationProperties:102)
2016-12-12 14:28:03,376 DEBUG - [main:] ~ atlas.kafka.bootstrap.servers = (ApplicationProperties:102)
2016-12-12 14:28:03,377 DEBUG - [main:] ~ atlas.kafka.hook.group.id = atlas (Ap plicationProperties:102)
2016-12-12 14:28:03,377 DEBUG - [main:] ~ atlas.kafka.zookeeper.connect = (ApplicationProperties:102)
2016-12-12 14:28:03,377 DEBUG - [main:] ~ atlas.kafka.zookeeper.connection.time out.ms = 200 (ApplicationProperties:102)
2016-12-12 14:28:03,377 DEBUG - [main:] ~ atlas.kafka.zookeeper.session.timeout .ms = 400 (ApplicationProperties:102)
2016-12-12 14:28:03,379 DEBUG - [main:] ~ atlas.kafka.zookeeper.sync.time.ms = 20 (ApplicationProperties:102)
2016-12-12 14:28:03,379 DEBUG - [main:] ~ atlas.notification.create.topics = Tr ue (ApplicationProperties:102)
2016-12-12 14:28:03,379 DEBUG - [main:] ~ atlas.notification.replicas = 1 (Appl icationProperties:102)
2016-12-12 14:28:03,379 DEBUG - [main:] ~ atlas.notification.topics = [ATLAS_HO OK, ATLAS_ENTITIES] (ApplicationProperties:102)
2016-12-12 14:28:03,379 DEBUG - [main:] ~ atlas.rest.address = (ApplicationProperties:102)
2016-12-12 14:28:03,380 DEBUG - [main:] ~ ==> InMemoryJAASConfiguration.init() (InMemoryJAASConfiguration:168)
2016-12-12 14:28:03,383 DEBUG - [main:] ~ ==> InMemoryJAASConfiguration.init() (InMemoryJAASConfiguration:181)
2016-12-12 14:28:03,387 DEBUG - [main:] ~ ==> InMemoryJAASConfiguration.initial ize() (InMemoryJAASConfiguration:220)
2016-12-12 14:28:03,387 DEBUG - [main:] ~ <== InMemoryJAASConfiguration.initial ize() (InMemoryJAASConfiguration:347)
2016-12-12 14:28:03,387 DEBUG - [main:] ~ <== InMemoryJAASConfiguration.init() (InMemoryJAASConfiguration:190)
2016-12-12 14:28:03,387 DEBUG - [main:] ~ <== InMemoryJAASConfiguration.init() (InMemoryJAASConfiguration:177)
Enter username for atlas :-
admin
Enter password for atlas :-
admin
2016-12-12 14:28:09,373 INFO - [main:] ~ Client has only one service URL, will use that for all actions: http://localhost:21000 (AtlasCli ent:265)
2016-12-12 14:28:10,427 WARN - [main:] ~ Unable to load native-hadoop library for your platform... using builtin-java classes where applicable (NativeCodeLoa der:62)
2016-12-12 14:28:10,778 DEBUG - [main:] ~ Using resource localhost:21000/api/atlas/types/hdfs_path for 0 times (AtlasClient:784)
2016-12-12 14:28:10,804 DEBUG - [main:] ~ API localhost:21000/api/atlas/types/hdfs_path returned status 200 (AtlasClient:1191)
2016-12-12 14:28:11,010 INFO - [main:] ~ HDFS data model is already registered ! (HiveMetaStoreBridge:609)
2016-12-12 14:28:11,011 DEBUG - [main:] ~ Using resource localhost:21000/api/atlas/types/hive_process for 0 times (AtlasClient:784)
2016-12-12 14:28:11,022 DEBUG - [main:] ~ API localhost:21000/api/atlas/types/hive_process returned status 200 (AtlasClient:1191)
2016-12-12 14:28:11,042 INFO - [main:] ~ Hive data model is already registered ! (HiveMetaStoreBridge:624)
2016-12-12 14:28:11,042 INFO - [main:] ~ Importing hive metadata (HiveMetaStor eBridge:117)
2016-12-12 14:28:11,045 DEBUG - [main:] ~ Getting reference for database defaul t (HiveMetaStoreBridge:211)
2016-12-12 14:28:11,046 DEBUG - [main:] ~ Using resource localhost:21000/api/atlas/entities?type=hive_db&property=qualifiedName&val ue=default@governance for 0 times (AtlasClient:784)
2016-12-12 14:28:11,068 DEBUG - [main:] ~ API localhost:21000/api/atlas/entities?type=hive_db&property=qualifiedName&value=default@ governance returned status 200 (AtlasClient:1191)
2016-12-12 14:28:11,687 INFO - [main:] ~ Database default is already registered with id bdd91811-250d-43c7-b814-b90124960f5a. Updating it. (HiveMetaStoreBridge:157)
2016-12-12 14:28:11,687 INFO - [main:] ~ Importing objects from databaseName : default (HiveMetaStoreBridge:166)
2016-12-12 14:28:11,688 DEBUG - [main:] ~ updating instance of type hive_db (HiveMetaStoreBridge:501)
2016-12-12 14:28:11,711 DEBUG - [main:] ~ Updating entity hive_db = {
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"id":{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"id":"bdd91811-250d-43c7-b814-b90124960f5a",
"version":0,
"typeName":"hive_db",
"state":"ACTIVE"
},
"typeName":"hive_db",
"values":{
"name":"default",
"location":"hdfs://localhost:8020/apps/hive/warehouse",
"description":"Default Hive database",
"ownerType":2,
"qualifiedName":"default@governance",
"owner":"public",
"clusterName":"governance",
"parameters":{
}
},
"traitNames":[
],
"traits":{
}
} (HiveMetaStoreBridge:504)
2016-12-12 14:28:11,717 DEBUG - [main:] ~ Updating entity id bdd91811-250d-43c7-b814-b90124960f5a with {
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"id":{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"id":"bdd91811-250d-43c7-b814-b90124960f5a",
"version":0,
"typeName":"hive_db",
"state":"ACTIVE"
},
"typeName":"hive_db",
"values":{
"name":"default",
"location":"hdfs://localhost:8020/apps/hive/warehouse",
"description":"Default Hive database",
"ownerType":2,
"qualifiedName":"default@governance",
"owner":"public",
"clusterName":"governance",
"parameters":{
}
},
"traitNames":[
],
"traits":{
}
} (AtlasClient:807)
2016-12-12 14:28:11,717 DEBUG - [main:] ~ Using resource http://localhost:21000/api/atlas/entities/bdd91811-250d-43c7-b814-b90124960f5a for 0 times (AtlasClient:784)
2016-12-12 14:29:11,766 WARN - [main:] ~ Handled exception in calling api api/atlas/entities (AtlasClient:791)
com.sun.jersey.api.client.ClientHandlerException: java.net.SocketTimeoutException: Read timed out
at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:149)
at com.sun.jersey.api.client.filter.HTTPBasicAuthFilter.handle(HTTPBasicAuthFilter.java:81)
at com.sun.jersey.api.client.Client.handle(Client.java:648)
at com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
at com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623)
at org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:1188)
at org.apache.atlas.AtlasClient.callAPIWithRetries(AtlasClient.java:785)
at org.apache.atlas.AtlasClient.callAPI(AtlasClient.java:1214)
at org.apache.atlas.AtlasClient.updateEntity(AtlasClient.java:808)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.updateInstance(HiveMetaStoreBridge.java:506)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerDatabase(HiveMetaStoreBridge.java:159)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importDatabases(HiveMetaStoreBridge.java:124)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importHiveMetadata(HiveMetaStoreBridge.java:118)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.main(HiveMetaStoreBridge.java:662)
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.read(SocketInputStream.java:152)
at java.net.SocketInputStream.read(SocketInputStream.java:122)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:690)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:633)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1371)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:468)
at com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:240)
at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:147)
... 14 more
2016-12-12 14:29:11,768 WARN - [main:] ~ Exception's cause: class java.net.SocketTimeoutException (AtlasClient:792)
Exception in thread "main" com.sun.jersey.api.client.ClientHandlerException: java.net.SocketTimeoutException: Read timed out
at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:149)
at com.sun.jersey.api.client.filter.HTTPBasicAuthFilter.handle(HTTPBasicAuthFilter.java:81)
at com.sun.jersey.api.client.Client.handle(Client.java:648)
at com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
at com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623)
at org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:1188)
at org.apache.atlas.AtlasClient.callAPIWithRetries(AtlasClient.java:785)
at org.apache.atlas.AtlasClient.callAPI(AtlasClient.java:1214)
at org.apache.atlas.AtlasClient.updateEntity(AtlasClient.java:808)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.updateInstance(HiveMetaStoreBridge.java:506)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerDatabase(HiveMetaStoreBridge.java:159)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importDatabases(HiveMetaStoreBridge.java:124)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importHiveMetadata(HiveMetaStoreBridge.java:118)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.main(HiveMetaStoreBridge.java:662)
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.read(SocketInputStream.java:152)
at java.net.SocketInputStream.read(SocketInputStream.java:122)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:690)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:633)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1371)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:468)
at com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:240)
at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:147)
... 14 more
Failed to import Hive Data Model!!!
... View more
12-12-2016
06:26 AM
/usr/lib/jvm/jre/bin/java and/or /usr/lib/jvm/jre/bin/jar not found on the system. Please make sure java and jar commands are available.
... View more
12-12-2016
06:16 AM
Hey Guys, I am trying to import hive metadata in Apache atlas, i ran import-hive.sh but ran into below notification. Thank you, Subash
... View more
Labels:
- Labels:
-
Apache Atlas
-
Apache Hive
12-09-2016
11:51 AM
@Sindhu Hey Sindhu, Can you please let me know the path of configuration file where I need to make these changes and let me know the services which need to be restarted after committing the changes. Thank you, Subash
... View more
12-09-2016
10:13 AM
hiveproxyhost.png snapshot attached.
... View more
12-09-2016
10:12 AM
@Rajkumar Singh yes i did and have restarted the hdfs and related services, but i am still facing the issue.
... View more
12-09-2016
09:47 AM
1 Kudo
Hey Guys, I am able to query data through admin hive view of ambari, it is displaying output properly. but whenever I am running query ,Console displays this error "Unauthorized connection for super-user: root from IP ***.***.***.*" also in output for data nodes IP addresses ."Unauthorized connection for super-user: root from IP".Please let me know which configuration property needs to be changed to resolve this issue. Thanks in advance, Subash
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
Apache Hive