Member since
09-11-2015
269
Posts
281
Kudos Received
55
Solutions
12-15-2018
08:57 PM
To allow creation and edit of any other types (including your custom ones) add this property to conf/atlas-application.properties file: atlas.ui.editable.entity.types=type1,type2,type3,...
... View more
11-07-2018
12:48 AM
For HDP-3.x, HBase table name is 'atlas_janus'.
... View more
12-11-2018
01:24 PM
Hi Ayub, As described in the step 1, Is it required to create some random ids "id":"-11893021824425525" for this json request to be successful.
... View more
09-13-2018
12:53 PM
At least for version 2.6.3 and above the section "Running import script on kerberized cluster" is wrong. You don't need to provide any of the options (properties) indicated (except maybe the debug one If you want to have debug output) because they are automatically detected and included in the script. Also at least in 2.6.5 a direct execution of the script in a Kerberized cluster will fail because of the CLASSPATH generated into the script. I had to edit this replacing many single JAR files by a glob inside their parent folder in order for the command to run without error. If you have this problem see answer o "Atlas can't see Hive tables" question.
... View more
09-26-2016
01:13 PM
7 Kudos
This short post concentrates on solving most common issue found while publishing metadata to kafka topic for Atlas server over a secure(kerberized) cluster. Issue: With AtlasHook configured for Hive/Storm/Falcon, if you are seeing below stack trace in the logs of the corresponding component. This means, AtlasHook is not able to publish metadata to kafka for Atlas consumption. The reason for this failure could be
Kafka topic to which the hook is trying to publish does not exist. OR Kafka topic does not have proper access control lists(ACL) configured for the user. org.apache.kafka.common.KafkaException: Failed to construct kafka producer
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:335)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:188)
at org.apache.atlas.kafka.KafkaNotification.createProducer(KafkaNotification.java:312)
at org.apache.atlas.kafka.KafkaNotification.sendInternal(KafkaNotification.java:220)
at org.apache.atlas.notification.AbstractNotification.send(AbstractNotification.java:84)
at org.apache.atlas.hook.AtlasHook.notifyEntitiesInternal(AtlasHook.java:126)
at org.apache.atlas.hook.AtlasHook.notifyEntities(AtlasHook.java:111)
at org.apache.atlas.hook.AtlasHook.notifyEntities(AtlasHook.java:157)
at org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:274)
at org.apache.atlas.hive.hook.HiveHook.access$200(HiveHook.java:81)
at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:185)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:86)
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:71)
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:83)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:277)
... 15 more
Caused by: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user
at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:940)
at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760)
at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
at org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:69)
at org.apache.kafka.common.security.kerberos.KerberosLogin.login(KerberosLogin.java:110)
at org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:46)
at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:68)
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:78)
... 18 more
Resolution: Below are the steps required in secure environments to setup Kafka topics used by Atlas:
Login with Kafka service user identity Create Kafka topics ATLAS_HOOK and ATLAS_ENTITIES with the following commands: $KAFKA_HOME/bin/kafka-topics.sh --zookeeper $ZK_ENDPOINT --topic ATLAS_HOOK --create --partitions 1 --replication-factor $KAFKA_REPL_FACTOR
$KAFKA_HOME/bin/kafka-topics.sh --zookeeper $ZK_ENDPOINT --topic ATLAS_ENTITIES --create --partitions 1 --replication-factor $KAFKA_REPL_FACTOR
Setup ACLs on these topics with following commands: $KAFKA_HOME/bin/kafka-acls.sh --authorizer-properties zookeeper.connect=$ZK_ENDPOINT --add --topic ATLAS_HOOK --allow-principal User:* --producer
$KAFKA_HOME/bin/kafka-acls.sh --authorizer-properties zookeeper.connect=$ZK_ENDPOINT --add --topic ATLAS_HOOK --allow-principal User:$ATLAS_USER --consumer --group atlas
$KAFKA_HOME/bin/kafka-acls.sh --authorizer-properties zookeeper.connect=$ZK_ENDPOINT --add --topic ATLAS_ENTITIES --allow-principal User:$ATLAS_USER --producer
$KAFKA_HOME/bin/kafka-acls.sh --authorizer-properties zookeeper.connect=$ZK_ENDPOINT --add --topic ATLAS_ENTITIES --allow-principal User:$RANGER_USER --consumer --group ranger_entities_consumer
If Ranger authorization is enabled for Kafka, Ranger policies should be setup for the following accesses: topic: ATLAS_HOOK; { group=public; permission=publish }; { user=$ATLAS_USER; permission=consume }
topic: ATLAS_ENTITIES; { user=$ATLAS_USER; permission=publish}; { user=$RANGER_USER; permission=consume } Also check if the atlas-application.properties file under hook(storm/hive/falcon) component configuration directory(typically it is under /etc/storm/conf) have a right keytab and principal information. Below are the two properties you should look for.. atlas.jaas.KafkaClient.option.principal=<component_principal>
atlas.jaas.KafkaClient.option.keyTab=<component_keytab_path>
For example:
atlas.jaas.KafkaClient.option.principal=storm-cl1/_HOST@EXAMPLE.COM
atlas.jaas.KafkaClient.option.keyTab=/etc//keytabs/storm.headless.keytab KAFKA_HOME is typically /usr/hdp/current/kafka-broker ZK_ENDPOINT should be set to Zookeeper URL for Kafka KAFKA_REPL_FACTOR should be set to value of Atlas configuration 'atlas.notification.replicas' ATLAS_USER should the kerberos identity of the Atlas server, typically 'atlas' RANGER_USER should be the kerberos identity of Ranger Tagsync process, typically 'rangertagsync'
... View more
Labels:
09-26-2016
10:30 AM
1 Kudo
Good descriptive article on how to install Atlas HA via Ambari
... View more