Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Failed to initialize Atlas client using spark-atlas-connector

avatar
Explorer

COMMAND:

spark-shell --jars /home/atlas/tt/spark-atlas-connector-assembly_2.11-0.1.0-SNAPSHOT.jar --master local --conf spark.extraListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker --conf spark.sql.queryExecutionListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker --conf spark.sql.streaming.streamingQueryListeners=com.hortonworks.spark.atlas.SparkAtlasStreamingQueryEventTracker


ERROR:

ERROR SparkAtlasEventTracker: Fail to initialize Atlas client, stop this listener

org.apache.atlas.AtlasServiceException: Metadata service API org.apache.atlas.AtlasClientV2$API_V2@398694a6 failed with status 401 (Unauthorized) Response Body ()

at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:395)

at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:323)

at org.apache.atlas.AtlasBaseClient.callAPI(AtlasBaseClient.java:239)

at org.apache.atlas.AtlasClientV2.getAllTypeDefs(AtlasClientV2.java:124)

at com.hortonworks.spark.atlas.RestAtlasClient.getAtlasTypeDefs(RestAtlasClient.scala:58)

at com.hortonworks.spark.atlas.types.SparkAtlasModel$$anonfun$checkAndGroupTypes$1.apply(SparkAtlasModel.scala:107)

at com.hortonworks.spark.atlas.types.SparkAtlasModel$$anonfun$checkAndGroupTypes$1.apply(SparkAtlasModel.scala:104)

at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221)

at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)

at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)

at com.hortonworks.spark.atlas.types.SparkAtlasModel$.checkAndGroupTypes(SparkAtlasModel.scala:104)

at com.hortonworks.spark.atlas.types.SparkAtlasModel$.checkAndCreateTypes(SparkAtlasModel.scala:71)

at com.hortonworks.spark.atlas.SparkAtlasEventTracker.initializeSparkModel(SparkAtlasEventTracker.scala:108)

at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:48)

at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:39)

at com.hortonworks.spark.atlas.SparkAtlasEventTracker.<init>(SparkAtlasEventTracker.scala:43)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:423)

at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2747)

at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2736)

at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)

at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)

at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74)

at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)

at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)

at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2736)

at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2360)

at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2359)

at scala.Option.foreach(Option.scala:257)

at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2359)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:554)

at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)

at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934)

at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925)

at scala.Option.getOrElse(Option.scala:121)

at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925)

at org.apache.spark.repl.Main$.createSparkSession(Main.scala:103)

at $line3.$read$$iw$$iw.<init>(<console>:15)

at $line3.$read$$iw.<init>(<console>:43)

at $line3.$read.<init>(<console>:45)

at $line3.$read$.<init>(<console>:49)

at $line3.$read$.<clinit>(<console>)

at $line3.$eval$.$print$lzycompute(<console>:7)

at $line3.$eval$.$print(<console>:6)

at $line3.$eval.$print(<console>)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)

at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)

at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)

at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)

at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)

at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)

at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)

at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)

at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)

at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)

at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)

at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:79)

at scala.collection.immutable.List.foreach(List.scala:381)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:79)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:79)

at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:78)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)

at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:78)

at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)

at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:77)

at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:110)

at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)

at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)

at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)

at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)

at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)

at org.apache.spark.repl.Main$.doMain(Main.scala:76)

at org.apache.spark.repl.Main$.main(Main.scala:56)

at org.apache.spark.repl.Main.main(Main.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


2 REPLIES 2

avatar
New Contributor

In this class spark-atlas-connector/spark-atlas-connector/src/main/scala/com/hortonworks/spark/atlas/AtlasClientConf.scala
default password and username is -
val CLIENT_USERNAME = ConfigEntry("atlas.client.username", "admin")
val CLIENT_PASSWORD = ConfigEntry("atlas.client.password", "admin123")

due to this you were facing an issue as the user configurations on your system do not match these credentials.
In regard to this one can pass a atlas-application.properties(property file) as mentioned in ReadMe.

avatar
New Contributor

In this class spark-atlas-connector/spark-atlas-connector/src/main/scala/com/hortonworks/spark/atlas/AtlasClientConf.scala
default password and username is -
val CLIENT_USERNAME = ConfigEntry("atlas.client.username", "admin")
val CLIENT_PASSWORD = ConfigEntry("atlas.client.password", "admin123")

due to this I was facing the same issue as the user configurations on my system do not match these credentials. You can fix it by making changes in the code(change underline password).
In regard to this one can pass a atlas-application.properties(property file) as mentioned in ReadMe, but even after passing the above mentioned file I was facing the same issue.