Member since
11-08-2016
15
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3479 | 05-21-2017 05:45 PM |
09-05-2017
03:50 PM
Hey @Geoffrey Shelton Okot I did get message Hive Data Model imported successfully!!!. However, i am not able to capture new lineage. I get following error, Atlas hook failed due to error
java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1884)
at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:195)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.atlas.hook.AtlasHookException: HiveHook.registerProcess() failed.
at org.apache.atlas.hive.hook.HiveHook.registerProcess(HiveHook.java:701)
at org.apache.atlas.hive.hook.HiveHook.collect(HiveHook.java:268)
at org.apache.atlas.hive.hook.HiveHook.access$200(HiveHook.java:83)
at org.apache.atlas.hive.hook.HiveHook$2$1.run(HiveHook.java:198)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
... 6 more
Caused by: org.apache.atlas.hook.AtlasHookException: HiveHook.processHiveEntity() failed.
at org.apache.atlas.hive.hook.HiveHook.processHiveEntity(HiveHook.java:731)
at org.apache.atlas.hive.hook.HiveHook.registerProcess(HiveHook.java:668)
... 12 more
Caused by: org.apache.atlas.hook.AtlasHookException: HiveHook.createOrUpdateEntities() failed.
at org.apache.atlas.hive.hook.HiveHook.createOrUpdateEntities(HiveHook.java:597)
at org.apache.atlas.hive.hook.HiveHook.processHiveEntity(HiveHook.java:711)
... 13 more
Caused by: org.apache.atlas.hook.AtlasHookException: HiveHook.createOrUpdateEntities() failed.
at org.apache.atlas.hive.hook.HiveHook.createOrUpdateEntities(HiveHook.java:589)
at org.apache.atlas.hive.hook.HiveHook.createOrUpdateEntities(HiveHook.java:595)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.InvalidTableException: Table not found values__tmp__table__8
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1213)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1183)
... View more
09-05-2017
01:30 PM
i could see the tables but when i do some changes again in hive like dropping table then Atlas is not capturing. Regarding max.request.size, do you have any more information.
... View more
09-05-2017
12:51 PM
@Geoffrey Shelton Okot I did run that script and also it was successful. But still i see following error ERROR AtlasHook.java:141 - Failed to send notification - attempt #2; error=java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.RecordTooLargeException: The message is 1464396 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
... View more
09-02-2017
12:11 AM
Following are hiveserver2 logs: RROR HiveHook.java:205 - Atlas hook failed due to error
java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1884)
at org.apache.atlas.hive.hook.HiveHook$2.run(HiveHook.java:195)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.atlas.hook.AtlasHookException: HiveHook.registerProcess() failed.
at org.apache.atlas.hive.hook.HiveHook.registerProcess(HiveHook.java:701)
at org.apache.atlas.hive.hook.HiveHook.collect(HiveHook.java:268)
at org.apache.atlas.hive.hook.HiveHook.access$200(HiveHook.java:83)
at org.apache.atlas.hive.hook.HiveHook$2$1.run(HiveHook.java:198)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
... 6 more
Caused by: org.apache.atlas.hook.AtlasHookException: HiveHook.processHiveEntity() failed.
at org.apache.atlas.hive.hook.HiveHook.processHiveEntity(HiveHook.java:731)
at org.apache.atlas.hive.hook.HiveHook.registerProcess(HiveHook.java:668)
... 12 more
Caused by: org.apache.atlas.hook.AtlasHookException: HiveHook.createOrUpdateEntities() failed.
at org.apache.atlas.hive.hook.HiveHook.createOrUpdateEntities(HiveHook.java:597)
at org.apache.atlas.hive.hook.HiveHook.processHiveEntity(HiveHook.java:711)
... 13 more
Caused by: org.apache.atlas.hook.AtlasHookException: HiveHook.createOrUpdateEntities() failed.
at org.apache.atlas.hive.hook.HiveHook.createOrUpdateEntities(HiveHook.java:589)
at org.apache.atlas.hive.hook.HiveHook.createOrUpdateEntities(HiveHook.java:595)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.InvalidTableException: Table not found values__tmp__table__1
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1213)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1183)
at org.apache.atlas.hive.hook.HiveHook.createOrUpdateEntities(HiveHook.java:568) Any thought on how to solve this ?
... View more
Labels:
- Labels:
-
Apache Atlas
-
Apache Hive
-
Apache Kafka
08-08-2017
09:11 AM
As said in the following post https://hortonworks.com/blog/recent-improvements-apache-zeppelin-livy-integration/ User can use livy.pyspark3 for python3 with spark. Can anyone help me which all configuration need to be set for the same
... View more
Labels:
- Labels:
-
Apache Zeppelin
06-20-2017
01:11 PM
@gnovak Thanks a lot for the response, just one last query. So for proper resource sharing within one queue, it make sense to enable fair scheduling with enabling size-based-weight ?
... View more
06-19-2017
11:50 AM
I have a bit confusion regarding these two properties,
Minimum user limit : which specifies minimum resource a user is guaranteed and
Ordering policy: fifo and fairNow if I set Minimum user limit to 100 %, then how will the second property come into picture if is set to "fair". Also if i set the Minimum user limit to 50 % and,
user1 job is utilizing 100 % of cluster resource, then
user2 submit job who requires 20 % of cluster resource then will the resource get distributed as 80% and 20% or will it be 50% - 50%
... View more
Labels:
- Labels:
-
Apache YARN
05-21-2017
05:45 PM
Thanks a lot @Geoffrey Shelton Okot, @Namit Maheshwari and @snukavarapu for the support. However we found the issue, one of the internal port was already in used and so it was not coming up. We finally resolved by killing the process and restarting the datanode.
... View more
05-16-2017
07:20 AM
@snukavarapu
Thanks for quick response. JCE security libraries are deployed same as other nodes and also krb5.conf is taken care by Vasd tool.
... View more
05-15-2017
11:03 AM
Guys we have kerberized cluster with hdp 2.6. We had to stop and restart the machines over the weekend. But while restarting we are facing issue with just one datanode, the error is
2017-05-15 09:42:41,555 ERROR datanode.DataNode (DataNode.java:secureMain(2691)) - Exception in secureMain
java.lang.RuntimeException: Cannot start secure DataNode without configuring either privileged resources or SASL RPC data transfer protection and SSL for HTTP. Using privileged resources in combination with SASL RPC data transfer protection is not supported.
at org.apache.hadoop.hdfs.server.datanode.DataNode.checkSecureConfig(DataNode.java:1354)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1224)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:456)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2590)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2492)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2539)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2684)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2708)
2017-05-15 09:42:41,557 INFO util.ExitUtil (ExitUtil.java:terminate(124)) - Exiting with status 1
2017-05-15 09:42:41,560 INFO datanode.DataNode (LogAdapter.java:info(47)) - SHUTDOWN_MSG: We dont have secureDatanode on. Can anyone help me out here please ? our dfs.datanode.address is 0.0.0.0:1019
... View more
Labels: