09-02-2015 10:09 AM
Hi Elena,
These are normal messages. Is it possible for you to upload the log somewhere (in cloude, Google drive, dropbox etc) from where we can download and take a look?
Vikas
09-03-2015 08:58 AM
Hi Vikas,
here are the complete logs. You can download them from the link below, it's a wetransfer.
Let me know if you find something.
Thank you, Elena
09-03-2015 02:52 PM
Hi Elena,
So I looked at logs and the only instance of query that Navigator noticed was in may:
2015-05-29 15:00:02,309 INFO hive.ql.parse.ParseDriver: Parsing command: insert OVERWRITE TABLE test1 select c1 as alias from test
Apart from it there were no other instance in the logs. Also, I couldn't find an exection/error log that will show why its failing. Will it be possible for
you to send us the jobconf file for the MR job that is lauched when you run Hive query. You may have to look at JobHistory server to get the job
and hdfs path of the configuraiton file and then get it from hdfs.
Navigator reads this confguraiton file to get the Hive query and then parses it to create link to Hive tables.
Vikas
09-04-2015 06:38 AM
Hi Vikas, we looked in the JobHistory server log and we finally found the problem. It was an issue of permissions, the loading failed because the user did not have the permission to access the files that the cloudera navigator needed.
This is the error we found in the logs:
2015-09-03 19:01:45,022 ERROR org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager: Error while trying to scan the directory hdfs://fisv-bdvm2.gft.com:8020/staging/history/done_intermediate/root
org.apache.hadoop.security.AccessControlException: Permission denied: user=mapred, access=READ_EXECUTE, inode="/staging/history/done_intermediate/root":root:supergroup:drwxrwx---
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:151)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6287)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6269)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6194)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:4793)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:4755)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:801)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getListing(AuthorizationProviderProxyClientProtocol.java:310)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:606)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:587)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
We granted permission to the user and now we can finally see the transformations in the navigator.
The only transformation we could see in the navigator before (the one in may) was actually executed by another user.
Thank you very much for your time and your help in finding the right log.
Elena
06-14-2018 02:41 AM
i have the same issue am using Cloudera Navigator 2.11.1 but i can't see my hive operation transformation with data lineage. So i search in the logs file it seems to be fine " Parsing command: insert into table salesdata partition (date_of_sale)select salesperson_id,product_id,date_of_sale from salesdata_source" source ParseDrive
and i don't have a probleme with permission.
* my exemple is creating a create table salesdata_source and making a partitioned table salesdata and insert into salesdata from salesdata_source.
thank you