Member since
01-12-2021
10
Posts
4
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3012 | 09-20-2022 07:30 PM | |
2482 | 03-08-2021 10:20 PM |
04-12-2023
08:56 PM
There might be the possible below cause. 1. If the script is running fine manually from your user, then maybe a problem with the binary path. Export the kinit binary path in the script. export PATH="/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/java/latest/bin:/usr/java/latest/jre/bin:/root/bin:/bin" 2. check the permission for the key tab file and the RWX permission for the user 3. If the above solution not works try to add the ticket generation command separately in corn for every 10 min and test the script. */10 * * * * kinit -kt /root/user.keytab user@PROD.EDH
... View more
02-09-2023
08:16 PM
If your problem processor generates the flow file more than your existing file then change the processor configuration - > Group Result from None to ALL
... View more
09-20-2022
07:30 PM
1 Kudo
@ManusUse the below Ranger rest API to fetch the audit logs : curl -v --insecure --anyauth --user username:password -H "Accept: application/json" -H "Content-Type: application/json" -X GET https://RANGER_HOST:6182/service/assets/accessAudit
... View more
02-07-2022
04:12 AM
To fetch the policy details in Json format. Use the below command : curl -v -k -u {username} -H "Content-Type: application/json" -H "Accept: application/json" -X GET https://{Ranger_Host}:6182/service/public/v2/api/service/cm_hive/policy/ | python -m json.tool
... View more
01-18-2022
01:20 AM
For my case, I observed the spark job was working fine on some hosts and hitting the above exception for a couple of worker hosts. Found that the issue with spark-submit --version on hosts. working hosts spark-submit version was version 2.4.7.7.1.7.0-551 and non-working hosts spark-submit version was version 3.1.2 I created the symbolic link with the correct spark-submit version file and the issue got resolved. ``` [root@host bin]# cd /usr/local/bin [root@hostbin]# ln -s /etc/alternatives/spark-submit spark-submit ```
... View more
12-01-2021
08:51 PM
If you are using the Postgres DB then use the below query to get table information : 1. log in to DB server and execute the below query : ./psql -d hive -c "SELECT \"NAME\", \"TBL_NAME\" FROM \"DBS\" as a, \"TBLS\" as b where a.\"DB_ID\"=b.\"DB_ID\";" > /tmp/tables1.txt
> awk '{print $1" " $3}' tables1.txt >> tables.txt
> cat /tmp/tables1.txt | wc -l
... View more
03-24-2021
11:37 PM
1 Kudo
Hello @proxim , Assign the valid ownership and permission for file. chown user:user /var/run/hue/hue_krb5_ccache chmod 755 /var/run/hue/hue_krb5_ccache Thanks, Pandurang
... View more
03-24-2021
03:26 AM
1 Kudo
Hello @DanHL Which python version you are using? Looks like the python-setuptools not installed in your system. If you are using python3 try to install the python-setuptools using pip3 > pip3 install python-setuptools > yum install impala-shell Please accept the solution, if its works. Thanks
... View more
03-08-2021
10:20 PM
1 Kudo
Hello @Amn_468 Could you please check - HDFS -> configuration -> File Descriptor Monitoring Thresholds Value. Try to increase the monitoring threshold value. Please find the attached screenshot for reference, Please accept the solution if its works. Thanks
... View more