Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 612 | 06-04-2025 11:36 PM | |
| 1180 | 03-23-2025 05:23 AM | |
| 584 | 03-17-2025 10:18 AM | |
| 2189 | 03-05-2025 01:34 PM | |
| 1376 | 03-03-2025 01:09 PM |
10-06-2019
06:56 AM
Hi, I'm facing same issue. Have there any luck?
... View more
10-05-2019
04:24 AM
@erkansirin78 Great !! it worked out for you, If you found this answer addressed your question, please take a moment to log in and click the thumbs up button. That would be a great help to Cloudera Community to find the solution quickly for these kinds of errors and mark it as a solution
... View more
10-04-2019
01:05 PM
@saivenkatg55 What are the permission on that file? It should be -rw-r--r-- 1 yarn hadoop $ ls /var/log/hadoop-yarn/yarn/hadoop-yarn-nodemanager-<host_name>.org.out Permissions should be # chmod 644 /var/log/hadoop-yarn/yarn/hadoop-yarn-nodemanager-<host_name>.org.out Ownership should be yarn:hadoop
... View more
10-04-2019
05:01 AM
It works only if it's the same KDC, need cross trust (realm) in my case. Thank you. ps : I didn't got the notification too Regards
... View more
09-29-2019
09:09 PM
Hi Shelton, Facing the issue in Hive.. 2019-09-27 15:41:46,351 - Retrying after 10 seconds. Reason: Execution of '/usr/jdk64/jdk1.8.0_112/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/hdp/current/hive-server2/lib/postgresql-42.2.8.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:postgresql://inairsr542007v3.ntil.com:5432/hive' hive [PROTECTED] org.postgresql.Driver' returned 1. ERROR: Unable to connect to the DB. Please check DB connection properties. org.postgresql.util.PSQLException: Connection to inairsr542007v3.ntil.com:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.
... View more
09-27-2019
03:03 AM
Take MySQL backup before doing this operation. Go to MySQL use <DB>; select * from TBLS where tbl_name = '<table_name>'; Make a note of TBL_ID from the table and delete entries from TABLE_PARAMS , TBL_COL_PRIVS , TBL_PRIVS using where clause TBL_ID After getting this done, try the below command. delete from TBLS where tbl_name = '<table_name>'; Now open hive shell and check for the table.
... View more
09-25-2019
12:14 PM
@Manoj690 Go to Ambari > Hive > CONFIGS > ADVANCED > Custom hive-site and add hive.users.in.admin.role to the list of comma-separated users who require admin role authorization (such as the user hive). Restart the Hive services for the changes to take effect. The permission denied error should be fixed after adding hive .users.in.admin.role=hive and restarting hive because properties that are listed in hive.conf.restricted.list cannot be reset with hiveconf Please do that and revert.
... View more
09-25-2019
10:59 AM
1 Kudo
@parthk You can definitely use sentry for RBAC type of style in Impala you don't really need Kerberos but it's highly advised to have Kerberos why??? If you know historical sentry has been the weakest link in the security architecture of Cloudera that's the reason it was dropped in favor of Ranger in the upcoming new offering CDP. Having said that sentry role-based access Control (RBAC) is an approach to restricting system access to authorized users whereas Kerberos using keytabs is like a biometric passport where the password is only know to the keytab and principal that allows a process (a client) running on behalf of a principal (a user) to prove its identity to a verifier (an application server, or just server) without sending data across the network that might allow an attacker or the verifier to subsequently impersonate the principal. Kerberos optionally provides integrity and confidentiality for data sent between the client and the server. You can safely build your cluster without Kerberos especially for self-study and development but not for production. There are 2 types of Kerberos setup MIT and AD Active Directory is a directory services implementation that provides all sorts of functionality like authentication, group and user management, policy administration and more in a centralized manner. LDAP (Lightweight Directory Access Protocol) is an open and cross-platform protocol used for directory services authentication hence the pointer in the Cloudera documentation to use LDAP/LDAPS HTH Happy hadooping
... View more
09-25-2019
10:29 AM
@elmismo999 Sqoop uses Mapreduce so make sure it's running and YARN then secondly you first validate that the database and table exist, follow the below steps # mysql -u root -p[root_password] mysql>show databases; If the sqoop database exists then run mysql> use sqoop; mysql> show tables; This MUST show the table result if it doesn't then your export cannot work and export command I don't see the MySQL database port default 3306 and the root password place holder -P or simple -p[root_password] # sqoop import --connect jdbc:mysql://127.0.0.1:3306/sqoop --username root -P --table result --target-dir /user/results10/ Can you confirm the above and revert
... View more
09-24-2019
07:50 PM
OK. tested and successful
... View more