Member since
10-04-2016
69
Posts
6
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5145 | 03-23-2017 08:41 AM | |
2545 | 01-26-2017 07:22 PM | |
1781 | 12-23-2016 12:07 PM | |
6261 | 12-21-2016 01:54 PM | |
1505 | 12-05-2016 06:37 AM |
01-15-2024
04:43 PM
Where is the location for the Spark driver and executor logs when the Spark job is executed in YARN cluster mode? Does yarn logs -applicationId CMD capture the logs? I can't find log4j-active.log for the driver.
... View more
Labels:
- Labels:
-
Apache Spark
03-30-2017
09:41 AM
Switched to JDK 1.7 and got the same issue. It seems that JDK can't pick up from the cache. $ export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true $HADOOP_OPTS" $ export HADOOP_ROOT_LOGGER=TRACE,console; $ export HADOOP_JAAS_DEBUG=true $ hdfs dfs -ls 2> /tmp/hdfsls.txt Java config name: null Native config name: /etc/krb5.conf Loaded from native config [UnixLoginModule]: succeeded importing info: uid = 1000 gid = 1000 supp gid = 4 supp gid = 10 supp gid = 190 supp gid = 1000 Debug is true storeKey false useTicketCache true useKeyTab false doNotPrompt true ticketCache is null isInitiator true KeyTab is null refreshKrb5Config is false principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false Acquire TGT from Cache >>>KinitOptions cache name is /tmp/krb5cc_1000 Principal is null null credentials from Ticket Cache [Krb5LoginModule] authentication failed Unable to obtain Princpal Name for authentication [UnixLoginModule]: added UnixPrincipal, UnixNumericUserPrincipal, UnixNumericGroupPrincipal(s), to Subject
... View more
03-30-2017
06:57 AM
It seems that JDK was not able to load kerberos ticket from cache. $ export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true $HADOOP_OPTS" $ hdfs dfs -ls / 2> /tmp/hdfsls.txt Java config name: null Native config name: /etc/krb5.conf Loaded from native config >>>KinitOptions cache name is /tmp/krb5cc_1000 I should see extra KRB debug, but found none.
... View more
03-30-2017
06:10 AM
P.S. I am on MIT KRB 1.14 version.
... View more
03-29-2017
09:34 PM
I am using CM API installing a CDH cluster on AWS with MIT KDC and JDK 1.8u121. From CM UI, keberos is working fine. I checked CM kerberos encryption types and they match those defined in kdc.conf. $ sudo cat /var/kerberos/krb5kdc/kdc.conf [kdcdefaults] kdc_ports = 88 kdc_tcp_ports = 88 [realms] AWS = { #master_key_type = aes256-cts acl_file = /var/kerberos/krb5kdc/kadm5.acl dict_file = /usr/share/dict/words admin_keytab = /var/kerberos/krb5kdc/kadm5.keytab supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal camellia256-cts:normal camellia128-cts:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal } $ klist -ef Ticket cache: KEYRING:persistent:1000:1000 Default principal: wzhu@AWS Valid starting Expires Service principal 03/30/2017 00:20:37 03/31/2017 00:20:37 krbtgt/AWS@AWS Flags: FI, Etype (skey, tkt): aes256-cts-hmac-sha1-96, aes256-cts-hmac-sha1-96 $ hdfs dfs -ls / ... ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "ip-10-1-30-107.us-west-1.compute.internal/10.1.30.107"; destination host is: "ip-10-1-30-107.us-west-1.compute.internal":8020; ...skipping... at org.apache.hadoop.ipc.Client$Connection.access$1900(Client.java:375) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:730) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:726) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:725)
... View more
Labels:
- Labels:
-
Cloudera Manager
-
Kerberos
-
Security
03-27-2017
02:39 PM
Thanks. For cloud deployment, we definately want SSL/TLS for backend DBs.
... View more
03-23-2017
08:45 AM
I can't find any instruction on this. Is there anyway to enable SSL/TLS between CM services such as CM, HIVE, Navigator, etc. and backend DB such as mysql, postgres and oralce?
... View more
Labels:
- Labels:
-
Cloudera Manager
03-23-2017
08:41 AM
1 Kudo
Resolved by using `%`.* in the grant statement which removes mysql database. AWS RDS will not let us touch this database on PAAS offering.
... View more
03-21-2017
10:49 AM
Since AWS RDS MySQL instance doesn’t grant the initial DB user the real ‘root’ privileges, I need specific instructions on AWS RDS setup. I can’t run the following command to create a temp user. (https://www.cloudera.com/documentation/enterprise/latest/topics/cm_ig_installing_configuring_dbs.html#cmig_topic_5_1😞 mysql> grant all on *.* to 'temp'@'%' identified by 'temp' with grant option; I had to modify with the following: mysql> grant all on `%`.* to 'temp'@'%' identified by 'temp' with grant option; [ec2-user@ip-x ~]$ sudo /usr/share/cmf/schema/scm_prepare_database.sh mysql -h $myDB-endpoints -utemp -ptemp --scm-host ip-x.us-west-1.compute.internal scm scm scm JAVA_HOME=/usr/java/jdk1.8.0_121 Verifying that we can write to /etc/cloudera-scm-server Creating SCM configuration file in /etc/cloudera-scm-server Executing: /usr/java/jdk1.8.0_121/bin/java -cp /usr/share/java/mysql-connector-java.jar:/usr/share/java/oracle-connector-java.jar:/usr/share/cmf/schema/../lib/* com.cloudera.enterprise.dbutil.DbCommandExecutor /etc/cloudera-scm-server/db.properties com.cloudera.cmf.db. [ main] DbCommandExecutor INFO Unable to login using supplied username/password. [ main] DbCommandExecutor ERROR Error when connecting to database. java.sql.SQLException: Access denied for user 'scm'@'myIP' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)[mysql-connector-java.jar:5.1.41] at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3973)[mysql-connector-java.jar:5.1.41] at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3909)[mysql-connector-java.jar:5.1.41] at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873)[mysql-connector-java.jar:5.1.41] at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1710)[mysql-connector-java.jar:5.1.41]
... View more
Labels:
- Labels:
-
Cloudera Manager
02-21-2017
08:40 AM
Thanks. With data at-rest encryption, we have to add hdfs user access to all encryption zones. Is this by default?
... View more