Member since
04-30-2017
31
Posts
1
Kudos Received
0
Solutions
12-11-2017
08:33 PM
Greetings Gerdan, Use memberOf or group identity user of where? I need more detailed information. Thanks,
... View more
12-08-2017
05:28 PM
In our environment when LDAP Group Mapping is set to TRUE, usernames are translated into GIDS and Ranger doesn't recognize GIDS. Any idea how to set attributes in sssd.conf to show usernames as usernames instead of GIDs when LDAP Group Mapping is set to true.
... View more
Labels:
- Labels:
-
Apache Ranger
11-10-2017
07:17 PM
We need an answer on how to move a few hive tables from one cluster to another. Source version HDP 2.5.3, Destination Version 2.6.2. Also, the backend databases are different. Source backend database=mysql, Destination backend database=enterprise oracle. Thanks,
... View more
Labels:
- Labels:
-
Apache Hive
10-12-2017
06:44 PM
Got it. For some reason ambari created a kerberos service check keytab with a date in the name. I renamed this on all hosts and started from scratch with kerberization and it worked this time using same cluster name. Thanks,
... View more
10-12-2017
12:45 PM
Hi Robert, Thanks for the quick and detailed response. In our initial planning, we didn't know that the host names would change and didn't disable Kerberos before wiping the cluster. So what issues will this cause? Thanks,
... View more
10-11-2017
11:24 PM
We rebuilt a kerberized cluster with new hostnames. Should we give it a new cluster name or can we use the same cluster name that is already in active directory. What's the cleanest way to do this without problems in Active Directory? Cluster will be in same hadoop OU.
... View more
10-11-2017
11:22 PM
08-25-2017
04:55 PM
We are installing HDP 2.6.1 with RHEL7.4. The existing backend database we used for Ambari, Ranger, Oozie, & Hive is mysql version 5.1.73-3.el6_5. RHEL7.4 comes with mariadb. Should we install the same version of mysql since we will continue to use the mysql databases? RHEL7 now comes with mariadb .5.56-2.. Thanks,
... View more
Labels:
08-17-2017
10:18 PM
I would expect this question to be yes but can't be absolutely sure. Does the minor release of an OS matter as far as HDP 2.6.1 deployment. Is RHEL7.3 and RHEL7.4 both supported or only RHEL7.3?
... View more
- Tags:
- rhel
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
08-11-2017
09:42 PM
I purposedly dropped a table after performing a backup and was able to successfully restore the database in Hive. One thing I notice is the file was not recovered in HDFS /apps/hive/warehouse despite the location showing when I run describe formatted against this the table. Is this behavior expected?
... View more
08-10-2017
04:50 PM
Hi Geoffrey I need the information on how to get the backup subset from mysql information synched with hive warehouse data in HDFS. /apps/hive/warehouse
... View more
08-08-2017
08:43 PM
mysql is the metastore and backup method.
... View more
08-08-2017
08:13 PM
What are the detailed steps involved in restoring a subset of Hive data not the entire database?
... View more
Labels:
- Labels:
-
Apache Hive
07-19-2017
04:51 PM
Hi Kuldeep. I actually did a restore from the backup and was able to preserve the cluster this way. I will keep the information you sent me in case I have a similar problem moving forward. Thanks,
... View more
07-12-2017
03:49 PM
Can someone share detailed information showing what is the best practice for performing a major OS upgrade (RHEL6->RHEL7) on a Kerberized HDP cluster, (Ambari version = 2.4.2.3, HDP version = 2.5.3)? The system team will not be doing an in-place upgrade and it sounds like the hosts will be wiped or moved to different hardware. The HDP components requiring databases such as Ambari, HIVE, Oozie, etc. are leveraging mysql not the default postgres database.
Thanks a million!
... View more
Labels:
07-11-2017
05:04 PM
Hi Olivier, Is this approach considered an in place upgrade of the OS? We need to upgrade to RHEL7 from RHEL6 and our system team doesn't use any configuration management tools to do an in place upgrade. It sounds like the systems/hosts in the cluster will be wiped to do the OS upgrade. Do you have any information on how to do this and preserve the Hadoop data disks? We also need to upgrade from HDP 2.5.3 to 2.6.0 and our cluster is kerberized. What's the best approach for us to take in upgrading the OS and HDP simultaneously?
... View more
07-07-2017
05:17 PM
I will try this on next week and let you know the results. Thanks, Debra,
... View more
07-07-2017
04:37 PM
Would someone please send me information on how to set up a non-kerberized HDF to communicate with a kerberized HDP cluster. I know how to set this up on a standalone but the procedures haven't proved successful on a HDF cluster.
... View more
Labels:
- Labels:
-
Cloudera DataFlow (CDF)
07-04-2017
04:50 AM
How to set up a non-kerberized HDF NIFI cluster to communicate and/or connect to a kerberized HDP cluster?
... View more
Labels:
- Labels:
-
Cloudera DataFlow (CDF)
05-22-2017
01:28 AM
These are the errors in thrift server logs: Caused by: org.apache.thrift.transport.TTransportException: Invalid status 80
at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184)
at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
... 5 more
... View more
05-22-2017
01:19 AM
Hi Edgar, I'm having the same problem you had previously but even after entering HTTP/_HOST@myrealm it's still not working in a kerberos environment. Below are my settings: hbase.thrift.support.proxyuser=true hbase.thrift.security.qop=auth
hbase.thrift.keytab.file=/etc/security/keytabs/hbase.service.key hbase.thrift.kerberos.principal=HTTP/_HOST@myrealm hbase.regionserver.thrift.http=true
... View more
05-16-2017
06:08 PM
I forgot to state that I have the hue user set up to impersonate/proxy in the core-site file as well.
... View more
05-16-2017
06:07 PM
Hi - We have a kerberized cluster HDP 2.5.3 and I have followed your instructions to the T and while I have no problems with Hive, Job Browser, & File Browser in HUE, I continue to get this error when trying to access HBASE tables in HUE: Api Error: Could not start SASL: Error in sasl_client_start (-1) SASL(-1): generic failure: GSSAPI Error: Unspecified GSS failure. Minor code may provide more information (Server not found in Kerberos database)
... View more
05-16-2017
05:56 PM
Hi Geoffrey, Is this the same for creating headless keytabs/principals? We are able to create keytabs with host attributes, the issue is using the same service name to create a headless account. Does the article you pointed to address this? Thanks,
... View more
05-16-2017
01:31 PM
Hi Umair, Our AD team created a headless keytab without HOST attribute and the keytab with same service account name with HOST attribute broke and the headless keytab doesn't work. What is the appropriate syntax for creating headless keytabs in AD? We created it as follows: C:\Users\adminname>ktpass /princ serviceaccountname@domain.com /pass securepassword /mapuser serviceaccountname /pType KRB5_NT_PRINCIPA
L /out serviceaccountname_headless.keytab
Targeting domain controller: hostname.domain.com
Failed to set property 'servicePrincipalName' to 'serviceaccountname' on Dn 'CN=serviceaccountname,OU=Hadoop,OU=Secure,OU=Secure,OU=Secure,DC=domain,DC=com': 0x13.
WARNING: Unable to set SPN mapping data.
If serviceaccountname already has an SPN mapping installed for serviceaccountname, this is no cause for concern.
Password successfully set!
Key created.
Output keytab to serviceaccountname_headless.keytab:
Keytab version: 0x502
keysize 57 serviceaccountname@domain.com ptype 1 (KRB5_NT_PRINCIPAL) vno 5 etype 0x17 (RC4-HMAC) keylength 16 (A000000000000000000) This is the error received when kiniting the headless keytab: Keytab contains no suitable keys for serviceaccountname@domain.com while getting initial
credentials.
... View more
05-04-2017
05:32 PM
In our environment we are not able to use keytabs with same principal name on different servers. For example nifi-1-service-keytab with a principal name of nifi can only be used on server 1 even if the keytab is changed nifi-2-service-keytab. nifi-2-server-keytab cannot be used on different server with same principal name nifi in AD.
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)