Member since
09-02-2016
523
Posts
89
Kudos Received
42
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2415 | 08-28-2018 02:00 AM | |
2297 | 07-31-2018 06:55 AM | |
5228 | 07-26-2018 03:02 AM | |
2591 | 07-19-2018 02:30 AM | |
6052 | 05-21-2018 03:42 AM |
01-30-2017
12:12 PM
@csguna It is authorized_key nothing to do with hdfs here. so it is user:linux group (instead of hdfs group)
... View more
01-27-2017
01:25 PM
1 Kudo
@csguna Run the command "ssh-keygen" in the master node, it will generate two files, in which move "id_rsa.pub" to child node and rename it as authorizedkey and change the permission to 600. This is required for keyless entry You can refer this link with above understanding https://www.digitalocean.com/community/tutorials/how-to-set-up-ssh-keys--2
... View more
01-26-2017
07:56 AM
@majeedk let me put in this way... 1. do you have access to your linux box by any chance. if so pls login and authenticate with your keytab to make sure everything is ok in your keytab. 2. if you don't have access, ask your admin (those who has access) to use your keytab to authenticate and confirm there is no issue with keytab. 3. In the mean time, the error doesn't show anything about kerberos [usually it will show krb issue if you access from the linux box, but i never tried from windows machine, so not sure]. So you need to make sure the Port is open, Firewall related check, etc... to make sure everything is ok other than kerberos issue
... View more
01-26-2017
06:51 AM
@majeedk Status : Failure -Test failed: [Cloudera][HiveJDBCDriver](500164) Error initialized or created transport for authentication: [Cloudera][HiveJDBCDriver](500169) Unable to connect to server: GSS initiate failed. The error message that you showed above is the partial (or) full error? If it is partial, then pls check your error message has any keyword like 'krb'... Ex:
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) If you don't find any krb related error then pls do not focus on kerberos... instead you can focus on other points like the port has been opened between the environment, etc like that
... View more
01-25-2017
07:30 PM
@Fawze I think you didn't get my point yet, You don't need to provide user name for impala instead you can authenticate your table in two steps 1. Network authentication using Kerberos 2. Query/Table level authentication using Apache Sentry so when you call the query using JDBC, you need to pass user, passworld & keytab
... View more
01-25-2017
06:48 PM
@Fawze The answer for your 2nd question first, You have to implement apache sentry to restrict user specific query access on impala. Work with your Hadoop admin to setup it on your cluster now for the first (generic) question, obviously you have to pass the parameter from the source. Apache sentry has dependency on other security tool called kerberos which has a concept called keytab. so when you pass the user and password from the source, you also have to pass the keytab to authenticate your network. you can achieve your first requirement with this step I've shared some high level security information in the below link, hope this will give some idea https://community.cloudera.com/t5/Security-Apache-Sentry/Hadoop-Security-for-beginners/m-p/49876#M247 Thanks Kumar
... View more
01-25-2017
01:46 PM
1 Kudo
@MasterOfPuppets When you configure Hive, hive-site.xml will be updated with default property. So you can customize hive-site.xml by adding addional property. Instead of directly updating hive-site.xml, i would recommend you to update via CM. You can follow the below steps CM -> Hive -> Configuration -> Advanced Category -> search for snippet -> It will show you option to add additional property. Press + button, Name = "hive.prewarm.enabled" and value to be "true" to enable it Press ? button, to know more about each property Thanks Kumar
... View more
01-24-2017
01:29 PM
1 Kudo
@HarishS Any DML changes will be reflected automatically. But you need to run the below replace command to reflect DDL changes in view CREATE OR REPLACE VIEW View_Name as query;
... View more
01-20-2017
09:22 AM
@majeedk I am using Linux but never tried from windows. Still hope it may give some insights 1. Create a keytab (for the source login) in the environment where you have kerberos installed. And keep the keytab file somewhere like /home/user/.auth/example.keytab (Change the path to windows equalent) 2. Create a shell to call the kinit (change the shell, kinit command to windows equalent) kinit user@REALM -k -t /home/user/.auth/example.keytab 3. Create a cron job (or anyother job which suits for windows) to call the above shell in a frequent interval. Becuase by default, Kerberos ticket will expire in a week, so you need to run a job to kinit the process in a frequent interval Thanks Kumar
... View more
01-19-2017
07:47 AM
@dsss You can find all the Impala build in functions in the below link. I don't find any option for PIVOT (pls double chk) https://www.cloudera.com/documentation/enterprise/5-5-x/topics/impala_functions.html There are so many ways to transpose row to column/column to row using normal SQL. I would suggest you to follow that and create a UDF (User Defined Function) https://www.cloudera.com/documentation/enterprise/5-8-x/topics/impala_udf.html Below is the JIRA ticket created with apache to include PIVOT option in Hive, you can see the status & comments. Also some links provided in the comment section to manually transpose the column to row/row to column. https://issues.apache.org/jira/browse/HIVE-3776 Thanks Kumar
... View more