Member since
11-05-2018
18
Posts
0
Kudos Received
0
Solutions
10-14-2019
01:21 PM
I ran into the same error. Even though the dependencies are listed in sbt, the jars have to be specifically shipped with --jars option in spark-submit. Why is this needed? Any workarounds?
... View more
08-22-2019
01:50 PM
Yes - that WIP links back to KUDU-1603 that I shared earlier. Guess, we will have to wait it out. Thanks for your response.
... View more
08-15-2019
02:12 PM
Could you please share how KuduContext is created in PySpark? I am aware of KUDU-1603, but looking for workarounds and the weird java wrapper detailed in KUDU-1603 is not working as intended.
... View more
08-15-2019
10:03 AM
Hi
I have been searching for sometime for a command reference/API manual for PySpark-Kudu and I have been unsuccessful so far. Does Cloudera have something that can be of help?
Thanks.
... View more
Labels:
- Labels:
-
Apache Kudu
12-21-2018
02:44 PM
Hello
We have HDFS Encryption at rest enables in our Kerberized cluster.
I am able to create encryption zone, write data into it as admin (who created the key & zone). Other users (not in the same LDAP group as the above admin user) are not able to access it even with FACL's set to rwx - because they are not authorized for [DECRYPT_EEK].
Queries
1. When a user creates an encryption zone, by default, other users in same unix group gets access to [DECRYPT_EEK]. Is this true?
2. Usually an user is able to see the encrypted data (not the actual data) if he/she has read permissions to it. But, this is not the case with HDFS Rest encryption. Unless the user is able to decrypt the data, they are not allowed to read it. - correct?
3. Is there a way to show/display the encrypted data (without decrypting it)? If so, how?
4. What/where does GENERATE_EEK & GET_METADATA fit into this concept?
The whole concept of maintaining keys in KMS/KTS and encrypt the data based on that - seems to be more like a blocking the access to the data rather than the fact that the data is encrypted.
if someone can please provide some clarity, would be greatly beneficial. Thank you.
... View more
12-10-2018
09:46 AM
Thanks Tim. Would you be able to advice any easy way to capture the log messages from Impyla?
... View more
12-05-2018
12:13 PM
Also - please try with impala-shell port (e.g. 21051). It looks like 25003 for load balancer is used for external connections thru ODBC and 21051 for impala-shell connections.
... View more
12-05-2018
12:11 PM
Hi I am trying to get some documentation on what cursor/connection object methods are available in impyla? It does implement DB API 2.0 (PEP 249), but some of the methods like rowcount, messages, errorhandler are optional in when implementing DB API. Were any optional DB API Entensions included in Impyla? If not, is there any documentation on what is available? Also - can someone please advice on how to capture the logs when using impyla??
... View more
12-05-2018
11:24 AM
Any solution for this problem?
... View more