Member since
01-31-2019
15
Posts
0
Kudos Received
0
Solutions
03-20-2019
09:01 AM
Ok, I figured it out. There was a mapping rule that translated my Kerberos principal name to a lower-case short name, i.e. USER1@EXAMPLE.COM becomes user1 I had entered both USER1 and USER1@EXAMPLE.COM as HBase superusers, but not user1. Tricky. . .
... View more
03-20-2019
07:38 AM
Yes, the klist output matches the added username, and everything is in the same realm. I'm trying to just create a table with the HBase shell: hbase(main):002.0>create 'testtable', 'colfam1' This results in the error from my first post: ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=<username>@<realm>, scope=default, params=[namespace=default, table=default:testtable,family=colfam1],action=CREATE)
... View more
03-14-2019
04:52 AM
I am using CDH 5.15.0; I did a rolling restart. We are using a centralized AD to authenticate and store the Kerberos principals. We don't have an AD group specifically for HBase, although I'm a member of the admin and ETL groups. Is there somewhere I need to configure HBase with a superuser group? I did add my account as an HBase Superuser, but it didn't resolve the issue.
... View more
03-13-2019
01:09 PM
I'm trying to set up the ACLs for HBase now that we have enabled secure authentication. I'm using the HBase shell, but I don't have permissions to grant anything: ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=<redacted>, scope=GLOBAL, action=ADMIN) I have added my user name to the "HBase Superuser" setting under configuration and re-deployed the service, but the error persists.
... View more
Labels:
- Labels:
-
Apache HBase
02-22-2019
12:15 PM
Is the date stored in Oracle as a DATETIME field? Hive used the Unix date format, and I don't believe these two are compatible. You can try loading it as a string and see if that fixes the error; if so, then you'll need to convert it during import if you want a Hive TIMESTAMP field.
... View more
02-19-2019
10:57 AM
You will need to use both dot notation and join notation: https://www.cloudera.com/documentation/enterprise/5-9-x/topics/impala_complex_types.html When complex types are nested inside each other, you use a combination of joins, pseudocolumn names, and dot notation to refer to specific fields at the appropriate level. This is the most frequent form of query syntax for complex columns, because the typical use case involves two levels of complex types, such as an ARRAY of STRUCT elements. SELECT id, phone_numbers.area_code FROM contact_info_many_structs INNER JOIN contact_info_many_structs.phone_numbers phone_numbers LIMIT 3; You can express relationships between ARRAY and MAP columns at different levels as joins. You include comparison operators between fields at the top level and within the nested type columns so that Impala can do the appropriate join operation.
... View more
02-19-2019
10:41 AM
This error is because the user 'hive' doesn't have write access to the directory containing your data: hdfs://nameservice1/user/hive/warehouse/ml.db/testk/test_hello.txt: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x Remote Exception: Permission denied: user=hive, access=WRITE, inode="/user/kkr":kkr:kkr:drwxr-xr-x at Why does Hive need *write* access to *read* the data? I don't know, but it does. If you make the file 'test_hello.txt' world-writable that should fix the error.
... View more
02-01-2019
11:55 AM
The Spark documentation notes that: Long-running applications may run into issues if their run time exceeds the maximum delegation token lifetime configured in services it needs to access. You should check if delegation is enabled and if the maximum token lifetime is set to something less than the time it takes to run your job.
... View more