Member since
11-03-2016
17
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3957 | 12-09-2016 01:05 AM |
01-21-2022
03:08 PM
Hi Nagaraj/ Anyone Can you please share the steps if you remember ? ERROR org.apache.hadoop.hdfs.server.namenode.ha.StandbyCheckpointer: Exception in doCheckpoint java.io.IOException: Exception during image upload: java.lang.NoClassDefFoundError: org/apache/http/client/utils/URIBuilder Caused by: java.lang.NoClassDefFoundError: org/apache/http/client/utils/URIBuilder
... View more
04-16-2018
08:25 PM
Assuming that you are referencing Cloudera Navigator Encrypt, as part of the process of encrypting a disk, you can move existing data onto that newly encrypted disk. See the navencrypt-move command. If you are referring to HDFS Transparent Encryption, then you must create a new encryption zone in HDFS (effectively a new directory) and then copy your HDFS data into it. A lot of people ask "How can I encrypt an existing directory". You would have to perform two extra steps and have plenty of available disk space: 1. Rename the existing directory in HDFS: "hdfs dfs -mv /data /data.bak" 2. Set up the encryption zone for /data. "hadoop key create <keyname>; hdfs dfs -mkdir /data; hdfs crypto -createZone -keyName <keyname> -path /data" 3. Copy the data in /data.bak to /data. "hdfs dfs -cp /data.bak/\* /data/" 4. Remove /data.bak. "hdfs dfs -rm -R /data.bak"
... View more
01-25-2017
10:00 AM
1 Kudo
thanks. I resolved the issue by giving full admin access to the user-id which I created on my LDAP server. That allowed my LDAP user to have all the privileges needed to create other permissions.
... View more