Member since
09-14-2017
120
Posts
11
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3157 | 06-17-2021 06:55 AM | |
1925 | 01-13-2021 01:56 PM | |
17205 | 11-02-2017 06:35 AM | |
19037 | 10-04-2017 02:43 PM | |
34440 | 09-14-2017 06:40 PM |
02-28-2019
08:01 AM
Thanks! Manually create the .Trash directory in the user's home directory works to show "Move to Trash" in HUE.
... View more
01-30-2019
08:15 AM
Using Hue™ 4.1 on CDH 5.16 I only see the Delete Forever button in File Browser but not the Move to Trash even though Use Trash is enabled in HDFS configuration. How to enable this in HUE?
... View more
Labels:
- Labels:
-
Cloudera Hue
-
HDFS
12-19-2018
02:23 PM
@bgooley Thanks a bunch! This is good info. I do see the below now which means /usr/lib/jvm is good for openJDK. Note: Cloudera strongly recommends installing Oracle JDK at /usr/java/<jdk-version> and OpenJDK at /usr/lib/jvm (or /usr/lib64/jvm on SLES 12), which allows Cloudera Manager to auto-detect and use the correct JDK version. Unfortunately in the CDH 5.16 install guide it doesnt clarify that for openJDK /usr/lib/jvm is good path but makes a blanket statement that The JDK must be installed at /usr/java/jdk-version. Hopefully they will update the doc in future. https://www.cloudera.com/documentation/enterprise/5-16-x/topics/cdh_ig_jdk_installation.html .
... View more
12-19-2018
05:13 AM
As of CDH 5.16 version openJDK is now supported you can read in the 5.16 Install guide. This may be due to Oracle will no longer provide updates to Oracle JDK without commercal license from 2019.
... View more
12-18-2018
03:56 PM
Hi, In the Cloudera Installation guide ver 5.16 it says the JDK must be installed at /usr/java/jdk-version. However while installing openJDK it gets installed as: /usr/bin/java -> /etc/alternatives/java /etc/alternatives/java -> /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.191.b12-1.el7_6.x86_64/jre/bin/java there is no /usr/java created. Is this going to cause any install issues? Thanks.
... View more
Labels:
- Labels:
-
Cloudera Manager
10-02-2018
07:21 AM
The parameter: [spark] security_enabled=true need to be added in Cloudera Manager Hue configuration to the below section: Hue Service Advanced Configuration Snippet (Safety Valve) for hue_safety_valve.ini However it still gave an error: Unable to authenticate . The Hue Admin screen showed a misconfiguration in Spark: The app wont work without a running Livy Spark Server. Also in /var/log/hue/error.log below message was present: GSSError: (('Unspecified GSS failure. Minor code may provide more information', 851968), ('Server HTTP/localhost@EXAMPLE.COM not found in Kerberos database', -1765328377)) These errors were resolved by adding the line to the to the Hue Safety valve above: [spark] security_enabled=true livy_server_host=yourlivyhostfqdn
... View more
09-20-2018
04:05 PM
Great! Thanks so much, would be good if Hive community documented the Database limits which most major databases have in the manuals.
... View more
09-20-2018
03:08 PM
Hello, anyone know the max table name and column name limits in Hive and Impala? I could not find any Hive database limits specifications which is found in most other databases. Thanks.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Impala
08-13-2018
02:19 PM
This issue was resolved by following the instructions in this site: http://vijayjt.blogspot.com/2016/02/how-to-connect-to-kerberised-chd-hadoop.html We need to copy the Java JCE unlimited strength policy files and the krb5.conf file under jdk/jre/lib/security folder where SQL Developer is installed. After this the Hive connection via Kerberos was successful.
... View more
08-08-2018
01:41 PM
Hello, Did you resolve this SQL Developer connection error? If so what was the solution as I have the same issue. Thanks!
... View more