Member since
03-23-2015
1288
Posts
114
Kudos Received
98
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4319 | 06-11-2020 02:45 PM | |
5930 | 05-01-2020 12:23 AM | |
3752 | 04-21-2020 03:38 PM | |
4035 | 04-14-2020 12:26 AM | |
2984 | 02-27-2020 05:51 PM |
12-12-2018
01:50 PM
HI elkarel, Thanks for your quick reply. The script is useful for me as well. But, its pity that we dont have option to copy the default files to the workspace when the workspace is created. We should have a configuration in hue.ini (like remote_data_dir) to copy the default contents once the workspace directory is created. Thanks
... View more
12-11-2018
01:44 AM
Hi Ericl, Finally we found the correct direction and got the solution as below: change the hive-site.xml in system add <property>
<name>parquet.column.index.access</name>
<value>true</value>
<description>set parquet index access</description>
</property> then restart hive Then everything is fine. This issue gave me deep impression, hope this reply could help other guys who have the same issue like me or even guide a way to find the solution. Thanks in the end.
... View more
12-07-2018
08:48 AM
Hi minsirv, Unfortunately, we didn't involve to much about HBase Thrift Authentication. But Let me share one more thing about HUE Authentication. HUE default account is "admin", this account maybe cause some permission decline problem. Please try to set account such as "hive" in HUE, and avoid using "admin". Hope this poor experience can help you.
... View more
11-23-2018
02:20 AM
1 Kudo
Hi, Whenever you create directory in '/' of cloudera by using following command #hdfs dfs -mkdir myfiles it will take cloudera as the user ...so it creates diectory under '/user/cloudera' inside this myfiles will be created and to access this location from the programs or from queries should use 'hdfs://quikstart.cloudera/user/cloudera/myfiles'. Than you
... View more
11-18-2018
10:50 PM
Hi EricL, Thank you for your useful reply. It is because the user didn't have right to write. As for the "Caused by" part, I am using a springboot application and it don't remind any other infomation besides the log I posted above. When I use beeline and "! connect jdbc:hive2://localhost:10000/" command, it remind the the write access things.
... View more
11-04-2018
11:20 PM
yeah the format was different
... View more
11-03-2018
04:54 AM
Hi Tomas, Thanks for providing the solution. Can you please accept your findings as the solution so that it would be obvious to other community users? Cheers
... View more
10-31-2018
06:06 PM
Hi, The error message "Unsupported major.minor version 52.0" means the UDF jar file was compile on a version of Java that is different to the one used for HS2. If you run below command: javap -verbose com/udf/StringSplitter.class | grep "major" What is the number returned? Below are the major version number for different version of Java: Java 5 uses major version 49 Java 6 uses major version 50 Java 7 uses major version 51 Java 8 uses major version 52 Java 9 uses major version 53 So you need to compile using the Java that has the same version that HS2 runs under. Hope above helps.
... View more
10-26-2018
06:57 AM
Hi, actually both session and operation timeouts are set to 6h, so this shouldn't be a problem. Thanks!
... View more
10-25-2018
04:05 AM
Hi, Thanks for reporting the typo, I believe that you are right, it should be Hive engine as it is Hive ODBC driver. I will repot this internally and get it updated. The difference basically one connects to HiveServer2 and the other connects to Impala :).
... View more