Member since
05-07-2018
331
Posts
45
Kudos Received
35
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7015 | 09-12-2018 10:09 PM | |
2735 | 09-10-2018 02:07 PM | |
9293 | 09-08-2018 05:47 AM | |
3067 | 09-08-2018 12:05 AM | |
4092 | 08-15-2018 10:44 PM |
03-23-2022
03:05 AM
@iamfromsky as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.
... View more
01-13-2022
10:28 PM
Hi there, is it dfs_datanode_data_dir_perm? what's your previous value for it when it couldn't write?
... View more
01-11-2021
09:35 PM
Later versions of hive have a "sys" DB that under the hood connects back to the hive metastore database (eg Postgres or whatever). and you can query that. Impala seems not to be able to see this sys db though. There is also a "information_schema" DB with a smaller and cleaner subset but it points back to sys and also not visible from impala if you do a "show databases;" You can use "show" statements in impala-shell but I'm not sure there is a DB to through SQL at via ODBC/JDBC. Still looking for a way to do this in impala
... View more
11-19-2020
03:18 AM
I too have the same scenario where in the column was decimal and updated to bigint, now getting error while querying the column. the data type on the table and the parquet file are aligned. Error: java.io.IOException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable (state=,code=0) If you have already resolved the issue, much appreciate if you could let know what worked for you. Thanks Dinesh
... View more
08-04-2020
07:30 AM
Could you please let us know which documentation you are talking about?
... View more
04-13-2020
12:46 AM
hi @elkrish , Was this resolved ?? can u share if you found a solution for this issue ??
... View more
04-09-2020
08:26 AM
So Presto now supports ACID tables, but only for Hive3. However, the subdirectory exception is from a configuration on the presto client side. In the hive.properties in presto's catalog directory, add "hive.recursive-directories=true"
... View more
01-30-2020
03:24 AM
How do I check if HS2 can reach port 2181?
... View more
12-14-2019
05:21 AM
No i cant view the result of this simple query
... View more