Member since
03-23-2015
1288
Posts
114
Kudos Received
98
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3295 | 06-11-2020 02:45 PM | |
5013 | 05-01-2020 12:23 AM | |
2815 | 04-21-2020 03:38 PM | |
2619 | 04-14-2020 12:26 AM | |
2316 | 02-27-2020 05:51 PM |
10-18-2016
03:43 AM
Hi, Did you "kinit" as a valid user before attempting to connect impala on Windows?
... View more
10-12-2016
01:50 PM
1 Kudo
Hi, Please enable the JDBC trace log for further debugging: jdbc:impala://{PUBLIC IP ADDRESS}:21051;AuthMech=1;KrbRealm={REALM};KrbHostFQDN={fqdn};KrbServiceName=impala;LogLevel=6;LogPath=/path/to/directory and then provide errors in trace log under /path/to/directory for review. Please also find the corresponding errors in impala daemon logs.
... View more
09-30-2016
01:44 AM
Hi, I can see that you are setting "current_date = 01-01-2015;", however you used ${hiveconf:start_date}, which I think should be ${hiveconf:current_date} Also, when you run "hive -hiveconf start_date=current_date -f argument_script.hql" from command line, where do you set "current_date"? If it is on the command line, then I think you should be using $current_date instead. Like Naveen mentioned, you should make sure that hive.variable.substitute=true first. Thanks
... View more
09-30-2016
01:24 AM
1 Kudo
Hi, Depending on your scenario: - do you use Cloudera Manager to manage CDH? - do you also want to move Sentry Database? If you use CM and do not need to move the Sentry database, then you can simply delete the current Sentry and then add a new one which can be configured to read from the same DB. Don't forget to backup Sentry DB first just in case.
... View more
09-10-2016
04:50 AM
I can see that the stats for table "gwynniebee_bi.fact_recommendation_events" was not available: | table stats: unavailable
| column stats: unavailable Without stats, Impala will not be able to do accurate estimate in terms of resources required for the query, this sometimes will lead to OOM error due to wrong estimation of memroy usage up front. If you check the SUMMARY of the query, you will get detaild information about estimated and actual memory usage. I suggest you to run "COMPUTE STATS gwynniebee_bi.fact_recommendation_events" and try again.
... View more
09-05-2016
08:50 PM
Hi Teng, I am glad that you have figured out the cause.
... View more
09-05-2016
03:37 AM
Hi Teng, This sounds like that you are using different thrift service class, can you please copy and paste some sample HS2 and HMS logs that do not have thread information here? I would like to confirm which class you are using. Thanks
... View more
08-27-2016
12:40 AM
Hi Jais, Can you please let me know where you run your Hive query? Do you run it through Hue? If you run through Hue, in most cases the staging directory will be left over even after query finishes. This is because Hue holds the query handler open so that users can get back to it, and the clean up of staging directories will only be triggered when query handler is closed. So first thing I would like to check is where you run your Hive query. Thanks
... View more
08-27-2016
12:33 AM
Hi sim6, Do you mean that you want to run all the qureries in the file through Hue's Hive Editor interface, rather than running through beeline or Hive CLI? Can you please copy and paste the content of the script here for me to have a look? What was the result you get when you tried to run this file in Hue? You mentioned you can't execute all of them, do you know which ones were executed? And how did you run this file through Hue?
... View more
05-24-2015
04:48 PM
This is a reported bug, you can workaround with this issue by breaking statement "CREATE TABLE ... AS SELECT ..." from one step into two steps.
1) Create table first with definitions:
CREATE TABLE separator_test (
id int,
name string
)
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
WITH SERDEPROPERTIES ("separatorChar" = "\t","quoteChar"="\"","escapeChar"="\\")
STORED AS TEXTFILE;
2) Then insert data into the newly created table:
INSERT OVERWRITE TABLE separator_test SELECT * FROM table_name;
This will force Hive to by pass the bug and insert data correctly.
... View more
- « Previous
- Next »