Created 07-01-2024 11:10 PM
Hello
I am struggling to set the time zone for the sessions on Spark Sql while submitting the query over the HUE.Maybe somebody already has this problem and know how to solve it - I would be grateful for the help.
on pyspak session running on linux the time zone is the one i want :
spark.conf.get("spark.sql.session.timeZone")
'Europe/Warsaw'
However then i submit the TIMESTAMP on HUE running query against SparkSql I got the UTC Time Stamp.
SELECT TIMESTAMP 'now';
1 | 2024-07-02T06:01:06Z |
Regards,
Bart
Created 07-09-2024 11:18 PM
Issue looks like you are hitting CDPD-66940 jira and try to upgrade your CDS3 parcel to latest and it will resolve your issue.
Created 07-15-2024 05:20 PM
@Bartlomiej Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
Thanks.
Regards,
Diana Torres,Created 10-08-2024 03:57 AM
Yes , upgrading spark to newest SPARK version SPARK3-3.3.2.3.3.7190.5-2-1.p0.54391297 - fixed the issue
Created 07-03-2024 02:54 AM
Hello @Bartlomiej,
You can change the timezone for Hue using the steps below:
Let me know if this helps.
Cheers!
Created 07-03-2024 03:06 AM
Hello
Thank you for this hint. However I have already check that configuration and it was already Europe/Warsaw so the one diesired.
There must be somewhere some other place to specify this i think.
Created 07-04-2024 06:46 AM
I think teh time zone has nothing to do with CURRENT_TIMESTAMP () finally. As teh actually need is to use the CURRENT_TIMESTAMP () or TIMESTAMP 'now' and have the local time as default time of that function in Spark SQL while running the query over the HUE.
Now these function are giving the time 2 h late with means in UTC format and I would like to have by default 2h later or EUROPE/Berlin time for example
Created 07-09-2024 11:18 PM
Issue looks like you are hitting CDPD-66940 jira and try to upgrade your CDS3 parcel to latest and it will resolve your issue.
Created 07-15-2024 05:20 PM
@Bartlomiej Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
Thanks.
Regards,
Diana Torres,Created 10-08-2024 03:57 AM
Yes , upgrading spark to newest SPARK version SPARK3-3.3.2.3.3.7190.5-2-1.p0.54391297 - fixed the issue