Member since
08-05-2024
21
Posts
12
Kudos Received
0
Solutions
09-11-2024
08:47 AM
@ggangadharan thanks for your reply. Yes, as soon spark sees NUMBER data type in oralce it convert the df datatype to decimal(38,10) then when precision value in oracle column contains >30 spark cant accommodate it as it only allows 28 max digits if decimal(38,10) hence getting this issue. yeah as you said the probable solution is to cast it as string Type.
... View more
08-28-2024
12:40 AM
1 Kudo
When writing to a statically partitioned table using HWC, the following query is internally fired to Hive through JDBC after writing data to a temporary location: Spark write statement: df.write.format(HIVE_WAREHOUSE_CONNECTOR).mode("append").option("partition", "c1='val1',c2='val2'").option("table", "t1").save(); HWC internal query: LOAD DATA INPATH '<spark.datasource.hive.warehouse.load.staging.dir>' [OVERWRITE] INTO TABLE db.t1 PARTITION (c1='val1',c2='val2'); During static partitioning, the partition information is known during compile time, resulting in the creation of a staging directory in the partition directory. On the other hand, when writing to a dynamically partitioned table using HWC, the following query is internally fired to Hive through JDBC after writing data to a temporary location: Spark write statement: df.write.format(HIVE_WAREHOUSE_CONNECTOR).mode("append").option("partition", "c1='val1',c2").option("table", "t1").save(); HWC internal query: CREATE TEMPORARY EXTERNAL TABLE db.job_id_table(cols....) STORED AS ORC LOCATION '<spark.datasource.hive.warehouse.load.staging.dir>';
INSERT INTO TABLE t1 PARTITION (c1='val1',c2) SELECT <cols> FROM db.job_id_table; During dynamic partitioning, the partition information is known during runtime, hence the staging directory is created at the table level. Once the DAG is completed, the MOVE TASK will move the files to the respective partitions.
... View more
08-26-2024
02:15 AM
2 Kudos
yes, it works with PUT command in place not get command.
... View more
08-22-2024
04:33 AM
1 Kudo
@bigdatacm May be the configuration for the Hive hook in Atlas is not properly set up. Ensure that the Atlas hook is correctly configured in the Hive configuration file (hive-site.xml). Check for any errors or warnings related to the hook configuration in the Atlas and Hive logs.
... View more
08-21-2024
12:15 PM
an we know why are you using GET command to update the entity ? is it typo ?
... View more