I am using Talend Big Data Platform to implement ELT Jobs with Hive tables. The jobs run with Spark engine.
I have a problem with a Hive on Spark job.
The scope of the task is to join two tables (parquet) and then insert the output into another table (parquet).
The job goes wrong but I am not able to interpet it.
This is the error message:
[FATAL]: datahub_rp_cli.j_rp_cli_movimenti_0_1.J_RP_CLI_MOVIMENTI - tRunJob_2 Child job running failed
Exception in component tELTHiveOutput_9
java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
Trying to find the solution online in the official documentation or forum, I understand that the errors SparkTask (with code 1 or 2 or 3 eccetera) are something like a 'wrapper' of java errors; however I cannot access to hive log because of denied permissions on the server.
So have you got a lists of causes that can originate somekind of java errors for Sparktask error code =3 ?