Member since
06-02-2020
235
Posts
35
Kudos Received
36
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
114 | 10-24-2023 09:58 PM | |
134 | 10-24-2023 09:43 PM | |
475 | 10-05-2023 01:44 AM | |
496 | 10-02-2023 10:58 PM | |
329 | 09-14-2023 06:09 PM |
11-19-2023
09:57 PM
Please give me solution on this
... View more
11-19-2023
07:12 PM
Hi @KPG1 Cloudera not yet supported Spark 3.4.1. Are you using open source apache spark?
... View more
11-01-2023
08:07 AM
On onStageCompleted you should have also the failures and the info on why it failed with StageInfo.failureReason() Check on the standard event log JSON if the logger stored that event, if it also does not have it, then it means your listener might have an issue. If you are not really seeing the StageCompleted event anywhere ( not even on the event log), it probably means the failure was more than just a regular error and it had a major error/crash that aborted the execution of that stage in a way that no StageCompleted was ever sent. Also bear in mind that there are specific listeners for SQL execution with fail/success events: QueryExecutionListener .
... View more
10-24-2023
09:58 PM
Hi @ali786XI Jolokia is not part of Cloudera stack and we are not supporting running spark applications using Standalone mode.
... View more
10-24-2023
09:56 PM
Hi @myzard I think you need to verify the following things are set properly: 1. SPARK_HOME path 2. Python Environment paths
... View more
10-24-2023
09:49 PM
Hi @SAMSAL I think you want to run the spark application using Standalone mode. Please follow the following steps: 1. Install the Apache Spark 2. Start the Standalone master and workers. By default master will start with port 7777. Try to access and Standalone UI and see all workers are running expected. 3. Once it is running as expected then submit spark application by specifying standalone master host with 7777
... View more
10-24-2023
09:44 PM
Have you tried to restart the cluster and cluster services. I think you need to keep all services are running because either directly or indirectly one service is dependent on other service.
... View more
10-24-2023
09:43 PM
I think you need to verify the yarn and spark resources are configured properly. If yes then go and check from spark ui, it will show driver memory and executor memory. It is coming as expected then safely you can ignore it.
... View more
10-24-2023
09:40 PM
Hi @adhishankarit Spark applications will run using spark engine and not a tez engine unlike hive. You no need to set any engine from spark side. If you want to run hive queries then you can set engines like Tez, Spark, MR
... View more
10-24-2023
09:36 PM
I think you don't have sufficient resources to run the job for queue root.hdfs. Verify is there any pending running jobs/application in the root.hdfs queue from Resource Manager UI. If it is running kill those if it is not required. And also verify from spark side you have given less resource to test it.
... View more