Member since
06-02-2020
331
Posts
65
Kudos Received
49
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1199 | 07-11-2024 01:55 AM | |
3479 | 07-09-2024 11:18 PM | |
2861 | 07-09-2024 04:26 AM | |
2306 | 07-09-2024 03:38 AM | |
2520 | 06-05-2024 02:03 AM |
10-24-2023
09:43 PM
I think you need to verify the yarn and spark resources are configured properly. If yes then go and check from spark ui, it will show driver memory and executor memory. It is coming as expected then safely you can ignore it.
... View more
10-24-2023
09:40 PM
Hi @adhishankarit Spark applications will run using spark engine and not a tez engine unlike hive. You no need to set any engine from spark side. If you want to run hive queries then you can set engines like Tez, Spark, MR
... View more
10-24-2023
09:36 PM
I think you don't have sufficient resources to run the job for queue root.hdfs. Verify is there any pending running jobs/application in the root.hdfs queue from Resource Manager UI. If it is running kill those if it is not required. And also verify from spark side you have given less resource to test it.
... View more
10-19-2023
04:36 PM
1 Kudo
The node must have a NodeManager role to take part of the processing, Spark gateway, and Yarn Gateway
... View more
10-17-2023
10:27 PM
Apache Spark is a unified analytics engine for large-scale data processing. It provides high-level APIs in Java, Scala, Python, and R, and an optimized engine that supports general execution graphs.
The below table provides you links to features released in each Spark 3 minor version:
Spark Version
Documentation Link
Release Date
Spark 3.0.0
https://spark.apache.org/docs/3.0.0/
2020-06-16
Spark 3.0.1
https://spark.apache.org/docs/3.0.1/
2020-11-05
Spark 3.0.2
https://spark.apache.org/docs/3.0.2/
2021-02-19
Spark 3.0.3
https://spark.apache.org/docs/3.0.3/
2022-06-17
Spark 3.1.1
https://spark.apache.org/docs/3.1.1/
2021-03-02
Spark 3.1.2
https://spark.apache.org/docs/3.1.2/
2022-06-17
Spark 3.1.3
https://spark.apache.org/docs/3.1.3/
2022-06-17
Spark 3.2.0
https://spark.apache.org/docs/3.2.0/
2021-10-13
Spark 3.2.1
https://spark.apache.org/docs/3.2.1/
2022-06-17
Spark 3.2.2
https://spark.apache.org/docs/3.2.2/
2022-07-15
Spark 3.2.3
https://spark.apache.org/docs/3.2.3/
2022-11-28
Spark 3.2.4
https://spark.apache.org/docs/3.2.4/
2023-04-13
Spark 3.3.0
https://spark.apache.org/docs/3.3.0/
2022-06-17
Spark 3.3.1
https://spark.apache.org/docs/3.3.1/
2022-10-25
Spark 3.3.2
https://spark.apache.org/docs/3.3.2/
2023-02-15
Spark 3.3.3
https://spark.apache.org/docs/3.3.3/
2023-08-21
Spark 3.4.0
https://spark.apache.org/docs/3.4.0/
2023-04-13
Spark 3.4.1
https://spark.apache.org/docs/3.4.1/
2023-06-23
Spark 3.5.0
https://spark.apache.org/docs/3.5.0/
2023-09-13
... View more
Labels:
10-05-2023
01:54 AM
Hi @hegdemahendra CDP onwards Spark Thrift Server is not supported. You can try the following links maybe it is useful for you: https://stackoverflow.com/questions/29227949/how-to-implement-spark-sql-pagination-query
... View more
10-05-2023
01:44 AM
Hi Team, Livy3 with Zeppelin Integration is not yet supported. To use Spark3, you need to install python3 and needs to add the following parameters: PYSPARK3_PYTHON spark.yarn.appMasterEnv.PYSPARK3_PYTHON Reference: https://docs.cloudera.com/cdp-private-cloud-base/7.1.8/running-spark-applications/topics/spark-python-path-variables-livy.html
... View more
10-02-2023
11:13 PM
Hi @pranav007 The required setup is little bit complex. You can try copy the core-site.xml, hdfs-site.xml, yarn-site.xml, hive-site.xml, mapred-site.xml, krb_configuration files to resource folder. In the spark code, you need to add a two parameters i.e spark.driver.extraJavaOptions and spark.executor.extraJavaOptionsby specifiing the krb_file location. --conf spark.driver.extraJavaOptions="-Djava.security.krb5.conf=KRB5_PATH" \
--conf spark.executor.extraJavaOptions="-Djava.security.krb5.conf=KRB5_PATH" \
... View more
10-02-2023
10:58 PM
There are two solutions you can try. 1. Create one more shell operator and perform kinit and after that submit your spark 2. Pass the keytab and principal to the spark-submit
... View more
09-26-2023
02:42 AM
@Emanuel_MXN, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more