Member since
11-12-2018
218
Posts
179
Kudos Received
35
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1259 | 08-08-2025 04:22 PM | |
| 1642 | 07-11-2025 08:48 PM | |
| 2589 | 07-09-2025 09:33 PM | |
| 1543 | 04-26-2024 02:20 AM | |
| 2151 | 04-18-2024 12:35 PM |
08-31-2022
09:15 PM
Hi @nvelraj Pyspark job working locally because in your local system pandas library is installed, so it is working. When you run in cluster, pandas library/module is not available so you are getting the following error. ModuleNotFoundError: No module named 'pandas' To solve the. issue, you need to install the pandal library/module in all machines or use Virtual environment.
... View more
08-12-2022
04:49 AM
To solve "unable to find valid certification path to requested target" I just import the certificate to java and restart the Zeppeling Server. ### LINUX LIST CERT cd /usr/lib/jvm/java-11-openjdk-11.0.15.0.10-2.el8_6.x86_64/bin ./keytool -list -keystore /usr/lib/jvm/java-11-openjdk-11.0.15.0.10-2.el8_6.x86_64/lib/security/cacerts ### LINUX IMPORT CERT ./keytool --import --alias keystore_cloudera --file /var/lib/cloudera-scm-agent/agent-cert/cm-auto-global_cacerts.pem -keystore /usr/lib/jvm/java-11-openjdk-11.0.15.0.10-2.el8_6.x86_64/lib/security/cacerts
... View more
08-02-2022
06:34 PM
@Asim- JDBC also you need HWC for Managed tables. Here is the example for Spark2, but as mentioned earlier Spark3 we don't have any other way to connect Hive ACID tables from Apache Spark other than HWC and it is not yet a supported feature for Spark3.2 / CDS 3.2 in CDP 7.1.7. Marking this thread close, if you have any issues related to external tables kindly start a new Support-Questions thread for better tracking of the issue and documentation. Thanks
... View more
08-01-2022
10:58 PM
@paulo_klein, Have any of the replies helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
07-29-2022
06:51 PM
1 Kudo
Hi @paulo_klein, Apache Spark by default request a delegation token for all 4 services: HDFS, YARN, Hive, and HBase. It is printed as a WARN message but it does no harm. It is captured due to the fact that no HBase jars are in the Spark classpath, hence, it's unable to get the HBase DelegationToken. To fix this issue you start spark-shell or pyspark or spark-submit via --conf spark.security.credentials.hbase.enabled=false Example: # spark-shell --conf spark.security.credentials.hbase.enabled=false
... View more
07-26-2022
07:06 AM
@jeyaguna Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future. Thanks
... View more
07-17-2022
11:50 PM
@Seedy Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
07-11-2022
11:12 PM
@naymar, Has the reply helped resolve your issue? If so, can you kindly mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future?
... View more
07-10-2022
10:25 PM
@mamoune, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
07-05-2022
11:51 PM
Hi @jagadeesan, I tried that as well, nothing worked.
... View more