Created on 03-11-202102:26 PM - edited on 03-11-202108:53 PM by subratadas
The following steps can be used to configure Intellij to run Spark jobs on Cloudera Data Engineering experience (CDE). This way the developers will be able to test their jobs on CDE without having to leave the IDE. The following steps are shown for an Intellij installed on a Windows PC.
Complete setting up CDE CLI with Git Bash as per this article.
In IntelliJ, go to File > Settings, modify the "shell path" to the following: C:\Program Files\Git\bin\bash.exe" --login -i
Test this setting by launching a terminal from within the IntelliJ project and running CDE CLI commands. If you are unable to run CDE CLI commands (with environment variable CDE_CONFIG not configured), you will have the option to setup/override it in the next step.
Add a Run/Debug configuration with a new Shell script with the following details. Ignore the error stating that the Shell script is not found. You have the option of overriding the CDE_CONFIG environment variable (in case you want to submit it to a different cluster than the system default). Use environment variables to supply arguments to your Spark job:
Run the cde spark submit from the Run/Debug menu as shown as follows. You should see the job run in the terminal window in Intellij.