While developing some code using Structured Streaming, I came across a code generation bug, which exists in Spark 2.1.1, but has been resolved in 2.2.0. Looking to make my code work on the current HDP, I saw that the latest Spark-release in the Maven repository was 188.8.131.52.6.1.9-1. That version no longer has the bug I was encountering in vanilla Spark 2.1.1.
After looking at our HDP installation, I realized, that the only Spark version currently being distributed is 184.108.40.206.6.1.0-129 which sadly still has this bug. I presume, that the bug has been fixed upon a customer request in on of the 9 hotfixes released since the 2.6.1 release. My question is - how would I distribute these hotfix-Spark releases using Ambari in a maintainable manner?
Obviously, I could manually download the spark-components from the repository and build a custom installation, but I assume that there's an easier way of accessing these artifacts via a package repository.