Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Local Apache Spark Context from Apache Storm

avatar
Guru

Has anyone tried to create a local spark context within a Storm bolt to load a saved Spark model instead of creating a model using exported weights or PMML? It seems that there is a Log4J dependency conflict between Storm and Spark.

1 ACCEPTED SOLUTION

avatar
Guru

Just looked through the Metron project and none of the POM files seem to have a reference to Spark. I bet they are either using PMML or exporting weights. I did a bit more reading as well and the more I think about it the more it seems like that pattern is just not such a great idea. Thanks for your input.

View solution in original post

4 REPLIES 4

avatar
Master Mentor

Have you seen this thread Vadim? https://community.hortonworks.com/questions/24092/how-to-use-spark-mllib-model-in-storm.html

Also if you're having dependency issues you can use exclude tag in pom.XML to resolve those issues. https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html

avatar
Guru

Yes, and that is how I am applying my models to the demos that I have built so far. I was interested in whether it is possible to create a Spark context in a Storm Bolt. Sounds like the answer might be no. Is it?

avatar

@Vadim checkout the Metron project, they are doing spark model scoring in storm if i remember correctlyh from SKO https://github.com/apache/incubator-metron

avatar
Guru

Just looked through the Metron project and none of the POM files seem to have a reference to Spark. I bet they are either using PMML or exporting weights. I did a bit more reading as well and the more I think about it the more it seems like that pattern is just not such a great idea. Thanks for your input.