11-25-2015 02:46 PM
I'm not entirely sure if this question should be here or in the getting started board. If I'm in the wrong place please let me know.
Anyway, my organisation has Cloudera as an "innovation" deployment where everyone develops against a single instance of most of the components of the Cloudera stack. There's a lot of manual activity to get software written, tested and artefacts prepared for potential deployment.
Naturally, there's a desire to shift to a continuous delivery method as we grow into a "production" deployment.
How do you do continuous delivery?
At the moment I'm thinking of an environment where a developer has a personal development environment (i.e. Cloudera instance and integrated development environment) hydrated (and dehydrated) and configured on demand with all artefacts held in a permenant Team Foundation Server source control repository. Code is created and unit tests are run in this space: small scale and no integration.
On a half day schedule the integration environment will be hydrated, configured and artefacts generated. The integration test suite kicks off and runs to produce reports for developers, which are checked back into source control. Finally the integration envirenment is dehydrated.
If all the tests succeed the next level of assurance is performed in its own environment, etc; until production deployment (manual step).
Is this practical/doable?
I'm also interested in how you do testing and integrate source control; but I'll hold that conversation for another thread topic.
01-07-2016 05:24 AM
I have the same question and have been unable to find any information on this. Hopefully this will kick start a conversation with how others are managing their code within a Hadoop environment.
01-07-2016 07:05 AM
02-19-2019 10:36 AM
...and that last link is also dead.
So much of this stuff is roll-your-own. The way forward will be to see how TFS is integrated into existing development platforms (i.e. how everyone is already doing it) and then try and replicate that.