Continues Delivery - Build true pipelines with Jenkins
Last few days, I built a pipeline delivery solution for the project, based on open source Jenkins and its vast plugins arsenal.
The goal is after developers checked in their codes, the rest of work in development cycle is ALL in automatic fashion, or out-source to someone else rather than developers themselves.
Using Subversion, Jenkins checks out codes soon after developers commit. Five pipelins show the progress - build, unit test, integration and functional test, code quality insurance wih Sonar, till the last stage deployment which is the only manual step involve human being, no need to be super intelligent, involve in.
At the start of each pipeline build shows SVN revision number. Any test failed, developers can trace down which revision causes the failure, and who did it. This prevents blaming game or dog-fighting among developers.
Both developers and testers watch the pipelines screen. When the drop time or hot fix come in, testers just need to find out which SVN revision they care about, check the pipeline matches this revision number has passed ALL the test, then trigger the deployment button, last stage of the pipeline of each successfully build, input ONLY environment testers want to deploy, the automatic deployment script will grab the already build artifaces and deploy them to DEV, SIT, SVT, PRE-PROD, whatever you could name ...
Like Jenkins' master and slave design, distributing build and test tasks to any slave node when it's available. Basically, you can created unlimited slave nodes, in virtual machine environment, as your project grows.
CI is the only reliable source I could trust the codes. And it's the infrastructure of development super highway should be built at ever first before starting test and production codes.
plink, pscp tool on windows, calling startup and shutdown scripts in Tomcat (have to, to keep copy of old release).