What is Agile Maturity?

advertisement
AGILE MATURITY LEVELS
AND ITS CORRELATION TO
INCREASED THROUGHPUT
Anil Varre
Lyudmila Mishra
Agenda







Agile Maturity
Project Overview
Technology Stack
Project Goals and Challenges
Agile Tool Set
Process to Achieve Maturity
Conclusion
Agile Maturity
•
•
Core goal of an Agile team should be to build a deployable
product in every iteration through all the phases
Maturity Levels as defined by us
Level 1 = Continuous Build
•
Unit Tests and Code Quality Tests with every commit
Level 2 = Level 1 + Continuous Deployment
•
All components are rebuilt and release candidates are deployed to a
systems integration environment at least once a day.
Level 3 = Level 2 + Continuous Testing
•
Automated integrations tests, smoke tests, security scans and
performance tests run after every deployment to systems integration
environment.
Level 4 = Level 3 + Continuous Release
•
Automated promotion of release candidates to a UAT/Pre-Prod
environment with regular sign-offs is what defines “Done”.
Project Overview


Nature: Complete Re-Architecture of 10 year old
platform
Size :
 60+
team size (developers, content, design, QA)
 Duration 2 Years
 Multi-Million Dollar Budget

Team Distribution - Onsite, Offshore, and
Specialists/Consultants
Projects Goals
Business Goals
Better, Faster, Cheaper System and its delivery
Quality Goals
Feasibility
Accountability
Functionality
Testability
Maintainability
Simplicity
Scalability.
Extensibility
Project Challenges



Distributed Teams
Organization Culture
Poor reliability in terms of
 Functionality
 Delivery
Time
 Quality

Varied Technology Stack
Technology Stack
Complex Problem -> Simple Solution
•
The Problem Statement:
Define a way to deliver HIGH QUALITY, VALUABLE, software in an
EFFICIENT, FAST, RELIABLE fashion.
•
The Solution Adopted:
Develop a Build, Deploy, Test and Release Process that
 is fully automated
 produces release worthy artifacts regularly
 is repeatable and reliable
 provides quick feedback on functionality, quality, and
performance
 increases visibility into the health of the project
 clearly articulates team velocity and cadence
Tool Set Adopted
•
Project Management and Issue Tracking
–
•
–
•
JUnit
Cobertura
–
–
–
–
–
PMD
CPD
Jdepend
Java NCSS (Non Commenting Source Statements)
CCN (Cyclomatic Complexity Number also know
as McCabe Metric)
Veracode
Code Testing
–
–
–
–
–
JUnit
Selenium
Fitnesse
JMeter
DTM Data Generator
Productivity
–
•
•
–
–
SVN
Build And Deployment Tools
–
–
–
•
DBDeploy
Source Control Management
–
•
Ivy
Artifactory
Tattletale
Database Change Management
–
•
JProfiler
Dependency Management
–
•
Custom Eclipse Settings
Code Profiling
–
Code Review/Quality
–
•
JIRA
Code Coverage
–
•
ANT
Shell Script Suite
Hudson
Version Controlled Developer Image
Hudson
•
•
•
•
•
•
•
•
•
•
•
•
•
Continuous Integration
Project Health Dashboard
Code Metrics and Quality
Continuous Builds & Deploy
Create New Projects
JBoss Restarts
Migrate Data Bases
Running & Scheduling Batch
Test Data Management in Data Base
Deploying Content
Single - Push Button Deploy
Selenium Regression Tests
Leader Board
Level 1 – Achieving Continuous Build
The Practices
 Every developer runs pre-commit tests on local environment.
 On successful pre-commit tests, developer commits code to central
SCM repository
 CI server (Hudson) triggers new build when new revision is detected.
 CI server repeats units tests, code review and code analysis,
publishes reports for the project and makes it visible to all.
 On successful tests, new source packages, binaries and javadocs, etc
are published to a common repository so they may used by
downstream components.
 CI server continues to build downstream components to ensure
working software exists
Characteristics of the Build Scripts








•
Common build scripts for all components belonging to the same
technology type
Checked in along with same repository as code base.
Branched and tagged along with code in order to guarantee
repeatable builds.
Continually evolve along with codebase.
Implemented incrementally to fulfill goals of a specific maturity level
Executes in stages as the change follows through the delivery
pipeline
Complete ideally in 90 seconds, but no longer than 5 mins.
Failed builds alert the entire team, including managers and testers,
not just developers.
Chose Ant as the build scripting tool of choice
•
•
•
Quick and easy to develop customized scripts
Consistent scripting language irrespective of technology used
Easy to maintain by wide variety of team members
Level 2 – Achieving Continuous Deployment
The Practices
 CI server kicks off Stage 1 build scripts – this time carefully tagging
everything and exporting tagged source for builds.
 CI server publishes artifacts as release candidates on successful build and
stage 1 tests.
 CI server executes next stage build scripts to create deployment packages
or installers.



The exact same installer is used to for deployment into every environment –
nothing is rebuilt from this point forward for a release candidate.
CI server reconfigures the systems integration environment using reference
implementations scripts of all application containers, web servers, etc that
are checked in to source control.
CI server deploys/installs the new packages on systems integration server

All environment specific configurations are provided as input to the installer
using configuration and property files.
Characteristics of the Deployment Scripts

Minimal deployment instructions with appropriate levels of automation to reduce chances of errors.

Minimal set of deployment tools to reduce need for extensive training and documentation.








Same deployment tools for each component irrespective of technology to maintain consistent instructions with
reduced complexity.
Use of standardized reference implementations of software platforms to ensure cohesiveness between all
environments.
Use of automated pre-requisite environment verification testing to ensure fewer environment based
deployment issues.
Ability to deploy an individual sub-system when possible without having to alter or update other sub-systems.
Ability to deploy entire system by chaining installation of pre-requisite software environments and all subsystems to ensure ability to create the entire system on a brand new hardware platform easily and quickly.
Ability to use a unified deployment process into each environment to reduce variability between how changes
are applied within the promotion path.
Ability to rollback back changes and restore to previous state of environment easily and quickly with simple
and few instructions.
Ability to query the differences between the post-deployment state and the pre-deployment state.
Level 3 – Achieving Continuous Testing
The Practices
 CI server runs stage 2 tests on systems integration server








integration tests,
acceptance tests,
performance tests
On successful tests, installers are marked as ready to promote
and archived to “release staging area”.
On a pre defined schedule CI server promotes the latest releasedistribution to a QA Test environment using the exact same
deployment packages and automated deployment scripts.
CI server repeats acceptance tests.
CI server executes automated regression tests.
QA team performs manual exploratory tests

Automation remains in pause mode until next signal from QA team.
Level 4 - Achieving Continuous Delivery

Develop a Deployment Pipeline that satisfies business needs and reflects
team cadence

Begin with manual steps and then script EVERYTHING incrementally.

Push Button Release to any environment


Use schedulers and chain scripts to force movement of releases along pipeline
Manage everything in source control
 environments and their configurations,
 Development, build, test tools, etc,
 build and deployment scripts,
 test scripts,
 documentation,
An example
Lessons Learned




Despite the following we achieved agility:
 No formal CSM/CSP/etc. training for staff
 No dedicated product management per team
 No dedicated resources for project teams
 Not all roles staffed
Highly self directed and motivated team is necessary
Tools and best practices are simply adopted – no management
mandate is necessary
Think big but act in small incremental steps
Conclusion





12 Releases since project launch on Nov 1st
Only one trouble ticket from end users since release!
First attempt with Veracode scan resulted in AAA rating
Performance test results consistently meet business requirements
100% code coverage all through project lifecycle


<5% PMD and CPD violation throughout project cycle



Easy to refactor code, extend functionality
Easy maintenance
Self documented architecture and code
Easy to integrate new members into team
Download