AgileTestStrategy

advertisement
Paul Gerrard
paul@gerrardconsulting.com
gerrardconsulting.com
@paul_gerrard
Agile Test Strategy
Overview
•
•
•
•
•
•
•
What is Agile Test Strategy?
Project Profiling
(Test Strategy as) Agile Interventions
Test Automation
What’s Left?
Summary
Q&A
Intelligent Definition and Assurance
Slide 2
What is Agile Test
Strategy?
Agile Strategy – an oxymoron?
Agile Test Strategy
• What do we mean by this?
1. AGILE Test Strategy – how to create a test
strategy in an Agile way?
2. AGILE Test Strategy – a test strategy for an
Agile project?
• We’ll look at how we created an Agile
approach to strategy, but we’ll spend more
time on strategy for an Agile project.
Intelligent Definition and Assurance
Slide 4
Google “Agile Test Strategy”
• There are plenty of recipes out there
• Most offer a selection of techniques but don’t
provide much guidance on how to blend them
• We need to know how to make choices, not
just know what choices exist
• Strategy is a thought process, not a document
– Although you might document the ideas for reuse, as
a reminder or ‘for the record’.
Intelligent Definition and Assurance
Slide 5
Agile governance
• Governance is the act of governing. It relates
to decisions that define expectations,
grant power, or verify performance
Wikipedia
• Define expectations – DEFINITION of need
• Grant power – DELEGATION to a project
team
• Verify performance – ASSURANCE of solution.
Intelligent Definition and Assurance
Slide 6
Strategy helps you decide what to do
A. The strategy presents some decisions that can be
made ahead of time
B. Defines the process or method or information that
will allow decisions to be made (in project)
C. Sets out the principles (or process) to follow for
uncertain situations or unplanned events
• In a structured/waterfall environment, most questions
answered off-the-shelf – “A-style, ready for anything”
• In an Agile environment – might have some ready-made
policies but we manage scope and adapt (mostly C?)
Intelligent Definition and Assurance
Slide 7
Contexts of Test Strategy
Axioms
Early
Testing
Risks
Goals
Test
Strategy
Culture
Human
resource
Contract
Automation
Constraints
Process
(lack of?)
DeDuplication
Opportunities
Skills
Environment
Communication
Timescales
User
involvement
Artefacts
Traditional v Agile test strategy
• Traditional – structured, goal/risk-driven
–
–
–
–
Identify stakeholders; what are their goals?
Product risk analysis
Allocate risks/goals to test stages
Formulate test stage definitions (entry/exit criteria,
environments, tools etc. etc.
• Agile – interventionist, consensus-driven
– Project profiling to set the testing theme
– Identify testing interventions (perhaps better,
contributions) in the Agile process
– Test policy overlays the process; catches exceptions.
Intelligent Definition and Assurance
Slide 9
Project Profiling
Select a profile for your project first,
then choose the aspects of test
strategy that suite your project
Template-driven? Bah!
• So this is just a template copy and edit process?
• Won’t you always end up with the same document?
• Profiling doesn’t need to be prescriptive
– No need to write a document if you don’t need to
– But if company policy or common sense dictates certain
approaches, save yourself some time
– Create a set of deeper, more detailed questions to be
answered (Pocketbook)
• Profilers are really just checklists: heuristic guidelines
designed to help you make choices and trade-offs.
Intelligent Definition and Assurance
Slide 11
Incident Mgt.
Using a Project Profiler to Derive
a Test Strategy and Project Plan
(A government client example)
Tools
Project Plan
Story Guideline
Test
Environment
Project Profiler
Test Plan
Items
Project Manager
Waterfall
Test Strategy
Orange
Risk Profiler
Consultation
Cerise
Product
Risks
Green
SCRUM/Agile
Test Strategy
Assurance
Unknowns
Blue
Business, Project Team
and Boards
The Project Profiler (with Test Assurance) helps
Project Managers to:
• Select a project style that fits (Waterfall or Agile)
• Identify the product risks that need testing
• Identify test activities to include in project plans
• Carefully define the scope of the project
Intelligent Definition and Assurance
12
Project profiling process
Task
1
2
Have the Information you need to hand
Project Profiler (section 3):
Select the statements that best match your project context. The Blue column indicates
that you need more information – consult your stakeholders, team or relevant Board(s).
3
Generic Risk Profiler (section 4):
Consider the generic project risks – which are significant for your project? Can testing
help?
4
Product Risk Profiler (Section 5):
Consider the functional and non-functional risks that most concern your stakeholders –
should they be tested?
5
Actions and Test Activities (Section 6):
Consider the actions that prompt your ‘next steps’ and the test activities that should be
incorporated into your project plan.
6
7
Create your Test Strategy from the Test Strategy Framework Template
Incorporate the activities from stage 5 and identified in 6 into your Project
Plan.
Intelligent Definition and Assurance
13
Project Profiler (part of section 3)
Project Aspect
Responsibility for
Acceptance
Cerise
Users will take
responsibility for UAT;
they have UAT experience
Orange
Users will be responsible
for UAT but have no test
experience
Green
Users will take part in UAT
or witness tests at critical
periods, and will review
the outcome
Requirements (Sources
of Knowledge)
New system replaces a
well-understood existing
system; users have clear
vision of system goals and
prefer to document their
requirements up-front
New system is a functional
replacement of an existing
system or a well-defined
process (requirements can
be fixed early on)
Users want to collaborate
to jointly define
requirements and meet
them incrementally or
iteratively
Users put the onus of
requirements elicitation on
the project; requirements
and the solution will evolve
New system replaces an
existing system with
enhancements or an
established (but not
necessarily documented
process)
High visibility/risk to
business; formal progress
reporting required; some
defined deliverables, some
deliverables will
emerge/evolve; some
approvals and sign-offs
Single, known supplier
responsible for
development (and supplier
testing)
New system supports a
new business need;
business process exists
but will change/evolve;
users have experience of
requirements
Relatively low businessrisk; informal progress
reporting is acceptable;
partial solution may
suffice,
incremental/iterative
delivery
In-house development, no
external dependencies
New system supports a
new business need;
business process is not
yet known; users have no
experience or
requirements
Potentially, high visibility,
high risk project, uncertain
impact on the business
Etc.
Etc.
Etc.
Requirements Stability
Visibility, Formality
External Dependencies
Etc.
High visibility/risk to
general public; formal
progress reporting
required at board level;
fixed scope and
deliverables; formal
approvals and sign-offs
More than one or new
external suppliers
responsible for
development (and supplier
testing)
Etc.
Intelligent Definition and Assurance
Blue
Users are unwilling/unable
to take part in UAT;
reluctant to make the
acceptance decision or not
known
Inexperienced users who
are unable or unwilling to
collaborate with
requirements gathering
Dependencies on external
suppliers, their
responsibilities or
competence not yet known
14
Project types - examples
Cerise
Orange
Green
Blue
Structured, waterfall style of project (and includes
COTS projects)
Iterative/prototyping style of project using SCRUM in
a formal way and having dedicated resources for the
Business Analyst and Tester roles.
A project using SCRUM in a less formal way but not
having dedicated resources for the Business Analyst
and/or the Tester roles.
Blue column statements describe where there is
insufficient information available to identify the style
of project and the recommendation must be that
some further investigation is required.
Intelligent Definition and Assurance
15
(Test Strategy as)
Agile Interventions
I’m using Scrum/Sprint terminology,
but you don’t have to of course
Interventions (government client example)
Activity
Story Challenge
2
Story Definition
These activities are repeated
for each Sprint iteration
• On the following
slides, we highlight
8 interventions
• Some are test
phases, but some
aren’t
No.
1
7
8
9
3
Daily Stand-Up
4
Story Refinement
5
Developer Testing
6
Integration (and
incremental
System) Testing
System Testing
User Acceptance
Testing
Non-functional
Testing and PreProduction Testing
Intelligent Definition and Assurance
When?
As stories are added to the
Product Backlog
As stories are added to a
Sprint Backlog
Once per day during the
Sprint
Occurs throughout the Sprint
as new information emerges
Occurs throughout the Sprint
as the developer codes the
stories
During and at the end of
each sprint, including the
final sprint
At the end of each sprint,
including the final sprint
At the end of each sprint,
including the final sprint
Expected to take place on an
as-needs basis.
Slide 17
1. Story Challenge
Suggest ‘what-ifs’ to
challenge new stories
and define story
headlines
Project Level Test Activities
(This diagram shows three sprints, but there could be more or fewer)
Sprint Backlog
Sprint Backlog
Sprint Backlog
Sprint 2
Sprint 3
2. Story Definition
Introduce scenarios
to enhance the
Acceptance Criteria
Sprint 1
Developed Stories
Developed Stories
Developed Stories
New Code
Integration into
Existing Code base
Automated testing
Increasing
Scope of
Sys. Test
and UAT
6. Integration Test
6. Integration Test
6. Integration Test
7. System Test
8. User Test
Increasing Scope of Integration,
System and Users Testing
Complete Tests after
Final Sprint
1. Story Challenge
Suggest ‘what-ifs’ to
challenge new stories
and define story
headlines
Project Level Test Activities
(This diagram shows three sprints, but there could be more or fewer)
Sprint Backlog
Sprint Backlog
Sprint Backlog
Sprint 2
Sprint 3
2. Story Definition
Introduce scenarios
to enhance the
Acceptance Criteria
Sprint 1
Developed Stories
Developed Stories
Developed Stories
New Code
Integration into
Existing Code base
Automated testing
Increasing
Scope of
Sys. Test
and UAT
6. Integration Test
6. Integration Test
6. Integration Test
7. System Test
8. User Test
Increasing Scope of Integration,
System and Users Testing
Complete Tests after
Final Sprint
1. Story Challenge
Suggest ‘what-ifs’ to
challenge new stories
and define story
headlines
Project Level Test Activities
(This diagram shows three sprints, but there could be more or fewer)
Sprint Backlog
Sprint Backlog
Sprint Backlog
Sprint 2
Sprint 3
2. Story Definition
Introduce scenarios
to enhance the
Acceptance Criteria
Sprint 1
Developed Stories
Developed Stories
Developed Stories
New Code
Integration into
Existing Code base
Automated testing
Increasing
Scope of
Sys. Test
and UAT
6. Integration Test
6. Integration Test
6. Integration Test
7. System Test
8. User Test
Increasing Scope of Integration,
System and Users Testing
Complete Tests after
Final Sprint
1. Story Challenge
Suggest ‘what-ifs’ to
challenge new stories
and define story
headlines
Project Level Test Activities
(This diagram shows three sprints, but there could be more or fewer)
Sprint Backlog
Sprint Backlog
Sprint Backlog
Sprint 2
Sprint 3
2. Story Definition
Introduce scenarios
to enhance the
Acceptance Criteria
Sprint 1
Developed Stories
Developed Stories
Developed Stories
New Code
Integration into
Existing Code base
Automated testing
Increasing
Scope of
Int. Sys.
and UAT
6. Integration Test
6. Integration Test
6. Integration Test
7. System Test
8. User Test
Increasing Scope of Integration,
System and Users Testing
Complete Tests after
Final Sprint
1. Story Challenge
Suggest ‘what-ifs’ to
challenge new stories
and define story
headlines
Project Level Test Activities
(This diagram shows three sprints, but there could be more or fewer)
Sprint Backlog
Sprint Backlog
Sprint Backlog
Sprint 2
Sprint 3
2. Story Definition
Introduce scenarios
to enhance the
Acceptance Criteria
Sprint 1
Developed Stories
Developed Stories
Developed Stories
New Code
Integration into
Existing Code base
Automated testing
Increasing
Scope of
Int. Sys.
and UAT
6. Integration Test
6. Integration Test
6. Integration Test
7. System Test
8. User Test
Increasing Scope of Integration,
System and Users Testing
Complete Tests after
Final Sprint
Test Activities in the Sprint
3. Daily Stand-Up
Report anomalies found,
stories tested, amended,
created
Daily Scrum
Stand-Up
Meeting
4. Story Refinement
Refine scenarios to
enhance story definition,
create system tests as
stories, as required
5) Developer Testing
Private ad-hoc tests and
build/run automated unit
tests
24
Hours
2-4 Weeks
Sprint Backlog
6) Integration/System Testing
Incorporate automated unit
tests into the CI regime.
On weekly basis and at end of
Sprint, deploy to System test
environment and tester runs
system tests.
Backlog tasks
expanded
by team
Product backlog
As prioritised by Product Owner
Intelligent Definition and Assurance
Potentially Shippable
Product increment
23
Test Activities in the Sprint
3. Daily Stand-Up
Report anomalies found,
stories tested, amended,
created
Daily Scrum
Stand-Up
Meeting
4. Story Refinement
Refine scenarios to
enhance story definition,
create system tests as
stories, as required
5) Developer Testing
Private ad-hoc tests and
build/run automated unit
tests
24
Hours
2-4 Weeks
Sprint Backlog
6) Integration/System Testing
Incorporate automated unit
tests into the CI regime.
On weekly basis and at end of
Sprint, deploy to System test
environment and tester runs
system tests.
Backlog tasks
expanded
by team
Product backlog
As prioritised by Product Owner
Intelligent Definition and Assurance
Potentially Shippable
Product increment
24
Test Activities in the Sprint
3. Daily Stand-Up
Report anomalies found,
stories tested, amended,
created
Daily Scrum
Stand-Up
Meeting
4. Story Refinement
Refine scenarios to
enhance story definition,
create system tests as
stories, as required
5) Developer Testing
Private ad-hoc tests and
build/run automated unit
tests
24
Hours
2-4 Weeks
Sprint Backlog
6) Integration/System Testing
Incorporate automated unit
tests into the CI regime.
On weekly basis and at end of
Sprint, deploy to System test
environment and tester runs
system tests.
Backlog tasks
expanded
by team
Product backlog
As prioritised by Product Owner
Intelligent Definition and Assurance
Potentially Shippable
Product increment
25
Test Activities in the Sprint
3. Daily Stand-Up
Report anomalies found,
stories tested, amended,
created
Daily Scrum
Stand-Up
Meeting
4. Story Refinement
Refine scenarios to
enhance story definition,
create system tests as
stories, as required
5) Developer Testing
Private ad-hoc tests and
build/run automated unit
tests
24
Hours
2-4 Weeks
Sprint Backlog
6) Integration/System Testing
Incorporate automated unit
tests into the CI regime.
On weekly basis and at end of
Sprint, deploy to System test
environment and tester runs
system tests.
Backlog tasks
expanded
by team
Product backlog
As prioritised by Product Owner
Intelligent Definition and Assurance
Potentially Shippable
Product increment
26
4. Story Refinement (example definition)
Objectives






What’s being tested?

Deliverables

Responsibilities (Orange) 
Responsibilities (Green)
Baseline
Entry Criteria
Exit Criteria











To define acceptance criteria for all stories that are included in a Sprint as they are worked
on by development
To define scenarios that describe the tests and expected behaviours of the System
Improve understanding of the requirement and communicate anomalies to developers
To identify System Tests that exercise functionality of multiple stories that can be system
tested in this sprint
To assure the completeness for stories in the current Sprint
Stories to be included in the current Sprint
Refined story definitions with defined acceptance criteria and scenarios, where appropriate
System tests
Tester – challenges stories by suggesting potential scenarios, new stories, story merges and
splits; performs ad-hoc testing with/on behalf of developers; assures completeness of
stories.
Developers – considers stories, evaluates impact on development
Product Owner or Analyst – collates feedback and decisions on stories
Product Owner – approves changes to stories, accepts completeness of stories
Scrum Master – monitors progress; evaluates impact on resources and schedules
Not performed in Green projects
Story Guideline (reference 3)
On commencement of the Sprint
When all stories within a Sprint are considered complete
Queries, anomalies, discrepancies and inconsistencies have been eliminated or explained
System Tests appropriate to the Sprint have been defined
Definition of acceptance is agreed with Product Owner
Intelligent Definition and Assurance
27
Test Automation
Could you create an Agile Test
Strategy without automation?
Brian Marick’s Testing quadrants
Intelligent Definition and Assurance
29
Test Automation Pyramid – Lisa
Crispin’s version (Google others)
• Pyramid reflects the
relative numbers of tests
• Focus on
unit/component
– Acceptance of “Services”
• GUI are end-to-end
• Manual checking the
exception?
Intelligent Definition and Assurance
30
Where do you automate?
3. Non-Technical
testers write scripts
Tools Experts write
interface
Stories/
Scenarios
Test Code
GUI Test Framework
2. Technical Testers
code scripts directly
GUI Test Tool
Unit Test Framework
Browser
HTTP Driver
HTTP/S
4. Programmers
write low level HTTP
GET/POST calls
through a driver that
simulates a browser
1. Programmers
write unit tests or
execute embedded
unit tests using a
unit test framework
to test components
HTTP/S
Inter/Intranet
HTTP/S
Test Code
Web Server
App. Server
DB Server
API
Unit Test Framework
Distributed testing
• Use business stories and scenarios/acceptance
criteria to validate requirements
• Reuse those stories to feed ‘AcceptanceDriven Development’ BDD/TDD process
• Automated tests are an Anti-Regression tactic
• Automated tests don’t replicate manual tests;
think of them as a ‘change trip-wire’ that
triggers an alarm, if tripped.
Intelligent Definition and Assurance
32
Deriving scenarios to test
•
•
•
•
•
•
Story Challenge
To understand feature scope?
Story Refinement
To get stakeholder to accept?
Story Definition
To validate the requirement?
Iteration Planning
To estimate the work to build this feature?
System Testing
To system test this feature?
To unit test this feature?
Developer Testing
Scenarios are created to meet several goals
Intelligent Definition and Assurance
33
What’s Left?
Other aspects of test policy
•
•
•
•
•
•
•
Definitions (of done etc.)
Incident management
Test automation
Story format e.g. Gherkin
Environment request and management
Regression testing (at what levels)
Test deliverables and documentation.
Intelligent Definition and Assurance
Slide 35
The Three Amigos
• Business Analyst
– Liaises and manages stakeholders and their needs
– Transforms business requirements into
specification (at multiple levels)
• Developer
– Scopes, designs, builds, tests and delivers features
• Tester
– Challenges the thinking on the project
– Performs ‘Assurance in the small’.
Intelligent Definition and Assurance
Slide 36
The tester’s contribution to testing
•
•
•
•
•
•
•
_________ feature/story acceptance criteria
_________ the developers to unit test (auto)
_________ feature/story acceptance
_________ acceptance test
_________ service/UI level automation
Scope: from low-level detail to system integration
Liaison with integration testers and feedback
• Fill in the blanks yourself; negotiate with your
team.
Intelligent Definition and Assurance
Slide 37
Summary
Close
• Agile test strategy has its place and many aspects
of test can be pre-defined
• Importantly, we use a principled approach to deal
with the unexpected
• Project profiling can help
• Testing as interventions, rather than test phases
• The testing role is split at least three ways – the
tester doesn’t own testing – think TESTMASTER
• Test automation in the context of Specification by
Example, requirements validation, BDD, TDD.
Intelligent Definition and Assurance
Slide 39
Paul Gerrard
paul@gerrardconsulting.com
gerrardconsulting.com
@paul_gerrard
Agile Test Strategy
Download