data - BTD

advertisement

Connecting Business and

Testing with Test Analytics

Paul Gerrard paul@gerrardconsulting.com

gerrardconsulting.com

Agenda

• Agile is Transitional

• Business Experimentation and Feedback

• Which Methodology?

• Things are Changing (everything)

• Big Data and a Definition

• A New Model for Testing

• A Model for Test Analytics

• Getting Started/Baby Steps

• Close

Agile is Transitional

Intro

• Online and mobile businesses have used highly automated processes for some years

– SMAC environments (social. mobile, analytics, cloud)

• Spec by Example, BDD, ATDD, DevOps

Continuous Delivery gaining momentum

• But (familiar story?) ...

– Most successes in greenfield environments

– Some people are faking it

– Not much yet in legacy.

Agile is for some, not all

• Agile (with a capital A)

– works for teams with clear goals, autonomy, business insight and the skills to deliver

– Supports team-based, manual and what might also be called social processes

• The automated processes of Continuous

Delivery and DevOps require a different perspective.

Post-Agile?

• Business agility, based on IT needs:

– Connected processes ...

– ...supported by high levels of automation ...

– ... driven by analytics

• This won’t be achieved until we adopt what might be called a Post-Agile approach

• This is what I want to explore.

Business

Experimentation and

Feedback

Emerging trend in business: data-driven experimentation

Where to put the paths?

• A new college campus was built, but one thing was still debated:

– Where in the grass should we put the paved walkways?

– Some felt the walkways should be around the edges

– Some felt the walkways should cut diagonal, connecting all buildings to all buildings

• One professor had the winning idea...

The winning idea

• "Don't make any walkways this year. At the end of the year, look at where the grass is worn away, showing us where the students are walking.

Then just pave those paths"

• A/B, Split, Bucket, Canary tests use the same principle

• Give people a choice; then eliminate the least popular.

Feedback

"We all need people who will give us feedback. That's how we improve." Bill Gates

"I think it's very important to have a feedback loop, where you're constantly thinking about what you've done and how you could be doing it better."

Elon Musk

"The shortest feedback loop I know is doing Q&A in front of an audience."

Me.

Business Experiments are just tests, aren't they?

Can you see the clown?

What are they testing here?

Business experiments aren't so dramatic

• In the context of online businesses, experiments are a way of evolving systems

• "Survival of the fittest"

• Experiments are designed

• Measurement process defined

• System is built, tested and deployed

• Measurements are captured, analysed and acted on.

Which Methodology?

Feedback loops (NOT to scale)

2-4 weekly

Daily

Hourly

Minutes

Seconds

Pair Programming

Unit Test

Continuous Integration

Daily Scrum

Sprint

Feedback ranges from personal observations (pairing), through automated unit, build/integration tests, user tests and team retrospectives

Three development patterns

Structured

Agile

Autonomous

Continuous

Profiles of the three patterns

Characteristic Structured

Structure Managed

Agile

Autonomous

Continuous

Production Cell

Pace/cadence Business decision Team decision Feedback

Leadership Project Managed Guided Research Line Managed

Definition Fixed spec Dynamic spec Live Specs

Exploratory Automated Testing Scripted

Auto. Test Retrospective

Measurement Pervasive

Governance Bureaucratic

Developer led

Avoided

Trust-based

Pervasive

Analytics

Electronic

Things are Changing

(Everything)

In the Internet of Everything, everything is a thing

• In the IoE, things are sensors, actuators, controllers, aggregators, filters, mobile devices, apps, servers, cars, aeroplanes, cows, sheep, cups, spoons, knives, forks and ... people

• Our perception of systems is changing

• Our software projects are systems too

• Our development processes are things.

Our processes are things

• DevOps processes are triggered, controlled, monitored

• Process dependency and communication similar to Machine to Machine (M2M) communications

• ‘Publish and subscribe’ messaging model is more appropriate than 'command and control'

Our processes have (or are) sensors

• DevOps outputs report outcomes and status

• These outcomes are the payloads of messages from processes, collected by instrumentation in development and production

• The data collected is the basis of the analytics that drive software and business stakeholder decision-making

• (BTW We've created our own MQTT/MongoDB

Server at gerdle.com)

Using message buses for DevOps http://sawconcepts.com

Big Data

(and a Definition)

Big Data is what Test Analytics collects

• Most isn't BIG, it’s just bigger than before

• More interesting is its lack of structure

• Mayer-Schonberger and Cukier:

1. Traditionally, we have dealt with small samples

2. We had to be ‘exact’

3. We were obsessed with causality

• Looking at the entire data set allows us to see details that we never could before

• Correlation, not causality.

Performance testers do ‘Big Data’

• Load/response time graphs

• Graphs of system resource (network, CPU, memory, disk, DB, middleware, services etc.)

• Usually, the tester or operations analyst has to manage and merge data from a variety of sources in a variety of unstructured formats

• Common theme: it is all ‘time-series’ data

• Performance testing and analysis is a Big Data discipline and an example of Test Analytics.

My thesis

• Data captured throughout a test and assurance process should be merged and integrated with:

– Definition data (requirements and design information)

– Production monitoring data

• … and analysed in interesting and useful ways

• Monitoring starts in development and continues into production.

A (my) Definition of Test Analytics

“The capture, integration and analysis of test and production monitoring data to inform business and software development decision-making”.

Insight to Action

Analysis

Test Analytics

Insight Decision

Development and Testing

Production

'Testing'

Example: dictionary

Requirement

A customer may search for an item using any text string. If a customer selects an item the system will …

Feature customer

The Index

Glossary of Terms customer A customer is…

Batch update

Record for all reqs, all stories

Feature customer

Scenarios customer <Data Item>

Business Story

Glossary key:

[Proposed Term]

Defined Term - Not Approved

Defined Term - Approved

<Data Item> (Scenarios Only)

A New Model for

Testing

Test Analytics is based on models

Judgement, exploring and testing

We explore sources of knowledge to build test models that inform our testing

Our model(s) are adequate

Creates test models

Exploring

(sources)

Judgement

Uses test models

Testing

(the system)

New Model Testing

Feedback loops

29 page paper: http://dev.sp.qa/download/newModel

Don’t ALM tools already do TA?

• Not really.

A Model for Test

Analytics

A model for Test Analytics

• There is no fixed view

– What does Test Analytics cover?

– How should tools modularise their functionality to provide the required features

• This is a vision for how the data MIGHT be captured across what are often siloed areas of an organisation.

• Six main areas

– Approximately sequential, but parallel changes in all five areas are possible

– Silo is a convenient term for the data contained within it.

Data silos for test analytics

(an illustration)

Stakeholder

• Stakeholders

• Business Goals and Measures

• Stakeholder involvement/ engagement

• Risk

Requirements

• Requirements

• Story/feature descriptions

• Glossary of terms and Term

Usage

• Processes

• Process Paths

(workflows)

• Feature change history

Development

• Code commits

• Unit Test Count

• Code Merges

• Code

Complexity

• Code Coverage

• Fault density

• Code Churn

Assurance

• Manual Tests

• Generated test code (unit, integration, system)

• Application

Instrumentatio n

• Automated Test

Execution

History

• Test Status

Production

Application

Monitoring

• Application

Process Flows

• User Accesses/

Activities

• Feature Calls,

Response Times

• Interface calls

• Application

Alerts/Failures

• Database

Content

• Production

Failures

Production

Environment

Monitoring

• System Assets

• Resource Usage

Logs

• Performance

Data

• System Events

• System Alerts/

Failures/

Incidents

• Outages

Getting Started

I'm sorry – I wish I has the time 

Method

• What kind of analyses are possible?

• What decisions could these analyses support?

• Imagine you have captured data in the areas noted above

– In each silo the data items can be related

– In principle, we can link any data to any other data

(through a potentially complicated set of relations)

• But in a Big Data analysis, you are not obliged to figure out these relations

• History of two aspects of this data over time might reveal a pattern

• Think, “correlation rather than causality”

Start with baby-steps

• Create a glossary of terms

• Scan your requirements documents and search for each glossary term and load the usage references to another database table

• Repeat for your stories and tests

• You can now link requirements, stories and tests. If you can’t – you have an interesting requirements anomaly:

– How are requirements and examples communicated if they don’t share common terminology or language?

– Cross reference (or use common names for) your features in requirements and stories and your tests.

• Ask developers to generate a log of usage of all or key features in their application code

• Use the same common feature names or cross reference .

Some simple reports that you could produce from your database

I'd rather show you some video https://www.youtube.com/watch?v=c

NBtDstOTmA

Close

• DevOps leads the way but I think most organisations have an opportunity here

• Treat your processes as 'things' with sensors

• Treat your data as fallible and explore it

• Experiment; ask questions of your data

• Forget scientific analyses until your data is

100% automatically generated 100% reliably

• It (probably) never will be, so don’t worry.

Is Test Analytics Worth the Effort?

I'm afraid you'll have to try it for yourself

Connecting Business and

Testing with Test Analytics

Paul Gerrard paul@gerrardconsulting.com

gerrardconsulting.com

Download