Lab 3 – Test Plan

advertisement
Running Head: Lab 3 – Test Plan
Lab 3 – Test Plan
by Andrew McKnight
of Team Blue
for CS 411 Spring 2012
with Prof. Brunelle
McKnight 1
Lab 3 – Test Plan
McKnight 2
Table of Contents
1 Objectives ..................................................................................................................... 3
2 References ..................................................................................................................... 3
3 Test Plan........................................................................................................................ 3
3.1
Testing Approach ................................................................................................... 4
3.2
Identification of Tests ............................................................................................ 5
3.3
Test Schedule ......................................................................................................... 9
3.4
Fault Reporting and Data Recording ..................................................................... 9
3.5
Resource Requirements ....................................................................................... 10
3.6
Test Environment ................................................................................................. 10
List of Figures
Figure 1: Prototype Major Functional Components Diagram………….....………………4
Figure 2: View of the presentation area………….………………………………………11
Figure 3: Schematic plan of the presentation room……………………………………...11
List of Tables
Table 1: Identification of Tests……………………………………………………………5
Table 2: Test Schedule…………………………………………………………………….9
Table 3: Fault Reporting and Data Recording…………………………………………...10
Lab 3 – Test Plan
1
McKnight 3
Objectives
The Traffic Wizard ecosystem involves several interdependent parts. All must be
fully functional, including communication capability. Risks do exist, from availability of
user base from which to gather data, to hardware malfunction, to cellular signal strength.
The prototype serves several purposes with respect to risk mitigation. It helps find the
bounds of normal operation of a given server configuration by experimentation.
Algorithms can be troubleshot and fine tuned to the actual amount of input they will
receive in the real world. The prototype also provides a test plan that serves as the
foundation to building an automated testing framework. Finally, the prototype shall serve
as a demonstration of the capability of the system in its own right, and as a realization of
a small but significant societal tool.
2
References
McKnight, Andrew. “Lab 2 – Prototype Specifications.” CS411. March 28, 2012.
3
Test Plan
The Traffic Wizard ecosystem is comprised of several different entities, each with
their own components. Databases, algorithms, smartphones and other hardware must all
work together to ensure a good user experience. Each component part must be tested
individually and together as an integrated system to discover potential problems and
either eliminate them or minimize their impact and probability. The prototype
demonstration will present a subset of the tests needed to shape Traffic Wizard into a
fully functional collection of applications.
Lab 3 – Test Plan
3.1
McKnight 4
Testing Approach
Figure 1 shows the prototype’s different components and how they interact with
each other. There are three tiers of testing to validate the proper function of each part.
Unit testing performs basic checks of functionality of algorithms. Integration tests will
combine the algorithms together with the databases. System testing will incorporate all
aspects of Traffic Wizard to give a thorough picture of interactions of all the parts
together.
Figure 1: Prototype Major Functional Components Diagram
Lab 3 – Test Plan
3.2
McKnight 5
Identification of Tests
Table 1 is a comprehensive list of tests needed to prove the proper operation of all
Traffic Wizard components. It is broken down into categories and subcategories to
compartmentalize testing. The order describes a bottom-up approach of testing the
smallest components and then integrating them and testing their interoperability.
Categories
Subcategory
ID
Subcategory
1.10
Virtual
Checkpoint
Databases
1.20
Driver
Profile
1.30
Speed Limit
2.10
Speed
Aggregator
2.20
VC
Reallocator
Algorithms
Test
Case
Name
Description
1.1.1
Database
Structure
Test
Verify the structure of all
tables and fields
2.1.1
Test
Aggregate
Speeds
2.2.1
Source Code
2.2.2
Open VC
Database
2.2.3
2.2.4
2.2.5
2.30
Route
Matcher
Open Speed
Limit
Database
Add
Checkpoint
Delete
Checkpoint
2.3.1
Source Code
2.3.2
VC Database
Connect
2.3.3
Input
Parameter
2.3.4
Proximity
To determine whether the
aggregate speed function is
working and if it is accurate.
To check if the code is
written in Java or C++.
To test the ability to open
the Virtual Checkpoint
Database.
Test the ability to open the
speed limit database.
Test the ability to add a
checkpoint.
Test the ability to delete a
checkpoint.
Verify code is written in
Java or C++
Verify Virtual Checkpoint
Database is accessible
Verify parameter
acceptance for input
coordinates
Verify ability to return
checkpoint ID's within
Lab 3 – Test Plan
Categories
McKnight 6
Subcategory
ID
Subcategory
Test
Case
Name
Description
vicinity
2.40
2.50
2.3.5
False
Proximity
2.4.1
Route
Analysis
Accuracy
Test
2.4.2
Route
Analysis
Data Test
2.5.1
Source code
2.5.2
User
Interface
Route
Analyzer
Blockage
Finder
2.5.3
2.5.4
2.5.5
2.5.6
2.5.7
2.6.1
2.60
Next
Checkpoint
Estimator
2.6.2
Simulation
Console
3.10
Region
Selection
3.1.1
3.1.2
Accessing
Information
Geographical
Area
Virtual
Checkpoints
Route
Analysis
Result
Verify ability to return no
checkpoint ID
Verify that the Route
Analysis Algorithm
properly validates a route
against the Virtual
Checkpoint Database.
To verify the calculation
and communication of
congestion data for a user
specified route.
Test for code language used
in the Blockage Finder
algorithm.
Testing the user interface to
be used on the server for
Blockage Algorithm.
Ensuring if information
received is valid.
Checking the location
through Google Maps.
Virtual Checkpoints
Route Analysis algorithm
Optimal Traffic
Next
Checkpoint
Estimator
calculations
Ensure the calculations
performed by the algorithm
are correct
Next
Checkpoint
Estimator
deviation
Region
Support
Arrival and
Destination
Test the conditional branch
in the algorithm that checks
whether a user has deviated
from a route
Verify region maps
available for simulation
Verify regiona maps have
entry and exit points
Lab 3 – Test Plan
Categories
McKnight 7
Subcategory
ID
Subcategory
3.20
Traffic
Scenario
Selection
3.2.1
3.2.2
Name
Description
Scenario
Support
Scenario
Scale
Verify scenario options
defined and available
Verify scenarios have
scalability functions
Ensure that realistic
proportions of drivers and
users are generated,
conforming to variable
thresholds which can be
changed by the user
Verify simulation defaults
and selectability of regions
and scenarios
Verify Scenario 1 (low
congestion) can be executed
to show algorithm proof
Verify Scenario 8 (high
congestion) can be executed
to show algorithm proof
Verify generation of two
types of virtual drivers
Verified through Simulation
Runtime Execution test
cases
Ensure that only authorized
users are able to access the
main user interface
functionality of the
application
3.3.1
Driver
Generator
3.4.1
Runtime
Defaults and
Selections
3.4.2
Scenario 1
Execution
3.4.3
Scenario 8
Execution
3.4.4
Virtual
Driver Type
3.50
Traffic
Activity
Display
3.4.1
3.4.4
3.4 Tests
4.10
Login
4.1.1
Login
3.30
3.40
4.20
Driver
Generator
Test
Case
Simulation
Runtime
Execution
New Trip
4.2.1
New Trip
Client User
Interface
4.30
Route Tracer
4.3.1
Route Tracer
4.40
Edit Trip
4.4.1
Edit Trip
4.50
End of Trip
4.5.1
End of Trip
Ensure the process of New
Trip Creation runs correctly
or fails gracefully
Test the functionality of the
Route Tracer screen to
ensure that illegal start/stop
presses are prevented
Ensure the process of
editing a Trip runs correctly
or fails gracefully
Ensure the End of Trip
process of runs correctly
Lab 3 – Test Plan
Categories
McKnight 8
Subcategory
ID
Test
Case
Name
4.60
Delay
Notification
4.6.1
Delay
Notification
5.10
Main Menu
5.1.1
Main Menu
Test
5.2.1
Driver
Profile
Database
5.2.2
Driver
Profile
Screenshots
5.2.3
Driver
Profile Main
Menu
5.30
Route
Creation
Demo
5.3.1
Create / Edit
5.40
Route Tracer
Demo
5.4.1
Route Tracer
demo
5.50
Traffic
Simulation
Window
3.4.1
3.4.3
3.4 Tests
5.6.1
Dashboard
Access
5.6.2
Dashboard
Return
5.20
Simulation
Console
Interface
Subcategory
5.60
Driver
Profile Demo
Dashboard
Table 1: Identification of Tests
Description
and unobtrusively to the
user
Ensure the delay
notification process runs
correctly and unobtrusively
to the driver
Verify interface has
accessible buttons/tabs for
features
Verify that features of
Driver Profiles have been
implemented correctly
Verify that features of
Driver Profile
Demonstration utilizes
appropriate GUI screenshots
Verify that features of
Driver Profile
Demonstration allows
access to the main menu
Must describe all fields
required for creating a new
route manually as outlined
in Requirement 3.1.4.1.3.
Show the functionality of
the Route Tracer works as
expected and returns correct
results
Verified through Simulation
Runtime Execution test
cases
Verify accessibility from
Traffic Simulation window
Verify that Dashboard
cannot return to Main Menu
during simulation
Lab 3 – Test Plan
3.3
McKnight 9
Test Schedule
The plan for the prototype demonstration includes a subset of the collection of
tests described in Table 1. It starts off with a quick introduction, followed by a
description of the databases and a short demonstration of their operation and contents. A
few algorithm unit tests will be performed, followed by their integration in a simulation
of Traffic Wizard’s operation in the real world. A presentation of the smartphone
application rounds out the demonstration, followed by a question and answer period.
Start Time
(min)
0:10
Duration
Description
Test Case #
5
1.1
0:15
10
0:25
10
Database (Driver Profile and
Virtual Checkpoint) Demo
Algorithm unit tests (via
Simulation Console)
Integration simulation
0:35
10
4.1 – 4.6
0:45
15
Smartphone Application
Demo
Questions
2.1- 2.6
3.1 – 3.5, 5.1 – 5.7
Table 2: Test Schedule
3.4
Fault Reporting and Data Recording
Thorough fault reporting is an essential part of the testing process. By
documenting failures and the conditions that lead up to them, error mitigation will
move at an efficient pace. Table 3 outlines the different categories of errors and how
to document each kind.
Lab 3 – Test Plan
Component
Hardware
GUI
Database
Algorithms
3.5
McKnight 10
How/Process of Recording
 Report failures through visual inspection of Smartphone
device and server stations
 Document through hardcopy forms
 Report failures through visual inspection of GUI
screens in app and simulation console
 Document through hardcopy forms
 Report failures through visual inspection of returned
SQL Statements
 Document through hardcopy forms
 Report failures through visual inspection of output logs
 Document through hardcopy forms
Table 3: Fault Reporting and Data Recording
Resource Requirements
To carry out the demonstration, several pieces of equipment are needed. A
projector and screen are needed to display various programs, slideshows and
visualizations. At least one iOS device will be needed to present the smartphone
application, together with a VGA-out cable to plug into the projector. Access to the
virtual machine provided by the systems group is needed through either a desktop or
laptop computer.
3.6
Test Environment
A large enough facility will be needed to accommodate a small audience of about
thirty people. A separate area for panelists will be reserved close by the projector screen
and the presenters to facilitate discussion. The rest of the seating area should have a
sufficient view of the projector screen.
Lab 3 – Test Plan
McKnight 11
Figure 2: View of the presentation area.
Figure 3: Schematic plan of the presentation room.
Download