Running Head: Lab #3 – Current ITS Prototype Test Plan/Procedure 1

advertisement
Running Head: Lab #3 – Current ITS Prototype Test Plan/Procedure
Lab #3 – Current ITS Prototype Test Plan/Procedure
Red Team
Charles J Deaver
CS411W
Prof. Janet Brunelle
11/19/2012
1
Lab #3 – Current ITS Prototype Test Plan/Procedure
Table of Contents
1.
INTRODUCTION ................................................................................................................................ 3
2.
REFERENCES ..................................................................................................................................... 4
3.
TEST PLAN.......................................................................................................................................... 4
3.1. Testing Approach ....................................................................................................................... 5
3.2. Identification of Tests ................................................................................................................ 6
3.3. Test Schedule ........................................................................................................................... 16
3.4. Fault Reporting and Data Recording ....................................................................................... 17
3.5. Specific Requirements ............................................................................................................. 17
3.6. Test Environment ..................................................................................................................... 18
4.
TEST RESPONSIBILITIES ............................................................................................................... 19
5.
TEST PROCEDURES ........................................................................................................................ 19
6.
TRACEABILITY TO REQUIREMENTS ......................................................................................... 19
List of Tables
Table 1: Test Identification Table ................................................................................................... 6
Table 2: Test Schedule .................................................................................................................. 17
Table 3: Test Case Failure Report ................................................................................................ 18
Table 4: Test Responsibilities ....................................................................................................... 19
(This space intentionally left blank.)
2
Lab #3 – Current ITS Prototype Test Plan/Procedure
1.
Introduction
The city of Norfolk installed a light rail system, called The Tide, to help reduce traffic
congestion. The Tide opened with successful ridership numbers but has subsequently suffered
some losses in ridership (Hampton Roads Transit Authority, 2012). Some of these losses can be
attributed to poor communication with riders about train stop times. Minimal communication
results in lower ridership and loss of revenue for both the Hampton Roads Transit Authority
(HRT) and the businesses surrounding train stops (Southeastern Institute of Research, Inc.,
2011). The deficiency in communication is caused by limited Internet or web resources about the
train system and the lack of signs indicating arrival or departure time at train stops. The
communication shortcoming not only affects riders, but also hampers increases in ridership.
Current Intelligent Transit System (ITS) is a solution designed to help close the communication
gap by providing easy access to real-time information.
The CS411 Red Team is designing a prototype to demonstrate the capability to addresses
the needs of The Tide riders, local businesses, and HRT. The Current ITS prototype will have
easy to use web-based applications which are designed to keep riders informed, give local
businesses an additional advertising platform, and allow HRT the capability to improve
operational efficiency. The real-time data gives improved decision-making capability for both
HRT and the riders plus provides local businesses the capability to direct ads based on location.
This test plan is a governance mechanism to aid in quality control for the prototype
implementation. It will provide a road map to ensure all the factors determined in the design
phase are implemented, interconnected, and operational. It includes a master table listing the
major components of the system and the intended testing objectives. Additional requirements to
3
Lab #3 – Current ITS Prototype Test Plan/Procedure
include essential testing prerequisites and personnel responsibilities are also encompassed in this
document.
Successful execution of this plan should result in the identification of effective
development of the Current ITS prototype. An efficacious test will validate the intended data
distribution to light rail riders, transit authorities, and business users as determined by the
product description. It will also provide an avenue to determine potential shortfalls and aid in
risk mitigation strategies should the product be developed for production.
2.
References
Hampton Roads Transit Authority. (2012). January 26, 2012, Commission Meeting Report.
Norfolk, Va.
Lab 1 -- Current ITS Product Description. Version 3. (2012, September). Current ITS. Red
Team. CS411W: Deaver, Charles.
Lab 2 -- Current ITS Prototype Product Specification. Version 1. (2012, November). Current
ITS. Red Team. CS411W: Deaver, Charles.
Southeastern Institute of Research, Inc. (2011). Hampton Roads Transit: Light Rail Marketing
Research Study. Norfolk: Southeastern Institute of Research, Inc.
3.
Test Plan
This test plan delineates the full testing scope of the Current ITS prototype system. The
following sections contain detailed descriptions of the testing approach, test identification, test
schedule, data recording, and required resources. The testing requirements are provided in
tabular format for simplicity in execution.
4
Lab #3 – Current ITS Prototype Test Plan/Procedure
3.1.
Testing Approach
The Current ITS prototype testing approach will be to conduct unit, integration, and
system tests for each component. Unit testing will use control data to ensure specific functions
within the modules are producing the correct output. Integration and system tests will leverage
control data combined with scripted procedures to guarantee proper component interconnectivity
and system display output. The series of tests will begin at the core of the application and spiral
toward the outer edges, or end user functions. Testing will begin with unit tests being conducted
on the database to ensure data integrity and accessibility. The next component tested will be the
Decision Engine followed by the test harness. The final component tested will be the Web
Application Engine utilizing simulated data from the test harness.
(This space intentionally left blank.)
5
Lab #3 – Current ITS Prototype Test Plan/Procedure
3.2.
6
Identification of Tests
The full list of tests is contained in Table 1. The table is a high level view of the testing
objectives for each component. The test cases are broken down into greater specificity in Section
5.
Category
Description
Test
Case
Unit
Database
1.1
Description
Formats
Objective
Requirements
Referenced
Visually examine the
entries for date
(YYYYMMDD) and
time (HHMMSS)
formats.
3.1.1.1
3.1.3.1
1.2
Read/Write
Procedures
Test create, read,
update, and delete
stored procedures on
the database.
1.3
Refresh Times
Test updateable views
for functionality and
response/refresh time.
1.4
Read/Write
Triggers
Test triggers for proper
functionality on
various create, read,
and delete commands.
1.5
Schema
Efficiency
Test tables for proper
schema, functionality,
and efficiency and ease
of use, and for proper
data values.
Table 1: Test Identification Table
(This space intentionally left blank.)
3.1.1.2
3.1.3.1
Lab #3 – Current ITS Prototype Test Plan/Procedure
Category
Test
Case
Description
2.1
Integration
Decision
Engine –
Ridership
Trend Analysis
Unit
Objective
Database
Connectivity
Test Current ITS
database connection
3.1.2.1.iv
Test ability to query the
database and necessary
tables
3.1.2.1.v.a,
3.1.2.1.vi.a,
3.1.2.1.vi.d,
3.1.2.1.vi.g,
3.1.2.2.iii,
3.1.2.2.iv,
3.1.2.2.v,
3.1.2.2.vii,
3.1.2.3.iii,
3.1.2.3.iv
3.1.2.1.i,
2.2
Select Ability
2.3
Input
Validation
Validate Input of date,
time range, stop id
2.4
Interval
Validation
Test future/past date
determination
3.1.2.1.iii
Test average of
embark/disembark for
past 15 days
3.1.2.1.vi.b
2.5
Average
Function Test
2.6
Past Event
Test
Test past Event
Detection
3.1.2.1.vi.c
2.7
Future Event
Test
Test future Event
Detection
3.1.2.1.vi.f
3.1.2.1.vi.h
Ridership
Variance
Function Test
Test accuracy of
variance between
established
disembark/embark
averages and past event
values
Test output of
embark/disembark to
Ridership Trend Report
function
3.1.2.1.vii
2.8
Ridership
Integration
Requirements
Referenced
Description
Decision
Engine –
DB Interface
7
2.9
Output
Validation
3.1.2.1.ii
Table 1 (cont.): Test Identification Table
Lab #3 – Current ITS Prototype Test Plan/Procedure
Category
Description
Test
Case
Objective
Input
Validation
Validate input of GPS
coordinates and dates
2.11
Train Activity
Test
Test ability to
determine if a train is
active or not
2.12
Delay Average
Calculation
Test
Test accuracy of
average variance from
the schedule
Alert
Detection Test
Test ability to correctly
identify active alerts
3.1.2.2.vii,
2.13
Test accuracy of alert
severity level delay
interval on variance
average
3.1.2.2.viii
2.14
Alert Delay
Interval
Application
Test
2.15
Test accuracy of
comparison between
calculated expected
time-of-arrival and
schedule
3.1.2.2.ix
Total
Calculated
Delay Test
2.16
Delay Estimate
Output
Validation
Test output of delay
time to Train Data
Report Module
2.17
Input
Validation
Validate input of Date
range and Stop ID
3.1.2.3.i,
3.1.2.3.ii
3.1.2.3.v
Ontime
Accuracy Test
Test accuracy of
variance between past
arrival times and
schedule times
Decision
Engine –
Delay Impact
Calculator
Integration
Unit
Decision
Engine –
Ontime
Performance
Reporting
Requirements
Referenced
Description
2.10
Unit
8
2.18
3.1.2.2.i.
3.1.2.2.ii
Table 1 (cont.): Test Identification Table
3.1.2.2.iii
3.1.2.2.vi
Lab #3 – Current ITS Prototype Test Plan/Procedure
Ontime
Performance
Output
Validation
Test output of variance
from the schedule to
the Train Data Report
Module
3.1
GPS stop tester
3.2
3.3
GPS format
test
GPS route test
3.4
GPS train test
Ridership Data
Control tester
3.5
Ridership Data
generation test
Ridership Data
Control tester
3.6
Ridership Data
test
Train control
tester
3.7
Train GPS test
3.8
Train sensor
failure test
3.9
Train outage
failure
3.10
Ridership test
3.11
End date test
Verify each virtual stop
has an associated GPS
coordinate.
Verify GPS coordinate
is in the correct format
Verify the GPS route
list contains a proper
GPS coordinates
Verify a train’s
coordinate is updated
and valid.
Test virtual rider
generation for each
stop
Ensure realistic
proportions of riders
are generated
conforming to variable
thresholds which can
be changed by the user
Verify each active
train return assigned
GPS coordinate
Verify each train has
the ability to simulate
sensor failure
Verify the ability for a
train to no return a GPS
coordinate.
Verify each train has
the ability to return the
current amount of
riders on board
Verify access to each
advertisement’s end
date
2.19
Integration
Test
Harness
9
GPS Data tester
Business ad
control tester
Table 1 (cont.): Test Identification Table
3.1.2.4.vi
Lab #3 – Current ITS Prototype Test Plan/Procedure
3.12
Start date test
Verify access to each
advertisement’s start
date
3.13
Advertisement
stop test
3.14
Advertisement
start time
Verify Ability to assign
and view
advertisements
assigned at each stop
Verify access to each
advertisement’s start
time
3.15
Advertisement
end time
Verify access to each
advertisement’s end
time
3.16
Train property
GUI
Verify interface has the
ability to display each
trains properties
3.17
Train settings
test
Verify interface has the
ability to edit different
train settings
3.18
Stop property
test
Verify interface has the
ability to display
ridership at each stop
3.19
Stop Property
edit test
Verify interface has the
ability to edit different
ridership numbers at
each stop.
Test harness
Interface
Test harness
Interface
10
Table 1 (cont.): Test Identification Table
(This space intentionally left blank.)
Lab #3 – Current ITS Prototype Test Plan/Procedure
Category
System
Unit
System
Unit
Description
Test
Case
Objective
Verify an alert can be
created/modified/closed
within the HRT GUI
3.1.4.1
4.1
Manage
Alerts
4.2
View Alerts
Demonstrate alerts are
viewable in module
3.1.4.1
3.1.4.2
4.3
Submit
Feedback
Verify a rider or
business user can
submit feedback via
web form
Retrieve Stop
Information
Verify DB interface
provides results for stops
& vehicles in operation
3.1.4.3
4.4
3.1.4.3
4.5
Map Overlay
Demonstrate stop &
vehicle information is
displayed on a dynamic
map
Verify Google Maps
search is performed for
direction request
3.1.4.4
4.6
Google Maps
Web
Redirection
3.1.4.5
View Events
Demonstrate Events are
viewable within the
module
3.1.4.5
Manage
Events
Verify that Events can
be added, edited or
removed within the
Business & HRT GUI
System
Overview
Module
Google Maps
Web Form
4.7
System
Requirements
Referenced
Description
Alert Module
Feedback
Module
11
Calendar
Event
Module
4.8
Table 1 (cont.): Test Identification Table
Lab #3 – Current ITS Prototype Test Plan/Procedure
Category
Description
Test
Case
3.1.4.9.i
Webpage
Layout
3.1.4.9.ii
Username
Validation
Ensure
username is
validated and
return error if
username
exists.
3.1.4.9.iii
9.1.3
Username
Retrieve
Prove username
is provided
after validation
method.
Confirm user is
able to reset
password.
3.1.4.9.iv
9.1.4
User Self
Password
Reset
Validate
administrator
Administration can change user
Application
information,
groups, and
passwords
3.1.4.9.v
9.1.2
User
Management
Objective
Requirements
Referenced
Verify user is
able to insert
name, desired
user name,
email address
and password.
9.1.1
Unit
Description
12
9.1.5
9.1.6
User
Information
Update
Verify user can
edit personal
information.
Table 1 (cont.): Test Identification Table
3.1.4.9.vi
Lab #3 – Current ITS Prototype Test Plan/Procedure
Category
Unit
Test
Case
Description
Description
13
Objective
Requirements
Referenced
10.1.1
Application
Access
Confirm user is
able to access
application
through onefactor
authentication.
Validate proper
identification
token generation.
3.1.4.10.ii
10.1.2
Token
Generation
Prove access is
granted only after
token validation.
3.1.4.10.iii
10.1.3
Access
Control
3.1.4.10.vi
Logging
Verify recording
of a users login
time, location,
authentication
success or failure
and the page
requested.
Authentication
Module
10.1.4
Table 1 (cont.): Test Identification Table
(This space intentionally left blank.)
3.1.4.10.i
Lab #3 – Current ITS Prototype Test Plan/Procedure
Category
Description
Test
Case
11.1.1
11.1.2
Integration
Description
11.1.4
Objective
Requirements
Referenced
3.1.4.11.i.a
Database
Connection
Confirm
application
opens access to
the database for
data transfer
3.1.4.11.i.b
Query
Transfer
Validate query
and result
transfer between
the database and
the Web
Application
Engine modules.
3.1.4.11.ii
Decision
Engine
Connection
Verify data is
sent between the
Decision Engine
and the Web
Application
Engine.
3.1.4.11.iii
Test Harness
Connection
Validate data is
sent between the
Test Harness and
the Web
Application
Engine.
Data
Integration
Module
11.1.3
14
Table 1 (cont.): Test Identification Table
(This space intentionally left blank.)
Lab #3 – Current ITS Prototype Test Plan/Procedure
Category
ID
Description
Test
Case
Integration
Unit
List
Verify listing of
Advertisements advertisements.
Verify submission
of advertisement
input fields.
3.1.4.8.3
x.2
Create
Advertisement
3.1.4.8.2
x.3
Edit
Advertisement
Verify
modification of
advertisement
input fields.
Verify interface
with database for
data input/output.
3.1.4.11.1
x.4
Database
Interface
Verify display of
default ridership
trend report.
3.1.4.6.4
x.1
Display
Default Report
Verify display of
custom ridership
trend report.
3.1.4.6.4
x.2
Display
Detailed
Report
Verify input fields
for custom report
query.
3.1.4.6.2
x.3
Request
Custom Report
Verify interface
with Decision
Engine for data
retrieval.
3.1.4.11.2
x.4
Decision
Engine
Interface
Verify display of
default train data
report.
3.1.4.7.3
Ridership
Trend Report
Integration
Unit
Train Data
Report
Objective
Requirements
Referenced
x.1
Unit
Business Ad
Campaign
Module
Description
15
x.1
Display
Default Report
Table 1 (cont.): Test Identification Table
3.1.4.8.1
Lab #3 – Current ITS Prototype Test Plan/Procedure
Integration
Unit
Graphical
User Interface
Framework
16
x.2
Display
Detailed
Report
x.3
Request
Custom Report
x.4
Decision
Engine
Interface
Verify display of
modules on Rider
GUI.
3.1.4.12.1
x.1
Display Rider
Modules
Verify display of
modules on
Business GUI.
3.1.4.12.2
x.2
Display
Business
Modules
Verify display of
modules on HRT
GUI.
3.1.4.12.3
x.3
Display HRT
Modules
x.4
Module
Interface
Verify display of
custom
3.1.4.7.3
Verify input fields
for custom report
query.
3.1.4.7.2
Verify interface
with Decision
Engine for data
retrieval.
3.1.4.11.2
Verify interface
with all Web
Application Engine
modules.
3.1.4.12
Table 1 (cont.): Test Identification Table
3.3.
Test Schedule
The test will be conducted within thirty-five minutes plus an additional ten minutes for
setup and a brief discussion of the product description and feasibility. The tests will be conducted
by component starting with a display of the data generating and housing components of the
Lab #3 – Current ITS Prototype Test Plan/Procedure
17
prototype. The testing will then migrate to the functional displays to demonstrate the end-user
capabilities of the system. The breakdown of the schedule is contained in Table 2.
Start Time
(minutes)
0:00
0:10
0:15
Duration (minutes) Description
0:25
10
0:35
10
0:45
15
10
5
10
Feasibility
Database Demo
Algorithm Unit
Tests
Integration Tests
-HRT
-Business
-Rider
System Tests
-HRT
-Business
-Rider
Q&A
Test Cases
Covered
1.0
2.0,3.0
4.1 – 4.7
5.0-10
Table 2: Test Schedule
3.4.
Fault Reporting and Data Recording
Start data for the prototype and the associated tests will pre-populated in the database.
Additional data required for test inputs will be provided by either the test engine or the actual
tester based on the case requirement. Section 5 contains a table with the actual test cases, the
initialization conditions, and the expected results. Test successes will be marked within the
provided table and any additional comments provided as necessary. In the event of a test failure,
a failure report will be filled out and returned to the development team for corrective action. The
tester will fill in the appropriate information as indicate by the brackets. The failure report is
shown in Table 3.
3.5.
Specific Requirements
This test plan will require a few external factors for completion. The test will need to be
conducted on a personal computer with access to the internet to demonstrate the system’s
capabilities. Ideally this machine will reside on the Computer Science Domain at Old Dominion
Lab #3 – Current ITS Prototype Test Plan/Procedure
18
University. The internet browser version should be at least Internet Explorer 9.0 or Mozilla
Firefox 13.0. An additional computer needs to be available to perform change actions on the test
driver. No other hardware or software requirements exist.
Two people will be required to run the machines for the test. They should have a copy of
this document available for recording purposes. Test data will be predetermined or prepopulated, but additional changes can be added to the test driver to display additional
functionality.
Current ITS – Test Case Failure Report
Test Category: Description: [Test Description]
[Category]
Test Case:
Case Name:
Version:
[Case #]
[Case Name]
[Case Ver #]
Purpose: [Test Case Purpose]
Requirements
Fulfilled:
[Req #]
Failure Conditions:
 [Enter conditions leading to failed result]
Test Case Activity
Expected Result
1 [Test Case Activity]
[Expected Result]
Severity:[Low/Med/High]
Discovered By: [Tester’s
Name]
Observed Result
[Actual or Observed Result]
Table 3: Test Case Failure Report
3.6.
Test Environment
The test will be conducted in the Gornto Teletechnet building at Old Dominion
University. The display capability will be performed at the instructor desk so the
demonstration can be displayed on the wall-mounted monitors and recorded. The test
driver machine can be placed at any desk as it will have access to the network through a
wireless connection.
(This space intentionally left blank.)
Lab #3 – Current ITS Prototype Test Plan/Procedure
4.
Test Responsibilities
Each member of the Red Team is responsible for conducting the tests on the portions of
the program they developed. Nate, Chris, and Dean will provide the oral presentation
accompanying the test. The list of team members and their responsibilities are provided in Table
4.
Team Member
Responsibility
Chris Coykendall
Web Application Modules
CJ Deaver
Web Application Modules
Brian Dunn
Web Application Modules & GUI Frameworks
Akeem Edwards
Test Harness Operator
Nathan Lutz
Decision Engine
Dean Maye
Database
Table 4: Test Responsibilities
5.
Test Procedures
The full table of test procedures with instructions is provided in an additional document.
6.
Traceability to Requirements
(This space intentionally left blank.)
19
Download