LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures

advertisement
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Running head: LAB 3 – SWDS Prototype Test Plan/Procedures
CS 411W Lab III
Prototype Test Plan/Procedure
For SWDS
Prepared by: SWDS Green Team
Date: 4/11/11
1
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
TABLE OF CONTENTS
1 OBJECTIVES ............................................................................................................. 3
2 REFERENCES ............................................................................................................ 4
3 TEST PLAN ................................................................................................................ 4
3.1
Testing Approach ................................................................................................. 4
3.2
Identification of Tests .......................................................................................... 5
3.3
Test Schedule ....................................................................................................... 7
3.4
Fault Reporting and Data Recording ................................................................... 7
3.5
Resource Requirements ....................................................................................... 8
3.6
Test Environment ................................................................................................. 9
3.7
Test Responsibilities .......................................................................................... 10
4 TEST PROCEDURES .............................................................................................. 10
4.1
Test Case Names and Identifiers........................................................................ 11
5 TRECEABILITY MATRIX .................................................................................... 31
LIST OF FIGURES
Figure 1. E&CS Conference Room Layout ....................................................................... 9
LIST OF TABLES
Table 1. Test Identification Table ...................................................................................... 5
Table 2. SWDS Prototype Test Schedule .......................................................................... 7
Table 3. Traceability Matrix ........................................................................................... 31
2
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
1
OBJECTIVES (Cassandra Rothrauff)
The Surface Water Detection System (SWDS) is conceived by the Old Dominion University
(ODU) CS410 Green Group. The SWDS is a network of above ground ultrasonic sensors able to
detect rising water levels in areas prone to flooding. The product provides physical sign
implemented to warn drivers of dangerous roadways and centralized data center for easy access
by user – based applications. With this system, we can assist drivers in preventing vehicle
damage and personal injury in cases where they proceed through inundated portions of the road.
The first objective is to design the prototype of the SWDS. Its service is intended to
demonstrate the feasibility of using the sensors which will measure the distance between itself
and its target surface. To successfully demonstrate the use of the sensors, the product is designed
to a public web application. Google Maps is used to display graphic representation of water
levels. Users can set up custom RSS feeds to alert them to dangerous water levels on their route.
The second objective is to demonstrate the SWDS service using a virtual hardware sign that
does not require any client software. The network system will communicate the measurement,
threshold and sensor ID from remote sensor to centralized server. On the monitor screen, there
will be a display on the status of the remote sensors. In the database, data will be recorded.
These objectives are to test plan and procedure is to establish the overall approach, testing
sequence, and specific tests to be conducted in order to show that the operational prototype has
achieved these objectives. The tests contained in this plan aim to show that the prototype has
meet the requirements given. The customers are able to implement these solutions which utilize
collected data. The main idea is to create an above ultrasonic sensor that can transmit data to note
only the web application, but to physical road signs.
3
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
2
References (Katherine Kenyon)
 Lab I – SWDS Product Description.
 Lab II – Prototype Product Specification for SWDS.
3
Test Plan (Katherine Kenyon)
The following sections of the test plan will discuss the types of tests to be performed, the
testing schedule, reporting procedures, resource requirements, the testing environment, and team
member responsibilities.
3.1
Testing Approach (Katherine Kenyon)
Performance of the SWDS prototype will be verified through a combination of component
and system tests. The major functional components of the prototype are shown in Figure 1.
The systems tests will exercise multiple components concurrently, obviating the need to perform
separate tests on every component independently. The four test categories therefore do not
correspond to the major components. The first category will cover the functionality of the SWA,
AWA, and PWS websites. These tests will demonstrate that the system allows a user to gain
access and manipulate data within the centralized database. The second testing category will
measure the system capacity and performance. These automated tests will ensure that the system
meets the performance requirements related to volume and speed. The third category will
examine the exception handling abilities of the prototype. Component requirements for updating
the database, updating graphical displays or views, and the filtering logic will all be examined by
these tests. The final category covers the authentication of each level of user control or schema
in the SWDS database.
All tests will be conducted in a controlled classroom environment using the ODU network
and hardware. Verification of data integrity and performance on the server will be tested and
reviewed through observation and monitoring of back-end logs and reports. This will ensure that
4
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
information is correctly being transmitted through the system. Performance tests will produce
reports which summarize the actual results and compare them to the requirement benchmarks.
3.2
Identification of Tests (Eric Boyd)
This section provides a listing and high level description of the tests to be performed, and the
objective of those tests.
Category
ID
1
Description
Test
Case
Description
Objective
1.1
Device
Configuration and
Data Retrieval
To verify that the sensor can be
configured by an ODAD and
local data can be copied to the
ODAD
1.2
Data Retrieval and
Storage
To verify the CSRD can
retrieve sensor data and store it
locally
1.3
Filtering Logic and
Data Validation
To verify that the CSRD can
discard erroneous
measurements
1.4
Roadside Warning
Sign Control
To verify that a warning sign
can be controlled by
measurements exceeding a
threshold
1.5
Lost Network
Connectivity
To verify that a NSRD will
revert to CSRD operations if
network connectivity is lost
1.6
Checking for a
Connection
1.7
Resuming Normal
Functionality
1.8
Sending Data to the
Data Acquisition
Host (DAH)
Sensor
Functionality
5
To verify that a CSRD/NSRD
regularly checks for network
connectivity
That a NSRD will resume
NSRD operations after a time
of lost network connectivity
To verify that a NSRD sends
its sensor measurements over
the network to the DAH
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Category
ID
2
3
4
Description
Sensor
Configuration and
Maintenance
Network
Interaction
Test
Case
Description
Objective
2.1
Onsite Data
Acquisition Device
To verify that an ODAD can
physically connect to a CSRD
and access its configuration
program.
2.2
Configuring a
CSRD Locally
To verify that the CSRD can be
configured via the ODAD.
2.3
Downloading CSRD
Local Data
To verify that the CSRD’s
local data can be copied from
the ODAD.
3.1
Data Acquisition
Host (DAH)
To verify that the NSRD and
DAH interact over the network
correctly.
4.1
AWA NSRD
Display
To verify that the AWA
displays the real-time sensor
measurements of the NSRDs
on the network.
4.2
AWA NSRD
Configuration
To verify that the AWA allows
for configuration of any NSRD
on the network.
4.3
AWA Historical
Data Querying
To verify that the AWA allows
for querying the centralized
database for historical sensor
data via some UI.
4.4
Public Web Server
To verify that the PWS News
Feed is generating the news
feed automatically, and
correctly.
4.5
Public Web Server
To verify that the basic Bing
Maps functionality is working
correctly.
Public Web Server
To verify that a user's custom
alerts selection is being saved
to the database.
Web Application
and User Interface
4.6
Table 1. Test Identification Table
6
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
3.3. Test Schedule (Jill Mostoller)
Our team has been designated 60 minutes to setup and demonstrate the SWDS prototype. The
setup of our prototype will be done in the initial five minutes of our time slot followed by a 10minute introduction of the Surface Water Detection System. After the setup and introduction, we
will test the prototype according to the testing schedule in Table 2.
Start Time
(Hours:
Minutes)
Duration
(Minutes)
Test
Objectives
Test
Event
Dependencies
Comments
0:15
20
Sensor
Functionality
1.1, 1.2, 1.3,
1.4, 1.5, 1.6,
1.7, 1.8
None
None
0:35
10
2.1, 2.2, 2.3
None
None
0:45
5
3.1
None
None
0:50
10
4.1, 4.2, 4.3,
4.4
None
None
Sensor
Configuration
and Maintenance
Network
Interaction
Web Application
and User
Interface
Table 2. SWDS Prototype Test Schedule
3.4 Fault Reporting and Data Recording (Chris Meier)
Within the source code that is running the SWDS prototype there exists a filtering logic. This
filtering logic checks all incoming data against a boundary offset. Should the data that is read in
prove to be erroneous; by being outside the boundary, then the filtering logic executes a block on
the write to the database and flags the data as unacceptable. It also stores the last known “good”
data for a specified set of cycles, waiting for a return to normal reading values. The “bad” data is
omitted from the database record. After the set of cycles has completed, the filtering logic
releases the block and resumes evaluating data that is read in. No test results will be recorded,
but a group member will evaluate the values for correctness.
7
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
3.5
Resource Requirements (Chris Meier)
The SWDS prototype demonstration will consist of the AWA website, DAH website, and the
PWS. Each will require MySQL on a PC server. A database table must be preloaded with at
least two simulated sensor and one actual sensor connected to the database via the eBox. Each
sensor will update to a locally hosted Bing map. Each simulated sensor will receive its data from
a text file and the actual sensor prototype will transmit read data on the fly.
A group member will initiate contact with the PWS and look-up a Bing Map, at this time
from a PC in the presentation room. Another group member will either initiate a read in of data
from file or manually manipulate the data for demonstration purposes in order to illustrate
prototype functionality. Also at this time a group member will adjust the water depth in the
basin so the actual sensor prototype functionality can be demonstrated as well. The PC must have
a USB port, a network connection, an installed web browser, and the Microsoft Visual program
with the SWDS development environment. The development environment will consist of
VisualHG-1.1.3, NUnit-2.5.9, TortoiseHG-2.0.2 and Microsoft SQL Server 2008 R2 Express
Edition with tools. Additionally the PC will require the eBox software in order to communicate
with the device.
[This space intentionally left blank]
8
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
3.6 Test Environment (Robert Dayton)
The test environment for the SWDS prototype will be located in the Computer Science
Conference Room on the third floor of the E&CS Building at Old Dominion University.
Figure 1. E&CS Conference Room Layout
Setup for the test environment will include a number of steps:
1. Connect the remote device hardware together and ensure proper operation. These
components include the sensor, microcontroller.
2. Turn on and connect the development PC to the remote device infrastructure via Ethernet
cable and test for connectivity.
3. Mount the remote device sensor to the water tank and ensure proper measurements are
being received.
Once the preceding test environment has been built, the following software should be prepared
for use:
1. Localized Microsoft SQL Database
2. Administrative Web Application (AWA)
3. Applications prototyped for the Public Web Server (PWS)
4. Data Injection Simulator (DIS)
9
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
3.7
Test Responsibilities (Robert Dayton)
Cassie Rothrauff will be responsible for controlling the PowerPoint presentation which will
facilitate the demonstration by keeping it on track and organized. Two team members, namely
Eric Boyd and Marissa Hornbrook will be in charge of main presentation duties. Associated
tasks will include introducing each topic and facilitating the overall structure of the
demonstration. Jill Mostoller will introduce the tentative schedule, highlighting the main topics
and test cases to be discussed during the presentation. The overall testing approach and iterative
analysis plan will be reviewed by Katherine Kenyon. In establishing prototype component
specialists, Eric Boyd will be in charge of answering questions pertaining to the functionality of
the AWA. The filtering algorithm specifications will be covered by Chris Meier. Last but not
least, Robert Dayton will be responsible for setting up the hardware based components of the
testing environment, maintaining safe operation of the demonstration hardware throughout the
course of the presentation, and responding to hardware based questions from the panel with
invigorating, generalized explanations.
4
Test Procedures (Collaborative)
Each test case has been assigned a unique name and reference number. The test cases
reference the requirements detailed in Lab II, and correspond to them by number. The
requirements being tested are identified along with the necessary initialization steps, inputs,
procedures, and expected results, and are labeled according to the team member who wrote the
case. Each test case may cover one or more requirements from Lab II, but for any given
requirement, it will not have multiple test cases. Our goal is to perform comprehensive testing in
the most efficient manner possible, testing all requirements with as little test cases as possible.
10
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
4.1 Test Case Names and Identifiers

Sensor Functionality
Test Case 1.1 – Device Configuration and Data Retrieval (Robert Dayton)
Specification References: 3.1.1
Test Level: Component
Test Type: Functional
Test Description: This test will verify that the CSRD will allow for self-configuration by an
Onsite Data Acquisition Device (ODAD) as well as facilitating the retrieval of stored data by
the ODAD.
Initialization:
1. Verify that the sensor is plugged into the microcontroller and that the microcontroller is
plugged into the eBox.
2. Turn on the microcontroller.
3. Turn on the eBox.
4. Establish connectivity to the Data Acquisition Host (DAH).
Test Inputs:
1.
Configuration script parameters
2.
DAC commands
Test Procedure:
1.
Run the eBox DAC executable.
2.
Generate a configuration script from the DAH.
3.
Execute configuration setup and data retrieval commands from the DAH.
Expected Test Results:
1. Once a configuration script has been generated, the DAC will retrieve and store it
accordingly.
2. After executing the data retrieval command, the DAH will catalog and store the data
received from the DAC.
Special Instructions: None
11
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 1.2 – Data Retrieval and Storage (Robert Dayton)
Specification References: 3.1.1.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that the Closed System Remote Device (CSRD) can
retrieve useable data from the sensor and store it accordingly.
Initialization:
1. Verify that the sensor is plugged into the microcontroller and that the microcontroller is
plugged into the eBox.
2. Turn on the microcontroller.
3. Turn on the eBox.
Test Inputs: None
Test Procedure: Run the eBox Data Acquisition Client (DAC) executable
Expected Test Results:
1. eBox DAC will open a specified COM port to establish microcontroller connectivity.
2. Microcontroller will execute the preprogrammed DAC returning sensor data to the eBox
DAC.
3. eBox DAC will store genuine sensor data after appropriate filtering procedures have been
executed.
Special Instructions: None
[This space intentionally left blank]
12
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 1.3 – CSRD Filtering Logic and Data Validation (Chris Meier)
Specification References: 3.1.1 The CSRD will be preprogrammed with filtering logic
capable of discarding erroneous data.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that the CSRD module accepts and responds to
incoming sensor data appropriately.
Initialization:
1. DAH database table set to accept
2. AWA database table checked for connectivity
Test Inputs:
1. User Input
2. Text File
3. Sensor Acquired Data
Test Procedure: Initiate read in of sensor data and monitor PWS response and AWA
statistics in the database.
Expected Test Results:
1. Incoming data will be filtered according to whether it falls within the acceptable range
(Pass/Fail).
2. Valid data is recorded and invalid data is discarded.
Special Instructions: None
[This space intentionally left blank]
13
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 1.4– Roadside Warning Sign Control (Robert Dayton)
Specification References: 3.1.1.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that the CSRD will have the capability of controlling
a roadside warning sign when the applicable sensor data is received.
Initialization:
1. Verify that the sensor is plugged into the microcontroller and that the microcontroller is
plugged into the eBox.
2. Turn on the microcontroller.
3. Turn on the eBox.
Test Inputs: None
Test Procedure:
1. Run the eBox DAC executable.
2. Create a flooding situation.
3. Return the environment to a normal state.
Expected Test Results:
1. eBox DAC will open a specified COM port to establish microcontroller connectivity.
2. Microcontroller will execute the preprogrammed DAC returning sensor data to the eBox
DAC.
3. Once the filtering algorithm has returned genuine sensor data indicating a flooding
situation, the DAC will turn on the warning sign flag visible from a DAC console
window.
4. Once the filtering algorithm has returned genuine sensor data indicated that a flooding
situation has subsided, the DAC will turn off the warning sign flag.
Special Instructions: None
[This space intentionally left blank]
14
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 1.5 - Lost Network Connectivity (Jill Mostoller)
Specification Requirements: 3.1.3.1 The NSRD shall revert to CSRD operations if network
connectivity is lost.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that once network connectivity is lost, the NSRD will
resume functionality as a CSRD and continue to operate and store data locally.
Initialization:
NSRD checked for connectivity to the DAH.
Test Inputs:
1. User input
2. Text file data
3. Data generated by the physical and simulated sensors.
Test Procedure: While reading in data from the sensor, user or text file, drop the network
connection to the DAH.
Expected Results:
The NSRD should continue to read in data and store the statistical information locally.
Special Instructions: None
[This space intentionally left blank]
15
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 1.6 – Checking For a Connection (Jill Mostoller)
Specification Requirements: 3.1.3.2 The NSRD checks for network connectivity at
regularly timed intervals.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that once network connectivity is lost, the NSRD will
try to reconnect with the DAH at regularly timed intervals.
Initialization:
NSRD checked for connectivity to the DAH.
Test Inputs:
1. User input
2. Text file data
3. Data generated by the physical and simulated sensors.
Test Procedure:
While reading in data from the sensor, user or text file, drop the network connection to the
DAH.
Expected Results: The NSRD should try to reconnect with the DAH every one minute. Each
attempt to reconnect will be logged.
Special Instructions: None
[This space intentionally left blank]
16
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 1.7 – Resuming Normal Functionality (Jill Mostoller)
Specification Requirements: 3.1.3.3 The NSRD resumes NSRD operations if, after a time
of lost network connectivity, the NSRD reestablishes network connectivity.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that once network connectivity is lost, the NSRD will
resume functionality as a CSRD and continue to operate and store data locally.
Initialization:
NSRD is not connected to the DAH through the network.
Test Inputs:
1. User input
2. Text file data
3. Data generated by the physical and simulated sensors.
Test Procedure:
Begin reading in data while the NSRD is not networked. Reestablish the connection between
the NSRD and DAH and continue to read in sensor measurement data.
Expected Results:
The NSRD should switch from storing the data locally to sending it to the DAH to be
processed and all the data previously stored on while the network was down should be
forwarded to the DAH for processing.
Special Instructions: None
[This space intentionally left blank]
17
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 1.8 – Sending Data to the DAH (Jill Mostoller)
Specification Requirements: 3.1.3.4 Once the NSRD established network connectivity, it
ends its sensor measurement data over the network to the Data Acquisition Host.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that the NSRD will send its measurement data to the
DAH for further processing.
Initialization:
NSRD is connected to the DAH through the network.
Test Inputs:
1. User input
2. Text file data
3. Data generated by the physical and simulated sensors.
Test Procedure:
Read in data from the various inputs to be sent over the network to the DAH.
Expected Results:
The NSRD should forward the data it obtains to the DAH where it can be logged.
Special Instructions: None
[This space intentionally left blank]
18
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures

Sensor Configuration and Maintenance
Test Case 2.1 - Onsite Data Acquisition Device (ODAD) (Marissa Hornbrook)
Specification References: 3.1.2.1 - An onsite operator is able to connect to the CSRD via
physical link such as Ethernet or Serial cable to provide access to the CSRD’s configuration
program over some network protocol such as Telnet or HTTP.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that the ODAD can connect to the CSRD by some
physical link.
Initialization:
1. The CSRD must have a physical port by which to connect to the ODAD.
2. The ODAD must have a physical port by which to connect to the CSRD.
3. A physical link such as Ethernet or Serial cable must be determined.
4. A protocol such as Telnet or HTTP must be determined.
Test Inputs:
1. User Input
Test Procedure:
1. Connect the ODAD to the CSRD.
2. Initiate a transmission of data.
3. Affirm transmission.
Expected Test Results:
1. A physical link is established and data can be transmitted from the CSRD to the ODAD.
Special Instructions: None
[This space intentionally left blank]
19
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 2.2 - Configuring a CSRD Locally (Marissa Hornbrook)
Specification References: 3.1.2.2 - An onsite operator, once connected to the CSRD, can
download sensor measurement data from the CSRD’s local storage to free device space.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that sensor measurements can be downloaded from
the CRSD to the ODAD.
Initialization:
1. The CSRD must have a physical connection to the ODAD.
Test Inputs:
1. Sensor measurement data
Test Procedure:
1. Connect the ODAD to the CSRD.
2. Initiate a transmission of sensor data from the CSRD to the ODAD.
3. Affirm receipt of data being stored in free device space.
Expected Test Results:
1. Sensor measurement data from the CSRD is now stored on the ODAD.
Special Instructions: None
[This space intentionally left blank]
20
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 2.3 - Downloading CSRD Local Data (Marissa Hornbrook)
Specification References: 3.1.2.3 - An onsite operator, once connected to the CSRD, can
configure the CSRD via the CSRD’s configuration program.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that an operator can set configuration details on the
CSRD via the ODAD.
Initialization:
1. The CSRD must have a physical connection to the ODAD.
Test Inputs: Configuration File
Test Procedure:
1. Connect the ODAD to the CSRD.
2. Initiate transmission of configuration file to be stored on the CSRD that will set details of
system functionality.
3. Confirm that in the CSRD the configuration changes were made.
Expected Test Results:
1. The CSRD configuration settings were changed via the file passed in from the ODAD.
Special Instructions: None
[This space intentionally left blank]
21
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures

Network Interaction
Test Case 3.1 - Data Acquisition Host (DAH) (Katherine Kenyon)
Specification References: 3.1.4.1 The DAH receives data from NSRDs on the network.
3.1.4.2 The DAH logs data received from the NSRDs to a centralized database.
Test Level: Component
Test Type: Functional
Test Description: This test will verify that the DAH interacts with the NSRD correctly.
Initialization:
1. The NSRD has test data ready to send to the DAH
2. The centralized database is set-up and accessible to the DAH
Test Inputs: None
Test Procedure:
1. Issue command to NSRD to send test data.
2. Monitor reception of data at the DAH.
3. Check that data was correctly logged into centralized database,
Expected Test Results:
1. Transmission of data from NSRD to DAH will return a successful transmission message
(Pass/Fail)
2. Processing of data at the DAH will return a successful processing message. (Pass/Fail)
3. Recording of the data to the centralized database will return a success message.
(Pass/Fail)
Special Instructions: None
[This space intentionally left blank]
22
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures

Web Application and User Interface
Test Case 4.1 - AWA NSRD Display (Eric Boyd)
Description: The AWA will demonstrate its ability to correctly display the current sensor
measurements from all sensors on the network. The main purpose of this feature is to provide
a real-time status from each sensor to an authorized user on the AWA.
Test Initialization:
Hardware and Software Configuration:
1. The AWA software package is deployed under Windows Internet Information Services
(IIS).
2. The SWDS centralized database is deployed on Microsoft SQL Server 2005/2008.
Preset Hardware Conditions:
1. Hardware is on and running.
2. No actual networking of hardware components is necessary. The necessary components
are software, and can be demonstrated on a single machine.
Preset Software Conditions:
1. The AWA is running in “test-mode”.
2. The SWDS centralized database is running and is populated with data. This data can be
coming from actual sensors, or generated from the AWA itself running in test-mode.
3. A test-user is configured in the root Web.config file of the AWA software package.
Test Procedure:
1. Open a web browser.
2. Navigate to the URL of the locally running AWA.
3. Login using the proper credentials (stored in the root Web.config).
4. Use the navigation menu to navigate to “Status”.
5. The sensor data is presented with randomly-filled measurements, displaying the proper
identification number, and the nomenclature/address of the respective sensor.
6. Verify the data is updated automatically on an interval of every ten seconds.
Expected Results: Because the sensor measurement data is being generated randomly, the
display should only serve to correctly display the sensor’s identifying information, and their
respective measurements.
23
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 4.2 - AWA NSRD Configuration (Eric Boyd)
Description: The AWA will demonstrate its ability to allow an authorized user of the AWA
to remotely configure any one of the sensors on the (simulated) network. The following data
is configurable for any sensor on the network: an identification number,
nomenclature/address, longitude/latitude, Offset, Threshold, Increment, and IP address.
Test Initialization:
Hardware and Software Configuration:
1. The AWA software package is deployed under Windows Internet Information Services
(IIS).
2. The SWDS centralized database is deployed on Microsoft SQL Server 2005/2008.
Preset Hardware Conditions:
1. Hardware is on and running.
2. No actual networking of hardware components is necessary. The necessary components
are software, and can be demonstrated on a single machine.
Preset Software Conditions:
1. The AWA is running in “test-mode”.
2. The SWDS centralized database is running and is populated with data. This data can be
coming from actual sensors, or generated from the AWA itself running in test-mode.
3. A test-user is configured in the root Web.config file of the AWA software package.
Test Procedure:
1. Open a web browser.
2. Navigate to the URL of the locally running AWA.
3. Login using the proper credentials (stored in the root Web.config).
4. Use the navigation menu to navigate to “Configuration”.
5. A table displaying each sensor and its corresponding configuration: an identification
number, nomenclature/address, latitude/longitude, Offset, Threshold, Increment, and IP
address.
6. Click on anyone of the rows corresponding to an individual sensor.
7. On the new page that loads, edit any combination of the sensor configuration.
8. Save changes.
9. Return home by choosing “Home” on the navigation menu.
24
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
10. Re-navigate to “Configuration” and verify that the changes to the sensor’s configuration
in Step 6 have persisted.
Expected Results: Once changes have been saved for the sensor’s configuration, those
changes are reflected in the SWDS centralized database. To verify these changes have been
saved in the database, the tester should navigate to the sensor configuration page and verify
that the changes have persisted.
[This space intentionally left blank]
25
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 4.3 - AWA Historical Data Querying (Eric Boyd)
Description: The AWA will demonstrate that it provides a sufficient (graphical) interface for
querying the SWDS centralized database for historical sensor measurement data. Queries
allow any combination of sensor’s to be queried over any specified period of time (within the
lifetime of the sensor’s installation). The output of the historical data should be in table
format; an alternate graph-format should also be available to the authorized AWA user. The
graph is a simple plot-style graph that shows the history of the sensor over the queried timeperiod.
Test Initialization:
Hardware and Software Configuration:
1. The AWA software package is deployed under Windows Internet Information Services
(IIS).
2. The SWDS centralized database is deployed on Microsoft SQL Server 2005/2008.
Preset Hardware Conditions:
1. Hardware is on and running.
2. No actual networking of hardware components is necessary. The necessary components
are software, and can be demonstrated on a single machine.
Preset Software Conditions:
1. The AWA is running in “test-mode”.
2. The SWDS centralized database is running and is populated with data. This data can be
coming from actual sensors, or generated from the AWA itself running in test-mode.
3. A test-user is configured in the root Web.config file of the AWA software package.
Test Procedure:
1. Open a web browser.
2. Navigate to the URL of the locally running AWA.
3. Login using the proper credentials (stored in the root Web.config).
4. Use the navigation menu to navigate to “Queries”.
5. Choose any combination of sensors, a date/time range, and which sensor configuration
data to be included in the table output.
6. Click “Run Query”.
26
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
7. Once the table output of the query is displayed, verify the correct information is being
presented.
8. Click “Show Graph”.
9. Verify that the graph displayed matched the table output.
Expected Results: When a historical query is executed against the SWDS centralized
database, the correct SQL query is generated at run-time, and properly returns the set of
requested data. The output of any query in “test-mode” will be based on randomly generated
data. The database can be explicitly queried through Microsoft SQL Server Management
Studio to verify the correct results. This test demonstrates that historical data is able to be
pulled from the SWDS centralized database, configured, and displayed both in table and
graph format.
[This space intentionally left blank]
27
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 4.4 - Public Web Server (PWS) News Feed (Cassandra Rothrauff)
Specification References: 3.1.6.1
Test Level: System
Test Type: Functional
Test Description: The PWS provides a news feed section on the homepage that informs users
of any current inundations in the jurisdiction.
Test Inputs: Sensor data from the database.
Test Procedure:
1. Load website.
2. Verify that the newsfeed is displaying the most recent alerts properly.
Expected Test Results:
1. Display(Pass/Fail)
Special Instructions: None.
[This space intentionally left blank]
28
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test Case 4.5 – PWS Bing Maps (Cassandra Rothrauff)
Specification References: 3.1.6.2
Test Level: System
Test Type: Functional
Test Description: The PWS provides an interactive Bing Maps section where users can view
the real-time status of the NSRDs in the jurisdiction.
Test Inputs: Sensor data from the database.
Test Procedure:
1. Load the website.
2. Navigate to the section containing the map provided by Bing Maps.
3. Verify that the sensors are being displayed and being updated in real-time.
Expected Results:
1. Display (Pass/Fail)
2. Display a number that indicates the water level (Pass/Fail)
Special Instructions: None.
[This space intentionally left blank]
29
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
Test 4.6 – PWS Personalized Alerts (Cassandra Rothrauff)
Specification References: 3.1.6.3
Test Level: System
Test Type: Functional
Test Description: The PWS provides a graphical user interface for allowing users to pick
which NSRDs in the jurisdiction shall be included in their own personal alert.
Test Inputs: Sensor data from the database.
Test Procedure:
1. Load website.
2. Login using public user credentials (stored in Web.config).
3. Navigate to the section “Custom Alerts”.
4. Choose any combination of the NSRDs to include in the personalized alert, and click
“Save”.
5. Navigate away from the page, or logout, return to the “Custom Alerts” section and verify
that the changes were saved.
Expected Results: The changes to the selected NSRDs are saved.
Special Instructions: None.
[This space intentionally left blank]
30
LAB 3 – Surface Water Detection System Prototype Test Plan/Procedures
5
Traceability to Requirements (Marissa Hornbrook)
Requirement references from Lab II are used in the following table to demonstrate that
each requirement is covered by at least one test case. If a requirement is addressed by a particular
test case, then an "X" will appear in the cell where the requirement row and test case column
intersect. As mentioned previously, each test case may cover one or more requirements from Lab
II, but for any given requirement, it will not have multiple test cases.
REQUIREMENTS
SYSTEM
TEST CASES
1.1
CSRD
1.3
1.4
1.5
1.6
1.7
1.8
2.2
3.1.3.2
3.1.3.3
3.1.3.4
DAH
4.3
4.4
4.5
4.6
X
X
X
X
X
X
X
X
3.1.4.1
3.1.4.2
X
X
3.1.5.2
X
X
3.1.5.3
3.1.5.4
X
3.1.6.1
PWS
4.2
X
3.1.5.1
AWA
4.1
X
X
3.1.2.3
NSRD
3.1
X
3.1.2.2
3.1.3.1
2.3
X
3.1.2.1
ODAD
2.1
X
3.1.1.4
3.1.1.6
1.2
Web App./GUI
X
3.1.1.1
3.1.1.2
3.1.1.3
3.1.1.5
Sensor
Network
Config./Maint.
Int.
Sensor Functionality
REQ. ID
X
3.1.6.2
X
3.1.6.3
Table 3. Traceability Matrix
31
Download