CS 411W Lab III Prototype Test Plan/Procedure For

advertisement
CS 411W Lab III
Prototype Test Plan/Procedure
For
LASI
Prepared by: Dustin Patrick, Red Group
Date: 04/17/2013
Table of Contents
1
2
3
Objectives ..................................................................Error! Bookmark not defined.
References ..................................................................Error! Bookmark not defined.
Test Plan.....................................................................Error! Bookmark not defined.
3.1
Testing Approach ...............................................Error! Bookmark not defined.
3.2
Identification of Tests ........................................Error! Bookmark not defined.
3.3
Test Schedule .....................................................Error! Bookmark not defined.
3.4
Fault Reporting and Data Recording .................Error! Bookmark not defined.
3.5
Resource Requirements .....................................Error! Bookmark not defined.
3.6
Test Environment ...............................................Error! Bookmark not defined.
3.7
Test Responsibilities ..........................................Error! Bookmark not defined.
4
Test Procedures ..........................................................Error! Bookmark not defined.
4.1
Test Case Names and Identifiers........................Error! Bookmark not defined.
5
Traceability to Requirements .....................................Error! Bookmark not defined.
List of Figures
Figure 1. Phase 1 prototype major functional component diagram Error! Bookmark not
defined.
Figure 2. Teletechnet classroom layout ............................Error! Bookmark not defined.
List of Tables
Table 1. CertAnon prototype test cases by category.........Error! Bookmark not defined.
Table 2. CertAnon prototype test schedule .......................Error! Bookmark not defined.
Table 3. Traceability matrix for the CertAnon prototype .Error! Bookmark not defined.
1. Objectives
The purpose of this Test Plan and Procedure is to establish the overall approach, testing
sequence, and specific tests to be conducted to demonstrate successful operation of the LASI
prototype. LASI provides the user with an intuitive way to scan documents for important words
to determine themes from document. The plan and procedures described in this document are
designed to verify stable, safe operational performance of the LASI prototype.
2. References
Patrick, Dustin. (2013). Lab I – LASI Product Description. Norfolk, VA: Author.
Patrick, Dustin. (2013). Lab II – Prototype Product Specification for LASI. Norfolk, VA:
Author.
3. Test Plan
The following sections of the test plan will discuss the types of tests to be performed, the
testing schedule, reporting procedures, resource requirements, the testing environment, and team
member responsibilities. This will outline the prototype presentation as well. The prototype
presentation will demonstrate our test cases.
3.1. Testing Approach
The performance of the LASI prototype will be verified through a series of test cases that
will be outlined below. The LASI prototype major functional component diagram is included on
the next page
Figure 1. Major Functional Component Diagram
File Management
Graphic User Interface
Start-up Screen
Create Project
File Converter
Create Project View
Documents Returned
Project Preview
Begin Analysis
Part-of-Speech
Tagger
In Progress View
Results View
Algorithm
Results Aggregator
Word &Phrase Binder
Individual Documents
Subject
All Documents
Tagged
File Parser
Object
Attributive
The tests that will be demonstrated in the prototype demonstration will be our system
tests, component tests, and integration tests. The tests will validate each of the three main
sections of the prototype as demonstrated in the major functional component diagram. The UI
tests will go through the user interface and provide a visual representation that will prove that
each of the requirements has been met. The File management tests will also be performed
through the GUI. They will be a visual representation of the fact that the prototype interacts
properly with the file system when selecting documents and creating the appropriate files in the
appropriate sections. Many of the algorithm tests can be performed through the GUI as well with
the visual display of the results of the result aggregation and binding algorithms. However, some
of the tests cases will require a more in-depth command line view of the data to demonstrate
functionality because of the way the results are displayed in the GUI doesn’t provide a means for
confirming that the test case results were met.
3.2. Identification of Tests
(This section intentionally left blank)
Table 1. LASI Prototype Test cases by category (next page)
Category ID
1
2
3
4
Category
User Interface
Test Case
1.1
Name
Description
New Project
Verify that the user has the ability to create a new project.
1.2
Load Project
1.3
File Type
1.4
Document Limit
Verify that the user has the ability to load a project
Verify that DOC, DOCX, and TXT are the only file type that can be
added to the project.
Only 5 documents may be added to a single project.
Documents added to the project will be displayed in a document queue
and documents can be removed from the document queue.
Verify that all fields must be completed to create a new project.
Verify that when a project is created the user will be navigated to the
loaded project page and the documents are displayed.
Verify that the user can add documents to the project once it is created.
Project is still limited to 5 documents at any time.
Verify that the user can remove documents from the project once it is
created. Project is still limited to 1 document to start analysis.
Verify that the Processing Screen displays a visual indication of
progress and that user has the ability to cancel analysis.
Verify that the Results Screen displays the results in multiple views.
Verify that the top results view provides a visual summary of the
analysis in both the individual and collective document scopes.
Verify that the top results view provides the user with the ability to
change the graphical view of the top results.
1.5
Document Queue
1.6
Field Verification
1.7
Document Display
1.8
Adding Documents
1.9
Removing Documents
1.10
Canceling Analysis
1.11
Results Screen
1.12
Top Results View
1.13
Top Results Graph
1.14
Word Relationships View
1.15
Word Count & Weighting View
1.16
2.1
2.2
2.3
Export Results
Input File Path Verification
Conversion of .DOC to .DOCX
Conversion of .DOCX to TXT
2.4
Converting .TXT to .TAGGED
2.5
Backups to Project Directory
3.1
Accepted Formats
3.2
Tagged File Parser at Word Level
3.3
Tagged File Parser at Phrase
Level
4.1
Subject Binder Test
4.2
Object Binder Test
4.3
Object Binder Fail test
4.4
4.5
5.1
5.2
5.3
5.4
5.5
Thesaurus Test
Adverb Binder Test
Word Weight - Instantiation
Phrase Weight - Instantiation
Word Weight - Duplicates
Synonyms
Word Associations
Word weight increase on phrase
association.
Phrase weight increase on word
association.
Phrase weight increase on phrase
association.
File Manager
Tagged File
Parser
Word Association
5.6
5.7
5.8
5
Weighting
5.9
Common word weight exclusion
5.10
Distance as weight modifier
5.11
Word weight for individual
document.
Word weight for project.
Phrase weight for individual
document.
Phrase weight for project.
Aggregate weight
5.12
5.13
5.14
5.15
Verify that the Word Relationships View displays visuals to show all
relationships and bindings of words.
Verify that the Word Count and Weighting view is presented with the
quantitative data used in the analysis
Verify that the user is able to export results.
The File Manager shall only accepts existent files in supported formats
The file manager must be able to convert a DOC file to DOCX.
The file manager must be able to convert a DOCX file to TXT file.
The File Manager shall invoke the SharpNLP tagger to process each
TXT file.
The file manager shall provide functionality to backup up the entire
project directory.
The Tagged File Parser will only attempt to parse the contents of a
.TAGGED file.
The Tagged File Parser shall only attempt to parse the contents of a
.TAGGED file.
The Tagged File Parser shall only attempt to parse the contents of a
.TAGGED file.
The Subject binder determines which noun phrases are the subjects of
verb phrases.
The Object binder determines which noun phrases are the direct
objects and indirect objects of verb phrases
The Object binder test makes sure that sentences not containing a
direct or indirect object are not bound as having one.
The Thesaurus correctly identifies synonyms.
Adverb binder binds adverb phrases to the adjective/verb modified
A test for the initial equivalent weights of each Word.
A test for the initial equivalent weights of each Phrase.
A test for weighting multiple instances of the same Word.
A test for weighting Synonyms of the same Word.
Test for incrementing the weight of a Word for each association.
The weight of a word shall be increased when a phrase is associated
with it.
The weight of a phrase shall be increased when a word is associated
with it.
The weight of a phrase shall be increased when another phrase is
associated with it.
The weights of commonly used words such as “the”, “a”, “to”, etc.
shall be excluded.
The distance between words and phrases shall be used as a weight
modifier.
Each word shall store its weight in respect to the document containing
it.
Each word shall store its weight in respect to all the documents.
Each phrase shall store its weight in respect to the individual document
containing it.
Each phrase shall store its weight in respect to all the documents.
There shall be an aggregate weight computed across all documents.
3.3. Test Schedule
The LASI prototype presentation has been allotted a time frame of one hour. There will
be an approximately five minute setup process to get the prototype to a position in which it can
be demonstrated. There will then be a 10 minute introduction followed by the actual prototype
demonstration. Table 2 below will outline the prototype presentation.
Table 2. Test Schedule
Start Time
(hours:min)
0:15
0:25
0:30
0:35
0:50
Duration
(Minutes)
10
5
5
15
10
Test Objective
Test Event
Dependencies
Comments
User Interface
File Manager
Tagged File Parser
Word Association
Weighting
1.1, 1.3-1.16
2.1-2.5
3.1-3.3
4.1-4.5
5.1-5.15
None
None
None
None
None
None
None
None
None
None
3.4. Fault Reporting and Data Recording
Because of the binary nature of each of these tests, and the sheer number of tests to
present, there will be a single group member assigned to recording the results of our tests, and
noting whether they passed or failed. User interface tests results will be validated by observing
the result of the test. Deeper functionality tests will be validated by displaying output via
command line. If a test fails or an exception is thrown during the presentation, there will be a
detailed log kept for that test, what failed, and why.
3.5. Resource Requirements
The LASI prototype will be demonstrated on a single laptop running the application. The
laptop will contain 8GB of RAM, an Intel Core i5 Processor, and a 64 bit Windows 7
installation. Because we will be displaying more than just the GUI, the laptop used in
demonstration will also need to have Visual Studio installed. There will be five pre-selected
documents that will be parsed in the demonstration to test different aspects of algorithm. At least
one of the documents will need to be manually parsed by hand before the presentation.
3.6. Test Environment
The demonstration will occur in the first floor presentation room of Dragas. The laptop
described in the previous section will be connected to the projector in that room so that everyone
viewing the presentation will be able to see the tests being performed. One person will be behind
the laptop administering the tests. The rest of the group will be seated at the table where one by
one, and in order of test case, each member of the group will describe the test cases that member
wrote and how to run that test. The person behind the laptop will then run the test in a way that
demonstrates whether the test was passed or failed.
3.7 Test Responsibilities
During the presentation, each member of the group that is present will be responsible for
handling each of the tests outlined in that group member’s roles and responsibilities. There will
be three group members who take on additional responsibilities. One group member, in addition
to handling his/her test cases will also be introducing the presentation. One group member will
be operating the laptop essentially driving the presentation. One group member will be recording
the results and faults of the tests. These responsibilities will be assigned in our final presentation
preparation on Saturday morning.
(This space intentionally left blank)
4. Test Procedures
Detailed test procedures have been prepared to ensure that the test cases can be
performed in an effective and efficient manner.
4.1 Test Case Names and Identifiers
Test case table has been printed separately.
5. Traceability Matrix
Traceability matrix has been printed separately.
Download