02-17PSR - Systems & Information Engineering, University of

advertisement
2000 Systems Engineering Capstone Conference • University of Virginia
RADIONUCLIDE ANALYSIS MESSAGING ENVIRONMENT (RAME) FOR
VERIFICATION OF THE COMPREHENSIVE NUCLEAR TEST BAN TREATY
Student Team: John Green, Brian Hashemi, Robert Rauschenberg, Christopher Stacy
Faculty Advisors: Donald Brown and Stephanie Guerlain, Department of System Engineering
Client Advisor: Dr. L. Roger Mason, rmason@psrw.com
Pacific-Sierra Research Corporation
Arlington, VA
KEYWORDS: Human Computer Interface, Data
Warehousing, Audit Trail, Usability Testing, Treaty
Verification
ABSTRACT
Nuclear weapons have unparalleled destructive
potential. The Comprehensive Nuclear Test Ban
Treaty was developed to eliminate nuclear testing,
thereby limiting this lethal power. To effectively
administer this treaty, a verification system must detect
events and communicate information between
organizations. The RAME facilitates this process and
affords archiving capabilities for future reconciliation of
sample analysis. The RAME consists of user interfaces,
data warehousing designs, and an audit trail. We
performed testing to verify the RAME design and its
usability.
INTRODUCTION
President Clinton’s signing of the Comprehensive
Nuclear Test Ban Treaty generated a need for the
treaty’s verification. Facilities worldwide must monitor
radioactivity to detect possible recent nuclear events
and treaty infractions. Pacific Sierra Research
Corporation (PSR) is tackling part of this formidable
task. PSR developed a verification system with
multiple aspects, including testing air samples for
radionuclide levels.
The testing process requires correspondence
between different organizations throughout the world.
In order to structure messages and facilitate
communications, PSR sponsored this systems
engineering capstone project. In the project, we
developed a Radionuclide Analysis Messaging
Environment (RAME) to standardize messages and
make communications more rigorous.
These improved communications provide
accountability and increase accuracy for treaty
verification. By tracking the messages, each person and
organization can be held accountable for aspects of the
analysis process. The RAME speeds communications
so that sample analysis can occur with as little sample
decay as possible.
In this paper, we provide an overview of
radionuclide testing and the objectives of the RAME.
We then describe the data warehousing, audit trail, and
user interface features of the RAME design. Finally, we
discuss some of the testing procedures conducted to
measure usability.
Treaty Verification
In order for nations to respect the conditions of
the treaty, the treaty must be verifiable. A network of
agencies must accurately detect violations and reliably
relay information. The United States Senate rejected
ratification of the CTBT because members questioned
treaty verification. In order for the CTBT to have clout,
a suitable verification process must be developed. PSR
is working on a nuclear test monitoring system, and
requested help developing a messaging environment for
their system.
Radionuclide Testing and Messaging with the
RAME
One important method to detect nuclear tests and
verify the treaty is monitoring radionuclide levels in the
atmosphere. This method involves air samples taken on
filter paper at field stations throughout the world.
Stations perform routine analysis of samples and
additional testing if questionable conditions exist. To
perform this additional testing, four types of
organizations must send messages throughout a
network. The field stations, labs, National Data
7
Radionuclide Analysis Messaging Environment (RAME)
Centers, and International Data Center comprise the
Comprehensive Nuclear Test Ban Treaty Organization
(CTBTO). Messages throughout the CTBTO describe
the activities of samples, and organizations create
different types of messages for different functions. For
example, the International Data Center may send a
request for further analysis, and the field stations may
send a sample dispatch notification. Organizations
currently send messages over email, without regulation
or any way of tracking communications. These
messages must be standardized and archived to
maintain records of communications. The Radionuclide
Analysis Messaging Environment performs these
functions.
Treaty Verification Testing Procedures
The proposed system to accomplish radionuclide
testing consists of the following entities: 80 worldwide
field stations, 16 certified labs, National Data Centers
(NDC’s), and the International Data Center (IDC). The
field stations and certified labs comprise the
International Monitoring System (IMS). The Technical
Secretariat (TS) oversees the IMS, IDC, and other
organizations. The Global Communications
Infrastructure (GCI) connects these entities. Each
country may develop an NDC to monitor activity and
determine the country’s actions with regard to the
treaty.
Sampling starts at the field stations, which collect
air samples on filter paper for 24 hours at a time. Then
the sample is allowed to decay for 24 hours to eliminate
common cluttering radioactive materials. A field
station operator compacts the filter paper into a disk and
analyzes it with computer software, generating an email
message to send to the IDC. Each sample is assigned to
a certain level of radioactivity, 1 through 5. All of the
analysis and transitions of the sample are recorded in an
Oracle database. The IDC information is available to
NDC’s and other government organizations that may
request further analysis, depending on the level of
radioactivity.
If initial IDC analysis indicates possible recent
nuclear weapons testing, then the IDC, field stations,
certified labs, and NDC’s generate a series of messages.
The labs partition the sample and send parts to multiple
certified labs to conduct further analysis, and to the TS
to archive the sample. The IDC initiates message
communication with a request for lab availability. After
this message, a number of subsequent messages track
the sample through the testing procedure. Table 1
8
outlines the types of messages sent during sample
analysis.
Message creation follows a consistent pattern.
The process begins when a states party or other official
organization requests additional testing on a sample.
The IDC sends a Request Availability Message to a lab,
asking if the lab has the capabilities to perform analysis.
The lab responds with a Lab Availability Message. If
the response was affirmative, the IDC sends a Send
Sample Message to the field station that collected the
sample. This message tells the field station operators to
split the sample and send sections to the appropriate lab
for analysis and to the IDC for archiving. Once samples
are split, the field station sends a Dispatch Notification
Message to the receiving organizations, and the
receiving organizations send a Notification Receipt
Message to the IDC upon sample receipt. Once the lab
has received the sample, the IDC
can send a Further Analysis Message requesting certain
analysis. The lab responds with a Further Analysis
Response.
Table 1. Description of Message Types
Message Type
Request Availability
Message (RAM)
Laboratory Availability
Message (LAM)
Send Sample Message
(SSM)
Dispatch Notification
Message (DNM)
Notification Receipt
Message (NRM)
Further Analysis
Message (FAM)
Further Analysis
Response (FAR)
Description
The IDC asks a lab if it has the
capability to perform analysis
The lab responds with a yes or
no
The IDC tells the field station
where to send the split samples
The field station tells sample
split recipients that the samples
are on their way
The sample recipients indicate
receipt of the sample
The IDC asks the labs to perform
further analysis
The lab responds with a yes or
no
Objectives
To rectify problems and inefficiencies that we
discovered during analysis of the current system, we
defined goals and objectives. Our goal was to improve
the treaty verification process, specifically the
radionuclide testing procedures, by accomplishing the
following objectives:
1) Reduce message variability. We improve system
integrity by standardizing message traffic throughout
the GCI. We create standard messages using a common
GUI interface provided at all sites, and limit the ability
of lab technicians to create their own formats. The
2000 Systems Engineering Capstone Conference • University of Virginia
standard format facilitates message archiving and the
receiver’s understanding of the message, and ensures
inclusion of all pertinent information into the message.
2) Reduce the time required to send messages, using
standardized message formats. For example, the lab
workers can select a message type to send. Then a
message template appears with a standard format
including a set order of information.
3) Improve the security and validity of information
concerning samples. We accomplish this by restricting
message access and generation from some organization
types and creating the common interface.
4) Archive message traffic for future access and
analysis. The standard messages and interface create
information of a consistent format that can be stored in
existing Oracle databases.
These improvements will provide an efficient means of
communication between organizations during sample
testing, thereby facilitating the treaty verification
process.
System Architecture
PSR will handle the implementation of this
messaging system. Our job was to develop the message
forms and database schema to produce a messaging
environment to accomplish our objectives. The RAME
consists of web based templates that facilitate message
creation and viewing. Java applets will parse the web
forms into the existing email system for transfer. Our
relational database design will store the message data.
The RAME will exist on a system architecture as
in Figure 1. This diagram illustrates the physical
location of system components. The IDC contains the
server that stores the RAME web forms, code, the
Oracle database, the existing email system and parser.
The field stations, IDC, labs, and NDC’s have web
browsers. The web forms are the basis of our
messaging environment, as they display information in
standardized formats. When a user creates a message
through a form and sends it over the GCI in the existing
email system, the parser disassembles the message and
stores it in the database. The existing email system also
transfers the information to the web browser of the
message recipient, who will view the messages through
another web form. Users access audit trail information
from the database with the web browser. The audit trail
includes summary information about messages for a
particular sample. All information transfer occurs over
the GCI.
MESSAGE TEMPLATES
Throughout this project we designed message
templates for each type of message that will be sent
across the system. We ascertained the contents of the
messages through interaction with our client and
analysis of current
Field/IDC/Lab/
NDC
IDC
Figure 1. Physical location of communications
equipment and software
communications. Since the development of this
messaging system is in the early phases, PSR could
account for any changes or additions that we felt
necessary. The message templates attempt to optimize
usability of the system. They are designed in a data
entry format as opposed to a generic text email. The
templates contain separated fields with radio buttons,
drop-down menus and free text fields to facilitate ease
of use. The separate fields of the templates incorporate
all the contents of the messages as outlined by PSR.
There are also separate templates for composing
messages and viewing messages. In addition, we
restricted the types of messages that the organizations
could send within the system. This restriction helps
maintain the integrity of the messaging system and
reduces the likelihood of improperly sent messages.
In creating the message templates we took into
account several design issues. The first of these is the
diversity of users of the system. Specifically, we tried
to minimize the effects of varying English proficiency
levels amongst the users, minimal exposure to and
familiarity with other users of the system, and
differences in computer skills among the users. In
designing the message templates we also wanted to
ensure that we met the individual requirements of the
messages. In addition, we wanted to allow for enough
flexibility in our messages in case the system
requirements should change. And even though these
messages are standardized through these templates, we
wanted to allow for enough flexibility in each of the
messages so that pertinent information will not be left
out. Finally, we incorporated into the design the results
9
Radionuclide Analysis Messaging Environment (RAME)
and recommendations from initial testing procedures on
the prototype.
DATA WAREHOUSING STRUCTURE
Within the context of this project, data
warehousing could effectively be defined as the process
of collecting, extracting, cleansing, organizing,
transforming, and archiving data for reference in some
form of decision support. While the interface and
reports are the frames for what is actually viewed by
system users, the data warehousing structure is what
allows those frames to be populated with the desired
information, and for the actual transactions to be
processed. The design for this structure is unique from
most systems in a few ways. The most notable
distinction is that this design combines the online
transaction processing (OLTP) capabilities of a simple
transactional system with the maneuverability,
flexibility, and structure of a relational online analytical
processing application (ROLAP). Included in this
design are the object models for the database, the data
dictionary, the protocol for message transfer, and the
client/ server system configuration.
Database Object Models
Within the data warehousing scheme, the database
serves as the back-end structure that contains all of the
data at a detailed level such that it can be referenced to
summarize information into the necessary forms and
reports on the web. In order to complete this aspect of
the design, a three-phase process of conceptualizing,
transitioning, and implementing was employed. This
made it possible to transform the enterprise-level
requirements of the system users into a plausible data
structure that could be developed into a functional
system. The design was then implemented in the
prototype as a relational database.
Conceptual Requirements
Initially, all of the information components of the
system were ascertained by conceptualizing all of the
potential correspondences that would be necessary
throughout the verification process. More specifically,
we conducted interviews with current system users at
varying levels, researched existing documentation
generated by the SAIC and Pacific-Sierra Research
companies, and outlined a desired protocol for the new
messaging system. This helped ensure that the system
was accurately represented within the design by
10
incorporating several perspectives. The completion of
this conceptualization phase resulted in the finalized
message system requirements. These conceptual
requirements should not be construed as concrete data
requirements. They are simply metadata, or “data about
data” pertaining to the detection process that would be
represented in the system.
Transition Requirements
After these requirements were outlined and
refined, they were transitioned into actual data
requirements for elements that would be included in the
system. This transitioning process included breaking
down the enterprise requirements into information
needs, documenting what data elements actually
comprised the information needs, and prioritizing these
requirements. Through the transitioning process, it was
possible to develop the first few components of the
database object model: the class diagram and the
relational model. In these models concepts,
abstractions, multiplicities, and entities that have
meaning within the system are represented as classes,
each containing attributes and behaviors. The
association lines between them also represent the
relationship of each class to others.
Implementation
Recognizing the constantly evolving needs of the
system and its users, this structure is designed with an
inherent dynamic mobility. During the implementation
phase, the object models were instituted into a
functional prototype of the system. Though this
prototype was completed in Microsoft Access for
timeliness and simplicity, the actual database will be
implemented in Oracle. Through the formats specified
in the design, “dummy” data were entered in a
normalized form to allow referential integrity to be
enforced. The prototype also afforded an opportunity to
evaluate and test the effectiveness and compatibility of
the system components’ design with the other aspects of
the system, such as the user interface. It also allowed
testing of the designed message protocol within the
dynamic model that exhibits the temporal behavior of
the system.
Client/ Server Infrastructure
There is an obvious duality between the
objectives and the protocol for this system. Though
simplified to offer a “black-box” view of the system, the
2000 Systems Engineering Capstone Conference • University of Virginia
protocol is inherently complex when considering the
potential language barriers, the relatively high employee
turnover rate, and the time-dependent nature of the
transactions. In order to effectively design this system,
it was necessary to scope a satisfactory client/ server
configuration.
This client/ server approach to the data
warehousing structure allows us to tailor the computer
infrastructure to meet the users’ needs. More
specifically, designing the architecture in this manner
allows users (clients) to share service resources
(servers), while running related, but independent
processes on independent computers. This also allows
asymmetric protocols within the system since clients
always initiate service from an otherwise passive server
by exchanging messages or commands. The physical
location of the clients and servers are irrelevant within
this structure, and scalability becomes a minimal
concern as additional clients or server enhancements
can be made with little impact on the overall system.
The integrity of the information is much easier to
maintain from a central server for derived integrity for
system users. Finally, solving the duality issue of
protocol and objectives, this configuration allows the
developers to completely encapsulate the underlying
services provided to the end-users, abstracting the
protocol complexity and allowing changes to the system
to be made transparent to end-users.
In this structure, the end-users interact with the
message environment through web browser applications
to conduct transactions. These transactions are
transferred through the Global Communications
Infrastructure (GCI) through Middleware applications
into the server where requests are processed.
At a high level, Figure 1 illustrates how the system
will perform. The end-users receive only an abstracted
“black-box” view of the system through an ObjectOriented User Interface, which presents the system
through familiar enterprise-level objects alleviating
potential confusion. This interface is generated through
code existing on the server for dynamic link libraries
and other interpretive code at run-time. Transactions
that are conducted are actually groups of Standard
Query Language (SQL) code that are passed through the
system. From the users’ Personal Computers,
transactions are sent to a middleware application with
an interpreter that translates the commands into
standard e-mails and transmits them. Middleware is in
essence the link between client and server-side
applications, however the details of the middleware
involved in this system are exogenous to the data
warehousing design.
At the next stage, the data are received by
transaction process monitors (TP Monitors) and objectrequest brokers (ORB’s) before being transferred to the
central database. The server within the system should
act as a hybrid of typical transactional and object
servers. TP Monitors enable the server to manage high
volumes of clients and processes, using multiple servers
if necessary, and are especially useful if the intended
volume of users increased. The ORB’s help conduct
object related transactions through a distributed data
environment, and are of particular importance to create
a synergy of the ROLAP and OLTP features of the
system. There is also a parsing mechanism inherent to
the ORB that aids in the archiving process of data
elements. This system will be reliant on a server
intensive process in order to mitigate the perceived
complexity from the users. This process should
minimize network interchanges by creating more
abstract levels of service. This is possible through
remote procedure calls and should be easy to manage
and deploy over the GCI, since most of the code would
run on the servers.
AUDIT TRAIL
The audit trail is a summary report of the analysis
process built from the information contained in the
central database. It displays the current status of the
analysis, a summary of the messages pertaining to a
particular sample, and a record of the handling of each
sample partition.
This information will be used by various groups
concerned with the radionuclide verification process.
For instance, members of the IDC will employ the audit
trail in supervising the analysis process by using it to
track the current location of each sample and to identify
any phases of the analysis that need expediting. In
addition, national officials from participating countries
will use the audit trail to ensure that the analyses are
being completed in a satisfactory manner. The audit
trail will also be used to assign responsibility for the
well-being of samples should they be lost, stolen, or
tainted, thus increasing the security of the system. In
this way, the audit trail serves to increase the overall
effectiveness of and the level of confidence in the
radionuclide analysis process.
The audit trail is organized according to the
event hierarchy present in the database design. Under
each event are listed the corresponding detection
numbers; under each detection number, the
corresponding bar codes. The status of each entity is
11
Radionuclide Analysis Messaging Environment (RAME)
also given, to allow users to easily determine the current
analysis situation.
Following the status report is the analysis
summary. This contains a record of the message traffic
concerning the particular sample, including the date,
message type, sender, and recipient of each message.
Interspersed with the
message summary is text detailing the handling of the
sample partitions, including where each partition was
sent, when it was sent, and when it
was received. This provides an easy way to track the
movement of each sample.
The audit trail will be accessed from the IDC web
page over the GCI. Two different methods of access
will be allowed. First, users will be able to search for
any event or detection number they may be interested
in. Second, users will have the ability to browse
through all completed or current analyses. Once the
audit trail is selected, an initial view appears displaying
only events, detection numbers, and their statuses. This
view is expandable by clicking on the desired detection
numbers. These methods of access provide the easiest
way of presenting the appropriate information to the
user.
This design performed well in preliminary
usability testing. Testers were able to locate and
understand the information they were looking at in the
vast majority of cases. Suggestions include providing
separate sections for the message and handling
summaries, so as not to break up the flow of the
message stream and make the audit trail easier to view.
These suggestions will be included in the final design of
the system.
TESTING
System testing is important because it provides a
means for comparing the RAME with existing
communications methods. We developed the test plans
after researching software testing techniques.
Testing Objectives
Through testing, we hoped to accomplish the
following objectives:
1) Verify the system requirements with PSR. Make
sure that the requirements meet the client’s
expectations.
2) Verify that the system design meets the
requirements. Make sure that our user interface and
database design fulfill the requirements.
12
3) Detect any errors in the design. Test for differences
in design intentions and actual prototype activity.
Examine data structure and database populating errors,
to test for differences between what was entered in the
test scenarios and that which is in the database. Also
examine interface and screen sequences to identify any
unexpected sequences of forms accessed through reply
functions.
4) Optimize the usability of the design. Along with
fulfilling the requirements, we want to make sure that
our system has a high level of usability by the test users,
where usability is defined with effectiveness,
learnability, flexibility, and attitude. We want the
system to be fast and error resistant, minimize training
times, and minimize user discomfort and frustration.
5) Gain usability statistics for comparison with the
existing system. We want to automatically track time
spent per form, total times of each test scenario, and
behavioral statistics. The RAME tests generate data
that allow us to evaluate our success in achieving
usability goals, to compare our system to existing
methods, and provide evidence for why RAME is better
than the current system.
Testing Methodologies
To accomplish the testing objectives, we
planned to use the following testing techniques:
1) informal reviews – meetings with our client to
discuss requirements
2) heuristic evaluation – feedback about our system
from a usability expert
3) simulation – test scenarios run on a prototype with
test users
4) behavioral measurements – tests conducted with user
reactions to the system
For each of these methodologies we developed a
test plan. We included the information necessary to
prepare for the test, execute the test, and evaluate the
test. We determined the objectives that the tests will
accomplish, the participants and their responsibilities,
any preparatory materials, the test activities, expected
results, post test analysis, and passing criteria.
Informal Reviews
The informal reviews occurred with our client to
verify system requirements. We discussed the proposed
requirements to verify the system according to the
flowchart of a sample’s messages. This includes all of
the messages from the IDC’s initial Request
Availability Message to the final Further Analysis
2000 Systems Engineering Capstone Conference • University of Virginia
Response message. After review, we revised the
requirements to include changes recommended by PSR.
communicating with the RAME. We hope to alleviate
some of these concerns with subsequent RAME
designs.
Heuristic Evaluation
CONCLUSIONS
We organized heuristic evaluation to optimize the
usability of the design. During this test, a usability
expert (Systems Engineering Professor Stephanie
Guerlain from the University of Virginia) stepped
through the prototype screens and scenarios to provide
feedback about usability. She made design suggestions
to improve usability. We examined the suggestions and
included them in an update of the design and prototype.
Simulations
The simulations were conducted to verify that the
system design meets the requirements, detect any errors
in the design (e.g. data structure, database population,
interface, and sequencing errors), and gain usability
statistics. We prepared a prototype of the RAME
system with the underlying data structure and data entry
forms. We included an automated data collection table
for performance statistics. We developed test scenarios
that mimic the actual flow of messages during
additional sample testing, and a test user briefing about
the overall system and the users’ particular roles in the
test.
The simulations uncovered design weaknesses.
For instance, the order of information displayed on
certain message templates caused the test users to
ignore data entry tools and focus on free text. This
limits archiving effectiveness and understandability
upon message receipt. We updated the design to fix
these flaws discovered during the simulation tests.
Behavioral Measurements
The behavioral measurements testing was done to
optimize the usability of the design. Test users voiced
concerns during the simulations, which we video taped.
The information for the behavioral tests was created
during the simulations. Before the simulations, we gave
the test users a pre-test questionnaire concerning their
computer experience and other factors that might affect
their performance during the simulation. We also gave
the test users a post-test questionnaire that we
developed to gather their opinions about the system.
This questionnaire concerns ease of use and opinions
about RAME features. Test users indicated some
confusion about the system and were not comfortable
The RAME system was designed to facilitate
message communications among CTBTO members.
The environment provides message creation and
viewing templates specific to sample testing
communications. It was built to reduce message
variability, reduce the time required to send messages,
improve the security and validity of information
concerning samples, and archive message traffic for
future access and analysis. These objectives can be
accomplished with a RAME design that fulfills our
requirements and optimizes usability.
Testing demonstrated that our current design
fulfills requirements but does not optimize usability.
Further design revamping and usability testing is
necessary to produce the best possible RAME. We
must use the test results to improve our system before
final implementation. The deployment of the actual
system will be performed within a two-year time
frmace, despite some uncertainty about the future of the
CTBT.
Although we must continue to improve the design,
our RAME system is already an improvement on
existing communications. We created a more rigorous
message exchange and demonstrated performance
analysis tools. The RAME is an important aid to the
success of treaty verification.
REFERENCES
Biegalski, Kendra F., et. al. Formats and Protocols for
Messages. Science Applications International
Corporation International Data Center
Documentation, order number SAIC-99/3004
(PSR-99/TN1141), document IDC3.4.1Rev1.
March, 1999.
Biegalski, Steven, Director of Radionuclide Operations
at the Center for Monitoring Research (CMR).
Personal Interview. 12 Oct. 1999b.
Blaha, M., and William Premerlani. Object-Oriented
Modeling and Design for Database Applications.
Upper Saddle River, NJ: Prentice Hall, 1998.
Bohlin, Jane, PSR Employee. Personal Interview. 4
Feb. 2000.
13
Radionuclide Analysis Messaging Environment (RAME)
BIOGRAPHIES
Gibson, John E. How To Do a Systems Analysis and
Systems Analyst Decalog. Ivy, VA. July 1991.
Hetzel, Bill. The Complete Guide to Software Testing.
2nd ed. Wellesley, Massachusetts: QED
Information Sciences, Inc., 1988.
Hosticka, Bouvard, Operator of Charlottesville field
station. Personal Interview. 1 Oct. 1999.
Mason, Roger, Program Manager & Senior Scientist at
PSR. Personal Interview. 24 Sept. 1999.
“Message Type Additional Sample Analysis (ASA)
overview” handout from George Novosel, 12
Oct. 1999.
Nielsen, Jakob. Usability Engineering. San Francisco,
CA: Academic Press, 1993.
Novosel, George, Software Development Manager at
PSR. Personal Interview. 12 Oct. 1999.
Novosel, George, Software Development Manager at
PSR. Personal Interview. 4 Feb. 2000.
Orfali, R., et al. The Essential Client/Server Survival
Guide, Second Edition. John Wiley & Sons, Inc.
1996
Palme, Jacob. Databases in Computer Messaging
Systems. Amsterdam, Neth and New York, NY,
USA: North-Holland., 1998
Preece, Jenny. “Supporting User Testing in HumanComputer Interaction Design.” Lecture Notes in
Computer Science, New Results and New Trends
in Computer Science. Proceedings of the
Symposium “New Results and New Trends in
Computer Science”. 20 June – 21 June 1991.
Ed. Hermann Maurer. Graz, Austria: SpringerVerlag Berlin Heidelberg, 1991. 256-67.
Welbrock, Peter. Strategic Data Warehousing Principles
Using SAS Software. Cary, NC: SAS Institute
Inc., 1998, 384 pp.
14
John T. Green is a fourth-year systems engineering
student from Virginia Beach, VA. He will be working
with Goldman Sachs & Co. at the end of this summer.
Brian K. Hashemi is a fourth-year systems engineering
student from Clifton, VA. He has accepted a position
with Providian Financial Corporation and will start
there at the end of June.
Robert D. Rauschenberg is a fourth-year systems
engineering student from Atlanta, GA. He will be
defending our country as an officer in the United States
Navy, in Pensacola, FL.
Christopher L. Stacy is a fourth-year systems
engineering student from Fairport, NY. He will be
working as a Systems Analyst with Deloitte Consulting
at the end of this summer.
Download