Preliminary System Requirements Document

advertisement
Preliminary System Requirements Document
Virtual Reality Stimulator
Document reference: TEC-MMG/2006/73b
Date: 28th September 2006
Issue # 1.1
TEC-MMG/2006/73b Issue 1.1, dated 28 September 2006.
Page i
TABLE OF CONTENTS
1.
INTRODUCTION................................................................................................................................................ 1
2.
APPLICABLE AND REFERENCE DOCUMENTS........................................................................................ 1
2.1
2.2
APPLICABLE DOCUMENTS ............................................................................................................................... 1
REFERENCE DOCUMENTS ................................................................................................................................ 1
3.
DEFINITIONS AND CONVENTIONS............................................................................................................. 1
4.
REQUIREMENTS............................................................................................................................................... 2
4.1
VRG REQUIREMENTS................................................................................................................................ 2
4.1.1
Functions: .............................................................................................................................................. 2
4.1.2
Generation of visual and aural stimuli .................................................................................................. 2
4.1.3
Generation of complex environments with navigational capabilities .................................................... 4
4.1.4
Interface requirements ........................................................................................................................... 5
4.2
VRG-H REQUIREMENTS .................................................................................................................................. 5
5.
REQUIREMENTS FOR PRELIMINARY FLIGHT DESIGN....................................................................... 6
5.1
5.2
6.
FUNCTIONAL AND DESIGN REQUIREMENTS ..................................................................................................... 6
ACCOMMODATION, SIZE, WEIGHT AND LOGISTICS ALLOCATION ..................................................................... 6
ADDITIONAL REQUIREMENTS FOR THE BREADBOARD.................................................................... 6
6.1
6.2
6.3
BREADBOARD ................................................................................................................................................. 6
SAFETY REQUIREMENTS .................................................................................................................................. 7
TESTING AND SCIENTIFIC ASSESSMENT ........................................................................................................... 7
TEC-MMG/2006/73b Issue 1.1, dated 28 September 2006
Page 1 of 7
1. INTRODUCTION
This document defines the requirements applicable to the Virtual Reality Stimulator activity.
This document is Applicable Document 1 to the Virtual Reality Stimulator statement of work (ref. TECMMG/2006/73, Issue 1.1, dated 28 September 2006).
2. APPLICABLE AND REFERENCE DOCUMENTS
2.1 APPLICABLE DOCUMENTS
AD# 1: IEC 60601 and collateral standards as applicable.
2.2 REFERENCE DOCUMENTS
RD# 1: EPM SMIRD. EPM-OHB-RQ-0001-3A, 25/10/2002
RD# 2: MEEMM user’s manual (TBC)
RD# 3: PORTEEM user’s manual (TBC)
RD# 4: Short arm human centrifuge user’s manual and interface document (will be provided at KOM)
3. DEFINITIONS AND CONVENTIONS
This section is identical to section 3 of SOW, copied here for convenience.
•
VRG is the acronym for the Virtual Reality Stimulator, including both the VRG-s and VRG-h.
•
VRG-s means Virtual Reality Generator application software. VRG-s will allow scientists to design,
test, modify and validate virtual reality-based experiments.
The scenarios created by the VRG-s will then be run on an adequate platform (computer, display
system, tracking and input devices) with the necessary support software and libraries.
•
VRG-h is the acronym for Virtual Reality Generator hardware, i.e. the computer, display system,
tracking and input devices. VRG-h will also ensure the interfacing with the various recording
systems, as required.
The sections and paragraphs of the present document describe controlled requirements and specifications and
therefore the following verbs are used in the specific sense indicated below:
•
Shall, i.e.: is used to indicate a mandatory requirement,
•
Should, e.g.: indicates a preferred alternative but is not mandatory,
•
Could, can: Indicates a possibility
•
May: indicates an option;
•
Might: indicates a suggestion;
•
Will: indicates a statement of intention or fact.
TEC-MMG/2006/73b Issue 1.1, dated 28 September 2006
Page 2 of 7
4. REQUIREMENTS
4.1 VRG REQUIREMENTS
4.1.1 Functions:
4.1.1.1
VRG shall allow generating stimuli and measuring subject responses in order to assess
subject (work) load, levels of stress, reaction time, discrimination capabilities1 and the like.
This feature of VRG is referred to hereinafter as “Generation of visual and aural stimuli”.
Detailed requirements are in paragraph 4.1.2.
4.1.1.2
VRG shall also contribute to the investigation of neuro-sensory conflicts, e.g. by assessing
subject perception of orientation and ability to move (navigate) in a given environment (e.g. a
building, a labyrinth …). This feature of VRG is referred to hereinafter as “Generation of
complex environments with navigational capabilities”. Detailed requirements are in
paragraph 4.1.3.
4.1.1.3
The VRG shall have an open design: it shall be possible (in further developments) to update
the application by including new functions.
4.1.1.4
The VRG environment shall allow the investigators to create, store, perform, retrieve,
perform and verify the experiment profiles reproducibly for testing population of subjects,
without having to modify the software.
4.1.2 Generation of visual and aural stimuli
1
4.1.2.1
Stimuli can be visual, aural or a combination of both visual and aural, simple and complex.
4.1.2.2
Simple stimulus: A simple stimulus is defined as one occurrence of the stimulus, e.g. one
shape or one sound.
4.1.2.3
Complex stimulus: a complex stimulus is defined as sequence stimuli. A complex stimulus
can be visual only, aural only or a combined visual – aural stimulus; in that case, the aural
stimulus can be generated before, simultaneously with or after the visual stimuli.
4.1.2.4
Visual stimuli:
4.1.2.4.1
Stimuli can range from simple geometrical shapes to patterns or pictures, plain or
empty,
4.1.2.4.2
Stimuli can be presented alone or by pairs, triplets, quadruplets and so on.
4.1.2.4.3
If more than one occurrence of a visual stimulus is presented, it shall be possible for the
subject to determine its axis of symmetry (up, down, right, left, oblique)
4.1.2.4.4
Shapes can be 2D or 3D and, if 3D is used, it shall be possible to switch the
stereoscopic feature on or off.
4.1.2.4.5
It shall be possible to display the stimuli or environments either on the Head Mounted
Displays or the screens (see 4.2 ).
4.1.2.4.6
Ground use: it shall be possible to mirror the scene displayed to the subject (Head
Mounted Display or screen) to an external screen for monitoring purposes by the
experimenter.
Discrimination capability is defined as ability of the subject to recognize whether a stimulus matches the
one declared as pertinent by the experimenter
TEC-MMG/2006/73b Issue 1.1, dated 28 September 2006
4.1.2.4.7
Page 3 of 7
Position management: It shall be possible to present a stimulus at either a fixed
position of the field of view or to present it at a random position over a pre-defined area.
It shall be possible for the experimenter to choose (specify) the display mode (fixed or
random position display), the coordinates of the field of view (screen/HMD) where the
stimuli are presented and the coordinates and area of the field of view (screen/headmounted display) for random position display (maximum area being full screen / full field
of view).
4.1.2.5 Aural stimuli: VRG shall be able to generate simple or complex sounds.
4.1.2.5.1
It shall be possible to choose the frequency of sound over the full audible spectrum.
4.1.2.5.2
Volume of aural stimuli shall be adjustable by the experimenter; note that this stimulator
is not a system for quantifying the aural function of a subject.
4.1.2.5.3
It shall be possible to generate aural stimuli to only one ear or both ears, as per
investigator selection
4.1.2.5.4
It shall be possible to build aural stimuli by importing sounds.
4.1.2.6 Requirements applicable to the generation of visual and/or aural stimuli
4.1.2.6.1
Time management: It shall be possible to set the generation of the stimuli so they are
either presented to the subject at fixed times (to be specified when preparing the
experiment profiles) or generated at random within a time window. It shall be possible
to the experimenter to specify the duration of the time window.
4.1.2.6.2
It shall be possible for the experimenter to define, among the proposed stimuli, which
one (or which sequence of) is pertinent (see also footnote 1). It shall be possible to
present the pertinent stimulus to the subject.
4.1.2.6.3
It shall be possible for the subject to have a dry-run of the experiment.
4.1.2.6.4
It shall be possible for the experimenter to set times, areas (where applicable) and any
variable when defining the experiment, including the time during which the stimulus is
displayed.
4.1.2.6.5
It shall be possible for the experimenter to create the stimuli by choosing them from a
library, or to create them from imported data. It shall be possible to update the library
from imported data.
4.1.2.6.6
It shall be possible for the experimenter to create, save, retrieve and use experiment
profiles.
4.1.2.6.7
It shall be possible for the experimenter to set the duration of the complex stimulus, the
duration of each simple stimulus and time separating each simple stimulus.
4.1.2.7 Recording of subjects responses:
4.1.2.7.1
VRG shall record time-tagged subject responses as defined by the experimenters.
Subject responses to be recorded by VRG include, but are not limited to:
4.1.2.7.1.1 Characteristics of each stimuli generated, time of generation since the beginning of the
experiment
4.1.2.7.1.2 Subject reaction times (time elapsed between stimuli generation and subject response
acknowledging stimuli)
4.1.2.7.1.3 Subject rating of the stimuli (or sequence of stimuli) as pertinent or non-pertinent - see
4.1.2.6.2
4.1.2.7.1.4 Subject’s perception of orientation and direction of motion
4.1.2.7.2
It shall be possible to export the recorded data to a spreadsheet such as, but not only,
MS Excel ®) for further statistical processing.
TEC-MMG/2006/73b Issue 1.1, dated 28 September 2006
Page 4 of 7
4.1.3 Generation of complex environments with navigational capabilities
4.1.3.1 VRG shall allow generation of complex navigation environments to test the ability of a subject
to navigate in e.g. a building and/or a labyrinth, in the three directions of space.
4.1.3.1.1
It shall be possible to the experimenter to select and build the environments proposed
for navigation (Rooms, corridor(s), labyrinth …)
4.1.3.1.2
It shall be possible for the subject to follow a path (navigate into) the proposed
environment.
4.1.3.1.3
Labyrinth: it shall be possible for the experimenter to define and select path
characteristics (such as, but not only, length, width …), the characteristics of each
segment (number, length, width, height), the texture of the walls, and the like …
4.1.3.1.4
Rooms: it shall be possible for the experimenter to specify the type of buildings or
rooms to be created (e.g. a office building with rooms and corridors or a single room
with tilting and rotating capabilities)
4.1.3.1.5
Environments shall be 3D (navigation right-left - top-bottom)
4.1.3.1.6
It shall be possible to animate the environment so that the subject experiences a feeling
of relative motion: it shall be possible to set the motion of the environment to be in the
direction of the movement or in the opposite motion. This shall be possible whether the
subject is static or navigates into the environment.
4.1.3.1.7
Upon experimenter’s choice, it shall be possible to rotate/tilt the environments displayed
in the VRG: In case VRG is used with rotating equipment (e.g. a centrifuge), it shall be
possible to have the scenes displayed in the VRG rotating in the same direction as the
rotation of the centrifuge or in the opposite direction. It shall be (TBC) possible to set the
rotation of the VRG-displayed scene in a plane perpendicular to the plane of rotation of
the rotating device (centrifuge), or in any other direction resulting from the combination
of those two perpendicular directions. The limits with tilting and rotating capabilities
shall be limited only by the computer hardware. This feature shall be available whether
VRG is used together with rotating equipment or not. Matching of rotation speeds
between VRG-displayed scenes and rotating equipment is a design aim.
4.1.3.1.8
Use of this complex environment generator could also be envisaged to implement and
test countermeasures to e.g. balance disfunction
4.1.3.1.9
VRG shall allow to affix colors or textures to the synthetically generated patterns
4.1.3.1.10
It shall be possible to enable/disable those measurements
4.1.3.1.11
Navigation into labyrinth or similar to be done by most appropriate means, compatible
with microgravity and allowing to measure subject performance;
4.1.3.1.12
At predefined times or at random, it shall be possible through the VRG to ask the
subject to give his/her perception of orientation and movement; subject responses shall
be time-tagged and recorded by the VRG.
4.1.3.1.13
It shall be possible to create scenes of stimulation by importing already existing material
(e.g. drawings, pictures, photographs, movies, 3D scenes, sounds …), to be used as is
or as a basis to be modified by the user.
4.1.3.2
For creating and setting-up the scientific protocols, it is anticipated that VRG is operated on a
standard commercial computer platform and uses the ground versions of the VRG-h for
testing and evaluation.
4.1.3.3
VRG-s shall allow controlling the displayed environment in function of the inclination of the
head of the subject (lateral or frontal or roll, yaw) see 4.2.7). It shall be possible to :
4.1.3.3.1
turn on or off that feature
4.1.3.3.2
display the scenes with respect to the orientation of the head
4.1.3.3.3
design aim: introduce experimenter-defined conflict between the displayed scene and
the position of the subject’s head.
TEC-MMG/2006/73b Issue 1.1, dated 28 September 2006
4.1.3.4
Page 5 of 7
Recording of subjects’ responses:
4.1.3.4.1
VRG shall allow to record time-tagged subject responses as defined by the
experimenters. Subject responses to be recorded by VRG include, but are not limited
to:
4.1.3.4.2
Measuring time required for the subject to follow a path (e.g. if the subject has to
navigate in a maze), recording the intersection of the trajectories followed by the subject
with the boundaries of the path (e.g. occurrences of hitting the walls, time, trajectory),
the number of tries to find a direction and the like,
4.1.3.4.3
Measuring the subject’s perception of orientation and direction of motion (see also
4.1.3.1.12)
4.1.3.4.4
It shall be possible to export the VRG-recorded data to a spreadsheet - such as
MS Excel® - for further statistical processing.
4.1.4 Interface requirements
4.1.4.1
Interface requirements apply to either operation mode of VRG (4.1.2 Generation of visual
and aural stimuli and 4.1.3 Generation of complex environments with navigational
capabilities).
4.1.4.2
VRG shall interface with mechanical, thermal and electrical stimuli generators.
4.1.4.3
It shall be possible to interface VRG with EPM MEEMM and PORTEEM
4.1.4.4
The VRG shall generate synchronization signals towards external systems such as
electroencephalographs (EEG recorders – such as, but not only the EPM MEEMM) EOG
recorders, general electrophysiological recorders such as, but not only the EPM PORTEEM,
and more generally the systems used to measure and assess performances in cognitive
neurophysiology. The recorders of electrophysiological signals are not deliverables of the
present study.
4.1.4.5
VRG-h, especially the parts applied to the subject, shall not impair the recording of
electrophysiology signals (especially, but not only, EEG and EOG)
4.1.4.6
VRG shall be compatible with the short arm human centrifuge (RD# 4) or treadmill (TBC)
4.1.4.7
Design aim: it should be possible to interface VRG (see-through display configuration – see
4.2.5.8) with eye tracking devices.
4.2 VRG-H REQUIREMENTS
4.2.1
All virtual-reality hardware shall be based on off-the-shelf equipment.
4.2.2
VRG-h shall allow to fulfill the requirements of section 4.1.
4.2.3
All the elements of VRG-h shall be commercially available (no development of hardware shall
be undertaken in the present activity)
4.2.4
The VRG-h shall allow to manage and display the virtual reality scenes created by the VRG-s
4.2.5
The VRG-h shall be comprised of at least:
4.2.5.1
A virtual reality head-mounted display (HMD) capable of stereoscopic display with sound
generation capabilities
4.2.5.2
A PC computer (preferably a laptop with expansion bay) with the necessary peripherals and
able to drive the above-mentioned head-mounted display;
4.2.5.3
One control (flat) screen and one (two TBC) additional flat screen to display the VRG
scenes;
4.2.5.4
Sound generation system
4.2.5.5
Loudspeakers
4.2.5.6
Stereo (sound) headsets
TEC-MMG/2006/73b Issue 1.1, dated 28 September 2006
Page 6 of 7
4.2.5.7
Design aim: hardware to allow the subject’s reaction times or responses to be computed
from subject’s voice recognition and analysis system
4.2.5.8
Design aim: an additional see-though display (head-up display) that allows combining
generation of visual stimuli and eye tracking
4.2.5.9
Tracking system for the position of the head
4.2.5.10
Traditional computer input devices and specific input device to allow easy navigation, for
example within a maze, and to measure subject responses (e.g. joystick)
4.2.6
VRG-h shall make use of state-of-the-art components and systems
4.2.7
VRG-h shall allow to measure subject's head inclination in all directions by using convenient
head tracking systems and shall allow to control the display according to subject’s head tilt with
respect to subject axes.
4.2.8
Performance of displays and inputs devices shall be compatible with measuring subjects’ visual
and aural reaction times (refer to 4.1)
5. REQUIREMENTS FOR PRELIMINARY FLIGHT DESIGN
5.1 FUNCTIONAL AND DESIGN REQUIREMENTS
5.1.1
It is intended to interface the VRG with EPM devices (RD# 2, RD# 3)
5.1.2
Particular attention shall be paid to the ergonomics: VRG shall be intuitive and easy-to-use.
5.1.3
Tracking systems shall be compatible with operation in microgravity and in the ISS
5.1.4
VRG computer shall be a laptop with an expansion bay.
5.1.5
VRG head-mounted display shall be wide angle, to fill subject’s field of view; vision of fixed
structures (VRG frames) shall be minimized (ideally eliminated)
5.1.6
Recording of subject' responses is done via a manually activated input system (e.g. recording
on a keypad strike and key) or recording of voice or combination of both
5.1.7
Navigations (e.g. like navigating a maze) shall be performed with an appropriate, micro-gravity
compatible input device
5.1.8
Head or body part trackers shall be microgravity-compatible
5.1.9
The flight systems shall be open, i.e. they shall be able to accommodate the new COTS as they
become available (especially displays, head trackers, computers and related hardware and
software).
5.2 ACCOMMODATION, SIZE, WEIGHT AND LOGISTICS ALLOCATION
5.2.1
The VRG shall ultimately interface with recording systems such as (but not only) the EPM
MEEMM and/or PORTEEM (RD# 1, RD# 2, RD# 3).
5.2.2
VRG-h shall fit into a 4PU drawer (stowed configuration - check)
6. ADDITIONAL REQUIREMENTS FOR THE BREADBOARD
6.1 BREADBOARD
6.1.1
VRG shall include data from head tracking in order to correct the display. For the purpose of the
present activity, the demonstrator for head tracking can be of magnetic type. However it shall
allow recording simultaneously electrophysiology signals such as EEG and EOG without
distorting or adding noise to those signals. Reciprocally, the head tracking system shall operate
nominally when used simultaneously with electrophysiological recorders.
6.1.2
The software chosen to code the application and the architecture of the application shall allow
TEC-MMG/2006/73b Issue 1.1, dated 28 September 2006
Page 7 of 7
that the developments undertaken in the present study can be pursued in later developments
and can be easily transported to other software platforms without full re-writing of the code.
6.1.3
Design aim: subject responses and reaction times to be processed from voice responses
6.2 SAFETY REQUIREMENTS
6.2.1
VRG equipment shall comply with industry standards applicable to those systems
6.2.2
Interface of VRG with electrophysiology signal recorders shall be in accordance with AD# 1.
6.2.3
Safety assessment is contractor’s responsibility.
6.2.4
Safety assessment shall prove the VRG technology is safe to the subject and the operator. The
VRG will be used only for carrying out research protocols. The VRG will not be used for
diagnosing or treating medical conditions (diseases).
6.3 TESTING AND SCIENTIFIC ASSESSMENT
The test plan shall cover the following items:
6.3.1
Technical,
6.3.2
Safety
6.3.3
Functional
6.3.4
Scientific. The scientific assessment shall be performed in close cooperation with a specialized
research laboratory (or laboratories, as per contractor’s choice) and demonstrate the validity of
the VRG both:
End.
•
As a flexible system allowing the investigators to define experiments profiles as required by
their research,
•
At least one experiment profile with each proposed type of use of VRG (4.1.2 and 4.1.3)
shall be validated on a statistically significant number of subjects,
•
Interface with electrophysiology recorders to be demonstrated
Download