CodeDocumentation

advertisement
INTERACT/REACT
Interaction in VR
Running the system
Getting Started
The INTERACT/REACT (IR) system requires two configuration files; one to setup
VRJuggler and one to setup IR. These files are passed as arguments to the IR
executable. e.g.
interact <VRJConfigFile> <INTERACTConfigFile>
For example, assuming the IR executable has been built. You can run a simple
environment in UCL’s reactor by typing the line
interact ./configFiles/VRJuggler/UCL_reactor_all.config
./configFiles/INTERACT/ModelLoadTest.xml
If you look at the ModelLoadTest.xml config file there are a number of distinct parts
to it, which configure the various parts of the system.
Scene Description
Interaction
Behaviours
Recording/Player
Distribution
Lets look at each of these configuration options.
Scene Description Variables
The scene is described by specifying all the models, which will make up the scene.
For instance, the config line below will load the kitchen.obj model and place it in
the scene at the position and orientation specified. Note that the pickable attribute,
specifies whether the model may be picked up by the user. For instance you probably
wont want the user to pickup large objects like buildings.
<MODEL name="Kitchen" fileName="kitchen.obj" scale="1.0"
pos="0.0 0.0 0.0" rot="0.0 0.0 0.0" pickable="false"/>
If the fileName attribute is specified, then the system tries to load the model from the
/models folder. If instead of a filename an objectName is specified (as with the line
below), the system creates the model on the fly (NOTE at this time UNIT_CUBE is the
only dynamically created model type implemented). For dynamically created models
you must also specify a colour attribute to define the colour of the object.
<MODEL name="VirtualSquare" objectName="UNIT_CUBE"
scale="0.5" pos="0.0 0.0 0.0" rot="0.0 0.0 0.0"
pickable="true" colour="1.0 1.0 0.0"/>
Several models may be specified in the same config file to create a scene.
Model Formats
The system uses the model loader plugins, which are packaged with Performer. I tried
many file formats and associated loaders and found that the wavefront (.obj) file
loader gave the best results (i.e. colours and textures were loaded correctly for most
models). Wavefront .obj files have an associated material file (.mtl), which defines the
models colours and textures.
Model Conversion
Any model you wish to use which is not in the wavefront (.obj) format can be
converted to .obj by conversion software. The VR Group have a program called
NuGraf which can convert many different model formats to .obj files, and at the same
time generating a .mtl file.
Textures in the .obj files only seem to work with Performer if the texture images are
in .rgb format. The UNIX program “xv” can convert from many image formats to
.rgb. Use xv to convert all of your model’s textures to .rgb format. You must then also
change the references to the textures in the .mtl file. Just go through the .mtl file and
change all texture name extensions to .rgb.
e.g.
Original .mtl file
StoneTexture.jpg
You change to
StoneTexture.rgb
Once you have the model in the right format (i.e. a .obj file with associated .mtl file),
and all your texture images are in .rgb format and you have changed the extensions to
.rgb in the .mtl file, you can now load the model via the interact config file.
Interaction Variables
Interaction Metaphor
As well as a scene description the system needs to define how users will interact with
the scene, several interaction metaphors are implemented.
VIRTUAL_HAND, RAY_CASTING, CONE_LIANGS and CONE_ANTHONYS.
For example, to specify that VIRTUAL_HAND should be used as the interaction
metaphor, include the following line in the IR config file
<INTERACTION_METAPHOR Name="VIRTUAL_HAND"/>
For cone interaction methods, you must also specify the size of the cone. This is
achieved with line below, where theta gives the half angle of the cone.
<INTERACTION_CONE_SIZE theta="3.5"/>
Navigation
We must also specify how the user navigates around the environment. The simplest is
to specify no navigation with the line below.
<NAV_METAPHOR_MASK mask0="NO_NAV" mask1="UNDEFINED"/>
At present the only travel technique implemented is the standard point in the direction
you wish to move technique, specified with the line below.
<NAV_METAPHOR_MASK mask0="FREE" mask1="UNDEFINED"/>
However there are a number of options, which can constrain the users movement. For
instance if you wish to constrain the user to the ground plane, and within a box
defining the extents of the environment, use
<NAV_METAPHOR_MASK mask0="CONSTRAIN_TO_GROUND_PLANE"
mask1="CONSTRAIN_TO_ENV_BOX"/>
The environment box is defined with the line
<ENVIRONMENT_BOX min="-4.0 0.0 -5.0" max="4.0 2.0 5.0"/>
Other Interaction Variables
If you wish potentially selected objects to be highlighted every frame, then use
<INTERACTION_OBJECT_HIGHLIGHT
highLightPotentialSelectionObjectsEveryFrame="true"/>
You may constrain the position of objects, so that if they are dropped outside of the
environment box (defined above), then the objects will pop back into the box. This
stops objects getting lost to the user outside of the environment.
You may also set objects to pop up on top of a second bounding volume. For instance
if you have a cupboard in the environment and wish objects dropped inside the
cupboard to popup on top of the cupboard.
Use the line below to turn these options on.
<DROP_OBJECT_CONSTRAINTS popDroppedObjectsIntoEnvbox="true"
popDroppedObjectsOntopofCupboardBox="true"/>
and define the cupboard box with the line
<CUPBOARD_BOX min="-4.0 0.0 -5.0" max="-3.0 1.2 5.0"/>
If you wish to only allow a single object to be picked up, use the line
<EXCLUSIVE_PICKUP objectName="YellowCube"/>
And to set the initial users position in the scene use the line
<START_POSITION ViewReferencePoint="2.5 0.0f 0.0"
ViewDirection="-1.0 0.0 0.0" ViewUp="0.0 1.0 0.0"/>
Recorder/Player Variables
The system can save a recording of all the events generated in the system. (Events
usually make some change to the scene graph). This recording is like a video of all the
users movements in the world, which may be played back at a later time to see how
the user moved and interacted in the world. The recording is in the form of an XML
file and apart from being fun watching your movements in VR, it also provides a very
rich data set, which may be used in analysis, post experiment.
To create a recording of a session set the line below
<RECORD_PLAY_STATE Name="RECORD"/>
To specify no recording is needed set the line below
<RECORD_PLAY_STATE Name="NO_RECORD_OR_PLAY"/>
And to play back a recording use
<RECORD_PLAY_STATE Name="PLAY"/>
NOTE to play a recording interact must have exactly the same config file as was used
at time of recording (apart from changing the line above from record to play). Also
you must manually place at the end of the recording file a tag
</EVENT_FILE>
if the recording is to work. This tag should be placed automatically by the system
whenever a session ends, but it is not; so if a recording fails to load, check that the
above tag exists at the very end of the recording file.
NOTE if a session was recorded with the start/end behaviour specified using
<BEHAV_START_END_SEQUENCE Activate="true"/>
and the user completed a specified number of trials, then the </EVENT_FILE> end tag
is placed automatically in the recording file.
Distributed Recordings
For distributed systems a separate event recording is generated for each user.
Behaviour Variables
A good VR system needs behaviours i.e. how the environment might react to the users
interactions. For example you might want to turn a light on when the switch object is
selected. A full behaviour definition system will usually use some sort of scripting
language to define behaviours. But at present the IR system simply defines hard coded
behaviours as needed by any experiments. Several simple behaviours have been
defined for use in our experiments, they are not very general or reusable outside of the
experiment and so I wont describe them here, see the various experiment config files
for a demonstration of their use.
Distributed Variables
IR has a fairly basic distributed functionality, to allow two users to interact in an
environment from remote sites, using reliable TCP/IP transport. Lets call one user the
master node and the other user the slave node. To run the distributed system we
include the line
<NETWORKING NetworkEnabled="true" IsMasterNode="true"
Address="128.16.64.97"/>
In the master node’s config file, and include the line below
<NETWORKING NetworkEnabled="true" IsMasterNode="false"
Address="128.16.64.97"/>
in the slave node’s config file, where the IP address is the address of the master node.
Then run the master node app first, the app will pause and wait for the slave app to
connect. The slave app can then run to connect to the master app.
There are many issues with conflict resolution, maintaining consistency etc to do with
distributed VR systems. IR at present attempts no conflict resolution. And consistency
is to an extent the responsibility of the programmer to maintain. For instance all
generated events have a flag, which specifies whether the results of an event are
distributed or not. All normal user interactions (via the interaction manager) such as
navigational movement and pickup/drop object are distributed automatically. But any
behaviour code has the choice whether the events are distributed or not. The
programmer has therefore two basic choices to maintain consistency; either run
behaviour code at one node of the distributed system and distribute all the results, or
run the behaviour code locally and don’t distribute the results.
Extending the system (a look at the code)
Events
The IR system revolves around the generation of events, which change the scene
graph in some way (see list of events below). Events may be generated in many ways.
User defined behaviours may generate events based on the current state of the scene
graph. The Interaction manager may generate events based on user interaction, such
as use of a navigation or interaction metaphor. If the system is distributed then events
may be received from a remote user. Or if the system is in playback mode, then
loading in an event file recording may generate events.
List of Events
UPDATE_NAV
UPDATE_WAND
UPDATE_HEAD
PICKUP_OBJECT
DROP_OBJECT
CHANGE_LIGHTING_INTENSITY
SET_OBJECT_DCS
REMOVE_OBJECT_DCS
ADD_OBJECT_DCS
HIGHLIGHT_OBJECT
UN_HIGHLIGHT_OBJECT
Each event has a set of associated attributes, for instance the PICKUP_OBJECT event
needs the ID of the object to pickup, and the matrix in wand coordinates which
determines where in the wand coordinate system the object should be attached (see
code for attributes associated with other events.
System Schematic
The diagram below illustrates the flow of events through the system. One very
important point is that all events, no matter where they are generated must pass
through the Recorder Player at some stage, so that the events may be recorded for
playback later. This recording is also the main source of data for any experimental
analysis of an IR session.
Behaviours
VRJuggler
Interaction
Manager
Remote
User
Recorder Player
Event Sender
Receiver
Event
Recording
File
SceneGraph
Manager
Scene Graph
Contained in the scene graph manager are all the performer nodes, which form the
scene graph. The nodes are arranged in a hierarchy as shown below.
rootNode
pfGroup
localHeadModel only used when
viewing recorded sessions
localNavDCS
localWandDCS
localHeadDCS
pfDCS
pfDCS
pfDCS
M
M
remoteNavDCS
pfDCS
remoteWandDCS
remoteHeadDCS
pfDCS
pfDCS
M
M
M
PMG
NotPMG
LightG
pfGroup
pfGroup
pfGroup
M
M
M
M
Several pickable models
M
M
M
Several pfLightSource’s
Several NOT pickable models
PMG
= Pickable Model Group
NotPMG = NOT Pickable Model Group
The M symbol (see below)
represents each loaded or
created model and consists
of the performer nodes in
diagram opposite
See ModelLoading
modelDCS
pfDCS
modelSCS
pfSCS
M
=
model
pfNode
Download