Proposal

advertisement
A Puzzle Game Controlled by Live MIDI Input
Dylan Karas
Advisor: C. David Shaffer
Abstract
Most well-known musical games have taken on the role of performance evaluators, requesting that the
player perform a designated score and determining level of success by measuring the quality of the
performance. Other musical software has served as performance illustration, allowing a live or pre-recorded
performance to manipulate a virtual world in such a way that it provides a further theatric element to the
performance or acts as a further tool of expression to the musician. This paper looks at the use of a MIDI
keyboard to directly manipulate objects in the game world, such that while a successful performance will lead
to success, the nature of that performance is not made apparent to the player in a musical sense, but is described
through the arrangement of the game objects and the known effects that the performance enacts upon them, as
well as the conditions for victory. This is achieved through a puzzle game consisting of several levels, each with
its own victory conditions, and a universal set of game objects, each of which respond to the musical input in a
unique way.
1
Introduction
In the most general sense, we can define a musical game by how it fulfills the following qualities: the
source of the music in the game, the time at which the music is created, the controller by which the player
interacts with the game, and the manner in which the game treats the music. Although there may be many
layers of music in a game, when we refer to music we speak of only that which has a direct impact on the game
world. With this definition in mind, the source of music in a game is in practice either the player or the game
itself. The music may also be played in real-time or it may be prerecorded. For our purposes, controllers can
generally be categorized as non-musical, pseudo-musical, or musical. Finally, the manner in which the game
treats the music generally lies in one of two categories; either the game evaluates the music and modifies the
game environment to communicate success or failure to the player or uses parameters of the music to modify
the game environment in a non-evaluatory manner.
Problem Description
My aim is to create a game with the following qualities: it receives music from the player in real-time
using a musical controller, specifically a MIDI keyboard, and maps the music to in-game effects. Moreover, the
game should be able to respond to the following qualities of the MIDI input: tempo, velocity (dynamics), key,
chord quality, interval size, individual note class, and note octave. This idea will be framed in a puzzle game
consisting of several levels, each with its own objectives. The overarching mechanic behind all levels though
will be the interaction of various game objects, each of which will respond to a different aspect of the musical
input. Each level will contain a different subset of these game objects, and the goal will be achieved by causing
the game objects to interact in a certain manner. Each level will also adhere to one of three categories based on
the way in which the world unfolds over time:
2
1.
The sandbox world; while the objects in the world may change with time, the ability to meet the
level’s objective will not be inhibited. This level type allows for experimentation without the risk of
failure, since the level is structured so that an incorrect solution will not put the game world into an
unwinnable state.
2.
The directed world; this world will be strictly timed. The player must input a desired sequence of notes
with arbitrary precision in order to succeed. Failure to meet this constraint will result in failure of the
level. There may be slight latitude in this constraint in terms of playing incorrect notes. Since all input
maps to the control of objects, if a note is pressed which has no mapping in the level, there will be no
adverse effects.
3.
The segmented world; this world will consist of a number of strictly timed segments like those of the
directed world. Failing a segment, however, will not result in failure of the level, but will return the
game state to its condition at the beginning of the failed segment. In this way, each segment may be
attempted as many times as necessary for success.
Game objects should each respond to a single note, so that the combined information of octave and
note class will uniquely identify that object’s controlling note. This note may be the sole means of controlling
that object, or it may be the root of a chord, the lowest note of an interval, or the key’s tonic. Some indication
of this will be made in-game. Controlling notes will be confined to only two octaves, allowing for a maximum
of 24 separately controlled objects in play at a time. There may also be multiple objects controlled by the same
note. With two octaves dedicated to object control, a third octave will be used for functionality external to the
game world, such as menu navigation. The primary concern of this project is the functional implementation of
these features; no claim is made to the ease of use or quality of the gameplay.
3
Related Work
If we look at the most popularized examples of musical games, most fall into the genre of rhythm
games, which require the player to mimic a musical phrase or play notes as they are shown on the screen.
Examining them based on the above qualities, rhythm games receive music from the player in real-time and use
it to measure success. The earliest of these games, such as Parappa the Rapper [1] in 1996, made use of nonmusical controllers (the standard gamepad) for input. More modern games generally make use of pseudomusical controllers; that is, controllers which resemble musical instruments, such as the guitar-shaped
controller used in Guitar Hero [2] and the more realistic guitar, bass, drums, and keyboard used in Rock Band 3
[3]. Other games allow the use of real instruments, either through sound recording or using MIDI or a similar
protocol. An example of the first type is Visual Music, developed by the University of Calgary’s Digital Media
Lab, which listens to the player’s performance on any desired instrument and evaluates based on selected
criteria [4]. An example of the second type is Synthesia [5], an originally open-source project developed by
Nicholas Piegdon, which allows the user to play in sync with falling notes using a MIDI keyboard. All rhythm
games are examples of performance evaluators – they judge success based on the quality of the player’s
performance. Another type of musical game is the performance illustrator. These games accentuate a musical
performance by visual effects, in some way linking the visual world with the musical world.
The first type of performance illustrator is the game that acts itself as a musical instrument. In this type
of game, the interaction of objects in the game world create music. In general terms, these games create music
in real-time and use non-musical controllers; furthermore, the music has no effect on the game, rather the
game is used as an expressive tool for generating music. One of the more well-known games of this type is the
2005 Nintendo game Electroplankton [6], which allows the player to interact with the Nintendo DS
touchscreen and controls in several levels to create musical and visual effects by causing the in-game objects to
interact. A similar example is Javier Sanchez and Jaroslaw Kapuscinski’s Counterlines project, which allows for
4
music to be created by drawing lines on a tablet [7]. The orientation and length of the drawn lines are used to
determine the produced music.
The other type of performance illustrator is that which acts a visualization of musical input. These are
not games in the usual sense, but can be considered so since they provide some form of entertainment and
interactivity with a virtual world. Visualizations are sourced by the player in real-time, use musical controllers,
and use the music as a mapping to in-game events. Once such game is Symbiotic Symphony, which is a
modification to the classic Conway Game of Life. In this game input is taken from a musical instrument and the
amplitude and frequency are used to determine a region of cells which are stimulated by the music[8]. The
previously mentioned Counterlines project is also an example of this type, since in addition to transforming
lines into music, it allows music to be transformed into lines.
Also in this vein is research performed by Taylor et al. into interaction with a virtual character through
musical performance [9]. In this experiment musical input was parsed through recorded voice and MIDI and
mapped onto predefined animations for the virtual character. This possesses all the same qualities as the game I
intend to create, but differs in that the control of a virtual character has no objective; there is no game-specified
condition for success or failure. Another close example is David Hindman’s Modal Kombat [10], which uses a
device called the Sonictroller that allows real musical instruments to be used as a substitute for a traditional
control pads, and applies it to the game Mortal Kombat, allowing two players to compete against each other.
This experiment differs from my game in that it makes no use of MIDI and that it while it does offer a musical
controller, it interfaces it with games designed for traditional gamepads which are therefore unable to make use
of all the musical information provided by the instrument.
A third type of performance illustrator is one that also acts as a visualization of musical input, but
which makes use of prerecorded music rather than live music. For this I refer to the work of Juha Arrasvuori
and Jukka Holm, who have experimented with using both MIDI and MP3 files as input to effect game state [11].
In both cases existing games have been modified to accept mappings of various MIDI or MP3 parameters onto
in-game objects and processes. In this case there is a similarity to my proposed game in that the musical input is
5
used to control many objects in the game world rather than a single character, but it only does so passively,
since the player still retains control over the game’s main character, and neither is it performed in real-time.
Considering these various types of musical games, my proposed game would fit into the category of
performance illustrators, one marrying the philosophy of the prerecorded visualizations to control multiple
game parameters and the implementation of the live visualizations to accept musical input as the primary
means of control. The games which act as musical instruments are also influential, since once the objects in the
game are set in motion by the musical input subsequent interactions between them may elicit musical responses
generated by the game. The category of performance evaluators is expressed more for the sake of completeness
and historical reasons, as these were the first games to incorporate music as an important gameplay feature.
Problem Solution
At a high level, the structure of the game will consist of three parts: the MIDI device, the translation
layer, and the control layer. Basic MIDI events will flow from the MIDI device into the translation layer, of
which only note on and note off events are relevant to the game’s operation. These events will then be
translated into higher level control information as is specified above; specifically, for each note, interval, or
chord in the input sequence, one control frame is generated by the translation layer and passed to the control
layer. This program structure is illustrated in Figure 1.
A given control frame will contain the following information: the current estimate of the tempo, the
current possibilities for the key, and the underlying control object, which will be a note, an interval, or a chord.
Given this control frame, the control layer will iterate through all controllable objects in the game world and
pass the control frame to those which accept the control object. In addition, any global objects which make use
of the tempo and key information will be updated.
6
Game
MIDI Device
Translation Layer
MIDI
Events
Control Layer
Control
Frames
Figure 1
In order to discuss the handling of MIDI events in the translation layer, let 𝑀 = {𝑚𝑖 ∶ 0 ≤ 𝑖 < 𝑛}
represent a series of 𝑛 MIDI input messages and {𝑡𝑖 ∶ 0 ≤ 𝑖 < 𝑛} represent the times at which they are received
̂ = {𝑚
in arbitrary-length ticks. Furthermore, let 𝑀
̂ 𝑖 ∶ 0 ≤ 𝑖 < 𝑛} represent the series of distinct units of 𝑀,
where a unit is one or more notes played simultaneously. Two things to consider here are the latency between
the time the notes are played and the time they reach the program, and the jitter value which determines by
what amount the latency fluctuates. The latency itself is of no concern for processing the data; what is
concerning is the jitter, since a sudden change in the latency can effect the ability of the program to determine
simultaneous note presses. According to a 2004 study, on Windows using a USB MIDI interface, the jitter does
not generally exceed 8 ms[12]. Given the prevalence of USB and advances in technology, we can reasonably
expect a lower value today, or at the very least one that is no worse. The constraint this imposes is shown
below. From 𝑀, the game must be able to determine:
1.
̂ . For this a tolerance value 𝜖 (measured in ticks) must be defined, and events that occur
The series 𝑀
within that window will be considered simultaneous. Because of the jitter, it must be the case that
𝜖 > 𝑗, with 𝑗 being the maximum jitter. Once 𝜖 is determined, we can do the following: when the first
note 𝑚0 is played, a new buffer is initialized. New notes are placed in this buffer until a note 𝑚𝑖 is
reached such that |𝑡𝑖 − 𝑡0 | > 𝜖. At this point, the notes in the buffer are processed and sent out as a
unit, and 𝑚𝑖 is used to begin a new buffer. Then the process repeats.
7
2.
The average tempo of the events. This is simply:
̂|
|𝑀
𝑡𝑛 − 𝑡0
3.
The interval between any two events. Since pitch is encoded in each MIDI event, this is trivial.
4.
The quality of a chord consisting of three events. This follows from determining each interval.
5.
The key of 𝑀. This will merely be a check to see if 𝑀 contains seven distinct note classes and that
these note classes fit some key signature. Any detection more advanced is unneeded since objects
responding to key will be limited.
For the control layer, we now consider the objects in the game. Each of the game objects will consist of
three components. As already discussed, one component of these objects will be the controlling note, which
will lie inclusively in the range A3 to G#6; in addition, each game object will have a desired type of control
object (note, interval, or chord). This information will be displayed to the player in the form of a letter (the
note class) overlayed onto a symbol representing the type of control object. Preliminary concepts for these
symbols are shown in Figure 2. The third component will be the object’s response upon receiving a control
frame. This, of course, will be the defining characteristic of the object, and some ideas for responses are
discussed later. Objects can be considered to be of one of two types: those which affect the state of the level,
and those which do not. The first type will accept controlling notes in the range A3 to G#5, and will be the
most prevalent object type in the game, being the means by which the player completes levels. The second type
of game object will only accept controlling notes in the range A5 to G#6. These objects will either have
superficial effects such as altering the background music or the level’s color palette, or will be used for functions
external to the level, such as game menu navigation.
8
m
B4
D4
C4
G4
Figure 2
Concepts for object bubbles displaying control information to the player. The interval bubble displays both notes rather than the large
amount of possible intervals. The chord bubble displays the root and the quality since there are only four possibilities.
Considering that the game’s central principle is the interaction of the game objects in a given level, it
follows that in discussing the specific objects much attention should be given to the manners in which they are
able to interact. Then there are two facets of note to any object: how its state can be affected by other objects,
and how, when it is controlled by the player, it is able to affect the states of other objects. Furthermore, it is
helpful to divide the objects into groups based on the means of their interaction. With that in mind, the object
groups are as follows:
Motion :
These objects are able to interact with each other physically, by means of colliding, repulsing,
or otherwise perturbing one another.
Barrier Orb:
Control Type:
Single Note
Response:
When the controlling note for this orb is played, a straight-line solid barrier
is formed between it and each other orb with the same controlling note.
Ball:
Affected By:
None.
Affects:
All solid moving objects.
Control Type:
Chord
9
Response:
Ball changes material based on the chord quality. Possible materials include
rubber, which allows the ball to bounce; iron, which allows the ball to cling
to magnets; iridium, which can smash through obstacles; and copper, which
allows the ball to transmit electric signals.
Gate:
Affected By:
All solid objects.
Affects:
Wire – can bridge two disconnected pieces of wire
Control Type:
Single Note
Response:
Playing the corresponding note opens the gate. The gate can be closed again
by playing the note a second time or by waiting for a certain time period to
elapse.
Launcher:
Affected By:
None.
Affects:
All solid moving objects.
Control Type:
Single Note
Response:
Applies a force to any object in front of it with a magnitude comparable to
the velocity (dynamic) of the played note.
Fan:
Affected By:
None.
Affects:
All solid objects.
Control Type:
Chord
Response:
Similar to the launcher; the dynamic of the chord will determine the force
with which the fan blows. Also, the chord quality determines the
temperature of the air that the fan blows, either cold or hot. For the case
where the fan simply blows an object forward, this has no effect; however,
10
the air temperature can be used to freeze, melt, or evaporate, other game
objects.
Affected By:
None.
Affects:
Lightweight objects.
Objects affected by temperature.
Water:
These objects are able to interact with each other by the manipulation of water.
Cloud:
Control Type:
Chord
Info:
For non-water functionality, see cloud in the electricity group.
Response:
Responds to a diminished chord by producing rain, which falls directly down
until encountering a solid object.
Bucket:
Affected By:
None.
Affects:
Any object affected by falling water.
Control Type:
Single Note
Response:
Playing the controlling note causes the bucket to spill any water it is carrying
directly below it.
Hose:
Affected By:
Falling water.
Affects:
Any object affected by falling water.
Control Type:
Interval
Response:
The size of the played interval, with its lowest pitch as the controlling note,
dictates the rate of flow of the water emanating from the hose. This size can
11
range from a unison (just the controlling note), which turns the hose off, to
an octave (P8), which is the maximum rate of flow.
Affected By:
None.
Affects:
Any object affected by water.
Electricity:
These objects are able to interact with each other through electrical signals.
Generator:
Control Type:
Single Note
Response:
Produces electricity in an amount proportional to the tempo of its controlling
note.
Wire:
Affected By:
None.
Affects:
Wire – provides power to the wire.
Control Type:
Single Note
Response:
Playing the note which corresponds to the wire in conjuction with the note
that corresponds to an empty input or output port causes the two to connect.
Similarly, playing the note which corresponds to the wire in conjunction
with the note that corresponds to the port it is connected to causes the two to
disconnect.
Affected By:
Generator – powers the wire.
Output Port – receives its current state (high or low) from the port.
Input Port:
Affects:
Input Port – transmits its current state to the port.
Control Type:
Single Note
Response:
See Wire.
12
Output Port:
Logic Gate:
Affected By:
Wire – receives state from wire.
Affects:
Output Port – transmits state to corresponding output port.
Control Type:
Single Note
Response:
See Wire.
Affected By:
Input Port – receives state from corresponding input port.
Affects:
Wire – transmits state to wire.
Control Type:
Single Note
Description:
Takes two input ports and combines them (with AND, OR, or XOR) into an
output port.
Cloud:
Response:
Toggles the inversion of the output, e.g. between AND and NAND.
Affected By:
Input Port – receives states on which to perform logic operation.
Affects:
Output Port – transmits result of logic operation onto attached output port.
Control Type:
Chord
Info:
For non-electric functionality, see cloud in the water group.
Response:
Responds to an augmented chord by producing a lightning bolt, providing
power to the first object it meets directly below it.
LED Battery:
Affected By:
None.
Affects:
Any powerable object.
Control Type:
Single Note (for each LED)
Response:
Each LED lights up when its controlling note is played. The sequence and
timing in which the note of each LED is played is recorded. Once a certain
13
amount of time has passed since the control of the last LED, or when the
controlling note of an object outside the LED battery is played, the sequence
is considered to be entered. At that point, the LEDs repeatedly light up in
time with the entered sequence; when each LED is on, an attached output
point is in a high logic state.
Affected By:
None.
Affects:
Output Port – each LED has an output port which is set high while the LED
is active.
Mnemonic:
These objects require the memorization and playback of a musical phrase.
Bird:
Control Type:
Single Note
Description:
The bird sings a phrase; once the bird’s controlling note is player, all
subsequent notes are routed directly to the bird, so long as they match the
phrase in pitch and timing. If a note is played that does not match the phrase,
notes begin to be once again routed to their usual objects.
Response:
If the bird’s phrase is correctly matched, it will undertake some
predetermined action in the game world, such as removing some physical
boundary or moving some unreachable object.
Affected By:
None.
Affects:
Anything.
While this is by no means a finalized list of objects that will appear in the game, it does provide a
strong notion as to the types of objects which will appear and illustrates the ways in which they can affect one
another. To further show the methods by which they can be used in conjunction for puzzle solving, I will list a
14
few small puzzles which could serve as parts of a level or as miniature versions of a full level. For the first
example, consider a puzzle in which the objective is to power a single LED – such a puzzle could have the
following appearance:
Gate
Breakable
Barrier
Launcher
L
Ball
Magnetic
Platform
Generator
Barrier Orbs
Barrier Orbs
LED
Wire LED
Figure 3
The sequence of actions required to achieve the objective, ignoring timing, would be as follows –
launch the ball, turn the ball to rubber, activate the leftmost barrier orbs, open the gate, activate the rightmost
barrier orbs, turn the ball to iridium, turn the ball to iron, turn the ball to copper, power the generator. This
would bounce the ball through the gate, bounce the ball into the barrier and break through it, attach it to the
magnetic platform, and drop it into the gap in the wire. Since there is a strict timing element to the puzzle; that
is, failing to enter an input in time would result in the ball falling off the stage, this would fall under the
category of a directed world as described earlier. By controlling the parameters of the puzzle, such as the
controlling notes of the objects, the acceleration of gravity, the time which the gate stays open before closing
and the barrier orbs stay connected, and the rotation speed of the magnetic platform, the input sequence can be
regimented to match a desired musical score. Specifically, the acceleration of gravity would control the tempo,
15
the time which the gate and orbs stay activated would control the timing window 𝜖 in which an input can be
successfully entered, and of course the controlling notes of the objects would correspond to the notes of the
score. The chords to transform the ball into iron and then to copper would function as the cadence of the piece.
An example of a segmented world would be a puzzle in which the objective is to propagate a signal
through a large circuit. The signal moves forward through a special wire with a fixed speed. There are several
gates it much pass through before reaching the end of the stage. Whenever it reaches an AND gate, the player
must have the second input port in a high state; whenever it reaches an XOR gate, the player must have the
second input port in a low state. The world would be segmented for each gate that the signal passes through. If
the player fails to produce the desired logic state before the signal reaches the gate, the level state would reset to
the moment at which the signal passed through the preceding gate.
For an example of a sandbox world, again consider a level where the objective is to provide power to
an LED; the puzzle has the following appearance:
LED
Wire
Power Cell
Wire
Magnetic
Platform
Magnetic
Platform
Ball
Ball
Ball
Hose
Cloud
Fan
Launchers
Barrier Orbs
Wires
LED Battery
Figure 4
Flume
16
Ultimately the goal in this puzzle is to simultaneously fill all three gaps in the wire with copper balls.
To achieve this, the balls must be launched at a certain order and timing from the three central launchers.
Specifically, the launchers must be fired in sequence from lowest to highest; this is achieved by playing the
proper sequence for the LED battery so that the launchers are activated correctly. Next, both the ball from the
left and the ball from the right side must be moved to their respective launchers. On the left side, first the cloud
must be frozen, then the ball turned to a non-magnetic material so as to land on the frozen cloud, and then
then ball must be dropped onto and bounced from the barrier orbs onto the launcher. On the right side, the
hose must be adjusted to the correct rate of flow such that when the ball is dropped from its platform, it does
not undershoot or overshoot its launcher when falling from the end of the flume. This puzzle classifies in the
sandbox category since there is no one correct input sequence; the LED battery may be activated at any time,
and the two balls may be moved in any order, or even simultaneously.
Now that I have discussed the game at a high level, I will mention some details of the implementation.
First, I have decided to implement the game in Java, for several reasons. Most importantly, Java already has a
robust MIDI API which will take care of getting the MIDI events to the translation layer. While an additional
library for analyzing the incoming MIDI events would be useful, after surveying several MIDI-capable libraries,
such a facilitity was not available. Of the libraries, JFugue does support the parsing of simultaneous note presses
through its Parser interface[13], but after examination of the source code I have determined that the MIDI
parser is incapable of this. Another reason I have chosen Java is that through my previous work with games in
Java I have developed a fairly extensive codebase which can easily be reused for this game, taking care of the
basics of rendering and updating the game world. Both of these will afford me greater time to focus on the most
important aspects of the game: the translation layer, the control layer, the game objects, and the levels.
17
Materials / Budget
The materials needed for this project are a MIDI transmission device for my computer and a portable
MIDI keyboard for demonstrating the game. The simplest and cheapest MIDI transmitters are USB 1-to-1
converters, of which I have found two that are best suited for my project, having moderately long cables and
having been well reviewed with regard to signal response time, which is vital to the playability of the game.
These are:
M-Audio UNO
E-MU XMidi
Price Range: $35-$50
Price: $30
Length: 10 feet
Length: 8 feet
Between these the UNO would be a better choice, not being very much more expensive than the
XMidi and providing an extra two feet of length (eight feet is infeasible for my current setup). As for portable
MIDI keyboards, the nature of my project will require it to be velocity sensitive and to have at the very least 36
keys. With these specification in mind, the keyboard that strikes the greatest balance between expense and
performance is the M-Audio Keystation 61es, which ranges in price from $150-$200. Combining these then, a
reasonable estimation for the cost would be $250.
18
Timeline
Nov. 4-18:
Complete functional versions of the translation layer and control layer.
Nov. 18-Dec. 16:
Develop a “testbed” level and several objects to populate it. This will serve as the groundwork
to create real levels and offer a demonstration of the prototypical object functionality. Produce
additional improvements to the translation and control layers.
Dec. 16-Jan. 27:
Develop a level that falls under the directed world category and a hub environment to access
the various levels as they are completed. Continue to develop objects. Finalize translation and control
layers.
Jan. 27-Feb. 10:
Complete working menu framework that allows for exiting the game and returning to the hub from a
level. Develop a suite of visual effects controlled by the top octave. Continue to develop objects.
Feb. 10-Mar. 2:
Develop a level that falls under the segmented category. Finalize development of objects.
Mar. 2-Mar. 30:
Develop menu functionality to modify game parameters in order to alter aspects of the directed world
such as tempo and precision of input. Offer sets of preset parameters fitting various levels of difficulty.
Mar. 30-Apr. 13:
Develop a level that falls under the sandbox category.
19
[1]
Parappa the Rapper. http://en.wikipedia.org/wiki/PaRappa_the_Rapper. Accessed 4 October 2011.
[2]
Guitar Hero. http://hub.guitarhero.com/. Accessed 4 October 2011.
[3]
Rock Band. http://www.rockband.com/. Accessed 4 October 2011.
[4]
J. R. Parker and John Heerema. 2008. Audio interaction in computer mediated games. Int. J. Comput.
Games Technol. 2008, Article 1 (January 2008), 8 pages. DOI=10.1155/2008/178923
http://dx.doi.org/10.1155/2008/178923
[5]
Synthesia. http://www.synthesiagame.com/. Accessed 4 October, 2011.
[6]
Electroplankton. http://en.wikipedia.org/wiki/Electroplankton. Accessed 4 October, 2011.
[7]
Javier Sanchez and Jaroslaw Kapuscinski. 2010. Counterlines: a duet for piano and pen display.
In Proceedings of the 28th of the international conference extended abstracts on Human factors in
computing systems (CHI EA '10). ACM, New York, NY, USA, 4765-4770.
DOI=10.1145/1753846.1754228 http://doi.acm.org/10.1145/1753846.1754228
[8]
Mark Havryliv and Emiliano Vergara-Richards. 2006. From Battle Metris to Symbiotic Symphony: a
new model for musical games. In Proceedings of the 2006 international conference on Game research
and development (CyberGames '06). Murdoch University, Murdoch University, Australia, Australia,
260-268.
[9]
Robyn Taylor, Daniel Torres, and Pierre Boulanger. 2005. Using music to interact with a virtual
character. In Proceedings of the 2005 conference on New interfaces for musical expression(NIME '05).
National University of Singapore, Singapore, Singapore, 220-223.
[10]
David Hindman. 2006. Modal Kombat: competition and choreography in synesthetic musical
performance, [the first ever instrument-controlled video game battle]. In Proceedings of the 2006
conference on New interfaces for musical expression (NIME '06). IRCAM - Centre Pompidou, Paris,
France, France, 296-299.
20
[11]
Juha Arrasvuori and Jukka Holm. 2010. Background music reactive games. In Proceedings of the 14th
International Academic MindTrek Conference: Envisioning Future Media Environments(MindTrek
'10). ACM, New York, NY, USA, 135-142. DOI=10.1145/1930488.1930517
http://doi.acm.org/10.1145/1930488.1930517
[12]
Mark Nelson and Belinda Thom. 2004. A survey of real-time MIDI performance. In Proceedings of the
2004 conference on New interfaces for musical expression (NIME '04), Michael J. Lyons (Ed.). National
University of Singapore, Singapore, Singapore, 35-38.
[13]
JFugue. http://www.jfugue.org/index.html. Accessed 19 October, 2011.
Download