The intersection of arts, science and engineering

advertisement
PROFESSOR JOANN KUCHERA-MORIN
The intersection of arts,
science and engineering
In this discussion, Professor JoAnn Kuchera-Morin explains the development of the AlloSphere,
an immersive 3D environment that allows scientists to experience their data on a sensorial level
First, can you provide an overview of the
AlloSphere project? What role have you
played in this grand venture?
I created the AlloSphere project and
instrument. It is the culmination of 28
years of my research in media language/
systems design. From my background
as a composer and a media systems
researcher, the language and instrument
have been designed based on the creative
process of musical composition and
ensemble performance. Think of the
AlloSphere as a large immersive scientific,
artistic instrument, in which researchers
– typically an interdisciplinary team of
artists, scientists and engineers – work
32
INTERNATIONAL INNOVATION
together data mining very large and/or
complex information.
scientific discoveries as well as creating
new and unique art forms.
How have you achieved holistic rethinking
in aspects of the creative medium, such
as data process, perception, interaction,
immersion and evaluation?
‘Big data’ is becoming a hot topic among
researchers. Could you explain what it
means to your project?
By applying the creative compositional
process of sketching in building our
computational language and representing
very complex information through
our senses – namely visual and audio
representations that you can interact with
– we are enabling the same right brain-left
brain process that artists experience when
they create a work of art, for scientists and
other researchers. This will facilitate the
uncovering of new patterns in complex
information and allow scientists and
engineers to work with their information,
perceptually and intuitively, the way that
artists do.
We can represent any information as
visual and audio frequencies by mapping
their vibratory spectrum into the light and
sound domain. This is well understood
by mapping heat through infrared light.
For instance, when atoms vibrate they
create waveforms that behave as sound
waves. Scaling these down into the audio
domain will allow us to use our hearing
to understand their complex behaviour.
We are thus finding patterns in this
complex information that may lead to new
We believe that one of the most difficult
tasks of understanding big data is the ability
to quickly find new patterns in voluminous
amounts of information and to retain
the information from pages and pages of
numbers in one’s memory. If there is a way
to translate this information into tangible
visual and audio taxonomies, we may be
able to retain this information much more
readily.
In what way do mathematical algorithms
inform visual and sonic representations
within the AlloSphere?
Information can be represented through
mathematical models which can be
translated into visual and aural frequencies.
These mappings can be very literal when
one is mapping scientific information. They
become more abstract with data such as
financial information, social networks and
economic data. For example, with scientific
data, such as atoms and sub atomic
particles, they vibrate at frequencies
outside of the visual and audio domain.
Through a mathematical transformation,
they can be scaled down into the sense
domain where they can be perceived, the
PROFESSOR JOANN KUCHERA-MORIN
PICTURE OF THE MULTICENTER HYDROGEN BOND (IMMERSED IN ATOMS)
PROFESSOR JOANN KUCHERA-MORIN AND DR JAMEY
MARTH (CENTER FOR NANOMEDICINE) IMMERSED IN
MRI HUMAN BODY VASCULATURE
A new way to
understand scientific data
For the past 13 years, researchers at the University of California,
Santa Barbara, have been developing the AlloSphere – a large-scale
laboratory that intersects science, engineering and new media
same applies for chemical and biological
information. When this information already
exists in the visual domain, a picture is not
the only way it can be represented. Another
option is to take the rasterised grid of
spatial locations that represents the picture
and construct a visual computation model
that is a geometric mesh of the image.
An example would be the fMRI output of
the brain. This will allow for simulations
that can transform the mesh in order to
make a prediction of how the brain may
change if, for instance, a tumour were to
grow in a specific area. With more abstract
information, such as financial or economic
data, a frequency has a pitch and an
amplitude that can represent how quickly
the data is moving and at what volume.
Finally, are you exploring any other areas
of research?
Over the past three years, my colleagues
and I have worked on a new approach to
representing, generating and transforming
wave functions of a hydrogen-like atom
based on the creative processes of musical
composition. We have attempted to
represent the state of a quantum system
and its evolution in an immersive integrated
environment as a way to obtain, for the
scientist, a better intuitive understanding
of quantum reality; and to produce, for the
artist, a powerful medium for aesthetic
investigations and creative expression that
also speaks to truth.
THE ABILITY TO explore data in a multidimensional format, with rich visualisations and
sounds, has been a scientific pipedream for many
years. Imagine a place where scientists could use
their senses to study data and understand it at
a perceptual level, as if shrunken down, coming
face to face with their research. Such insights are
considered crucial for advancements in certain
areas of science and engineering, and since
1997, a group of digital media researchers at the
University of California, Santa Barbara (UCSB) has
been developing a holistic field that brings these
industries together through discoveries in new
media; an intersection of arts, science, engineering
and the creative process.
These discoveries have given birth to a new kind of
immersive laboratory, which allows for the visual
and aural representation of mathematical and
scientific data. Increasingly, scientific challenges
that have traditionally been tackled analytically,
are now being approached computationally, with
the answers represented visually.
At its most fundamental level, the AlloSphere,
is a digital microscope powered by a
supercomputer: a 30-foot diameter ball, set
within a three-storey, echoless cube, which
offers researchers an opportunity to synthesise,
manipulate and analyse large datasets and so
called ‘big data’ in a virtual reality environment;
one that can be interacted with on a sensorial
level. Aside from visualisation, the AlloSphere
facilitates various channels of inquiry, including
numerical simulations, data mining, abstract
data representations, systems integration and
human perception.
Leading the project is the AlloSphere’s conceiver,
Professor JoAnn Kuchera-Morin, who designed it
as a visual and aural limiting multimedia device
for both artistic and scientific endeavour. The
Allosphere’s original purpose was to provide an
intersection, where researchers from different
fields could share insight and collaboratively
explore phenomena such as symmetry, beauty
and pattern formation. The project is unique in its
grounding in both science and art (without being
bound to either one), urging a holistic revision
of certain components of the creative medium,
including data, process, perception, interaction,
immersion and evaluation.
INTEGRATION, DESIGN
AND THE ALLOBRAIN
The collaborative research team is highly
multidisciplinary and represents media arts, science
and engineering. Each team member performs a
specific function, as Kuchera-Morin explains: “The
media artist, who knows how to represent very
complex and abstract information visually and
sonically in time and space, works closely with
the domain researcher, who understands what is
important in their information. Working together
they make decisions on how the data will be
mapped, with the domain researcher highlighting
the most important characteristics of the data that
is unfolding spatially and in the time dimension,
and the media artist choosing the proper display
techniques to represent the data’s salient
characteristics”. Therefore, by representing data
through the senses, the media artist is channelling
a new intuitive language, which can then be
utilised by the scientist to perceive information,
using the same visual and aural information that
the artist uses to ‘create’. This innovative language
will enable scientists to more readily find patterns
in their data, as well as allow artists to make
‘truthful’ and fascinating works of art. Indeed,
one of the project’s key goals is to uncover new
worlds through simulations and visualisations;
encouraging the concept of ‘beauty as truth’.
The AlloSphere’s multimodality consists of visual,
audio and interactive components. 12 projectors
WWW.RESEARCHMEDIA.EU 33
INTELLIGENCE
A COMPUTATIONAL FRAMEWORK
INTEGRATING METHODS FROM
MUSIC COMPOSITION AND
SKETCHING FOR LARGE SCALE
SCIENTIFIC DATA VISUALIZATIONS IN
THE 3D IMMERSIVE ALLOSPHERE
OBJECTIVES
This project entails the design of a new
computational framework based on music
compositional process and sketching to map
and identify patterns in large-scale complex
scientific datasets.
Sonic marking of patterns in large datasets
and techniques in music composition will be
explored as ideal methods to identify complex
integrated layers of data as music carries
meaning on several timescales, from individual
timbres and pitches to short melodies and
rhythms all the way up to large-scale form
and structure of a work, each engaging distinct
perceptual and cognitive processes.
KEY COLLABORATORS
Dr Matthew Wright,
University of California, Santa Barbara
FUNDING
National Science Foundation –
award no. 1047678
CONTACT
Professor JoAnn Kuchera-Morin
Principal Investigator
AlloSphere Research Facility
California Nanosystems Institute
Media Arts & Technology and Music
University of California, Santa Barbara
2209 Elings Hall
Santa Barbara, California 93106, USA
T + 1 805 893 3010
E jkm@create.ucsb.edu
www.allosphere.ucsb.edu
JOANN KUCHERA-MORIN is a composer,
Professor of Media Arts and Technology and of
Music, and a researcher in multi-modal media
systems content and facilities design. Her years
of experience in digital media research led to
the creation of a multi-million dollar sponsored
research programme for the University of
California, Santa Barbara – the Digital Media
Innovation Program. The culmination of
Kuchera-Morin’s creativity and research is the
AlloSphere, a 30 ft diameter, three-storey high
metal sphere inside an echo-free cube, designed
for immersive, interactive scientific and artistic
investigation of multidimensional datasets. She
earned a PhD in composition from the Eastman
School of Music and serves as the Director of the
AlloSphere Research Facility at the University of
California, Santa Barbara.
34
INTERNATIONAL INNOVATION
have nearly 360o immersive visual projection; 54
audio speakers allow for 3D spatial audio; and a
14-camera sensor tracking system (as well as other
wireless devices) give researchers the opportunity
to interface with their data – manipulating and
controlling the information which is mapped to
visual and audio frequencies. The methodology
behind this is based on the represented content; it
steers the technological design and development.
The team has found that most of the content or
data – regardless if it is fMRI, atoms, financial,
or a work of art – share common properties and
features. The content has a landscape (topology)
which lives in dimensional space. Furthermore,
the data undergoes changes whilst inhabiting this
space. Kuchera-Morin proposes that the creation
of a computational language – one that represents
the feature space and any dynamic variations
in that space – is presenting a holistic and
interdisciplinary approach to scientific discovery
and artistic expression.
A prototype that has facilitated the development
of the computational language, on which many
different prototypes have been built, is the
AlloBrain. Utilising an array of mathematical
programming languages (specifically C++
libraries), the AlloBrain project was developed
using fMRI data from researcher Marcos Novak’s
brain. It is a study that is attempting to quantify
beauty through an investigation into the
neurophysiological reasoning behind aesthetic
appreciation. Thus, this side of the project also sees
a unique synergy between scientific endeavour and
artistic expression.
POTENTIAL APPLICATIONS
AND FUTURE DEVELOPMENTS
The research at UCSB is turning computation into
an interactive multimodal instrument: “If we can
make the computer into a device that will allow
us to interface with the machine more naturally
– such as gesturing with our hands and body
movements, using our voice and eye contact –
there will be a seamless connection from the real
world to that of computation, allowing us to use
technology as a natural extension to ourselves,”
Kuchera-Morin elucidates. Fantastic as it may
seem, the study will lead to the development of
several key applications, including new information
technology systems, ranging from full surround
platforms to mobile devices; interactive visual and
audio displays; and human computer interaction
systems featuring haptics and other senses.
But the potentiality of applications does not stop
there. In terms of content, application is relevant
to medical technologies, particularly in the fields
of biomedicine, telemedicine, remote diagnostics
and therapy, and targeted nanomedicine for
treating cancer as well as other illnesses. It is
also relevant new materials for IT systems,
energy efficient materials and smart clothing.
Furthermore, it could be integrated into the
field of arts and entertainment, for example in
immersive cinemas and computer games, as well
as 3D content and interaction.
The research at UCSB is turning
computation into an interactive
multimodal instrument
The dual focus of the project is leading to
groundbreaking developments in media design,
firstly through integrated media systems, and
secondly through multimodal representations of
complex scientific and abstract information (as
well as the creation of new art forms based on this
information). Kuchera-Morin sees a future that is
fully immersed with the technology: “As we scale
the AlloSphere instrument and the computational
language to distributed media platforms moving
to embedded systems, I see the technology
moving to wearable computing LED wallpaper –
the AlloSphere in the palm of your hand with eye
glass devices, like sunglasses, and the computer
in your pocket or embedded in the sleeve of your
shirt. The virtual and the material will be totally
integrated into a seamless space”. The implication
is not only an ability to pluck virtual information
from the ether, using only our senses to compute,
but a better understanding of such areas as
quantum information, the subatomic and the
human genome.
THE ALLOSPHERE INSTRUMENT AS VIEWED FROM ABOVE
© JASON MADARA/BERNSTEIN & ANDRIULLI
Download