Uploaded by Ami Rica

House as machine the influences of technology

advertisement
European Network of Heads of Schools of Architecture
European Association for Architectural Education
International Conference
International Symposium
File to Factory:
Rethinking the Human
The Design and Fabrication of Innovative Forms
in Technology
Driven Architecture
in a Continuum
Host: CMA (Centre for Mediterranean Architecture),
Chania, Crete, Greece, 3-4 September 2009
This project has been carried out with the support of the European Community
and in the framework of the Socrates Programme.
The content of this project does not necessarily reflect the position of the European Community,
nor does it involve any responsibility on the part of the European Community.
european network
enhsa
of heads of schools of architecture
european network
enhsa
of heads of schools of architecture
European Network of Heads of Schools of Architecture - European Association for Architectural Education
International Conference
Rethinking the Human in Technology Driven Architecture
Transactions on Architectural Education No 55
Editors
Maria Voyatzaki
Constantin Spiridonidis
Cover design: Emmanouil Zaroukas
Layout design: Dimitris Apostolidis
Printed by: Charis Ltd, Thessaloniki, Greece
ISBN 978-2-930301-53-2
Copyright © 2012 by the authors and the EAAE
All rights reserved. No part of this book may be reproduced in any form, by print, photoprint,
microfilm or by any other means without written permission from the publisher.
Despite the attempt to transcribe with accuracy the debates from the workshop, the editors
wish to apologise in advance for any inaccuracies of the interventions of individuals that
could be attributed to the quality of recording.
List of contents
Initiations
Maria Voyatzaki, Constantin Spiridonidis,
Kostis Oungrinis, Marianthi Liapi Greece
Rethinking the Human in Technology-Driven Architecture ............................................................ 13
Inspirations
Manuel DeLanda USA
Form-Finding through Simulated Evolution ................................................................................................ 19
Edith Ackermann USA
Emergent Mindsets in the Digital Age:
What Places for People on the go?
New media ecology / New genres of engagement ................................................................................ 29
Antonino Saggio Italy
GreenBodies - Give me an Ampoule and I will Live ................................................................................ 41
Kostas Terzidis USA
Digital Culture and Permutation Architecture ............................................................................................ 57
Reflections
Chris Younès France
Towards a Responsive Architecture:
Paradox of the Metamorphoses in Play ............................................................................................................ 69
Emmanouil Zaroukas UK
Hacking the Symb(i/o)tic field of Architecture ........................................................................................... 73
Alessio Erioli Italy
The Fashion Robot .............................................................................................................................................................. 91
Anastasios Tellios Greece
Human Parameters in Technology-Driven Design Research:
Formal, Social and Environmental Narratives ............................................................................................. 111
Xin Xia, Nimish Biloria The Netherlands
A 4EA Cognitive Approach
for Rethinking the Human in Interactive Spaces ...................................................................................... 121
Stavros Vergopoulos, Dimitris Gourdoukis Greece
Network Protocols / Architectural Protocols:
Encoded Design Processes in the Age of Control .................................................................................... 135
Socratis Yiannoudes Greece
From Μachines to Μachinic Αssemblages:
a Conceptual Distinction between two kinds
of Adaptive Computationally-driven Architecture ................................................................................. 149
Ava Fatah gen. Schieck UK
Embodied, Μediated and Performative:
Exploring the Architectural Education in the Digital Age ................................................................. 163
Kostis Oungrinis, Marianthi Liapi Greece
Rethinking the Human:
An Educational Experiment on Sensponsive Architecture .............................................................. 175
Sophia Vyzoviti Greece
Pleat and Play ......................................................................................................................................................................... 189
Contributions
(Re)searching a critically responsive architecture
Henriette Bier The Netherlands
Reconfigurable Architecture ..................................................................................................................................... 207
Natassa Lianou, Ermis Chalvatzis UK
Krama Proto-Tower
Systemic Interactions Design Research ............................................................................................................ 211
Rita Pinto de Freitas Spain
Hybrid Architecture - Hybrid Tools ....................................................................................................................... 227
Dimitris Psychogyios Greece
Collective Design for Hybrid Museum Environments .......................................................................... 241
8
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Georgia Voradaki, Despoina Linaraki Greece
Responsive Architecture as a Tool to Suppress the Psychological Disorders
of an Individual and Replace Medicines .......................................................................................................... 251
Anna Klara Veltsista, Nadia Charalambous Cyprus
Architectural Design Studio:
Reconsidering the Digital through Embedded Technologies ...................................................... 265
Mihaela Harmanescu Romania
Responsive Architecture through Urban Planning,
Landscape Architecture and Urban Design .................................................................................................. 277
Sally Stewart UK
Mapping the City: the possibility of developing
a rich picture place through experiments
with conventional, digital and stolen techniques of mapping .................................................... 293
Luigi Foglia, Renata Valente Italy
Rethinking the Public Space for Urban Intensive Performance Places .................................. 305
(Re)thinking a critically responsive architecture
Christian Drevet France
Form and Information ..................................................................................................................................................... 321
Konstantinos Grivas Greece
Constellation Pattern:
A Spatial Concept for Describing the ‘Extended Home’ ...................................................................... 343
Anastasia Karandinou UK
Beyond the Binary .............................................................................................................................................................. 355
Ada Kwiatkowska Poland
Architectural Interfaces of Hybrid Humans .................................................................................................. 363
Pau Pedragosa Spain
Immaterial Culture in the Age of Information Technology .............................................................. 371
Yannis Zavoleas Greece
House-as-Machine:
the Influences of Technology during early Modernism ...................................................................... 381
List of contents
9
Claus Bech-Danielsen, Anne Beim,
Charlotte Bundgaard, Ulrik Stylsvig Madsen Denmark
Tectonic Thinking - Developing a Critical Strategy
for a Responsive and Adaptive Architecture ................................................................................................ 395
Stylianos Giamarelos Greece
Have we ever been Postmodern?
The Essential Tension within the Metamodern Condition ............................................................... 409
Ana Maria Hariton Romania
Reading-Rewriting-Learning from and Building
on an Exemplary Precedent: Bucharest`s Blind Alleys ........................................................................ 421
Beril Özmen Mayer Northern Cyprus
Digital Design Thinking and Moving Images Syndrome ................................................................... 439
Charalampos Politakis UK
Skeletal Apotheosis of the Human Body ......................................................................................................... 447
Lucilla Zanolari Bottelli Italy
Wall-e ............................................................................................................................................................................................. 457
(Re)scripting and fabricating a critically responsive architecture
Jan Slyk Poland
Interactive Shadow Play ................................................................................................................................................ 471
Alexandros Kallegias Greece
Machinic Formations / Designing the System ............................................................................................ 485
Philippe Marin, Philippe Liveneau, Yann Blanchi, Angelo Guiga France
Digital Materiality: an Experiment with Smart Concrete ................................................................... 501
Ioanna Symeonidou, Urs Hirschberg Austria
Developing a ‘Next Generation’ Collaborative Design Platform .................................................. 513
Ole Vanggaard, Anders Munck, Ola Wedebrunn Denmark
The Origin of Change ....................................................................................................................................................... 527
Rodanthi Vardouli, Theodora Vardouli USA, Greece
The Digital Liminal:
Reflections on the Computational Transition in Architecture ....................................................... 539
10
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Daniel Comsa Romania
Contemporary Ways of Space Envelopment – Adaptive Architecture? ................................ 549
Lucien Denissen Belgium
Simulation of Comfort and Health ....................................................................................................................... 563
Stella Dourtme UK
Prototypical Matter: Digital Plaster ...................................................................................................................... 577
Maria Mandalaki Greece
Sustainable Design Technology:
Human Senses versus the Measuring of Environmental Conditions ....................................... 597
Lemonia Ragia, Chrisa Tsinaraki Greece
A Framework for DYNamic and Interactive Architecture (DYNIA) ............................................. 609
Teaching a critically responsive architecture
Joanna Crotch UK
Linger, Savour, Touch ....................................................................................................................................................... 619
Giovanna Franco Italy
Archetips, Symbols, Memory
Which Tectonic in Digital Culture? ........................................................................................................................ 631
Hubert Froyen, Sylvain De Bleeckere, Evelien Verdonck Belgium
IT Tools and Human Voices in Teaching Architecture ........................................................................... 641
Susanna Fülöp Hungary
Holistic Integrated Approach of Architectural Education & Practice
Terminology & Strategy ................................................................................................................................................. 655
Jim Harrison, Cathy Dalton Ireland
Learning to Imagine the Invisible:
Using New Technologies to Enhance User-Friendly Architecture .............................................. 671
Java Revedin Sweden
Architecture with the People
- Teaching a Critically Responsive Architecture with a Human Aim ......................................... 683
Balázs Balogh Hungary
Changes in Architecture:
Think - Design - Practice - Education .................................................................................................................. 695
List of contents
11
Antonis Papamanolis, Katherine Liapi Greece
Thoughts on Digital Architectural Education:
in search of a digital culture of architectural design ............................................................................. 701
Peter Staub Liechtenstein
Imagining the Real ............................................................................................................................................................. 709
Anastasia Tzaka Greece
Digital design pedagogies
A critical case study approach .................................................................................................................................. 721
Alexandros Vazakas Greece
High Technology - High Aesthetics ? .................................................................................................................. 729
Helle Brabrand Denmark
Body-Space Interface
Re-scripting and re-fabricating a responsive architecture ............................................................... 741
Contributors ............................................................................................................................................................................. 749
12
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Initiations
Rethinking the Human
in Technology-Driven Architecture
Over the past 10 years the research record of architectural education institutions in
Europe have significantly shifted from research primarily based on the Humanities to
research directed in and supported by Information Technology on experimentations
in architectural design, materials and construction. This shift, now a distinctive trend
in architectural research, has a direct impact on the entire construct of architectural
knowledge and design skills, as well as on the creation of the profile of the architect
and the priorities for pedagogical strategies in architectural education. The more
IT becomes ubiquitous by being integrated into almost everything people get their
hands on, the more architecture tends to absorb this technological impulse, by becoming adaptive, responsive, transformable, intelligent and customized.
These new conceptions of architecture are accompanied with new terms like liquid, hybrid, virtual, trans, animated, seamless, interactive, emergent, parametric, algorithmic, machinic and self generating, thus producing a new architectural culture.
That is a culture in which the terms and conceptions that have nourished architecture
for centuries are replaced by their opposites: stability and solidity replaced by change,
simplicity and clarity replaced by complexity and space replaced nowadays by (real)
time. In the design domain, emerging techniques and methods seem to have absorbed the bulk of IT, mainly with regards to software applications, which influence
greatly the way architects think, design and visualize their ideas. Meanwhile, the area
of fabrication has been rapidly evolving so that the versatility provided by design
software can now be materialized through advanced manufacturing equipment, previously employed only by the industry. Moreover, advancements in material science
have also been supporting experimentation in that direction. Last but not least, this
new culture has progressively established its ethos in the education of the architect
detectable in student design works, in the new nature of the design studio (lab) as
well as the gradual devaluation or even elimination of modules related to the Humanities in the architectural curricula and their being replaced instead by modules on
scripting, biology, representation and simulation software.
The paradigm of nature, the development of more powerful, sensitive, interactive and intuitive software as well as the ability to experiment with electronic assemblies have facilitated an ever-growing tendency for responsive architecture. One of
the most significant shifts of contemporary architectural thinking in our fast changing world is a strong inclination towards an innovative experimentation adaptable to
the speed and pace of changes occurring in our mind, soul and body. As a result the
whole practice is nowadays moving towards responsiveness. Thus, design tools are
used according to user demands and needs, which are now conceived as unstable
and transformable while fabrication methods develop to respond to design idiosyncrasies, and space is designed to respond directly to changing human behavior and
environmental conditions.
However, voices criticizing this digitalization of architectural thinking are becoming
more boisterous. Not only are they the voices of practitioners and educators, who steer
clear from avant-garde ideas and experimentations but, more significantly, of those
who have been strongly involved and engaged in the development, implementation
14
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
and theorization of the contemporary technology-driven architecture from its infancy. The common grounds of these critics focus on three main orientations; the design
process, the nature of the outcome, and the role of the architect. The digitalization of
the design process and its development as an imitation of the biological, morphogenetic process is questioned on its potential to continue to be considered as an act of
creation when it follows a purely mechanistic development, sterilized by the decisive
presence and the creative role of cultural values. The architectural outcome of such a
process is questioned on its merit to adequately represent our contemporary culture
when the dominant characteristic through which it gains its value is its capacity to be
passively adaptive and responsive to preprogrammed external human or environmental stimuli. Finally, what is questioned is whether the architect more as a script editorprogrammer than a thinker-maker working on values to give form to our everyday
life, can safely translate, in parametric terms and the script language, the complexity
of human senses and behaviors. The common denominator of all this questioning is a
broader concern that, by overemphasizing the technological capacity of the available
means, we risk considering the means as objectives and thus lose the human being as
the ultimate end of architectural creation. Is IT the end or a means to an end?
All the above issues are translated into new questions that have nourished research
and experimentation, trigger off debate, contemplation and influence the practice
and education of the architect. Is it possible to find the human being in IT driven architecture? Is it possible to have an adaptive architecture in which the presence of
the human being will be more influential and decisive? Can the contemporary technological means assure a value-based responsive architecture? Can we have an architectural production, which will not only reflect some of the abilities, constructions
and properties of the alive, but also made to be receptive to the senses, the feelings,
emotions and sensations of the human being which will inhabit it? Can we use advanced information technology to protect architecture from becoming a consumable,
self-complacent object, fascinating for its elementary intelligence, admired for its advanced technical competences, attractive for its formal peculiarity but distant from
those who are invited to appropriate it by investing in its spaces and forms feelings,
aspirations, cultural attitudes, and values emerging from social life?
This volume contains essays the authors of which have been invited to give answers
to the above questions in the framework of the International Conference entitled “Rethinking the Human in Technology-Driven Architecture” organised in Chania, Greece
by the European Network of Heads of Schools of Architecture under the financial support of the Lifelong Learning, Erasmus, Academic Networks Program, the European
Association for Architectural Education and hosted by the School of Architecture of
Technical University of Crete. The authors of the contributions are architects, teachers
and researchers in architecture and their texts have been produced after their presentation in the Conference incorporating this way the comments, remarks and outcomes of the debates that took place in the context of this event.
The reader of this volume can find a record of the research undertaken in different
parts of Europe on architectural design and the output produced by schools of architecture aiming at advancing responsive and adaptive architecture critically towards a
more sensitive involvement of the human values. It also presents cases of architectural
design and fabrication where information technology is amalgamated with a valuesInitiations
15
based rethinking of architecture. Last but not least, it follows the ways that the output
produced (or imparted) by research, practice and contemporary contemplation on architecture is (tran)scribed into and recycled in architectural design teaching practices.
The volume is structured on three main parts. In the first part there are four essays introducing with inspiring way significant aspects of the reconsideration of the human
in the way we conceive and understand architecture as creation and practice. Manuel
DeLanda, Edith Ackermann, Antonino Saggio and Kostas Terzidis open with their contribution new directions and avenues for reflection on the main theme of the volume.
The second part contains essays, which can be considered as coherent introduction
to four different topics. The first topic is the research in contemporary responsive architectural design. The second is the theoretical and philosophical considerations of
the responsive architecture through the contemporary conceptions of the human.
The third topic is the scripting applications, which support the design of responsive
architecture. Finally the fourth topic focuses on the teaching of responsive architecture and the nature and form of methods and pedagogic approaches that could be
implemented. The Scientific Committee of the Conference1 selected these essays to
represent each one of these topics.
The third part of the essays contains the papers that the Scientific Committee selected
to be presented in the Conference grouped according to the four afore-mentioned
topics. A broad spectrum of approaches, conceptions and views can be found on the
way contemporary architectural design becomes subject of research, contemplation,
experimentation, materialisation and education.
The editors expect that this volume will formulate the conditions for an interesting
academic input on the basis of which a constructive debate can be formulated on the
issue of integration aiming at bridging the most significant but artificial separations
in our educational systems, between architectural design modules, construction modules and theory modules. The editors also expect that the innovative approaches presented in the volume constitute a collection of good practice examples able to inspire
more teachers and to influence changes in our educational approaches.
Maria Voyatzaki, Aristotle University of Thessaloniki, School of Architecture,
ENHSA Coordinator
Constantin Spiridonidis, Aristotle University of Thessaloniki, School of Architecture,
ENHSA Project Leader
Kostis Oungrinis, Technical University of Crete, Faculty of Architecture
Marianthi Liapi, Technical University of Crete, Faculty of Architecture
1 The members of the Scientific Committee were (in alphabetical order): Henriette Bier (Technical
University of Delft), Ava Fatah (UCL, Bartlett, London), Christian Friedrich (Technical University
of Delft), Kostis Oungrinis (Technical University of Crete), Antonino Saggio (Rome Sapienza),
Constantin Spiridonidis (Aristotle University of Thessaloniki), Maria Voyatzaki (Aristotle University
of Thessaloniki).
16
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Inspirations
Manuel DeLanda
New York
USA
Form-Finding
through Simulated Evolution
Algorithms are the soul of software. They are mechanical recipes for the performance
of tasks like sorting or searching, and are indispensable because computers lack the
judgment necessary to use procedures in which every step is not specified unambiguously. Search algorithms, in particular, are highly valued in computer science because many routine operations in personal computers involve looking for and finding
something: a document, an application, a web page, or just free space in a hard disk
to store a file. But more importantly search algorithms matter because many problem-solving techniques can be modeled as a search: a space of possible solutions to a
problem is constructed and a mechanical recipe is created to explore it. If the space of
possible solutions happens to include a single best solution then the process is called
an “optimization”, a term familiar to engineers. But the search space need not be structured in such a simple form so its exploration may demand a more flexible type of algorithm. While computer scientists are not normally drawn to biology for inspiration,
those concerned with the design of search algorithms are. The reason is that biological organisms may be viewed as solutions to problems posed by the environment: by
the climate or topography, by predatory or parasitic species. In other words, adapting
to a particular environment involves finding the appropriate changes (in anatomy, in
behavior) to cope with the challenges that it presents. Although individual organisms
may be said to cope with challenges throughout their lives, evolutionary biologists
are typically more interested in longer term adaptations, that is, in solutions to environmental problems found by a given species over many generations. In the 1960’s
the computer scientist John Holland looked at evolution as a process involving a
search for solutions, and abstracted its basic features from the details of its biological
implementation. Or as he put it, his task was “lifting the reproductive plans from the
specific genetic context...”. 1
The result was a new type of search algorithm, the genetic algorithm, that differed
from older procedures in that the space of solutions was not itself directly explored.
Rather the search was conducted in a space that coded for those solutions. This reflected the fact that in biology we face a double reality, that of the bodily traits of
organisms (the phenotype) and that of a coded procedure to generate those traits
(the genotype). Because the process of production of an organism can be coded into
genes the process can be repeated every generation, a repetition that is crucial to endow the entire species with the ability to find solutions. Another significant difference
is that while other search algorithms may look at one solution at a time, comparing it
to older solutions and adopting it if it is better, evolutionary searches can look simultaneously at many solutions, one for each member of the population. This captures
the insight that, in biology, the repetition of the process that generates organisms
always includes differences, differences that are distributed throughout a population
making each member a slightly different solution. When applied to algorithms this
implies that evolutionary searches are conducted not serially, one solution at a time,
but in parallel as the entire population moves across search space like a cloud. Finally,
while genetic differences are generated by random processes (mutation, sexual recombination) the environment selects only those that increase the degree to which
the solution fits the problem, giving the search process a certain directionality. This
reflects the idea that natural selection sorts out the members of the population into
those that get to leave many copies of themselves and those that do not, capturing in
the process historical information about the adequacy of the solutions.
20
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
The concept of using a search process to solve design problems is not new to architects. They can easily come up with examples of procedures that have been used in
the past to find forms using the inherent tendencies of certain materials and structures to perform analogue computations. Search spaces structured by a single optimal point, for example, have been known to mathematicians for centuries and have
been adapted by architects for design purposes. Such optimal points (minima and
maxima) were first studied in the eighteenth century by Leonard Euler via his famous
calculus of variations. One of the first variational problems to be tackled was the socalled “catenary problem” which can be characterized by the question: what form will
a chain find if allowed to hang freely while constraining both its ends. Euler framed
the problem in terms of the potential energy of the gravitational forces acting on the
chain. He realized, and proved mathematically, that of all the geometrically possible
forms the one realized by actual chains is the one that minimizes this potential, that
is, that the chain will be at equilibrium when it’s center of gravity occupies the lowest position.2 In a sense, the hanging chain performs an analogue computation to find
this form among all the other possible ones. Among architects, it was the Antonio
Gaudi who at the turn of the twentieth century first realized the potential of hanging chains or ropes. He used them to find the form of the arches in the facade of his
Sagrada Familia church. But chain models can be used for more complex design problems than arches or vaults:
“Chain networks showing significantly more complex forms than freely suspended
individual chains can be constructed from small pieces of chain or short bars fastened together flexibly. Freely suspended networks of this kind open up the gigantic formal world of the “heavy tents”, as the so-called gravity suspended roofs can
also be named. They can be seen in the temple and pagoda roofs of the Far East,
where they were originally made as flexible bamboo lattices. Today roofs of this
kind are made of rope nets with a wooden or lightweight concrete roof” .3
The authors of this quote are Frei Otto and Bodo Rash of the Institute for Lightweight
Structures in Stuttgart. Frei Otto is perhaps best know for his use of soap film as a
membrane-forming liquid capable of finding minimal forms on its own. Form-finding for tent designs can also be performed with thin rubber films, knitted or woven
fabrics, and thread or wire nets, but soap film is perhaps a better illustration of the
technique. As is well known, soap film can spontaneously find the form possessing a
minimum of surface tension. Like the inverted chain, the space of possibilities associated with soap film is structured by a single optimum, a topological point that attracts
the population of soapy molecules to a specific form. Without any constrains (such as
those exerted by a frame made of wire or rope) the form that emerges is a sphere or
bubble. Adding constraints can break the symmetry of this sphere and yield a wide
variety of other minimal surfaces, such as the hyperbolic paraboloids (saddle-shaped
surfaces) that Frei Otto needed for the roof of the German Pavilion at the Expo ‘67 in
Montreal. That roof was the first of a series in which Otto deliberately used soap film
as a form-finding instrument. Despite this exemplary achievement, however, some of
Frei Otto collaborators realized that performing form-finding procedures on search
spaces structured by a single global optimum was too constraining. Peter von Buelow, for example, argued this point by contrasting the task of engineering analysis
Manuel DeLanda USA
21
with that of architectural design. While in the former one normally tries to find a single
best solution, and there is the expectation that different analysts will reach basically
the same solution, in design there are always a variety of ways of solving a problem,
and different designers will typically arrive at their own solutions. In the latter case,
the search space is structured by multiple local optima, a condition that favors the use
of simulated evolution to perform form-finding. As von Buelow wrote:
“[Evolutionary search] goes beyond a set procedure of analysis to aid the designer
in exploring form-finding problems in a creative way. Unlike analysis tools it is not
intended to yield one correct solution, but rather to supply the designer with stimulating, plausible directions to consider. [Evolutionary search] is intended to be used
in the early, form-finding stages of a design problem. As such, it deliberately avoids
leading the designer to a single ‘best’ solution, but instead follows the designer’s
lead in exploring the design space.” 4
Let’s describe in some detail a typical implementation of evolutionary search. A simulation of evolution consists of the following components: a strategy to code a problem into a simulated chromosome (a way of mapping genotype into phenotype); a
procedure to discriminate good from bad solutions to that problem (a fitness function); a procedure to translate this assessment into reproductive success (a selection
function); and a set of operators to produce variation in each generation (at the very
least, mutation and sexual recombination operators). Some of these components involve human creativity while others are used in an entirely mechanical way by the
computer. Coding the problem to be solved and devising a way of correctly estimating the fitness of evolved solutions can be highly challenging tasks demanding imaginative human intervention. But once the creative decisions involved in these preparatory steps have been made the rest of the components can take care of themselves: a
population of random chromosomes, most of which start with very low fitness, is first
created; the few members of the original population that happen to be a little better
than the rest are then selected for reproduction; pairs of chromosomes are mixed in
a way that imitates sexual recombination, or single chromosomes mutated asexually,
producing a new generation; the fitness of the offspring is evaluated, and the steps
are mechanically repeated until a locally optimal solution is found.
The creative preparatory steps - inventing a mapping between a coded problem and
a solution, and implementing a fitness function - are the points at which an artist or
designer can make the greatest contribution in this area, so we will need to describe
these two steps in more detail. The task of coding the design problem depends crucially on the nature of the simulated chromosome. In the case of genetic algorithms,
for example, strings of symbols play the role of chromosomes. This linear structure
gives them a certain similarity with their real counterparts, except that unlike real
chromosomes the length of the strings is kept fixed, and the alphabet providing the
symbols has only two entries (“one” and “zero”) instead of four (the four nucleotides
used in DNA). In other words, the chromosomes in genetic algorithms are bit strings
whose length remains constant throughout the simulation, and in which the variables
defining a given problem must be represented by ones and zeroes. If the variables
happen to be switches that can be either on or off, the coding is trivially simple: each
22
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
bit in the string represents a gene, and each gene codes for a switch. But most problems do not have this simple form. The variables may have, for example, numerical values, ranging from a minimum value to a maximum one. In this case, we must break
down the range of continuous values of each variable into a discrete series. If this series contains, say, sixteen different values, then a string four bits long will be enough,
the gene “0000” representing the minimum value and “1111” representing the maximum one. The fitness function that evaluates solutions on each generation can be
used to handle values that are out of the range, that is, to enforce the constraint that
values must belong to the allowable range by penalizing strings that violate it.5
The standard example of the kind of problem that can be solved by genetic algorithms is the control of a pipeline for natural gas. A pipeline must geographically
link the point of supply of gas to the point of delivery using a series of compressors
linked by pipes. The problem is to determine the relation between the suction pressure of each compressor to its discharge pressure (the pressure gradient between its
input and output) in such a way as to minimize the overall electrical power consumed.
Coding this problem into a form that a genetic algorithm can use involves two steps.
First, the gradient for each compressor must be given a binary representation (a bit
string long enough to give a series of numerical values) and several of these bit strings
must be concatenated into a larger one to capture the whole pipeline. Second, a fitness function must be created to evaluate the power consumption of each combination of values for different compressors, as well as to enforce global constraints, such
as the minimum or maximum of pressure allowed in the pipeline. Genetic algorithms
have been shown to search the space of possibilities defined by problems like these
in a highly efficient way.6 There are different ways to adapt this approach to the task
of form-finding. The simplest one would be to define the space of possible forms in a
parametric way, so that it matches exactly the template offered by the pipeline example. Defining significant parameters that can be varied independently is not a trivial
task: a good parameter should not be a single variable, such as the height or width
of a particular design component, but a relation between different properties, at the
very least a ratio of two carefully picked variables. Another possibility, explored by the
architect John Frazer in 1971, is to adopt a modular approach to design. In one implementation, for example, Frazer created two modules (two folded plate components)
that could be oriented in 18 different ways relative to each other. Then he devised
an arbitrary code to match binary numbers to each of the modules and their transformations. Creativity enters here in the choice of pre-designed modules (they must
have a great combinatorial productivity) as well as in the choice of transformations. In
Frazer’s case the latter were simple rotations, but more complex transformations can
be used as long as they are adapted to the combinatorial capacities of the modules.7
Frazer realized early on that the way in which one represents the design problem in
order to be able to code it into a bit string, what he calls the “generic representation”, is
a key step in the process since it implicitly defines the space that will be searched. As
he writes:
“In step one, the generic representation largely determines the range of possible
outcomes. A tight representation based on previously near-optimal solutions may
be fine for some engineering problems but might seriously inhibit the range of
Manuel DeLanda USA
23
more creative solutions in another domain. For example, parametrization is a valuable technique for exploring variations on a well-tried and tested theme, but it is
limited to types of variation that were anticipated when the parametrization was
established. [On the other hand] a very open representation is often difficult to imagine and can easily generate a vast search space.” 8
Given the importance of the generic representation of a design problem, and more
generally, of an adequate mapping between genotype and phenotype, architects
must consider all existing alternatives. The bit strings used by genetic algorithms not
only force the designer to find a numerical way of coding the design problem, but
the fact that the strings are of a fixed length implies that the complexity of a problem
must be specified in advance. This limits the range of problems that can be coded and
solved. Although these limitations can be mitigated by allowing the string to vary in
length (as in so-called “messy” genetic algorithms) other chromosome designs can afford more flexibility. In genetic programming, for example, chromosomes are not static strings but dynamic computer programs capable not only of varying in length but
also of breaking down a problem into a hierarchy of sub-problems, and then to literally construct the solution to the design problem following the evolved building procedure. The idea of using a procedural generic representation of a problem, instead
of an arbitrary numerical code for modules and transformations, may seem obvious to
any architect that has built a 3D model using a script (in, say, Maya’s MEL scripting language). But most computer languages do not allow the creation of programs in which
random substitutions of instructions can be made while the overall program remains
functional. In other words, the functionality of most scripts or programs is destroyed
after undergoing a few random mutations or sexual recombinations. There are some
languages, however, that do possess the necessary resiliency: they use mathematical functions instead of step-by-step recipes, and generate control hierarchies by recursion, that is, by defining higher-level functions in terms of lower-level ones. With
this kind of computer language the range of design problems that can be coded into
simulated chromosomes can be increased dramatically.
In genetic programming the creative preparatory steps include selecting the right
kind of elementary functions out of which more complex ones can be built by recursive composition, as well as the constants and variables that can act as inputs to those
functions. This elementary repertoire must fit the type of problem to be solved: if the
problem is a logical one, the elementary functions should be operators like “And”
or “Not”, while the variables should be True and False values; if it is arithmetical, the
operators should be like “Add” or “Multiply”, while the variables should be numbers
or matrices; if it is a problem of robotic motion, it must contain functions like “Move
Left” or “Move Right”, and variables specifying distances or angles; and finally, if the
problem is creating a 3D model of a building then the functions must include extrusion, surface of revolution, bending, and twisting, while the variables must be polygons or NURBS. In other words, the basic repertoire must be matched to the details of
the problem’s domain. A chromosome in genetic programming is not a linear string
but a branching graph, a “tree” in which each branching point is labeled with a function while the “leaves” are labeled with variables and constants. These tree-like graphs
capture the hierarchical relations between elementary and composite functions, and
24
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
can be manipulated by the same genetic operators (mutation, sexual recombination)
that are used in genetic algorithms. Much as the oil pipeline problem is an exemplar
of the use of genetic algorithms, the design of analog electrical circuits (filters, amplifiers, sensors) has been the area in which genetic programming has demonstrated its
full potential. Unlike digital circuits in which the design task can be automated, analog
circuits are basically handcrafted. To make the problem even more “human-like”, John
Koza, the creator of genetic programming, chose as his targets designs that had already been patented. The reason is that for a patent to be accepted it must typically
contain significant differences with respect to existing designs and these differences
must be “creative”, that is, not logically deducible from a previously patented invention. 8 Using this criterion, the designs produced by genetic programming can be classified as true inventions, rather than mere optimizations: in several cases evolutionary
search has rediscovered circuit designs that had been previously patented; in other
cases it has matched the functionality of patented designs by using novel means; and
in at least one case it has produced an entirely new patentable design.9
The repertoire of elementary functions that allow the circuit design problem to be
coded include functions that insert a new component (a resistor, a capacitor, an inductor); functions that alter the connectivity of these components (the topology of
the circuit); and functions that set the intensity (or sizing) of a component, that is, the
amount of resistance of a resistor, the capacitance of a capacitor and so on. Fitness
evaluation is more complex than in genetic algorithms because the evolved programs
must be ran to construct the solution. In the case of analog circuits, once the topology and the sizing have been set for a given generation, a circuit must be built (as a
simulation) and tested. To do this, a kind of “embryo” circuit (an electrical substructure
with modifiable wires and components) is placed into a larger circuit in which no component is modifiable. Only the embryo evolves, but its placement into a larger functional setting allows it to be easily checked for viability.10 Koza decided to use existing
software to check for the functionality of the circuits, a strategy that could also be followed by designers of architectonic structures, since these need not only be assessed
for aesthetic fitness but also be evaluated as load-bearing structures. Like Koza, users
of genetic programming in architecture could have the program build 3D models in a
format that is already used by existing structural engineering software (such as finite
element analysis) and use the latter as part of the process of fitness evaluation. And
like Koza, only a certain part of a building need to be evolved (the embryo) the rest
being a non-evolvable template into which the embryo can be placed to be checked
for structural integrity.
It should be clear from this remarks that fitness evaluation is another aspect of simulated evolution that demands a creative intervention on the part of the designer.
The assessment of aesthetic fitness, in particular, can be particularly difficult. One approach here is to let the designer be the fitness function: he or she is presented with
a population of solutions on every generation, perhaps one that has already been
checked for structural integrity, to be ranked by their aesthetic appeal. This approach
has the advantage that the designer has more control over the direction of the search,
steering the evolutionary process in promising directions. Peter von Buelow’s use of
simulated evolution for form-finding uses this strategy, not only allowing the user to
Manuel DeLanda USA
25
rank proposals as a way of measuring fitness, but also letting him or her add new variants to the population to redirect the search away from evolutionary dead ends.11 Replacing a fitness function with a human, however, has the disadvantage of making the
process painfully slow, and of limiting the evaluation of every generation to a small
subset of the entire population, a subset small enough to be displayed on a computer
screen and be surveyable at a glance. Given that aesthetic criteria are very hard to formalize it would seem that using the “eye of the beholder” is inevitable when evaluating fitness in terms of fuzzy concepts like “elegance” or “beauty”. But there is another
alternative: not a mechanical assessment of aesthetic fitness but a means to store the
taste or stylistic preferences of the designer so that they can be applied automatically.
This can be done by the use of another type of simulation called neural nets. Like Koza’s use of external software to assess the functionality of electrical circuits, this would
extend the meaning of the term “fitness function” so that it encompasses not only a
fixed criterion coded mathematically but any complex set of procedures, using any existing software, that can be reliably used to assign fitness values.
Simply put, a neural net is a learning device that maps patterns into patterns, without
any intervening representations.12 One pattern may be, for example, a sensory pattern
produced by features of the environment (captured via a video camera), while the
other may be a motor pattern, that is, a sequence of actions produced as a response
to the sensory stimulation. Learning consists in correctly matching the motor activity
to the sensory information, such as fleeing in the presence of predators. Both sensory
and motor patterns are implemented as activity patterns in simple computing units
arranged in layers (input and output layers in the simplest designs) linked to each
other by connections that can vary in strength. Unlike other implementations of machine learning a neural net is not programmed but trained. In the simplest case, the
training consists in repeatedly presenting a pattern to the input layer, activating some
units but not others, while fixing a desired activation pattern in the output layer. The
reason the output pattern is fixed in advance is that, as with animal training, the human trainer has a desired behavior that he or she is trying to elicit from the animal.
Once both activation patterns are set, the computing units in each layer can begin to
interact with each other through their connections: units that are simultaneously active will strengthen their connection to each other, and vice-versa, simultaneous inactivity will weaken a link. Thus, during training changes in the connection strengths
store information about the interactions. After many presentations of the input and
output patterns the connection strengths will converge to the combination needed
to match the activation patterns to each other. And after the training is over and the
fixed output pattern is removed the neural net will be able to reproduce it whenever
the right input pattern is present. In a sense, the neural net learns to recognize the
sensory stimulation, a recognition signaled by the production of the correct motor
response. And more importantly, the neural net can not only recognize patterns that
were included in the training set but also patterns that are similar to those.
To be used as an aesthetic fitness function, a neural net needs to be trained with a set
of examples corresponding either to the designer’s taste or the stylistic preferences
associated with a particular project. A set of photographs or 3D renderings of the appropriate architectonic structures would comprise the training set, presented to the
26
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
input layer via a video camera, in the case of photographs, or in some coded form in
the case of 3D renderings. The output layer, in this case, would not have to perform
any motor response but only produce a pattern of activation representing a numerical value: a number that ranks different inputs by their aesthetic proximity to the designer’s taste or stylistic preferences. During training these numerical values would be
given explicitly by the designer (making sure that they do indeed capture his or her
aesthetic values) but after training they would be produced automatically to be used
as part of the fitness score. Using neural nets to replace the “eye of the beholder” has
advantages and disadvantages. It can greatly speed up the process since there is no
need for the designer to sit at the computer following a simulation, and it can evaluate as many evolving entities as needed, without the restriction of having to present
these to the user on the screen. On the other hand, it can constrain the search space
to those areas containing possible design solutions that are already pleasing to the
user, preventing the simulation from finding surprising forms, that is, forms that the
designer did not know he or she liked.
Another component of fitness evaluation that is important to architects is the kinds of
activity patterns displayed by the human users of a given architectural space. Certain
circulation patterns, for example, may be desired, with rapid and unobstructed circulation in some areas, and gatherings of small groups in other, more intimate zones. To
check whether an evolved design does indeed facilitate such circulation patterns we
need to include simulated agents that are spatially situated with respect to one another, and that can interact with the simulated walls, doors, hallways, stairs, and other
components of the 3D model of a building. Space can be structured through the use
of cellular automata, populations of simple computing machines placed on a tiled
plane (or volume) in which spatial relations like proximity are easily captured by the
sharing of edges or vertices in the tile. Traditional cellular automata, like the famous
Game of Life, use the simplest type of computing machine: finite state automata capable of carrying computations without any memory. They can, for example, perform
multiplications as long as they do not have to carry a number. But the restriction to
memoryless automata can be removed, allowing each automaton to perform more
complex tasks. When this is done the result is called a multi-agent system, a hybrid
of cellular automata and object-oriented programming.13 With the right set of rules,
such agents can avoid collisions and plan motion paths in a given space that take into
account the opportunities and risks afforded by the physical layout of a space, as well
as the movements of neighboring agents. A small population of such agents can be
unleashed into every proposed design solution in a given generation, and a simple
piece of software can be added to check for the emergence of the desired circulation
patterns, with a score given to each candidate relative to its distance from the ideal
pattern. This score can then be added to the one produced by the neural net to determine the overall fitness value.
Thus, just as devising the right mapping between genotype and phenotype, between
coded design problems and their solutions, involves the creativity of the designer, so
implementing a good fitness function demands the imaginative coupling of multiple
simulation genres. Neither task can be accomplished by software engineers designing
general products for general audiences, since it is only the specific artist or designer
Manuel DeLanda USA
27
who has enough knowledge of his or her field to make the right decisions about how
to code a problem, how to unfold the possible solutions embryologically, and how to
evaluate their adequacy. There is plenty of room for individual creativity in the use of
evolutionary search as a form-finding procedure.
References
1 John Holland. Adaptation in Natural and Artificial Systems. (Cambridge: M.I.T Press, 1992) p. 18.
2 Stephen P. Timoshenko. History of Strength of Materials. (Dover: New York, 1983). p. 31.
3 Frei Otto, Bodo Rash: Finding Form. (Fellbach: Axel Menges, 1995). p. 62.
4 Peter von Beulow. Using Evolutionary Algorithms to Aid Designers of Architectural Structures.
In Creative Evolutionary Systems. Edited by Peter J. Bentley and David W. Corne. (San Diego:
Academic Press, 2002). p. 317.
5 David E. Goldberg. Genetic Algorithms in Search, Optimization, and Machine Learning. (Reading:
Addisson-Wesley, 1989). p. 82-85.
6 David E. Goldberg. Ibid. Pages 125-130.
7 John Frazer. Creative Design and the Generative Evolutionary Paradigm. In Creative Evolutionary Systems. Op. Cit. p. 260-262.
8 John Frazer. Ibid. p. 257.
9 John R. Koza, Forrest H. Bennett, David Andre, Martin A. Keane. Genetic Programming: Biologically Inspired Computation that Exhibits Creativity in Producing Human-Competitive Results.
In Creative Evolutionary Systems. Op. Cit. p. 294.
10 John R. Koza. Ibid. p. 280.
11 John R. Koza. Ibid. p. 284.
12 William Bechtel and Adele Abrahamsen. Connectionism and the Mind. An Introduction to Parallel
Distributed Processing in Networks. (Cambridge: Basil Blackwell, 1991). p. 106-107.
13 Joshua M. Epstein and Robert Axtell. Growing Artificial Societies. (Cambridge: M.I.T. Press, 1996).
p. 179.
28
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Edith K. Ackermann
MIT
USA
Emergent Mindsets
in the Digital Age:
What Places for People on the go?
New media ecology
/ New genres of engagement
Where we are—the places we occupy, however briefly—has everything to do
with who we are (E.S. Casey. Getting back into place, 1993. P. xiii)
Many new tools and mediations are at today’s children’s avail that we couldn’t dream
of when we were growing up. At the same time, the millennium generation is also
facing many new challenges, which call for creative solutions. Today’s youngsters are
growing up in a world increasingly shielded from nature; of ever more busy work and
entertainment schedules; longer commutes; disappearing third places; reorganizing
neighborhoods; and recomposed families. And yet the children are extraordinarily resourceful. They invent their own ways of navigating rough seas and of seizing opportunities. Much can be learned from their genres of engagement.
While wary of claims that exacerbate the divide between so-called “digital natives” (people born after 1980) and what Marc Prensky refers to as “digital immigrants”
(people born and raised in the post-Gutenberg / pre-digital era (Prensky, 2005)., it is
fair to say that we are witnessing a major cultural shift, or epistemological mutation,
the symptoms of which are only magnified in today’s youngsters’ infatuations with all
things digital1.
This paper addresses six areas of change that, in our view, inform how today’s
youngsters play and learn, and, more generally, how they see themselves, relate to
others, dwell in place, and treat things. Together, these areas form a framework to rethink some of our own assumptions on what it means to be literate, knowledgeable,
and creative, and what it takes to be so.
Looking into the “digital natives” genres of engagement, and how they mediate
their experience through tool-use, provides a unique window to understanding their
needs, and appreciating their contributions. Our purpose is to open new venues for
designers and educators to cater the native’s strengths while, at the same time, providing for what they may be missing on, if left on their own.
Of particular interest for the scope of this publication are the questions: How do
the “natives” find their ways into the media-saturated jungle in which they are born?
What can be done to restore a viable balance between people’s increasingly hurried
lifestyles and nomadic tendencies (desire to go places, move between worlds: physical,
virtual, digital) and an equally strong desire to feeling grounded, centered? Why does
“being in place” and “living in our bodies” have everything to do with who we are?
The Importance of Place in People’s Lives
Every child needs a place that they can call their own.
As philosopher Edward Casey puts it, “places are what humans make of space and
time”, and it is that making, which is of present interest (Casey, 1993, p. xiii). Places are
the homes in which we live, the distance we keep
from one another, and the squares we traverse,
or play in. They are the offices (or classrooms) in
which we work, the vehicles we travel in (car, metro, mobile home), and all the in-between, sometimes referred to as “third places” (the mall, the
café, the hotel, all the way to the schoolyard, the
park, and the gym) (Oldenburg, 1989).
30
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
In his studies on “proxemics”, anthropologist Eduard Hall has provided countless
examples of how human behavior is regulated by the hidden dimension of space
(1969) and the silent language of time (1959), and how culture itself mediates our perceptions of, and actions in, space. People need places to dwell in, depart from, and
revisit, and places, in turn, hold their own unspoken rules on how “they like to be inhabited”. They whisper to us some of the paths worth following and how to find our
ways. And it can take a while before a young child grasps the clues that set adult’s expectations in different settings. Young kids speak loud at church, wave at their parents
during a school performance, run around in restaurants, and can stare at unfamiliar
faces until they blush. Likewise, a newcomer to a culture often unknowingly infringes
upon the tacit rules, and habits, that prevail in given places, or habitats (strangers may
speak too loud in a café, sit too close to an other person on a bench, or leave no time
for others to come forth).
Early on, children develop their own views on how space is to be arranged, inhabited, and shared, and how time is to be spent (often different from adults), and even
very young infants are alert to surrounding moods and tensions. They play a proactive role in seeking or avoiding contact to make themselves comfortable. Born with a
knack to keep incoming stimuli within a viable range, human infants are experts in the
art of distancing: They open up or shut down, come close or step back, and reposition
themselves to optimize exchanges. As the saying goes: when a child is interested in a
hammer the whole world looks like a nail. Children are also master navigators and explorers. They feel their way through things, stake their territory, and wander about to see
what’s on the other side of the fence, or beyond the surface of things. The children also
seek corners to rest, grounds to play, stages to perform, and safe places to return to.
To conclude, at once man-made (or, in the case of nature, domesticated) and lived
in, places constitute the grounds on which we stand, the springboards from where we
leap, and the destinations we seek to reach2. They are the culturally mediated spatiotemporal arrangements, designed (or evolved over time) for specific purposes, and
not always used as intended.
Shared Spaces, Mediated Experience – People, Places, Props
It takes a whole village to raise a child.
The contribution of thinkers like Lev Vygotsky (and others in the socio-cultural tradition) is to remind us that it takes a whole village to raise a child. And what is true of children is also true of adults. In other words, although every person in the world needs a
place they can call their own, no person would survive, let alone thrive, without being
a part of a wider community. Conversely, no community would live for long without
the active participation of its members (Vygotsky, 1972, 1978). To be is to belong and
be loved!
According to Vygotsky, children (or any other learners for that matter) can operate
at one level if left on their own (which he refers to as their ‘level of actual development’)
but at a higher level if ‘scaffolded’ by caring and knowledgeable adults, or experienced
peers3. Vygotsky’s “zone of proximal development” (ZPD) constitutes that sweet-spot inbetween, where the child feels challenged but can succeed with the appropriate support, or guidance. Human guidance, or relational support, in the socio-cultural tradition, refers to the mutual enrichment that comes from inhabiting a place together, and
Edith K. Ackermann USA
31
communicating with others through language (and other mediational means). Technological mediation, on the other hand, refer to the tools, artifacts, and objects (mobile
or ubiquitous, physical or digital) used by the inhabitants of a place to make their lives
more livable, enjoyable, and convenient, both for themselves and for others.
Communal places (in the sense of villages) are at once the contexts, or ambient
conditions, in which we operate, the depositories of what we (and others) have left
behind (those who were there before) and the interface between who we are, when
and where we are, and where we want to be, in relation to others. They become ours
to the extent that we feel the presence of trusted others “standing behind” us, and
available whenever needed.
In sum, besides people and place, humans also rely on tools to mediate their experience, and if given new tools and media, they won’t just accomplish new tasks, but
the will begin to view the world in new ways. Bruno Latour and John Law have shown
how the social dimension emerges from the interactions between both human and
non-human actors, and is maintained through associations between things. Said otherwise, people are who they are because they belong to a patterned network of heterogeneous materials. In Law’s words: “If you took away my computer, my colleagues,
my office, my books, my desk, my telephone I wouldn’t be a sociologist writing papers, delivering lectures, and producing “knowledge”’ (Law, 1992, p 4).
What may be different in this day and age, as compared to previous generations, is
that most of us belong to more that one community or “village” at once, and that no
one seems to stick to any one in particular for very long. Instead, we live our lives inbetween, and we move across realms: physical, virtual, digital. We do so at ever faster
pace and, more often than not, we go places without even moving our physical bodies
(Abbas, 2011). These new forms of mobility, and the sense of “dematerialization” that
comes with living our lives on the screen (Turkle, 2011) call at once for stronger anchors and more flexible ties, for safe harbors and new routes.
Who are the Natives? What’s to be Learned
Today’s kids don’t always play, learn, or mingle the ways we expect them to.
What if they did so in their own clever ways that we don’t always see?
Based on recent research findings and our own work on children’s uses of digital technologies, we have identified six areas of change where there seems to be more going
on than the usual generational gap. Each constitutes a dimension that, in conjunction
with others, informs how the natives play and learn, and, more generally, how they
see themselves, relate to others, treat things, and use place. Each captures a core generational trait that runs across findings (Ito, & Al. 2010, Mc Arthur, ILLM, 2009, Jenkins,
2010). Together, these traits provide a framework to understand youngsters’ needs
and aspirations, appreciate their contributions, and rethink some of our own assumptions about their infatuations. Dimensions are:
Plural identities / Fluid selves
New ways of being - Shifting boundaries between ME / NOT ME
Identity refers to distinct personal traits that remain persistent, and recognizable, over
time and across contexts. More than in previous generations, today’s children seem
32
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
to exist and evolve in multiple realms: physical, virtual, and digital. Their sense of self
is at once more fluid and more distributed. In their play, the youngsters may take on
different personae, or masks, which they then incorporate as a host of voices-within.
While putting on a mask and moving in and out of character are not new (carnival, bal
masqué, role play), digital environments have this particularity that they let you put on
several hats at once! You can simultaneously exhibit different facets of self in different
contexts (often shielded from one another) and in each you are taken at face value.
Sharism - New ways of relating - A growing precedence of co-creation
over individual construction and personal elaboration.
Another noticeable generational trait is that the “natives” don’t seem to first think and
then act, or first try out things for themselves, and then share them with others. Instead, they mingle before they make, and hey share before they draft! The natives are
eager to collect and disseminate half-baked ideas and creations—either found or selfmade—which they then bounce around, often at a fast pace, instead of keeping them
to themselves. They often do so with kindred spirits, present or absent, before they
seek help from knowledgeable adults, or more experienced peers.
Border-crossing - New ways of way-making and path-finding–
Taking a walk on the wild side!
More than in previous generations, today’s youngsters are constantly on the move.
They zap between channels, roam about online, and surf between worlds (physical,
virtual, digital). The “natives” may feel at home in more than one place, or not live in a
place in particular. Some seek their grounds in virtual habitats or videogames. Others
carry along the stuff they care about (hence, ever heavier backpacks) and ask for cellphones and other mobile devices to stay connected while on the go. Others yet, especially the children who live in split families, will request that parents buy replicas of a
preferred toy, so that it awaits them whenever they will stay over. Their urge to cross
borders, geographic and cultural, puts an end to the notions of home and territory, as
we know them.
Literacies beyond print— New ways of saying it - Deep shifts in what it means
to be literate. Write to Sp[w]rite (speakwrite). Notate to annotate.
With the proliferation of digital presentation and authoring tools, the gap is slowly but
surely closing between the acts reading and writing, as well as between speech and
writing. Writing is an ever quicker assembly of cut-and-pasted fragments, a blending
of text, images, and sounds. And reading turns into ever more meticulous acts of highlighting, earmarking, and extracting bits for later use. Annotations and editing are the
new mid-ground between reading and writing. “Texting”, on the other hand, is about
typing-to-speak, and since typing is slow, the youngsters invent many clever ways to
speed it up or, alternatively, to make it short (using emoticons, twitting, re-inventing
spelling). Lastly, today’s authors rarely start from scratch. Instead, they borrow from
those who inspire and they address to those whose opinions matter. And if time permits, they will happily reassemble and remix what they found in order to add their
mark (Ackermann, 2008).
Edith K. Ackermann USA
33
Enactments and simulation -Gaming and simuling New ways of playing it safe!
The use of the word “simuling” requires some explanation. Unlike simulating, which
implies the faithful reproduction of an original in an attempt to mimic an existing reality (e.g., a professional flight simulator), ‘simuling’ is here meant as the creation of an
alternative world, virtual or physical, that is ‘true’ or believable in its own right. More
than in previous generations, today’s youngsters expect the tools they use to provide
them with immediate feedback, and most important, the tools should let them undo
previous moves (recover) and keep track of what they are doing (revisit). This “goodenough-mother” quality of digital tools (attentive, responsive, and forgiving) breads
a culture of iteration (try again, build on top) and of playful exploration (go for it, no
move is fatal) in ways that pre-digital tools hardly could. In the worlds of gaming and
simuling, like in children’s pretend play, you are always given a second chance!
“Bricolage and fabrication” - Makers, Hackers, Hobbyists –
New rapports to things.
A bricoleur is a Jack-of-all-trades who knows to make do with whatever is at hand.
Comparing the bricoleur and the engineer, Levi-Strauss portrayed the bricoleur as
being adept at many tasks and putting together in new ways preexisting things that
usually don’t belong (1962). More than in previous generation, today’s youngsters are
bricoleurs, eager to collect, repurpose, and trade things, preferably tangible although
not necessarily. Like today’s authors, they find great pleasure in tweaking what they
find (giving it a second life or extra powers). And as they perfect their skills, our master “pro-ams” (professional amateurs) invent many new ways of making things (crafting, fabricating); making things ‘do things’ (controlling, programming); and repurposing things. If given a chance (and provided with the appropriate support), today’s kids
won’t merely consume and dispose. Instead, they will create and recycle. Alias, they
may care!
To conclude,
While not all youth may exhibit the “neomillienial” traits as here described, the trends
are significant enough to be worth paying attention to. This is especially needed at a
time when larger societal efforts are being undertaken to address the gaps between
the natives’ interests and genres of engagement and the concerns of the older folks in
charge of their upbringing.
21st century Requirements4: What Support for the Natives? –
Restoring Inner Balance
Today’s youngsters, we have seen, entertain an altogether different rapport to one another and to the world—man-made or natural, digital or physical, animate or inanimate. This, in turn, brings about new ways of being, thinking and doing--and new expectations that ideas and artifacts can be borrowed, re-purposed, and recycled, and
that the tools we use should be responsive and forgiving. Questions remain: How to
34
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
live a life in-between and still feel rooted and centered? How to stay connected while
on the go? What to carry along or leave behind? How to share and with whom? What
does it mean to settle in a place that is anonymous, temporary, not yours.
While emerging practices, mindsets, and lifestyles hold great potential they also
breed new tensions that call for re-adjustment. Boldly put, there may be a price to
be paid for being constantly on the go yet without moving one’s body, and perpetually connected yet out of touch! In what follows,
we shift focus from what the natives bring to the
table in order to what they may be missing on, if
left drifting on their own. For each of the generational traits previously identified, we highlight a
contrasting urge, or compensatory strategy, that
is already observable for those who know how
to listen, and the importance of which cannot be
under-estimated.
Plural identities vs located self / Fluid selves vs feeling centered)
Fluid selves may foster empathy, yet for a person to feel whole also requires that she
be centered, and find her voice. In the cultural psychology of self, Cearan Benson welcomes a certain containment of what he calls the “located self”, arguing that the primary function of the self is to orient and efficiently stabilize a person’s erring within
the flux of ever changing experience (Benson, 1993, p. 4) One of the biggest challenges for today’s youngsters is to remain true to who they are (keep their identities), and
stand behind their ideas as they are moving in and out of character, and putting on
different masks.
Sharism – Connected vs dependent / Attached vs detached
In today’s participatory cultures, being in it together seems more important than doing
it yourself (from DIY to DIT) and, as a consequence, the nature of social formations and
group loyalties are changing fast: Groups are easy to start and sustain. Yet they are
also more volatile and tend dissolve faster (Jenkins, 2010). This in turn changes how
the youngsters think about personal accountability and negotiate the terms of their
commitments. Lastly, a culture of Sharism, like any consensual group, calls for trustworthy allies, and, while forthcoming and warm, it tends to blur the line between intimate, private and public. Hence the need to learn who to let in on what, and who
better to let out!
Border-crossing - Mobile vs grounded / Dislocated vs rooted /
Erring vs anchored
Whether chosen or forced upon, the displacements and relocations characteristic of
today’s youngsters’ lifestyles have a profound impact on how they transit and settle,
on what they bring along or leave behind, what they choose to remember or forget,
and when (and with whom) to engage or keep at a distance (Kristeva, 1991). A big
challenge for the millennium generation is to find new ways of reconciling their desire
of evasion and their need to remain securely attached.
Edith K. Ackermann USA
35
Literacies beyond print - Cut-and-paste vs. authoring /
Borrow and pass on vs create and own
One of the biggest problem among educators these days is to come to grips with
what they conceive of as students’ plagiarism: Today’s youngsters’ tendencies to
pick-up and pass-on ready-made imports, without taking the time to get mindfully engaged (Langer, 1997), let alone add value to what they borrow! It is our view
that borrowing and addressing are quite alright as long as: 1) the incoming bits are
“massaged” long enough to become an expression of who we are, and 2) the borrowed sources are recognized, mentioned, and brought to a next step through our
intervention.
Gaming, simuling - Make it up vs Play it out
Moving in and out of pretense - Most children are quick to know when something is
for good or for fun. And they usually won’t mistake an invitation into fantasyland for
the “real” thing. Instead, they see pretense as an opportunity to play out otherwise
‘dangerous’ ideas on make-believe ground, which in turn allows them to then return
to “real life” better prepared, refreshed, and stronger. This said, today’s culture of simulacra, as exhibited in some reality shows, docudramas, and subliminal ad campaigns
(including political self-promotion ads) can be confusing—especially if no readable
clues are given on just how fake or truthful they are. Today’s kids need the space-time
to critically reflect on what they are made to believe by whom, for what purpose,
through which medium.
Making/ Hacking / Crafting - Do it fast vs Do it well!
Starting from scratch vs. Composing with what’s there. While digital technologies
open up new possibilities to breakdown objects into subparts and reassemble them
in an attempt to curtail, or recuperate from this breakdown, hacking alone won’t make
for a culture of caring (Sennett, 2008). In other words, even “doers” can be careless if
“hurried”: Things well done require that one slows down, dwells in, and composes
with what’s there. Craftmanship, in the digital age, has everything to do with today’s
youngsters’ willingness and abilities to develop an “intelligent hand” at the service of
a “grounded self” and a “playful mind”. Craftsmanship, to Sennett, is an enduring, basic
human impulse, the desire to do a job well for its own sake (Sennett, 2008. p. 9).5
Guidelines for Educators and Designers:
From a Fragmented Field to a Learning Ecology
New environmental qualities are needed to make for the unsettling consequences
of both desired and imposed physical as well as mental “displacements”. By “displacements, I here mean the shifting habits, habitats, and the sense of disconnect (estrangement, detachment) that emerge from leading ever more nomadic lives (Abbas, 2011).
It comes as no surprise, then, that increasing numbers of people are already actively
reclaiming their bodies, and territories, in a quest to live a fuller life. And it is not just
the natives, but the immigrants themselves, who come to rally under a new common
call (I put words in the neo-native’s mouths):
36
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Make me able to explore and show my creative skills locally, globally, anytime, anywhere but please don’t forget: I do have a body, and I like to use it! I’m exuberant!
I’m physical: so, let me unleash my imagination (transport, teleport me). but also
make me touch, feel, and move (ground me)!
This new form of collective wisdom, the signs of which are popping up everywhere,
is likely to turn the tide faster than we think. Today’s natives, ironically, are often more
steeped in the concrete, ecologically sound, and caring for the grounds on which we
stand.
Digital natives to Digital wisdom6 - Reclaiming the body, Rethinking the territory
We think the ways we thing because we have the bodies that we have,
and live in the world in which we live (Lakoff and Johnson, 1980)
People of all ages and walks of life need places to dwell, traverse, furbish, and get lost
and found again. They need occasions to gather and disperse, and the time for things
well done. And more than ever, people need to use their bodies for more than just
“mousing” their avatars around in second-life, driving cars to go places, or taking the
escalator to go to the gym!
No doubt, roaming about in cyberspace, or talking to friends on Facebook, can be
enjoyable and, in many cases, is as beneficial as hanging out in the street. Likewise,
following beaten paths feels different than venturing into the unknown, with no possibility of return, and doing so on- or off line has different risks attached. Each makes
for a unique experience. While in today’s new media ecologies there is place for any
combination of the above, it is the wisdom of not getting struck a single mode, or
genre of engagement, that matters7.
The concept of learning ecology, as defined Uri Bronfenbrenner, offers a useful
framework for the design of educational setting, and to assess the ambient qualities
needed to foster everyone’s physical, emotional, and mental wellbeing, especially in
times of uncertainty.
Bronfenbrenner called his theory “bio-ecological systems theory” to emphasize
the notion that, indeed, a child’s body itself is a primary environment that fuels her development (Bronfenbrenner, 2004).. Our bodies are our “closest” life support system,
mobility system, and mediation: that with which, and through which, we perceive and
interact with our world. To study a person’s development then, we must look not only
at her immediate environment (personalized and relational interactions), but consider
the well-being of her mind/body (health, nutrition, exercise) as well as the impact of
the larger, more remote, environmental conditions that affect our physical, mental, and
emotional well-being (fields of forces that propagate through distributed networks of
influences). What is true of the growing child is equally true of the learning adult.
Bonfenbrenner’s bio-ecological model comprises 4 layers, or circles of influences,
including a temporal dimension, which form a tightly interwoven system:
1. The microsystem (the layer closest to the person includes relationships and interactions with the immediate surrounding, including her body).
2. The mesosystem (one step removed, enables the child to distinguish between firsthand interactions, or close relationships).
Edith K. Ackermann USA
37
3. The exosystem (the larger social/relational and build/natural environments, in
which we are embedded—yet have no direct control over)
4. The macrosystem (the outermost layer comprised of cultural values, customs, and
laws, whose effects have a cascading influence throughout the other layers.
5. The chronosystem –encompasses the dimension of time, and transitions over time.
Influences at this level can be external, such as the timing of a parent’s illness, or
internal, such as the physiological changes that occur during puberty. They can be
sudden or progressive, cyclic or incremental.
Important in Bronfenbrenner’s model are the notions that: (1) bi-directional influences
at the micro-layer are strongest and have the greatest impact on the growing child; (2)
mutual influences can be direct or indirect, and their impact mastered differently over
time; and (3) interactions at the outer levels, while out of our control, still impact us
very directly.
To Conclude
In recent years, ecological approaches to human development have regained momentum, which comes as no surprise at a time when educators and policy makers are
tackling highly complex issues, such as the balance of needs and resources, the uses
of digital media and technologies, and the need to provide new venues and opportunities for students’ academic success, youth development, and lifelong learning (HerrStephenson, Rhoten, Perkel, & Sims, 2011).
It is our belief that a host of new initiatives, programs, and institutions will need to
be developed, including places, times, and occasions for incidental learning (that are
not intentionally designed to teach, at least in the didactic sense of the term). And existing programs will have to be readjusted to fit the needs of all learners, everywhere,
and all the time. The role of digital media and technologies will be significant, but
more than anything else it is the learner’s physical and mental health and wellbeing
that will become a priority.
Learning, in the future, will be about the art of living as much as the science of
learning, long-term benefits as much as short-term achievements, and sustainability
beyond development. The gardener’s metaphor, as examplified in Alan Kay’s Vivarium
Project will prevail over the information-processing paradigm, still dominant in educational parlance8.
A “vivarium,” in Kay’s sense, is an enclosure or reserve, especially designed for keeping plants and animals alive and thriving in their “natural” habitat (“natural” is not here
meant in opposition to the built but as a viable niche, or ecosystem, for its inhabitants). The role of the children and researchers in the project was to observe and study
38
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
what makes [and how to make] their vivarium a “better place” for its dwellers! Initially
launched in the eighties to gain insights into the design of human-machine interfaces, the project differs from others of its kind (olpc, LOGO, Microworlds) in that the
children were working in a real “augmented garden”. Most important, Kay didn’t just
use the gardening metaphor as an entry point to using computers, but the computer, and many other tools, as a means to keep the garden alive And to Alan a primary
school seemed an excellent choice because younger children are still “in their bodies”,
steeped in the here and now, and open to their senses. Their thinking is not yet bound
by adult certainties and conventions.
It is also Alan, working with children, who reminded us of the obvious, regarding
digital technologies: “we adults call technology any tool that was invented – after I am
born : ) Not so for kids! One could write an entire new essay, just on that!
Acknowledgements
Many thanks to Maria Voyatzaki, Constantin Spiridonidis, Marianthi Liapi and Kostis Oungrinis, and
the participants in the 2011 “sense[res]ponsive environments for children” workshop, in Chania,
Greece. I extend my thank to Yasmine Abbas, Cearan Benson, and Sherry Turkle, as well as to my
colleagues from the LEGO Learning Institutes, Exploratorium Science Museum, and AIANY, whose
works are an inspiration to my thinking about people and places.
Notes
1 We call the changes “epistemic” because they question how pre-digital cultures have come to
define knowledge and to think about thinking itself, and how their views on how to promote
everyone’s potentials are projected on those who don’t think like them!
2 Cearan Benson defines place as a “humanized personalized space”, and he uses the term placetime to indicate that “in personal and collective memory certain places are inexorably constituted by their […] connections with, and embodiement of, certain moments in experiential
time […] Place situates time by giving it a local habitation. Time arises from places and passes
between them (Benson, 1993. p. 6).
3 Scaffolding is about supporting learners to achieve beyond existing capabilities by giving
them a ‘step up’ through questions, pointers, or encouragement, rather than direct instruction.
Ultimately, the learner should reach a point where they wont need the scaffolding support.
In this case, the mere knowledge, or perception, that there are trusted others on whom one
can rely on, becomes enough to support self-reliance.
4 There is much talk about 21st century skills and standards these days, and much research is being fueled into redefining what today’s youngsters ought to know, or learn, in order to become
active and successful players in tomorrow’s world (Jenkins, 2009; Weigel, James & Gardner,
2009). While important, such guidelines often emerge from adult projections and as a result,
they tend to downplay what the youngsters themselves are contributing. As mentioned earlier,
our focus as a psychologist is on the emergent traits, as exhibited by the natives, more than
on adult projections.
5 The craftsman establishes an intimate connection between head, eyes, hands, and tools. And
as he perfects his art, the materials at hand speak back to him through their resistances, ambiguities, and by the ways they change as circumstances change. An enlightened craftsman is
one who falls in love with the materials and becomes so fluent in using his tools that he feels
at one with them. According to Sennett, such appreciation and fluency are in no way contrary
Edith K. Ackermann USA
39
to play, since it is in play that we find the source of the dialogue the craftsman conducts with
materials, such as clay, wood, or glass.
6 We borrow this title from Prensky’s latest book of the same name, published by Corvin, 2011
7 The term “new media ecology” refers to environments in which traditional tools and mediations, intersect with, are augmented by, and in some cases mimic or invert, their digital
counterpart. “Genres of engagement” refer to they ways the digital natives (and immigrants)
navigate, stake, inhabit and furbish today’s hybrid media environments—with the assumption
that no one ever lives in one realm, mode, or channel, alone. Instead, we all move between
realms, physical/virtual/digital all.
8 For more on the history of the Vivarium Project, visit http://www.beanblossom.in.us/larryy/
VivHist.html
Bibliography
Abbas, Y. (2011) Le Néo-Nomadisme: Mobilités partage, Transformations Identitaires et Urbaines.
Editions FYP (France).
Ackermann, E. (2008). “Notation chez l’enfant : Du Graphique au numérique”. In Apprendre demain:
Sciences cognitives et éducation à l’ère numérique (Eds. Andler, D., Guerry, B). Paris: Hatier. (p. 77-94).
Benson, C. (2001) The cultural Psychology of Self. London & New York: Routledge.
Bronfenbrenner, U., (2004) Making Human Beings Human: Bioecological Perspectives on Human
Development. Sage Publications.
Casey, E.S. (1993) Getting back into place: Toward a renewed understanding of the place-world. Bloomington, Indiana University Press.
Herr-Stephenson, Rhoten, Perkel, and Sims (2011). Digital media and technology in afterschool
programs, libraries, and museums. The MIT Press.
Ito. M. & Al. (2010). Hanging Out, Messing Aroung, Geeking Out. Cambridge, MA.: The MIT Press.
Jenkins, H. (2010), “Confronting the Challenges of Participatory Culture: Media Education for the
21st Century.” White paper. The John D. and Catherine T. MacArthur Foundation.
Kristeva, J. (1991) Strangers to Ourselves. New York: Colombia University Press.
Lakoff G. and Johnson, M. (1980) Metaphors we live by. Chicago & London. University of Chicago
Press.
Langer, E. (1997). The power of mindful learning. Cambridge, MA. Perseus Books.
Latour, B. (2005). Reassembling the Social. Oxford: Oxford University Press.
Law, J. (1992). Notes on the theory of the actor network. Systems Practice (p. 379-393).
Levi Strauss (1962). La pensée sauvage (in English: the savage mind, 1966).
Oldenburg, R. (1989). The Great Good Place. New York: Paragon House.
Prensky, M. (2005).”Don’t bother me mom: I am learning!” New York Paragon house.
Sennett, R. (2008) The craftsman. Yale University Press.
Turkle. S. (2011) Alone Together. New York: Basic Books.
Vygotsky, L. S. (1962) Thought and Language. Cambridge, MA: MIT Press.
Vygotsky, L. S. (1978) Mind in society. Cambridge, MA: Harvard University Press.
Weigel, M.; James, C.; Gardner, H (2009) Learning: Peering Backward and Looking Forward in the
Digital Era. The International Journal of Learning and Media, Volume 1, Number 1, Cambridge, MA,
MIT Press.
40
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Antonino Saggio
Sapienza University of Rome
Italy
GreenBodies
Give me an Ampoule
and I will Live
This essay summarizes my research of the last five years and is projected to become
the next chapter of my last EAAE-ENHSA keynote speech delivered five years ago.1
When I delivered that speech according to my usual practice, I created a hidden
link, which expanded the lecture so that the audience could see the images, read the
texts, access further pages and,when the conference was over – listen to the audio, all
from this one link.2 This use of the Internet is relevant because content and container
are interwoven. If I want to speak about processes, interconnections, ecological systematic thinking and IT how can I do it with a linear (and private) slide presentation?
We are on the web; let’s share, and particularly use the inner philosophy of electronics:
Interconnections.3
The 2005 keynote was entitled “Give me a cord and I will build.... Construction, Ethics, Geometry and Information Technology”.4 This one is entitled “Green Bodies. Give
Me an Ampoule...and I Will Live. Rethinking the human: Ecosystems for today’s architectures.” It is evident that key words have shifted from: “construction,” “ethics” and “information technology” to “ecosystems” and “green bodies.” The two main key words
have also changed.
“Cord,” which was used then as a symbol of geometry and construction and at the
same time as the instrument to build; has transmuted into:
“Ampoule” as the symbol of life and at the same time as the instrument with
which to create ecosystems.
The title “Give Me an Ampoule...and I Will Live” should begin to create the mental
framework in which we are moving in this essay. I apologize for the length and complexity of some passages. It is more challenging to go along new lines of research
than to present well established ones.
This essay is organized in seven parts. Each part is a “city” which we can inhabit for
all our scientific life or just look at briefly from an airplane. Nevertheless, all seven cities are part of a common territory. It is a system of relationships to facilitate the birth
of design ideas, which are relevant to our topic. Here are the seven parts of the talk:
1. Hybridization between Systems of Architecture and Systems of Nature.
2. Parallel Lines Do Meet. The Awareness of Limited Resources.
3. Processes, not Objects.
4. Synergy. Vernadsky + Buckminster Fuller = John Allen’s Scientific Experiment.
5. Biosphere 2 and Closed systems.
6. Current researches.
7. Principles of Green Bodies.
Hybridization between Systems of Architecture and Systems of Nature
The idea of today is that architecture must become a reactive landscape,
complex, animated and alive in a process of combination with other elements of technology and of the environment.
The aspect of hybridizing the natural and the artificial is thus moving
towards the center of the conception of architecture nowadays.
42
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
The nature intended in this concept is no longer floral or “art nouveau-style;” neither is it the nature of the masters of Organic Architecture,
counterpoint to the mechanical and industrial world. Current concepts
of nature have in fact become much more complex, much more difficult,
much more “hidden.” This nature is also investigated by architects and
designers with an anti-romantic eye through the formalisms of contemporary science (fractals, DNA, atoms, the leaps of an expanding universe,
the relationship between life and matter, topological geometry, animated
forms), in other words, through the categories of complexity. Hidden in
this context are the figures of flows, the wave, whirlpools, crevasses and
liquid crystals; fluidity becomes the keyword. It describes the constant mutation of information and places architecture face-to-face with the most
advanced research frontiers, from biology to engineering, to the new fertile areas of superimposition such as morphogenesis, bioengineering or
biotechnology.
IT endows architecture with reactive systems capable of simulating
types of behavior in nature, in reacting to climate, usage flows and ultimately also emotional behavior, and so offers a new phase of esthetic
research.
The approach described above, opens the path to different research. In order to better
understand the idea of hybridization between architecture and nature, I went back to
a moment in which there did not exist separations between man and land, construction and nature, rational and magical. It was a moment in which the interconnections
among things were more important than the things per se.
The Etruscans had an integral, magical, heuristic relation with nature. Vie Cave are
the most relevant examples of this attitude. They are long, human-excavated processional streets down which the dead were brought to sepulture. At the same time, the
Vie Caves were used to celebrate nature. For the Etruscans, nature speaks. She lives
and breathes in a sphere shared with all the other creatures. Nature is alive.
Humans, animals and land were interrelated, interconnected; they were part of the
same “system.” The governing forces of this system could not be explained by “analytical” reductionist means but only by “ecological” ones (i.e. based on interconnections,
therefore antiscientific from a positivist, reductionist, analytical point of view). This is
the central concept derived from this research path. Hybridization is not only a “formal” device; it is rooted in profound ecological thinking. It is an action that is part of
an “ecosystem.”
From a more direct and “architectural” point of view, Etruscan is the civilization of
the “section,” because it is the section that celebrates the marriage between the earth
and human artifacts.
The “plan” is the symbol (and the instrument) of the Roman military and expansionist attitude. If the plan is the symbol (and the instrument!) of rational domination,
the section is the symbol/instrument of ecological inhabitation. If Etruscans hybridized architecture and nature through section, the Romans “posed” independent objects on the land: Aqueducts, streets, and bridges.
Antonino Saggio Italy
43
Later on, towards the beginning of the nineteenth century all the world of mechanical artificiality related to the Industrial Revolution developed that “rational” idea
of domination and infinitive conquest much further.
If an “ecosystems” approach to architecture should take place, then
architecture must belong simultaneously to the land and to the cloud
(i.e., Information Technology). This interconnection is the crisis and
the challenge in front of us.
The proposal for a Museum for Francesco Borromini in Rome5 is a good example of
how these ideas of Land, Architecture and IT may take shape today. This final thesis
starts from the notion that “Modernity is what turns crisis into a value and gives rise to
an aesthetics of rupture.”6 The crisis that precipitated this project was the fracture provoked by a thruway in the old historical park of Villa Pamphilj. From historical research,
the presence of Francesco Borromini emerged in the planning and design of the villa.
The Doria Pamphiljs were indeed his clients for the Piazza Navona Palace in downtown Rome. From the Borromini presence emerged the brief: A mixed use project
that, as a driving force, proposes a Museum dedicated to Roman Baroque architecture
- MOB. The project’s development was based on the use of a diagram inspired by one
of Borromini’s ceilings. It was an inspired choice. As the ceiling lines “connect” the different walls of Cappella dei Magi, in Rome, the same family of lines may connect the
opposite sites of the park. Borromini’s drawing was pulled and stretched to adapt to
the site that had been cut by the thruway. The project idea developed as a membrane
structure, half natural and half artificial that was modeled along the diagram’s lines.
The architecture is indeed a hybrid: Building, land, bridge and nature at the same
time.
This architecture belongs, at the same time, to the Clouds of Information
Technology.
In this process of hybridization the catalyst role is, of course, that of Information Technology that is the key for an entire group of connected reasons.
In the first place, the information era provides an overall different model of the city and urban landscape, as well as in part the surrounding territory that has mixed uses with overlapping flows, open 24 hours a day for
production, leisure, social, and residential activities, where natural and artificial elements are woven together with the combination of functions and
uses.
In the second place, information technology supplies the “mathematical models” to investigate the chemical, physical, biological, and geological
complexity of nature. These simulation models permit structuring new relationships in projects that consider reasoning and dynamics. In this process,
information technology supplies the essential tools for first creating, then
designing, and finally constructing designs conceived with these complex
systemic approaches.
In the third place, information technology endows architecture with reactive systems capable of simulating natural behavior in their reaction to
weather, flows and usage, as well as to ultimately emotional behavior, and
44
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
thus offers a new phase of investigation into a concept of landscape that
is not just “simulated” in architecture but actually and physically represents
several aspects. This means defining an environment and an architecture
that not only evoke the formative rules of landscape and nature, but also
propose environments capable of interacting and evolving. In this context,
information technology enters directly into the fiber itself of new buildings,
first by digitally designing them, and later building them using new construction techniques, but above all by exploiting dynamic electronic interconnections to create environments that react to variations in real situations and flows to form a sort of IT landscape in new buildings.7
Not only is IT indispensable in the process of ideation and of taking shape of many
projects today; in many cases the architecture incorporates contemporary electronic
technologies to actively partner in the environment. In the case of MOB, the edifice
transforms and purifies the air, heavily polluted by the passage of cars underneath it.
The building becomes active and can be seen as an alive being.
Among the research work produced by NitroSaggio is the built prototype of “A New
Primitive Hut.” This is a good example of an architecture that creates a hybrid half natural and half electronic environment. The movement of the person in this new “hut”
molds the environment by changing interactively the sound. In this way, “the occupant shapes information while he/she moves demonstrating that the current idea of
space ’is’ also informational.” 8
Parallel Lines do Meet
The Concept of Limited Resources
The idea of the city for the Functionalist CIAM (the International Congress
of Modern Architecture) evoked a city in constant centrifugal movement as
if it were a flywheel that could “youthfully” and mechanically expand, absorbing pieces of the surrounding territory. We know this model has entered
a crisis period over the past few decades for a whole range of reasons, not
the least the awareness of the limited nature of resources and the birth of
an ecological consciousness. As we have mentioned, the presence of the
information era has contributed greatly to this because the change in the
production model (robotization, miniaturization, the decentralization of
heavy, polluting industries) creates new opportunities and frees up resources. In particular, the great industrial areas becoming available create the
possibility of an epochal reclamation project. Reclamation is an essential
key word here since green spaces, nature, and park facilities can now be introduced into areas frequently filled with high-density construction. At the
same time, large natural areas must be conserved and respected and not
eroded infinitely by the undifferentiated expansion of new suburbs even if
they are supplied with wireless broadband.
More specifically, if CIAM’s idea of nature was “green,” i.e., something
that resembled a patchwork on a plane where green zones contrasted with
residential, industrial, or office areas, the modern concept is one of landAntonino Saggio Italy
45
scape (cf. Landscape); in other words, a much more complex idea that sees
nature and constructed areas “together,” a constant hybridization between
the formative rules of the urban landscape and the architecture itself of
buildings. To sum up, architecture and urban planning themselves make
up today’s landscape. Architecture takes what it does not have, absorbs it,
transforms it, makes it its own, and reconstructs a new idea of nature.9
We do not believe in the presence of unlimited resources and therefore architecture
and the city cannot indefinitely expand. The vision of the never-ending railways at
the conquest of the Far West or of the Urban highway extended towards the horizon
shaped a long phase of architecture and urbanism. It was an idea still embodied in
the Deconstructivist movement of the eighties and the nineties. You may recall from
Between Zero to Infinity by Daniel Libeskind, for example.
To think about ecosystems for today’s architecture I propose another formula, another vision: instead of “From Zero to Infinity” I propose: “Parallel
lines do meet”.
We have to change our point of view once again. “Parallel lines do meet,” means that
we live in a closed system with limited resources. We do not live in the never-ending
flat plateau of Euclidian math, but in the curvilinear, negotiable topological world of
planet earth! We “are” in this closed system, we are in this planet, and in planet earth
parallel lines meet.
Not only are we in a world of limited resources, we are also in a world in which our
actions can kill or mend the world. If we continue to perforate the earth, for example,
it is rather clear that we are going to kill it at the end. It is Earth’s crust not only as a
metaphor: Just think of the “cycle” of petroleum. But at the same time we can think of
actions that can mend, ameliorate, and be compatible. And Architecture must be with
science at the forefront of this search.
Systems: Processes not Objects
The idea of the functionalist city was implicitly tied to the idea of the assembly line that organized a series of operations to be performed sequentially so as to achieve efficiency in the production cycle. Each phase
was constantly perfected and optimized to then move onto a subsequent
phase.
But the concept of “before and after,” “cause and effect,” “if … then,” related to mechanized, serial production has now been replaced by a concept
of simultaneous processes, subdivision of cycles, the presence of alternatives, in other words of “what...if.”
The network that diffuses, interrelates, interconnects, and makes the
development of processes both global and local has inevitably replaced the
figure of the line.
The aim of the production system is no longer the uniformity and homogeneity of the final result (guaranteed by constantly greater improvement in the various production phases) but exactly the opposite. It is the
46
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
personalization of the product based on individually activating several different connections each time in the informational network.10
We want to focus on issues related to education and curricula. Up to the recent past,
architecture was expected to produce primarily artifacts, i.e., objects. To produce objects in the industrial era, the assembly line was the way to go and the flowchart, the
model from industrial production, moved to education. Accordingly, teaching was
chopped in areas and subareas following the same principle used for the industrial
production. But if “We have to change at lot,” as professors Spiridonidis and Voyatzaki stated, and if we want the address the issue of Ecosystems, we have to modify
that.
We have to start addressing teaching through the development of
“processes” and not “objects.” Electronic and ecological thinking are
both based on interconnections. Architecture should not produce one
“object” but a series of methods to implement relationships and families of solutions.
Therefore, scripting and parametric design, so popular in these days, are not only a
fashion, they are rooted in this shift from object to process! It is a Copernican revolution. This brings us to two other rather important factors of change.
The first one is that contemporary teaching must be more oriented towards the
development of “Projects.” In a world dominated by information, and with extremely
easy access to knowledge, what becomes critical are the motivation, the methodology and the instruments to study. If we create projects that motivate students by their
inner strength and necessity, if we teach how to structure the search of information, if
we provide the basic guidelines to the direction in which to look and in the complexity behind it, students will educate themselves. Recently this pedagogy of “teaching
by projects” has been proven highly successful.11
The other interesting factor of modification is what we call, in our teaching jargon,
“maratonda.” This reverses the old “in-out” linear flow and substitutes a circular, cyclical
“in-out-in” process which is attentive to the use and re-use of resources.
Here is one example. This project is called “Place Less. From Playground to Urbanground: Monitoring and recycling”12 and the crisis was the condition of homelessness
in Rome, the pollution and the way to have more intelligent and creative tourism for
the young. Well, you may think that we are crazy. How can all these three elements be
put together?
The solution was rather interesting. The students designed a little mobile device
(car, chart and bicycle at the same time) that had different positions. The user can cycle, collect, trash, recycle, and check the pollution or just tour. The mobile device can
be offered to homeless or poor people to earn some extra money, or can be rented.
The device has also a place to rest in a sort of urban park that is organized as movable
landscape.
We do not ask, “design a vehicle for the homeless.” We frame knowledge and challenges; we provide methods and instruments. The students find “a crisis.” We work with them to shape the concept, articuAntonino Saggio Italy
47
late the brief, and develop the project. Pedagogically, students “do
not learn by doing” in Deweyan terms, but rather “learn by necessity
and by desire”.
In this process students face a number of issues, study different matters, and develop
specific skills. This example refers to a class based on the relationship between IT and
Architecture. When a more direct architectural design is required, other approaches
are developed, but we cannot address them here for lack of time.13
Synergy
Vernadsky + Buckminster Fuller = John Allen’s Scientific Experiment
Now let’s go to the more typical cultural-informative part of the conference. I want
briefly to talk about a fundamental project for the creation of Ecosystems for Today’s
Architectures.
In 2006 I met John Allen, the inventor of the scientific project Biosphere 2. I consider myself lucky because I entered the ecological world with one of the top ecologists.
Through Allen I understood things that I could not get “just” studying the literature.
What follows are some of these findings.
Let me underline one of the most important principles of ecological and systematic thinking: synergy. I understand synergy as “biological mathematics.” While in algebraic mathematics 1+1 = 2, in Synergy 1+1 can make 3 or 4 or 5 or -1 - 2. If the minimum principle of synergy applies than 1+1 equals 3.
Now, 1 + 1 = 3 is a good formula of “creativity.” Creativity is a term
that applies to creative thinking as well as to the most creative action
of all: The “creation” of life.
Is it not true that in sexual reproduction 1+1 makes 3? Starting to think in these terms
opens new doors indeed. For example, architecture is one of the greatest synergies
one can think of. We take rough materials and by putting them together we increase
the value of the product. We cannot calculate the “cost” without algebra, but we
cannot understand “the value” without a feeling of the synergetic process we went
through to create it. We create energy, to a level that is impossible to create with a
normal sum.
John Allen created an incredible synergy between two men. The two men were,
on one side, Buckminster Fuller, and on the other Vladimir Vernadsky.
Allen, by 1971, was already calling his ranch in Santa Fe, New Mexico, Synergia
Ranch. It was his vision of life and also a homage to the chapter “Synergy” dedicated to the subject by R. Buckminster Fuller in his coeval volume Operating Manual for
Spaceship Earth. Nowadays, interest in Buckminster Fuller has been revived but for my
generation he was almost completely cut out, as if he did not exist. He was considered
a strange, humanistic engineering fellow who wanted to put humans and technology
together!
Allen and Buckminster Fuller had a strong relationship in the last phase of the latter’s life and many ideas took form. His book Operating Manual ... is a fundamental
book, some kind of “manifesto” of ecological thinking. It is a small book, important to
48
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
read, where several ideas are interconnect. Crucial is the idea of finite resources and of
closed system: The Spaceship earth of the title of the book.
Second, is the need for being interdisciplinary, a concept that Bucky takes from the
culture of sea people. Sea people must know everything, from stars to winds, to underwater rocks, to geography and animals, to culture habits, religions and languages.
Buckminster Fuller dedicates fantastic pages to pirates. Let us not forget, on the other
side, that the technology for “the rest of us,” that which we are using today, was created by a small group of people at Apple Cupertino under a pirate flag.
So the idea of “energy” created by an interdisciplinary group of people “closed”
in Renaissance Florence or in Apple’s “Texaco Towers” applies quite well to Bucky’s
thinking.
Allen underlined some aspects of Buckminster Fuller’s method, through an algorithm that shows a method to have a synergetic approach. Here is the citation:
If you take the synergetic overall approach then proceed to a comprehensive anticipatory design;
if you’ve started on this, then make detailed macro-comprehensive and
micro-incisive studies;
if these are completed, then proceed to do more with less; ephemeralize;
if you’ve ephemeralized, then computerize to check rationality and to
communicate;
if you’ve computerized then check if you’ve increased the wealth of all
involved.
(...)This algorithm constitutes his greatest contribution to dealing with the
challenges coming toward humanity in the next century, a time of great
planetary troubles, which he metaphorically referred to as humanity’s final
examination.14
Do not forget therefore these five steps: 1. Comprehensive design
(“have the whole”), 2. Macro and mini tests, 3. Do more with less, 4.
Computerize and 5. Assure the increased value!
If there is an indispensable point of reference for Ecosystems for
Today’s Architectures, it is represented by Buckminster Fuller.
And here comes the second man whose contribution allowed Allen’s synergetic
invention:
Vladimir Vernadsky (Russian: 1863 – 1945) was a Ukrainian Soviet mineralogist and geochemist who is considered one of the founders of geochemistry, biogeochemistry, and of radiogeology. His ideas of Noosphere were an
important contribution to Russian cosmism. He also founded the National
Academy of Science of Ukraine. He is most noted for his 1926 book The
Biosphere in which he inadvertently worked to popularize Eduard Suess’
1885 term biosphere, by hypothesizing that life is the geological force that
shapes the earth. In 1943 he was awarded the Stalin Prize.15
Antonino Saggio Italy
49
We will talk about Biosphere later, for now let’s notice that Russians - via Vernadsky
- use the word “cosmos” while Americans use the word “space.” The difference is important because the idea of Cosmos underlines that forces “are together,” they are interconnected and interrelated. Technically, Vernadsky was the first to prove that “oxygen,
nitrogen and carbon dioxide in the Earth’s atmosphere result from biological processes.” 16 This finding gives shape to the development of the concept that the World can
be seen as a series of interlocking spheres. They belong, for Vernadsky, to the sphere
of life (which is of course called “Biosphere”), to the sphere of Geochemistry, to those
spheres of cultural knowledge and technology.
If cosmos is “solid,” space is “empty”. If cosmos is regulated by complex and probabilistic interrelationships, the absolute Newtonian
laws of physics can govern space, if space implies the possibility of an
unlimited expansion; Cosmos implies the necessity of the coexistence
of different forces.
Now, not only animal behavior influences the inanimate sphere but cultural and technological ones influence the biosphere. This is the crucial aspect of this approach. As
Allen clearly underlined to me, in an ecological approach there is no such thing as the
environment on the one hand, and man on the other. The concept of environment is
anti-ecological by definition, whilst ecology is about the interconnections!
John Allen, a geologist like Vernadsky, and at the same a personality profoundly connected to literature, put together the operative, profound, revolutionary, nonconformist, holistic thought of Bucky and his own geodetic technique with a philosophy
stemming from farfetched and, in fact, politically opposed culture in the era of the
USA-USSR Cold War. Vernadsky achieved cosmic reasoning and saw geological, biological, atmospheric and human phenomena as an interacting whole of forces and forms.
After the construction and invention of the Synergia ranch, Allen built a ship, following Bucky’s understanding of the interdisciplinary practices of sea people. Called
the Heraclitus, the vessel has since 1974 been circumnavigating the world collecting
data from all its different spheres. But the great achievement of Allen and his Ecotechnics group was the ideation in the eighties (after a series of interdisciplinary conferences, and the construction of other preliminary projects) of Biosphere 2: a great,
probably the greatest and most interesting ecological experiment ever built.
Biosphere 2 is a 3.14-acre (12,700 m2) structure originally built to be an
artificial, materially-closed ecological system in Oracle, Arizona (USA) by
Space Biosphere Ventures, a joint venture whose principal officers were
John P. Allen, inventor and Executive Director, and Margret Augustine, CEO.
Constructed between 1987 and 1991, it was used to explore the complex
web of interactions within life systems in a structure that included five areas based on natural biomes and an agricultural area and human living/
working space to study the interactions between humans, farming and
technology with the rest of nature. [2] It also explored the possible use of
closed biospheres in space colonization, and allowed the study and manipulation of a biosphere without harming Earth.17
50
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Biosphere 2 has little to share with the greenhouses that have been built around the
world - the most famous one being the Eden Projectin Cornwall, Great Britain by Nicholas Grimshaw. These projects can be considered very interesting from an architectural point of view, but they are not “ecosystems,” they are not “scientific experiments:”
Biosphere 2 is different! It was built as an experiment and it did work. Not only were
dozens of patents on different issues created, but also Biosphere 2 was fully tested.
Eight people lived in this completely closed system not for one but two years!
Biosphere II and the Closed System
At the core of this project there was the ingenious intuition that the idea of the biosphere as promulgated by Vernadsky could be combined with the ecological observation and technical inventions of Fuller.
Biosphere 2 was thus built in 1991 at Oracle in the desert near Tucson, Arizona, and
still affirms itself as an extraordinary work of both engineering and ecological science.
Allen, assisted by a team of numerous consultants, of whom the architect Margaret
Augustine and the engineer William Dempster should especially be remembered, so
realized a project according to the image and likeness of the terrestrial biosphere that
an interacting whole of geological, ecological and human forces formed of seven biomes (ecologically balanced systems) could serve to study systematic phenomena.
Biosphere 2 was based on these dynamically balanced systems where carefully studied percentages of plants, microbes, water, animals and air were in a cycle of
continuous regeneration. Through complex research with many experts specializing
in different areas, the seven biomes were thus determined (from the Amazon forest
to the Great Coral Reef, from the anthropological Mediterranean environment to the
same ocean’s marine environment) all housed within great glass paneled surfaces that
covered an area of more than a hectare. Living and relaxation areas and laboratories
were also integrated into the structure.
The experiment allowed, among other things, the patenting of various systems
and technologies that brought up to 100% the recycling of water, human and animal
waste as well as the autonomous generation of food and a minimum loss of air inside
the great closed environment.
Eight scientists, including Mark Nelson and Ray Walford, lived sealed up in this environment for two years, experimenting with its efficiency.
After this period, Biosphere 2 was conceded to Columbia University and then to
the University of Arizona that modified its structure. Nonetheless, this extraordinary
event marked the basis of a possible systematic development of architecture, an architecture that need not necessarily be connected to infrastructural networks but is
autonomous with regard to its own vital and energetic cycle.18
This is a picture of scientist, Dr. Clair Folsome19 who in the mid-sixties did the first experiment to prove the perpetuation and development of life in a closed system. It is
the key image of this talk. Folsome’s work proves that water, air and microorganisms
can be in equilibrium for a long time if they are sealed in a close environment. An ampoule is an image closer to our earth and its atmosphere than is a never-ending railway track! Give me an Ampoule… and I will live.
From this link20 it is possible to access a site created by the Italian photographer
Toni Garbasso, and to watch and navigate in a spectacular 3D immersion Biosphere 2
Antonino Saggio Italy
51
which is currently managed by the University of Arizona. Unfortunately, today many
of the scientific aspects of Biosphere 2 have been dismissed. The cruel destruction of
the scientific data and material of the project, even the removal of the original soil, of
all plants and seeds, was an act that in some moment in the future will be the subject
of a movie.
Biosphere 2 was built with donors’ contributions and money from a private developer who was seeking the possibility to use the technology in a field of increasing interest including that of NASA. But, after a couple of years after its completion a
terrible attack was undertaken against it. The establishment cannot accept the idea
of the ecological “system” as shown in Biosphere 2 because it was a real challenge for
the way to operate in the current economic “system.” One system was against another.
Just imagine what it means to prove, in a real experiment of that magnitude, how to
avoid the use of pesticide or of any other chemical products for agriculture. Try to imagine what this means for the huge market of chemicals in agriculture. Energy is another issue, recycling of water, use of waste etc. At that moment the Internet did not
yet exist and the media were controlled from the top with very little possibility to react. A converging attack of the media, governmental ecologist and politics succeeded
in moving the original creators out of the projects and, exactly as happened with the
Apple Macintosh when Jobs was fired, destroying the basis of the project. I recently
published a book on the history of the last century - Architecture and Modernity. From
Bauhaus to IT Revolution, Carocci 2010 -and in this book I proudly included Takis Zenetos, Samuel Mockbee, Paolo Soleri and the history of Biosphere 2 and John Allen.
If Bucky can be an indispensable reference I think that Biosphere 2 is
a fundamental example to study in order to address Ecosystems for
Today Architectures.
Some Examples and Current Research
New designers seek to give form to an idea of architecture born out of systems of dynamic interconnections, interrelations, mutations, and topological or parametrical geometries, typical of the world of information
technology. A whole series of architects are giving shape to a sort of hybrid
environment between nature and technology. Although this may not have
the clarity of that “collectively shared” representation assumed by the early
works of Hadid, Gehry or Eisenman, its features have already been outlined.
This notion of a computerized landscape is closely linked with contemporary scientific methods of investigation and simulation. Structured
through information technology, this idea uses the term “complexity” as
a sort of key word. At various times it can show typhoons, cloud formations, the reproductive mechanisms of DNA, or sedimentation of crevasses or terrestrial masses. But the difference between this generation and
the previous is that these experiments are not performed with sketches
or metaphorical images, but are investigated directly through computer
simulations. The genetic mechanisms of various phenomena are studied
and formalized (i.e., interpreted with mathematical equations) in these
simulations.
52
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
The mathematical formalization guaranteed by information technology leads to the birth of real project strategies (particle systems, attractors,
modifiers, etc.) that guide and conceptualize the logic for developing the
project. In this case, computer technology is not a tool for realizing a complex landscape considered independently from electronic media, but rather
it studies phenomena taken from the world and matter, and by formalizing
these phenomena identifies variations that slowly but inexorably lead to
new concepts of architecture, in an inextricable weave between the object
of study, computer modeling, and architecture.
We can pinpoint, very briefly, some of the current architectural research in this area.
One case of interest is surely Francois Roche and R&Sie(n). Roche is working towards
an idea of architecture as a hybrid body. Recently we discussed Biosphere 2 and he
was not very interested. Roche may sometimes be too focused on the formalization
in architecture of outside aspects of nature rather than on the inner functioning of
ecosystems. Many are waiting for a small but convincing built project from him, but
I think his work is crucial and must be studied seriously. A younger emerging group
is Ecologic Studio formed by two Italians, Poletto and Pasquaro who moved to London. They came out of the Emergent Technologies Masters Unit at the AA headed by
Michael Hensel, who has been on the forefront of the idea of combining engineering
with Information Technology and ecological thinking. One of the best examples of an
innovative approach is the work of the Polish scholar-scientist and artist Zbigniew Oksiuta, who collaborates with Max Planck Lab in Cologne, Germany. Oksiuta is developing prototypes of habitable spaces that grow from artificial material in water. These
new structures are not only habitable, but can be used in different contexts and circumstances and, in some cases, they can also be edible. I was very impressed by Unit
23 led by Bob Sheil and Emmanuel Vercruysse at the Barlett School, UCL - for their capability to create prototypes of cyclical ecological behaviors within high design and
graphic standards. I have dealt, in depth, with several of these groups in the last book I
edited . In this book there are essays of members of the Nitro group that go into great
detail to describe the above mentioned current research.
So, Green Bodies, at the end. In order to give life to “something” we, as highly symbolic beings, must give it a name. Giving a name means recognizing that from infinite
and often-accidental creations only that one is what we really desire and is the one we
were looking for. Giving a name is an inscription in the sphere of desires!
So we have named the long trail of this lecture “Green Bodies.”. Green Bodies share at
least six fundamental characteristics:
1. Green Bodies are not “add on” or “plug-in” technological support for environmentally sound buildings but, on the contrary, represent a different and complete rethinking of the very same idea of building. Green Bodies are living and
dying organisms.
2. Green Bodies are generated through a process of Convergence. This means that
we are aware of the role in the Biosphere, of all interconnected spheres including the cultural, technological, historical ones.
Antonino Saggio Italy
53
3. Green Bodies are strategically designed based on Buckminster Fuller 5 Rules’
algorithm.
4. Green Bodies are capable of intelligent, interactive, even emotional behaviors.
These behaviors become an active part of the world.
5. To describe, design or - even better - generate Green Bodies creators must use appropriate verbs: Not only the old verbs (to fold, to bend, to graft) that metaphorically relate to the form of land as in the land architecture phase, but also really
organic verbs. Green Bodies do sleep, smile, breathe, and sweat. Bucky wrote “I
Seem to Be a Verb” in 1970.
6. Each generation of Green Bodies generates - in a progress of natural evolution new specimens.
You can interpret these six points in many ways. They could implement an “operating
manual,” a soft manifesto, a checklist, a chart to add modify or expand, the index of
our next book, or the topics for 2017 talk.
Notes
1 The conference was entitled “(Re) searching and Redefining the Content and Methods of
Teaching Construction in the new Digital Era” EAAE-ENHSA at ESTAV in Valles, Barcelona, 22
September 2005.
2 http://www.arc1.uniroma1.it/saggio/Conferenze/Creta/
3 Among many other books in the “The IT Revolution in Architecture” series it appeared The
Architecture of Intelligence (Birkhauser, Basel, 2001) by one the best continuators of Marshall
McLuhan, Derrick de Kerckhove. See http://www.arc1.uniroma1.it/saggio/it/
4 EAAE-ENHSA ETSAV Barcelona 22 September 2005 published in AAVV Maria Voyatzaki (ed),
(Re)searching and Redefining the Content and Methods of Construction teaching in the new digital
era, EAAE-ENHSA, Athens 2005, pp. 13-34 see http://www.arc1.uniroma1.it/saggio/conferenze/
Barc/Eaae05.htm
5 Matteo Alfonsi Thesis, Antonino Saggio Advisor, Univ. La Sapienza, Facoltà di Architettura
L. Quaroni Roma 2006 “MOB il museo dell’opera Borrominiana, una macchina atmosferica
per trattare la crisi di villa Pamphilj”. See http://www.arc1.uniroma1.it/saggio/didattica/
Tesidilaurea/Alfonsi/
6 See Antonino Saggio, La rivoluzione informatica in architettura, Carocci, Roma 2007 (English
translation: The IT Revolution in Architecture. Thoughts on a Paradigm Shift, 2008) The phrase
was originally pronounced by Bruno Zevi.
7 In A. Saggio, The IT Revolution in Architecture. Thoughts on a Paradigm Shift cited p. 47.
8 See chapter “Information” in A. Saggio, The IT Revolution in Architecture. Thoughts on a Paradigm Shift.
9 Ibidem p. 38.
10 A. Saggio, The IT Revolution in Architecture. Thoughts on a Paradigm Shift, cited p. 35.
11 One author who devoted many talks and books to the topic is Ken Robinson. For example The
Element: How Finding Your Passion Changes Everything (with Lour Aronica). Viking, 2009 A good
successful case is that one created in Great Britain by the “Studio schools” an English private
organization part of The Young Foundation that created projects such as the Open University.
A clear talk on this was on TED by Geoff Mulgan. In more technical terms this approach can
also be referred to as “Project based learning”.
54
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
12 See http://www.arc1.uniroma1.it/saggio/DIDATTICA/Cad/2006/Ass/FInale/ authors are students Agnese Canziani, Alessandra Cao, Chiara Conte, Giustino Di Cunzolo, Maria Ragosta. A.
Saggio’s Course on IT fifth year Sapienza University of Rome 2006.
13 I am referring to “Urban Voids” and “Urban Green Lines” teaching strategies on urban and
architectural designs. The second project can be partially seen on my web pages and it is in
publication. I talked on several occasions of Urban Voids, one in English is in “Paradigms in
Architecture and Architectural Education” Conference: Assignments assuring Competences
La Antigua Guatemala, ENHSA South America. www.arc1.uniroma1.it/saggio/Conferenze/
Guatemala/
14 John Allen, “Buckminster Fuller’s Synergetic Algorithm and Challenges of the Twenty-First
Century” Speech delivered by for Buckminster Fuller Memorial at U.S. International University,
San Diego June 4, 1996 http://www.biospheres.com/pubjabucky.html
15 From http://en.wikipedia.org/wiki/Vladimir_Vernadsky
16 Ibid.
17 http://en.wikipedia.org/wiki/Biosphere_2
18 An important source of study on this question is described in John Allen’s in Me and the Biospheres: A Memoir by the Inventor of Biosphere 2, Synergetic Press, Santa Fe, 2009. The book provides the opportunity to follow in detail the history and conquests of this and other of Allen’s
projects. There are few architects and engineers, I’m sure, who know what I’m talking about,
but thanks to the Web and Wikipedia in particular, in-depth information is available to all.
“John Polk Allen (born 6 May 1929, Carnegie, Oklahoma) [1] is a systems’ ecologist and engineer, metallurgist, adventurer and writer [2]. He is best known as the inventor and Director of
Research of Biosphere 2, the world’s largest laboratory of global ecology, and was the founder
of Synergia Ranch. Allen is a proponent of the science of biospherics.
Allen currently serves as Chairman of Global Ecotechnics, and a director of Biospheric Design
and of Institute of Ecotechnics. He is a Fellow of the Royal Geographical Society, the Linnean
Society, and the Explorers’ Club.
In the early sixties, John Allen worked on regional development projects with David Lilienthal’s
Development Resources Corporation in the U.S., Iran, and Ivory Coast where he became an
expert in complex regional development. Before that, he headed a special metals’ team at
Allegheny-Ludlum Steel Corporation, which developed over thirty alloys to product status.
He has led expeditions studying ecology, particularly the ecology of early civilizations: Nigeria,
Iraq, Iran, Afghanistan, Uzbekistan, Tibet, Turkey, India, and the Altiplano.
He studied anthropology and history at Northwestern, Stanford, and Oklahoma Universities,
and served in the U.S. Army’s Engineering Corps as a machinist. He graduated from Colorado
School of Mines and received an MBA with High Distinction from the Harvard Business School.
In the early 1960s, Allen headed a special metals’ team at Allegheny-Ludlum Steel Corporation
which developed over thirty alloys to product status, then he worked with David Lilienthal’s
Development Resources Corporation in the U.S., Iran, and Ivory Coast.
Under the pen name of Johnny Dolphin, he has chronicled his personal history alongside the
social history of his many destinations in novels, poetry, short stories and plays. “ from http://
en.wikipedia.org/wiki/John_P._Allen
There is much literature inside and outside the web; it is interesting to note this section that is
hosted by Columbia University itself http://www.columbia.edu/cu/21stC/issue-2.1/specmain.htm
19 Here is a bibliography http://www.biospheres.com/histfolsome1.html
20 http://www.studioargento.com/biosphere2/
21 Architettura & Information Technology, (eds. A. Saggio, ) Mancosu, Rome 2011.
Antonino Saggio Italy
55
Kostas Terzidis
Graduate School of Design
Harvard University
USA
Digital Culture
and Permutation Architecture
This essay may sound contradictory at times, not because of the lack of information or
the complexity thereof but rather because of a shift in linguistic meaning in the technological jargon today. Over the last few decades, words have changed meaning in
such a way that the same word means something completely different from what it
meant to mean a few years ago. For instance, consider the title of this essay: digital
culture. Even the phrase “digital culture” is a contradiction. ON the one hand, culture
can be defined as something entirely human that involves arts, literature, religion, or
philosophy. It is the subjective realization, understanding, and expression of a group
of humans at a particular time in history. On the other hand, digital is something that
is objective, quantifiable, neutral and therefore non subjective. So, from the very beginning we’re called upon to define the relationship between two antithetical terms,
digital and culture. It is almost the same as trying to define what “subjective objectivity” really means.
Given my Greek heritage, it would be appropriate to start with a myth, an ancient
Greek myth which may illustrate metaphorically this contradiction. It is the myth of
Theseus’ ship or rather the paradox of Theseus’ ship. Theseus was a hero in the ancient
Greek mythology that had done many great deeds, such as defending the people
of Athens from monsters, daemons, and thieves, going down to the island of Crete
and killing the Minotaur, saving people from disasters and famine, founding cities,
and many more. Theseus did his heroic deeds by using his ship that he loved dearly. It was that ship that took him to foreign lands, let him escape from dangers, and
opened new sailing paths for him. When he got old, he anchored his ship at the port
of Athens. But the ship was made out of wood and over time it started to deteriorate.
First, the sail fell off. So, Theseus ordered it to be replaced. Then the mast collapsed,
only to be replaced immediately. Then the rows, the ropes, the flags, and then the
hull fell apart. Eventually, at some point the entire ship was replaced with new parts
and even though it looked the same, none of the original parts was present. And so,
a question arises: which one is Theseus’ ship. The ship that he sees in front of him today or that ship that remains in his memory? Is that deck that he is walking on today
the same deck than the one that he jumped on when he was fleeing from the Cretans years ago? Does the sail he sees today on the ship be the same sail that years ago
caused his father’s death? Is that mast the same one he climbed to see his beloved
Ariadne when he fell in love with her? Why does it matter which one is the real one? Is
it not what we see, touch, smell, hear, and taste the same thing as that which is in our
minds? If we still use the same words to identify things, are they really the same? Or
is there something deeper, something behind the visual appearances that contains a
meaning that cannot be defined with words?
Design in the last few years has gone through a similar transformation. Words that
were used as recently as twenty years ago mean today something entirely different if
not antithetical. Technical terms that are used in design to convey a concept, a technique, or a process have changed in such a way that their meaning is completely different leading to confusions, misunderstanding, and misconceptions. Let us consider
the process of design as a sequence of actions starting with an inspiration, followed
by a model that is then rendered and finally presented. So, we can say with a certain
degree of certainty that even today the process of design has a starting point into
58
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
the world of ideas and is progressively materialized into a more specific form that is
then sent for implementation. I will define these stages using the following words:
inspiration, modeling, rendering, and presentation. So, within this paradigm, design
starts with an idea, a concept, an inspiration that initiates the process. Then the designer needs to make the idea more specific by using a pencil and paper to sketch
out the main form. Then more details are added in order to produce a working model. Next, the designer is in a position to render the model in the order to convey the
material and formal qualities. However, in the world of design today the process of
modeling has been replaced by computer programs such as autoCAD, Rhino, or Maya.
What used to be done manually using paper and pencil has been distanced by using
a mouse and a virtual screen. Moreover, the process of producing a model has been
enhanced, corrected, altered, and modified often with no direct control by the user of
the software. Similarly, rendering used to be a manual tedious process involving artistic skills, perspective geometry, painting, and occasionally, collage, not to mention
time consuming and expensive. Today, computer programs such as V Ray, Renderzone, or Maxwell provide virtual reality representations that often exceed the real not
only in the accuracy of depiction but also in their ability to extend reality into artificial, illusory, and fantastic worlds. Meanwhile, the speed, efficiency, and cost of such
rendering mechanisms are far distanced from their original manual process. Further,
the techniques, processes, and methods of presentation of models has also altered
so much from the world of manual presentation so that the terms used today serve
no help in demoting what really is happening and are therefore confusing and misguiding. For example, there is little if nothing photographic about Photoshop and the
word Illustrator offers no connection to the profession of an illustrator at least as it
is remembered some twenty years ago. Similarly, Rhino, Grasshopper, Maya, or max
are part of a nomenclature that provides very little to address, define, and explain the
logic, structure, and potential of digital systems.
As a consequence, words in the vocabulary of the designer today have changed
meaning, especially with the emergence and application of computational methods. Most of the terms have been replaced by computational counterparts and we
should probably take that as a sign that something is happening in the world of design, something very important, fundamental, and profound that may have strong
influences and repercussions. From Photoshop filters to modeling applications and
from simulation programs to virtual reality animation and even more mundane tasks
that used to need a certain talent to take on, such as rendering, paper cutting, or 3D
printing/sculpting the list of tasks diminishes day by day being replaced by their computational counterparts. What used to be a basis to judge somebody as a talent or a
genius is no more applicable. Dexterity, adeptness, memorization, fast calculation, or
aptitude is not anymore skills to seek for in a designer or reasons to admire in a designer as to be called a genius. The focus has shifted far away from what it used to be
towards new territories. In the process many take advantage of the ephemeral awe
that the new computational tools bring to design by using them as means to establish
a new concept or form only to be revealed later that their power was based on the
tool they used and not their own intellectual ability. After all, the tool was developed
by somebody else, the programmer, who, perhaps, should be considered the innovator if not the genius.
Kostas Terzidis USA
59
As a result of the use and abuse of design tools many have started to worry about
the direction that design will take in the next years. As one-by-one all design tasks
are becoming computational some regard this as a danger, misfortune, or in-appropriation of what design should be and others as a liberation, freedom, and power towards what design should be: i.e. conceptualization. According to the second group,
the designer does not need to worry anymore about the construction documents,
schedules, databases, modeling, rendering, animation, etc. and can now concentrate
on what is most important: the concept. But what if that is also replaced? What if one
day a new piece of software appears that allows one to input the building program
and it produces valid designs, i.e. plan, elevation, sections that work. And, worse,
what if they are better than the designer would ever do by himself or herself. Even
though most designers would never admit that something is better than what they
would have designed, yet what if deep inside them they admit the opposite. What
then? Are we still going to continue demonizing the computer and seeking to promote geniuses when they really don’t exist? Or should we reconsider our knowledge,
terms, concepts, processes, and methodologies and seek for new answers rather than
old reassurances?
However, it may be that the above dilemma is important only because we are still considering design in terms of an old paradigm, that based on human intelligence and initiative. Is it possible that this paradigm is not valid any more. Is it possible that design
is more than just a human activity and as such can be performed by non humans? Let
me illustrate what I mean with a few examples: if I, as a designer, want to draw a dot
on a piece of paper, most likely what a person would do would be to take a pen or a
pencil and lower it on the canvas marking a dot. But the process involves apart from
mechanical actions, an intellectual determination of the process of lowering the arm
and pointing. Strangely enough even though, at first sight, the process appears to be
random most of the process is predetermined in the brain as the hands move down.
The process can be said to be similar when using a digital tool instead of a pencil. Suppose you are faces with a canvas in Photoshop and you select a pen and then move
the cursor on the screen until you press down on the screen leaving a mark. I see little
difference in that and the physical process. Now, consider the following commands
on a computer system: x = 20, y = 30, and point(x, y). This will draw a point at location 20, 30. Replace now these commands with the following x = random(0,100), y =
random(0,100), and point(x, y). I assume that the canvas is 100x100 pixels wide. Also, I
assume that a command called random(min, max) exists that can produce and unpredictable to me number within a range set between min and max. Now, there is a lack
of control/prediction of where a dot will show up. I know that I will see a dot but it is
almost impossible to predict its location in advance. Consider also the following commands: x = random(0,100), y = random(0,100), and if x > 50 and y<50 then point(x, y).
Now, I am not only uncertain about the location of a dot on the canvas but I am not
even sure if I will see a dot at all. That is, in the case x <= 50 then point() will not be
activated. You may start to distinguish a difference between the human world and the
computationally driven random world. There is a thin blue line that separates the two.
The first, is the human world with its intentions, mistakes, aspirations, etc. a world we
have been familiar with for over thousands of years. The second world is new, non hu-
60
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
man, encountered for the first time; alien and strange. Please cross the line between
predictable and unpredictable.
Now let’s implement this theory using a simple human task, that of solving a puzzle.
Suppose that you are presented with a puzzle composed of ten pieces that eventually fits into a rectangular canvas. Any human, consider for example a child, will start
by selecting the first piece placing in the canvas, then the next one and place it, then
the next and so on until either all pieces match or in case there is a impasse, take out
a piece or two and rearrange until a match is found. This process may take a few seconds or minutes depending on the complexity of the puzzle or the capabilities of the
solver and it is considered as a task of human intelligence, or intelligence in general.
Now consider the following possibility. I take the pieces of the puzzle and toss them
in the air, let them fall and hope that a match is found. If it does not work I do it again;
and again; and again. Over and over; hoping for a match. What are the chances that
a match will occur? Most people will say impossible. And yet simple logic may reveal
that while a match in unlikely to come soon yet, there is a very small chance that it
may occur. Given enough time there is a possibility, once in a billion perhaps, that it
will happen. However, nobody will try this method mainly because there is no time
to wait. But with a computer such logic starts to be applicable. A billion for example
is not such a big number. Think of GHz: those are billions of cycles per second. So, let’s
try this process through a simulation shown to you in the screen. In the first trial it
took 1252 unsuccessful attempts to get a match, taking virtually only two seconds.
Next time it took 2619 unsuccessful attempts until a perfect match occurred. So, if you
were to choose between the two methods you are faced with the following dilemma:
should I employ my intelligence and take several minutes to solve the problem or use
a mindless chance mechanism and solve the same problem in just a few seconds? To
some people this is a very fundamental question.
Here is another related problem to the previous one. In how many possible ways
can we solve the puzzle? Is there infinite (I don’t like that word) or is there a specific number of possible permutations? Let’s take a set of 9 positions arranges in a 3x3
grid and assume that each position can be either black or white. What are the chances
that a cross configuration will occur? By the way the pattern we are looking for (when
laid out) is 010111010. One way to find out is to start doing random configurations
until we get a match. But that may involve repeated patterns and it may take redundantly more time than by using a different method. The second method uses a simple enumeration of permutations starting with a 000000000, then 000000001, then
000000010, and so on. The pattern we are looking for, that is 010111010, will occur
somewhere between 000000000 and 111111111. All possible combinations are 512,
or 2 to the power of 9. The pattern we are looking for comes after 325 attempts (depending on the method of enumeration) (Fig. 1).
Now in design, although not the same, we have a similar process. In design, and, in
particular, architectural design, the problem that a designer is called upon to solve can
be regarded as a problem of permutations, that is, the rearrangement of design elements within a set of discrete positions, such as a grid, until a solution is found that
satisfies a set of criteria. Traditionally, such arrangements are done by human designKostas Terzidis USA
61
Fig. 1
ers that base their decision making either on intuition (from the point of view of the
designer) or on random sampling until valid solutions are found. However, in both
cases the solution found may be an acceptable one but cannot be labeled as “the best
possible solution” due to the subjective or arbitrary nature of the selection process.
In contrast, an exhaustive list of permutation-based arrangement will eventually reveal the “best solution” since it will exclude all other possible solutions. For example
consider the design of simple bathroom in an architectural plan consisting of four fixtures: a sink, a toilet, a shower, and a door arranged in a 2x2 grid. The slide (below)
illustrates all possible arrangements of such a simple four-fixture bathroom. The
number of non-repetitive, rotationally-specific arrangements is only 384. However, after eliminating all arrangements that have a toilet seat facing a door and eliminating
any arrangement that uses more than 6 meters of pipelines (i.e. choosing the least expensive ones) the number of successful bathrooms is only 8. It can be claimed therefore that these eight bathroom configurations are indeed the best possible ones since
they exclude anything else. Of course, we may have to redefine the term “best” and
apply it only to quantitative criteria and pertinent only to the number of possible permutations. In other words, given the number of all possible permutations, the resulting 8 are the ones that satisfy our constraining criteria and are therefore considered to
be the best (Fig. 2).
Let’s now take another example. Consider a sample architectural problem, relatively
simple for the time and size of this essay. I will try to demonstrate the use of permutations as a method for the automatic generation of building plans. In this case, consider a site (b) that is divided into a grid system (a). Let’s also consider a list of spaces
to be placed within the limits of the site (c) and an adjacency matrix to determine the
placement conditions and neighboring relations of these spaces (d). One way of solving this problem is to stochastically place spaces within the grid until all spaces are fit
and the constraints are satisfied. The slide (below) shows such a problem and a sample solution (f) (Fig. 3).
62
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
All the possible arrangements of a simple four-fixture bathroom (a). These non-repetitive, rotationally-specific arrangements are 384. However, after eliminating all arrangements that have a
toilet seat facing a door and eliminating any arrangement that uses more than 6m of pipelines (i.e.
choosing the least expensive ones) the number of successful bathrooms is only 8 (b).
Fig. 3
A site (b) that is divided into a
grid system (a), a list of spaces
to be placed within the limits
of the site (c) and an adjacency
matrix to determine the placement conditions and neighboring relations of these spaces (d).
A sample solution is shown in (f).
Kostas Terzidis USA
63
So, let’s run this algorithm and see the results. As you can see after 274 random attempts a solutions is found. If we do it again, another solution is obtained and so on.
According to this algorithm, each space is associated to a list that contains all other
spaces sorted according to their degree of desirable neighborhood. Then each unit
of each space is selected from the list and then one-by-one placed randomly in the
site until they fit in the site and the neighboring conditions are met. If it fails then the
process is repeated. Since the total number of units of all spaces is equal to the site’s
grid units, there will always be a fit. To illustrate the point, in (a) nine randomly generated plans are shown as a result of this algorithm. Then each plan is extruded into
an architectural structure (b) to be potentially stacked into floors. While the algorithm
can generate many different solutions, as shown in (d) my research will seek to produce all possible solutions, that is, all possible permutations. If that happens, then we
can select from the exhausted permutation list the ones that best fit the programmatic, economic, ecological, aesthetic or other criteria of the client.
While the previous example is quite simplistic compared to the number of possible
arrangements involved in an actual architectural design, nevertheless it illustrates the
potential of a system of permutations and presents a different way of approaching
architectural design. As I mentioned earlier, the speculation of this work is to detect,
test, and implement the use of exhaustive permutations as a means of synthesis for
architectural plans of buildings. Such an effort involves the risk of increased complexity as the numbers of permutations increase exponentially. While the number of all
possible permutations can be pre-estimated using polynomial theory, the actual production of these arrangements may involve algorithms that are np-complete, that is,
possible to solve but would require extremely long time to execute. As an alternative,
brute force techniques can be used instead but, of course, those would depend on
the computational power of the computers used.
I would like conclude now with a few arguments based on my essay so far. The computer is not a tool. It is an intellectual entity and as such can simulate human thinking
producing inferior, similar, or even superior results to those of a human mind. Some
people do not like that comparison and perhaps there is a merit in that assessment.
Then perhaps a better way to describe the computer is that of complementary, alien,
or different. Perhaps it is has different way of thinking, a new way, a strange way. I
would like to believe that it is complementary in the sense that it can address many of
the things we cannot, or better, do not have enough time to deal with. So you should
not treat it as a tool. That is my advice. Treat it as something else; a different thing.
This brings me to my next point; that of the human mind. I am afraid that it is limited.
Whether we like it, believe it, or accept it is true. Factually true. If you do not believe
me try dividing 22 by 7. Or try to plot all the connections on a social network. Or think
of architectural complexity in a skyscraper. We are not as smart as we think. Yet, honestly, I do not want to be too smart, I do not want to learn everything, do everything.
But I would like to know that one day I can break out of that limited world and do
something more. Not alone but with help. Well, that is what the computer is: a ticket
to that world. It is not a device that replicates what you already know. That would be
redundant and useless. That would simply be like re-inventing the wheel. Unfortu64
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
(a)
(b)
(d)
Fig. 4
Three stochastically generated plans that fulfill the requirements of the architectural program
(a). These plans were then extruded (b), and stacked into a building (c). In contrast, (d) shows a
portion of what all possible solutions would look like.
nately, that is what some designers do. Many think that computers are screens that
replace light rays with pixels. I would like to suggest doing things that you cannot do
or think in this world. Try to reach areas of intellectual capacity that you do not have
but can obtain through a computer. Try to involve them to do things better than you
can do yourself. Especially, when it comes to random processing where you can have
things happen that you cannot predict. That is the true essence of what a computer is
or does. We should stop this whole idea that computers are inferior to human intellect
Kostas Terzidis USA
65
as if there is some sort of a competition going on so as to prove to our colleagues that
we are superior to the machine. It is ridiculous and should not even happen.
Finally, I would like to offer an experiment that was establish as a test of intelligence
and referred to as the Turing test. According to the experiment, if something, no matter what that is made out of looks like, behaves convincingly as intelligent, then it is
so. By definition. So, if a computer offers you a solution that involves an awe of intelligence may be that is the case. In the original experiment, which is a theoretical one
so far, a human converses with an entity that is hidden on the other side of a parapet without knowing whether it is another human, a computer, or something else.
The point of the experiment is to detach the eyes from the form or connotation that is
usually associated with intelligent beings; those could be non-human provided they
pass the test. You do not know what you are talking to: it could be a human or a computer. But you can’t see it. You cannot be influenced by the form, shape, or voice of
your interlocutor. If intelligence is what you are seeking for then its container should
be irrelevant.
So that being said I would like to offer a concluding remark: we have something significant is going on and digital culture is emerging as a new prism of looking at the
world asking us to redefine almost all of our established terms. This may be the biggest opportunity ever in the history of humanity so please do not miss it.
66
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Reflections
Chris Younès
Nationale Supérieure
École d’ Architecture
Paris la Villette
France
Towards
a Responsive Architecture:
Paradox of the Metamorphoses
in Play
Like the sciences of nature and man, architecture has been overtaken by the techniques of information and communication. Professionals can no longer ignore these
means that not only offer significant savings in time and effort for those using them
but also help them gain in performance. But what kind of performance are we talking
about? The digital tool is also the bearer of a certain vision of the world or at least a
certain conception of the relationship between man and nature, between the subject
and the object on which it acts. We must not forget that it was the bow and the arrow which led to man’s dominance over animals, and that the lever inspired the idea
of shifting the earth from its location, a hypothesis briefly evoked as being fact by
Descartes.
Re-thinking and Εxploring the Possible
between Rationalization, Imagination and Mutation
The different communications focus mainly on the possibles to explore, experiment
and transmit in the training of students. The rationalization of the process of design
and production in architecture, thanks to the contributions of artificial intelligence,
highlights other modalities of the project itself. And this by instrumental, methodical
and conceptual mediations which determine it as both techniques of anticipation and
critical assessment. It should be noted that according to Antoine Picon numerous aspects of digital design are linked to different episteme, from scientific rationalisation
to digital imagination, thus underlying the tensions between mobilized theoretical
models.
Rethinking humanity is to better situate and grasp the place which is given to
these technologies in the work of architects. Digital techniques are still too recent to
reveal all their secrets, but already they have turned upside down the relationships
between people and nations by reducing space, by multiplying contacts between individuals, by displacing borders and by shrinking the planet to the size of a village.
Maybe their specificity is even more striking in that they seem to interact even more
on humans rather than on the objects on which they act. This is so true that for the
first time in its history, homo sapiens are in competition with machines on what has
always set them apart: their intelligence. We have known and have often celebrated,
since the beginning of modernity, the marvels which represented automatic machines: but these wonders were barely more than their displacement or their carrying
out a few actions, or uttering a few words which we could consider with amusement,
as they were so far away from being capable of imitating the least complex behaviour
of animals, yet alone that of humans. And yet, nowadays it is a completely different
story, as in a certain manner, the computer has extended the brain of man and software has become indispensable in activities such as writing, calculating, drawing or
even building, but also that of living. The Internet network is becoming increasingly
closer to the human brain and we are already asking ourselves whether it will be able
to develop in an autonomous manner or even manage to recreate itself. We can even
go as afar as imagining one day that computers will relieve humans from intellectual work in the same way that the first machines relieved humans from manual work.
Furthermore, a disembodied experience risks substituting itself into a perceptive and
emotional experience in a body situated here and now. Where are men and women in
all this? It should be noted first of all that there is a notable difference between clas70
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
sical technique, that of invention and the use of machines and computer-based techniques. The first concerns a diptych: spirit and matter. The design and construction of
machines was an admirable testimony of Plato’s thoughts. It is indeed the spirit which
triumphs over matter. It is the sign of an ‘unveiling’ of man in which intelligence masters and overcomes materials. With information technology, the concept of materials
becomes hazier. Man now finds himself in a new and shifting landscape, that of the
virtual, with the risk, when leaving the sensitive of installing himself in the intelligible,
that the body is no longer needed. Husserl has in particular analyzed how sciences are
putting into peril1 a certain relationship in the world by veiling the carnal world. The
construction of this never before seen disparity has left humans disorientated, confronted by a world of ideas which are distinct from the sensitive world which nevertheless gives it its shape. What asserts itself is the power of constructing the human
spirit which attempts to give the illusion that it is reality itself.
The Issue of Management by Creative Synergies of anotherType
The key question of ‘responsive architecture’ infers a strong preoccupation in terms of
managing life spaces. Processing complexity, installing clouds of sensitive sensors, favouring interactivities, collaborative forms, finding balances, should contribute to succeeding the challenge of a sustainable architecture whilst at the same time transcending the imposed break between spirit and body.
Concerns linked to the devastations of ecosystems and heritages, to the finiteness
of planet earth and the precariousness of human life threatened by a crazy excessiveness has led numerous contributors asking themselves what are the appropriate
strategies for transforming living spaces and what sustainable relationships are to be
established between nature, techne and society. It is only in the artificial which means
‘made by art’ (arte facere), designating skilfulness, know-how, cunning, that there is
the possibility of a menacing promethean desire but also the possibility of inventing
symbiotic mechanisms.
Education is particularly necessary when imagining other possibles, when establishing synergies of another type between the dynamics of culture and those of nature, by taking into account both the limits and resources specific to each milieu. This
requires the deciphering of the ‘already here’, the harnessing of coming forces. The
act of creation is also an act of resistance as stated by Gilles Deleuze2 against forms of
deadly exploitation. All the more so since an architect is always confronted by what he
or she is responsible for regarding the habitation of his or her fellow humans. It should
not be forgotten that no technique is capable of acting on ethics. This issue is that
of meaning, the senses, and values. Like all inventions, information technologies are
a doubled-edge sword which can liberate man or enslave him. It is therefore vital to
take measure today.
Definition and Indefinition of Humans3
The metamorphoses which operate with and within digital culture invite us to refer
back to anthropology. Is it the swansong before being swept along by waves or a true
watershed which expresses itself with the concern of building with care the habita-
Chris Younès France
71
tion of humans, in other words by grasping at the same time all the environmental,
economic, social, political, technical, ethical and aesthetic dimensions.
The question of a ‘technology-driven architecture’ naturally has a relationship with
the definition of man which his been endlessly deferred and postponed. The ancient
definitions have not purely and simply become outdated but they are inscribed in the
sustainable future and form part of a quest which is endlessly renewed and which engages the infinite existence of the search for meaning.
Notes
1 Husserl , E. La crise des sciences européennes, Gallimard, 2004
2 Deleuze, ‘What is the act of creation’, conference at FEMIS, 1997.
3 Goetz, B., Madec, Ph., Younès, C. Indefinition of architecture. La Villette, 2009.
72
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Emmanouil Zaroukas
Computing and Design, CECA
University of East London
UK
Hacking the Symb(i/o)tic field
of Architecture
All Watched Over By Machines of Loving Grace
I like to think (and
the sooner the better!)
of a cybernetic meadow
where mammals and computers
live together in mutually
programming harmony
like pure water
touching clear sky.
I like to think
(right now, please!)
of a cybernetic forest
filled with pines and electronics
where deer stroll peacefully
past computers
as if they were flowers
with spinning blossoms.
I like to think
(it has to be!)
of a cybernetic ecology
where we are free of our labors
and joined back to nature,
returned to our mammal
brothers and sisters,
and all watched over
by machines of loving grace.
Richard Brautigan, 1967
In 1967 Brautigan had a very clear view on how the natural world would be in harmony with the post-war technological developments. The human free of labor would be
able to sustain a balanced and peaceful world in an ecology that would be watched
over by machines – the dream of machines of control that would eliminate the irrational aspect of human beings by taking away from them the privilege to decide –
while being able to think for themselves. Forty-five years later and the machines of
loving grace are established in almost every aspect of human lives. Architecture as
pedagogy, as research, as practice and as inhabitation is flooded with technological
machines that watch and control the complexity of contemporary life. Technology is
here forming a new context and Brautigan’s insight has turned reality. Built on the developments of the last sixty years in the systems theories, cybernetics and complexity
theory architects discuss, design and build an architecture that is adaptive, performative, parametric and responsive. An approach that is based on the basic premises of
Brautigan’s view; a view that understands any entity in a systemic manner capable to
interact, communicate and be affected in the environment in which it is embedded.
This essay aims to further explore this relation of an entity in its environment in
order to rethink the human in a technological driven architecture. For that matter it
takes a philosophical and theoretical position and brings together two rather peculiar
and unfamiliar for the discipline of architecture concepts, that of symbiosis and that
of hacking. The concept of symb(i/o)sis is proposed as a framework to think human
meshed in a complex machinic assemblage and the concept of hacking as an ethos
that allows the creation of new assemblages and the exploration of their capacities.
The aim of the essay therefore is to rethink Brautigan’s insight under a new ontological and epistemological perspective. Diagramming processes of emergence, following
the dynamism of assemblages and moving through the ontological scheme of the
French philosopher Gilles Deleuze the question of the emergence of the organism in
the environment will be reposed.
The essay at its first part unfolds three types of emergence: the synchronic, diachronic and transversal emergence as they have been read by John Protevi. Before
giving a full explanation of them two actions in ostensibly two opposite directions will
be taken. A short description of the basic concepts of the Dynamical Systems Theory
(DST) and a rough sketch of Deleuze’s method of dramatization will be coupled not
only to reveal a shared common ground but in order to create a new framework in
74
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
which the concept of emergence will be re-examined. It is not the aim of this essay to
trace a historical development of the concept of Emergence1.
The second half of the essay will become more specific and the discussion will focuse on the use of computers and computational processes in the way architects design and people inhabit architecture. The concept of Emergence and the paradigmatic
shift in the ontological and epistemological approach that this conception entails
positions the architect in a new domain. In this domain the architect could take the
role not of the mere designer of a system but that of a constructive hacker. The idea
of hacking is introduced not as a disruptive but rather as a constructive praxis, bringing new things together, opening new domains. Hacking therefore, stripped from any
connotation of criminality becomes a highly critical act in the symb(i/o)tic field of architectural design.
Emergence
The concept of emergence with no doubt has gained a lot of attention in architectural
academia and industry despite its conceptual density in scientific domains. A property
or a behavior of a whole is Emergent when it is produced by the interaction and selforganisation of component parts at a lower level. Emergence hardly has any rigid definition and still its concept has not been explained. It comes with different flavors at
times as weak, strong, synchronic, diachronic or other as different disciplines and scientific fields attempt to grasp it. Architects and architectural research groups around
the globe tend to use a ‘unified’ conception that entails a bottom up approach to their
design methodologies where an unpredictable new pattern or form occurs by the
low-level local interactions without externally commanding its occurrence. The scope
of this essay is to consider Emergence under the perspective of Complexity Theory
and Dynamical Systems Theory.
Paul Hamphreys distinguishes two broad categories of Emergence: the Diachronic
and the Synchronic. “The first approach primarily but not exclusively, emphasizes the
emergence of novel phenomena across time; the second emphasizes the coexistence
of novel ‘higher level’ objects or properties with objects or properties existing at some
‘lower level’.” (Hamphreys 2008, p. 431). Synchronic and Diachronic emergence appear
to have different time spans and refer to different processes although they are not
completely separate. John Potevi will attempt to restore the distinction of those processes by giving his own definition: “The construction of functional structures in complex systems (diachronic) achieves a (synchronic) focus of systematic behavior as they
constrain the behavior of individual components.” (Protevi 2006, p. 19).
The articulation of the concept of Emergence appears to be crucial in the work of
Gilles Deleuze although not explicitly referred to by him. Deleuze’s interest on Emergence comes through two main sources; firstly, his preoccupation with Leibniz’s philosophy2, monadology and differential calculus and secondly, via Deleuze’s interest in
the almost concomitant research on complex system and the philosophy of technology developed by Gilbert Simondon(1989). Deleuze constructs a method where emergence has an important role in the schematization and construction of his ontology.
In a paper that he will deliver in 1967 he will call it the method of Dramatization. Later
in 1980’s, Deleuze will be coupled with Felix Guattari and together will push the concept of dramatization further in order to construct a new category, that of transversal
Emmanouil Zaroukas UK
75
emergence. This third category will be explained in some details later but for the moment it suffices to note that Deleuze and Guattari are interested in the movements
that travel transversally forming assemblages from biological, social and technical
components.
Dynamical Systems Theory
Avoiding any mysticism in the attempt to grasp the mechanisms of emergence a
group of neo-realist philosophers and scientists will turn to Dynamical Systems theory to explain emergence as the formation of attractors in complex dynamic systems.
At the same time sketching out the basic concepts of the theory, a common origin
will be established from which Deleuze’s method of Dramatization will develop. That
origin goes back to Henry Poincarée, one of the founders of the theory of dynamical
systems, who developed the geometric model for studying physically, biologically or
socially observed systems (Abraham and Shaw, 1985a).
According to this theory every observed system can be modeled by being defined
in terms of “interesting Variables” or “degrees of Freedom”. A Damped pendulum, having two degrees of freedom that are related to its movement: the velocity and the offset angle, is a good example to illustrate this. It is clear that the “interesting Variables”
are relevant to the ways in which the object can change. The choice of the variables is
subject to the modeler and the level of abstraction of the system under study. Every
space has a phase space where the range of the system’s behavior is been represented
and which is constructed using a manifold, which is an n-dimensional mathematical
object. In the case of the damped pendulum the manifold is a two dimensional object. Any point of the manifold represents the global condition of the system. If the
system will be followed across time the different states that the systems goes through
construct the trajectory of the system. A computer simulates the process and maps
together in the same graph the relation of the velocity and the offset angle. What is
mapped is basically the behavior of the system and the construction of trajectories.
Another more complex example will be to use predator prey computational model
where trajectories of a particular configuration will be created (Abraham and Shaw
1985a p.83). These configurations have a specific shape, called attractors representing patterns of behavior of the real system. Attractors can be distinguished into three
types:
1. Point (stable or steady state systems) Crystal Growth
2. Circular or Periodic (oscillating System) any predator prey system
3. Chaotic or Strange (Turbulent or ‘chaotic systems’).
In the case of the damped pendulum a point attractor, approached by a spiraling trajectory, is observed. There are cases however, where the attractor can be either Periodic or Chaotic3. The layout of attractors in the phase space, which describes the layout of the patterns of behavior of the system, is defined by the layout of singularities,
which are mathematical objects that define the topological structure of the manifold.
Singularity, in mathematical terms is a point where “the graph of the function changes
direction... as it reaches local minima or local maximal”(Protevi 2006 p. 24). A singularity is a turn that represents a threshold where the real system changes qualitatively.
76
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
This singular turn define the limit of the basin of attraction and therefore the existence of an attractor. Unlike the trajectory that is composed by actual states of the system Singularity belongs to a vector field of directions composed by values of rates of
change at certain instances.
At the zones of crisis where the system is perturbed there is the irreducible element of ‘chance’ or unpredictability where the system can move into different basins
of attractors under the affection of external triggers. Therefore “[s]ingularities may
undergo a symmetry breaking transition and be converted into another one. These
transitions are called bifurcations.” (Delanda 2002, p. 18). Chance and unpredictability
are constitutive properties of any dynamic system, which is however thoroughly deterministic. A state space structured by one point attractor, for example, may bifurcate
into another with two such attractors, or a point attractor may bifurcate into a periodic one.
What has been illustrated so far is that the interesting-variables when they are
coupled in differential relations – like that between the velocity and the offset angle in
the case of the damped pendulum – can produce different mathematical entities that
can describe the system at every instance. However, it is the production of the singularities and their respective attractors that explain the mechanism of the emergence
of the behavior of the system4.
Sketching Dramatization
The description of the main concepts of dynamical systems theory sets the base for
an understanding of the method of dramatization and Deleuze’s interest to open up
ontology to creation and novelty. It is around the dynamic aspect of the world that he
constructs an entire ontological scheme. What follows is a rough sketch of the method
of dramatization, which in the course of the essay will be explained further through
the different types of emergence.
“... [Dynamisms] always presuppose a field in which they are produced, outside of
which they would not be produced... this field is intensive, that is it implies differences
of intensity distributed at different depths” (Deleuze, 2002, pp. 96-97 ).
Deleuze marks that experience accesses the world as already developed in extensions, that means as fully formed entities by covering at the same time all the intensities under qualities. Those pure intensities which are “enveloped in an intensive spatium” (2002, p. 97) is everything that exists prior to the formation of extensions and
qualities, parts and species. Intensity is considered as “the unequal in itself”(2002, p.
97). It is the pure difference in itself formed in series. At the depth of this spatium is
where the intensive field of individuation is constituted. This field is populated by
series of intensities. Those series however enter into communication and the task to
bring the series into a coupling has been taken by an important operator for Deleuze’s
scheme. This operator is called dark precursor and its role is to bring differences into
communication, which, in turn, will differentiate further in producing “disparate series”(2002, p. 98) or difference of differences. The coupled intensities ‘resonate’ and the
further differentiation causes “inevitable forward movements” (2002, p. 98).
The Subjects are not completely absent from the scene of dramatization. The
Deleuzian subjects are more patients than agents. They are rough sketches, instead
of fully qualified and composed organisms. The larval subjects are the ones that can
Emmanouil Zaroukas UK
77
sustain the dynamisms of the intensive field. Deleuze turns to embryology and talks
about the embryo as the larval subject that is the only capable to resolve the dynamisms of the egg.
“Dynamisms... constitute not so much a picture as a group of abstract lines coming
from the unexpected and formless depth. These abstract lines constitute a drama...”
(2002, p. 98).
Deleuze will elaborate further his scheme and he will reveal the “two inseparable
aspects of differentiation – specification and partition, the qualification of species and
the organisation of an extension”(2002, p. 99). These two aspects although inseparable are, on the one hand, the differential relations among elements and, on the other
hand the distribution of the singularities. So what he calls an Idea5 should be looked
at from two points of view. The point of view of the variation of differential relations
and from the point of view of the distribution of singularities corresponding to particular set of values. These two points of view of the Idea will help to concretise two issues. First, Ideas have no actuality but they are only virtual6. The communications that
form the differential relations and the distribution of the singularities which construct
the vector field of the dynamic systems, are different mathematical entities – real but
not actual – from the integral curves or trajectories which trace the actual states of the
system. The multiplicity of the Idea is what for Deleuze constitutes his ontological modality of the virtual. The Second issue clarified is that the Ideas are actualised “precisely only insofar as its differential relations are incarnated in species or separate qualities, and insofar as the concomitant singularities are incarnated in an extension that
corresponds to this quality” (2002, p. 100). These two operations of course have nonresemblance whatsoever with the incarnated qualities or extensions. The differential
relations between entities at one level doesn’t resemble the qualities of the emergent
outcome and in same way the distribution of singularities are not represented in the
formed extensions. For this reason this virtual part of the entities is independent of
the underlying mechanism of emergence. Manuel Delanda will state that emergent
phenomena can be explained through a mechanism-independent approach. Deleuze
will state for the same matter that virtual realm formed by the dynamisms in the intensive field is “not connected with any particular example borrowed from a physical
or biological system, but articulate the categories of every system in general” (2002, p.
98).
To conclude this sketchy presentation of the method of dramatization it will be
necessary to get to the actual realm; the formed entities or objects with extensions
and qualities. It seems, therefore, that for Deleuze things have two halves: the virtual
half (or Ideal) that is constituted by the differential relations and the distribution of
singularities, and the actual half constituted by the qualities and parts. If the latter half
has the capacities to affect the environment in which those entities are embedded,
the former half has the capacity to be affected by it. For that matter the method of
dramtatization and the construction of the ontological modality of the virtual favors
openness, change, contingency and novelty. The two processes of differential relations and differentiation under the operation of the dark precursor and the distribution of singularities “expresses not only a mathematico-biological complex, but the
very condition of any cosmology, as the halves of the object”(2002, p. 102) and guarantees the openness of the object to change. Based on this view the concept of emer-
78
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
gence will be re-approached and the whole relationship of entity in its environment
will be re-framed.
Synchronic Emergence - Diachronic Emergence - Transversal Emergence
John Protevi gives a definition that grants the difference and the inseparability for
Synchronic and Diachronic emergence. “A synchronically emergent structure is that
which enables focused systematic behavior through constraining the action of component parts” (Protevi 2006, p. 19). It could be argued that Synchronic emergence
could be considered as the emergence of ‘order out of chaos’ as Ilya Prigogine and
Isabelle Stengers wrote in 1984. The synchronic level of emergence mainly focuses
on the systemic behavior as the functional structures constrain the behavior of the
individual components. This mereological level distinction is relevant for the case of
synchronic emergence and explains the relation between the properties of microlevel and the emergent properties of the macro-level. In other words synchronic
emergence focuses on the part-to-whole relations. This mereological understanding
caused confusion in terms of the direction of the determination either of the whole
or the components. The argument stands for that group of researchers who grant a
mutual operation of both bottom up and top down processes for the local/global
determination of the entities. John Protevi initiates a critical stance to this position
very much inspired by Deleuze and especially Delanda’s reading on the latter. Protevi
points out that the upward causality (bottom-up) and the downward causality (topdown) which are braided together in what is known by Varela and Thompson as ‘the
reciprocal causality’ involves a problematic reading of causation in the understanding
of synchronic emergence. This problematic reading occurs from an implied purposefulness, a necessity of self-reproduction and self preservation of the process that locks
the system in circularity and grants its totality and its identity. This creates therefore an
understanding of the downward causation, as efficient causality that emanates from
a reified totality the purpose of which is its self-preservation. He carries on claiming
that the mutual constitution of upward and downward causality can be accepted as
long as a reworked definition of downward causality is possible. He will search for this
reworked understanding of downward causality in Deleuze’s reading of Stoics and Delanda’s clarification of the Deleuzian concept of ‘quasi-causality’.
Delanda argues that Deleuze approaches the question of necessity by splitting the
causal link:
“One of the boldest moments of the Stoic thought involves the splitting of causal
relation. Causes are referred in depth to a unity, which is proper to them, and effects
maintain at the surface specific relations of another sort. Destiny is primarily the unity
and the link of physical causes among themselves. Incorporeal effects are obviously
subject to destiny, to the extent that they are the effect of these causes. But to the extent that they differ in nature from these causes, they enter, with one another, into relations of quasi-causality. Together they enter into a relation with a quasi cause, which
is, itself, incorporeal and assure them a very special independence, not exactly with
respect to destiny, but rather with respect to necessity, which normally would have to
follow destiny. The Stoic paradox is to affirm destiny and to deny necessity” (Deleuze
1969, p. 194).
Emmanouil Zaroukas UK
79
Delanda will flesh out this point by using a terminology from the Dynamical systems theory and method of dramatization. “[On the] one hand, processes of individuation are defined as sequences of causes (every effect will be the cause of another
effect) while singularities become pure incorporeal effects of those series of causes”
(Delanda 2002, p. 52). The upward causation that is the bottom up determination is
roughly identified at this point. Delanda after Deleuze goes on by substituting the
downward causation with quasi-causality. He will continue by stating: “on the other
hand, these pure effects are viewed as having a quasi-causal capacity to affect causal
processes” (2002, p. 52). This reworking of causality by making this split will give the
capacity to Deleuze to separate the determinism which links causes to other causes,
from strict necessity, from a final cause.
With the introduction of quasi-causality Protevi claims that Deleuze dispenses
with the false problem of downward causation which would be thought “as an efficient causation issuing from a reified totality” (Protevi 2006, p. 30) whether this is the
mind, the organism or the society; in other words totalities that imply necessity. It is
the distribution and formation of singularities and attractors that resolve the issue of
downward causation. Therefore, it is quasi-causality that funnels the trajectories into
a basin of attraction. In the method of dramatization the operation of quasi-causality was named dark precursor, the ontological entity that brings disparate series into
communication. “The task which the quasi-causal operator must accomplish is to create among the infinite series springing from each singularity “resonances and echoes”
that is the most ethereal or least corporeal of relations” (Delanda 2002, p. 84). What
Delanda after Deleuze names resonances and echoes is no other that the resonances and forward movements that have been mentioned in the method of dramatization. To gain a more detailed view it would be appropriate to look more at the side
of complexity theory. While Resonances can be understood as mutually stimulating
couplings of divergent series, the notions of echoes or forward movements are more
associated with the positive feedback that ramifies the series emanating from the singularities7.
Those forward movements allow differences of intensity “to ensure their affirmative divergence” (2002, p. 205) that is the positive feedback loops that keep the system open and able to ramify its differences further. Therefore, while ensuring the bifurcations of the system that creates new singularities, in other words that learns new
skills, resonances will induce convergence in the series capable to actualise a larval
subject in extensions and qualities. It is the existence of those forward movements
that are contained in every primitive communication among disparate series and are
only possible due to the presence of ‘fluctuations’ the behavior of materials in near
phase transition (2002, p. 86). These fluctuations may effect a correlation of the disparate series (rates of differences). In every material system there are variables that are
not fixed but rather they fluctuate around a singularity. When the system is poised in
equilibrium state those fluctuations are minimum and so the potential of their communication. In states where the system is heading towards the near phase transition
the fluctuations increase up to a critical point where the potential of transmitting
information is maximized. At this point the system maximizes its potential to ‘evolve’
to change and to restructure its phase space by producing new singularities, new attractors. There are two issues that are important to be mentioned at this point. First
it is this capacity of the systems to approach but not to actualize this transition and
80
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
second that this phenomenon is independent of physical mechanisms underlying
these phenomena. In other words the singularity itself is “mechanism-independent”
(Delanda, 2011). The dark precursor with its quasi-causal operation on the disparate
series allows systems to break up from necessity and to open to change and novelty,
which is described by the second type of emergence called diachronic8. At the level of
Diachronic emergence systems evolve by learning new patterns of behavior or other
systems acquire new skills. What Deleuze calls the event, which dramatises is this truly
creative process that opens to novelty. As the series formed by differential relations
resonate with other series via the affirmative divergence by keeping the system open
to variation and to the creation of new attractors and new singular points. It is the production of new patterns and thresholds of behavior that matters. It is for that reason
that Deleuze’s ontology is the ontology of the virtual. A philosophy half plunged in the
virtual and half in the actual realm.
If the actual half is the part that has the capacities to affect the environment in
which an entity is embedded then the virtual half is the capacity of the system to be
affected and in turn to produce new extensions and qualities. At the virtual domain,
at the intensive field of individuation is where the intensities operate and their series
resonate as it has been already highlighted in the method of dramatization. But the
communication between intensities is not necessary to take place between components of the same strata9. An intensive field of individuation is constructed with communications beyond the extensions of the system, which forms with other systems
assemblages through “relations of exteriority”10. Deleuze finds an isomorphism and
an interest in the biological organisms through the work on symbiogenesis by Lynn
Margulis (1998). In her research on eukaryotic cells she considered them as organic
assemblages formed across evolutionary processes. There is no need for this transversal communication between organisms or better just entities to be constrained at
the biological strata. Those transversal movements cut diagonally the intensive fields
of individuation of biological, social and technological entities. Meshing the virtual
realms of entities belonging to different strata, therefore, forms a machinic assemblage. The enlargement of the virtual half that is of the differential relations and distribution of singularities entails a restructuring of the phase space of the assemblage.
Based on the conception of transversal movement and the presence of singularities
in order to explain the mechanism of emergence Protevi comes up with a third type
of emergence called Transversal. The idea of transversality that renders the concept of
‘machinic assemblages’, complicates the understanding of diachronic emergence be
discussed before and eventually folds the hierarchy that has been presupposed by the
synchronic emergence into a flat meshwork.
Emergence and Organism
Autopoiesis and Mach[i]nic Assemblages
It is exactly those last two readings on the mechanism of Emergence; the Diachronic
and Transversal that provides an insight into repositioning the organism in the environment. Synchronic Emergence has dominated the discusion through mainly the
concept of autopoiesis. Autopoiesis has been defined by Humberto Maturana and
Francisco Varela in the famous dictum of “organizationally closed, – but structurally
open” (Maturana and Varela 1980) organism. Although it is adequate to capture the
Emmanouil Zaroukas UK
81
phenomenon of synchronic emergence, autopoiesis is not able to follow the tendency of the organism to change in a diachronic scale. Autopoiesis can be seen only as an
instance of synchronic emergence dedicated to homeostatic processes. If the analysis
will be stopped here then it is assumed that organization has a fixed transcendental
identity. Keith Alan Pearson will identify the problematic aspect of autopoiesis. He
will state: “if there is to be autopoiesis then it has to be conceived as operating on the
machinic level”(Pearson 1999, p.170). Pearson explores the work of Gilles Deleuze and
Felix Guattari and makes use of their conceptual armature to extend the concept of
autopoiesis. By introducing the ‘machinic’ (Guattari, 1995) character of any organism
there is an attempt to dislocate its system from the often too conservative homeostatic circularity and to open it in an ongoing testing and experimenting of the limits
of the organization to see the virtual disribution of singularities or what Manuel Delanda will call “tendencies”. Deleuze and Guattari advances the theory of autopoiesis
by opening it up to chance and change, by working with the half of the reality that is
plunged to the virtual realm; in a realm that chaos becomes a constitutive element of
any creative process.
This is why the organism for Deleuze is an enemy. Any closed and stratified system is an enemy but an enemy that is not to be denied and eliminated. What it has
to be revealed, while remaining hidden, is its virtuality. The actual realm with entities
in extension and with qualities formed should be approached cautiouslly, “staying
stratified - organized, signified, subjected - is not the worst that can happen”. (Deleuze
and Guattari 1980, p.161 ) The actualised entities enter into relations excersing their
capacities to affect and be affected and the drama reverbarates in a continuum of
dynamisms11.
The question therefore transformed from what an entity/organism is to what an
entity/organism can do, when, where and how. The ‘Animal-stalks-at-five-o’clock’ (Deleuze and Guattari 1980, p.263) is not Heidegger’s ‘Being-in-the-World’or Bateson’s
‘organism-in-its-environent’ 12 but it is Deleuze’s take on Spinoza, of a “body [that] affects or is affected by other bodies” (Deleuze 1970, p 123). This Spinozian image of
thought can best be described in the symbiotic complexes, in the machinic assemblages of biological, social and technological components. For that matter Symbiosis
has not the same charge as that of Brautigan’s or with the dream of ultra-control of cybernetics in past decades. The fact that cybernetic systems can be programmed and
reprogrammed gave rise to the belief that technology and machines can extend and
replace humans in peaceful and balanced environment. Any communication though
at the intensive field of individuation under the operation of dark precursor, any symbiotic complex is a “violent encounter”. It is not ‘where mammals and computers live
together in mutually programming harmony’ and it is not about a ‘deer stroll peacefully past computers’, as well as it is not the case of ‘all watched over by machines of
loving grace’. It is about affecting and being affected. The system or the body enters
into a composition with other affects while being open to be affected. The body lives
in its own domain and series of intensities propagate and enter or not into entrainment/resonances with other series. “We know nothing about a body until we know
what it can do, in other words, what its affects are, how they can or cannot enter into
composition with other affects, with the affects of another body, either to destroy that
body or to be destroyed by it, either to exchange actions and passions with it or to
join with it in composing a more powerful body” (Deleuze and Guattari 1980, p.257).
82
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Deleuze’s and Guattari’s advancement of autopoietic theory is an attempt to open the
entity/organism to its affective realm. In doing so it constructs a ‘machine’ 13 that is prior to the distinction of the human and the technical device.
Three different types of emergence have been presented so far, in an attempt to
explain the mechanism of emergence through the framework of Dynamical Systems
theory. Investing on Deleuze’s method of dramatization and his new ontological modality of the virtual the essay opens emergence through its diachronic and transversal
reading to change and novelty by advancing the premises of autopoietic theory. That
openness constitutes the theory of machinc heterogenesis and that of machinic assemblages. Approaching any entity as a machinic assemblage that meshes with multiple other assemblages what is revealed is its affective realm. The realm that is formed
from the depths of differential relations and the distibution of singularities of the assemblages or in other words from the depths of the vrtual of a symbiotic complex.
What is now necessary to explore is the virtual realm that will be achieved by concretising the approach to symbiotic complexes of human and technological components
in the field of architecture.
Symb(i/o)sis and the Praxis of Hacking
Code, programming and computational processes uncover a powerful domain providing architects with an opportunity to work, think and experiment at the level of
the intensive fields of individuation prior to any extensities and qualities. The introduction of computation and programming into design methodologies has marked a
paradigmatic shift from top down to bottom up processes; a shift that is no need to
be plunged into a synchronic view of emergence that perpetuates a balanced homeostasis. Brautigan’s view has been partially realised proving that current technology
has resolved many of the technical issues of his time. However, what is not realised, is
this immanent optimism of the controlled, balanced and peaceful symbiosis that the
cybernetic view was striving to establish through the communication of individuals.
The issue in question here is, therefore, more philosophical than technical and largely
engages with the ontological and epistemological changes that this techno-scientific
explosion in its turn brings to thinking, practicing and inhabiting architecture. Those
changes have been roughly revealed earlier by presenting the method of dramatization and by reworking the autopoetic circularity of the cybernetic organisms. However
what remains to be seen is how this reworked hypothesis is concretised in the symbiotic complexes in the field of architecture populated with computational entities. The
ubiquitous presence of computers and digital code is apparent in the contemporary
research in architecture following the same pattern as in all the aspects of life.
A computer should be seen neither as a facilitator nor as a partner but as a symbiont. A universal machine able to set up a stage for a drama of computations and
data calculations where code develops its own ‘machine’. What it has become apparent through the method of dramatization and the presence of the incorporeal entity
of dark precursor is that a certain kind of communication takes place at pre-individual
level; at the level where Ideas in Deleuzian sense exist as distinct differential relations
through data manipulation. But what is preserved in the idea, within itself is a zone
of dark and obscure undifferentiation. The differentiated Idea, the actualized, is clear
but the undifferentiated idea is distinct and obscure. It is in the presence of the dark
Emmanouil Zaroukas UK
83
precursor that obfuscation is being created. Put it in other words code has a Dionysian
drunkenness within its Apollonian performance. It doesn’t only operate as an agent
and performer but it has an immanent patient side. Take the example of objects in the
Object Oriented programming paradigm. They usually have been named as demons
participating collectively in what can be described as a Pandemonium. Pandemonium becomes the common ground where the demons act and execute the formalised
logic of the algorithm but at the same time while they are exercising their capacities
with other demons they fall into ‘violent encounters’ they exchange actions and passions composing an emergent higher entity. The demon is an agent and a patient at the
same time and it is this latter characteristic that plunges code into the virtual constituting in parallel a new machine. It is demon’s passion that allows for the conception
of diachronic emergence and the transversal communications between heterogeneous elements. It is not therefore proposed to retain the distinction between the computer/code and human and to link them through communication in a phenomenological level. What it is proposed is that human and computers are already elements
of a ‘machine’; a collective machine that includes human bodies and technological devices in transversal communications at the pre-individual domain of resonances and
echoes. It can be argued therefore that the computer is not a facilitator to the extent
that executes the architect’s intuitions. It cannot be merely thought as a partner that
manages relations between different subjects. It appears more as a symbiont capable
to set-up state of affairs for pre-subjective transformations. A machinc assemblage is
being constructed where the emergence of another world takes place. This is what is
proposed here with the idea of Symb(i/o)sis as a transversal emergent complex. Architect’s preoccupation with generative coding mostly relies on the phenomenological
level where human-subject perceives the production of new qualities and traits. But
the affective aspect rethinks perception as micro-brains or micro-perceptions at the
level of intensities and dynamism. Deleuze’s reading of Spinoza’s affect takes place in
a pre-individual level where only the rough sketches of larval subjects exist to sustain
affects before they become accountable; before they become accessible to the sensory awareness.
The theme of the symbiotic relationships between biological organisms and
technological machines is not something new. Joseph Carl Robnett Licklider wrote a
paper on ‘Man Computer Symbiosis’ in 1960. Licklider (1960) saw computers as a potential. He was insisting though that in order to harvest these potential the masterslave relationship which characterize contemporary paradigms of interaction has to
be replaced by the notion of a partnership or even better yet, with that of symbiosis
in which evolutionary paths of humans and machines interact. In this symbiotic complex, the human acquires a new position. It is not any more the master that dictates
his/her own thinking patterns on the machines neither he disappears from the loop
as the Artificial Intelligence program had planned and in many cases has succeeded.
The aim of symbiosis, according to Licklider, is to let computers facilitate formulative thinking as they now facilitate the solution of formulated problems. It is a radical
change of thinking patterns that for Computers Scientists like E.W. Dijkstra (1972) it is
required if it is for a human to deserve a computer. Gordon Pask’s MusicColor project
(Haque, 2007) liberates interaction and interactive installations from this master-slave
relationship. The MusiColour Machine is a performance system constructed in 1953.
Music produced by a band functions as input for a lighting system to illuminate the
84
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
stage of performance in concert with the audio. Pask’s MusicColor is programmed to
respond to certain frequencies. The moment, however, at which the music falls into a
repetitive pattern MusiColour, reacts by producing lighting patterns that are discordant with the music. It receives inputs from a different source and improvises. It is the
turn of the band then to respond by renewing the conversation with the machine.
It is Francois Roche with his project “Une architecture des humeurs” that takes Gordon Pask a step further by liberating communication and interaction from a conversation between a subject and an object or better between subjects individuals that can
refer to their subjective realm of the sensory awareness. Francois Roche researches
the contingency of the humors of the inhabitant on the habitat itself. He moves in a
pre-individual level at the stage of dynamisms that presupposes an intensive field. “At
last, a habitat that reacts to your impulses... More precisely... it is itself the vector... synchronized with your body, your arteries, your blood, your sexual organs, your pulsating organism... and you become a thing, an element among the rest, an element in
fusion, porous... which breathes and yearns to be its own environment... Here everything combines and intertwines. Everything is here, its happening now, a movement
happening now... Let yourself go...” (Roche, 2010). The symb(i/o)sis among algorithms,
humors, building and inhabitants forms a machinic assemblage. Meshed together at
the virtual realm through resonances the assemblage is poised at the edge of chaos
and it is open to variation and creation of new singularities.
The new ontological scheme proposed to be formulated engages architects and
inhabitants in encounters that are far from the subjectivist relativism of phenomenology. Symb(i/o)sis14 reveals the non-human in the human and all sorts of micro-brains
that can be found everywhere. This is not to deny the human or to take a position
against him/her. What it is denied is a formed and organised perceiving subject as the
basic unit of architecture’s constitution. It is only produced by the existence of a larval subject capable to sustain the immanent dynamisms. Biological and technological
multiplicities enter therefore into affective ‘violent encounters’. There is not a subject
perceiving an object but only “unique, singular individuals differing in spatiotemporal scale but not in ontological status” (Delanda 2002, p. 58). Once the correlation of
the perceiving subject and the object is dissolved in favor of multiplicities existing in
a pre-individual level then architecture doesn’t acquire spatiotemporality via the presence of a perceiving subject. In flat ontology architectural space is not the result from
a human perceptual action (sensori-motor perception) on a programmed structure; it
is the state of affairs of entities and their capacities to affect and be affected.
It is at this Symb(i/o)tic field that the architect and becomes a hacker. The introduction of Hacking here has an affirmative connotation and should be thought more as
a constructive rather than a disruptive praxis; a praxis that brings new things together experimenting and creating new connections. Hacking, therefore, is introduced
stripped from any connotation of criminality, on the contrary it becomes a highly critical act in the field of architectural design. Hacking intervenes at the level of virtual, the
level of the intensive field of differences before the operation of the dark precursor.
For this reason hacking becomes highly experimental with no possibility to control
and steer any system according to a predefined necessity and therefore it affects only
the destiny of the system. McKenzie Wark in the ‘Hackers Manifesto’ notes that hacking
is “to produce or apply the abstract to information and express the possibility of new
worlds, beyond necessity” (Wark 2004, para. 014). The practice of hacking therefore is
Emmanouil Zaroukas UK
85
becoming relevant in the attempt to dismantle the autopoietic system and its conservative circularity by testing and experimenting with the limits of its organization
creating the conditions for the emergence of new singularities. “Hackers create the
possibility of new things to entering the world” (2004, para. 004). They open up the
possibilities for new domains where new singularities will be incarnated into new extensities and therefore new capacities of the system might find a way to be expressed.
Every hack produces a common plane for different things to enter into relation and
therefore differentiates the normal functioning of their shared goal. The action of
hacking pushes the system beyond a certain threshold where the echoes, that are the
positive feedback, will trigger a diachronic emergence in the transversal domain of
the assemblages. It is for this reason that hacking operates on the virtual in order to
transform the actual.
“The virtual is the true domain of the hacker. It is from the virtual that
the hacker produces ever-new expressions of the actual. To the hacker, what is represented as being real is always partial, limited, perhaps even false. To the hacker there is always a surplus of possibility
expressed in what is actual, the surplus of the virtual. This is the inexhaustible domain of what is real without being actual, what is not but
which may be. To hack is to release the virtual into the actual, to express
the difference of the real.” (2004, para. 074).
If architecture can be thought as transversally emerging through a symb(i/o)tic complex of dynamical systems then the architect/hacker as part of the ‘machine’ is able to
affect the stage of dynamisms where potential capacities for the production of spatial arrangements emerge. It is, therefore, not a purpose or a transcendent idea that
govern the production of space and form but the quasi-causality of the distribution of
singularities in a system’s phase space, which the architect-hacker partially affects. It is
about setting the stage of possible differential relations of interacting computational
entities to be produced. In other words it is about setting up the stage for a drama to
happen in the symb(i/o)tic field of architecture.
Closing Remarks
The essay attempted to critically approach human individuation in a technologically
driven architecture. The critical position of the essay doesn’t go against this reality in
any dialectical manner. On the contrary it follows the same direction only to push it
further to its limits. It is on this premise that the essay re-approaches emergence not
through a perspective of synchronic homeostasis but moving forward to include in
its conception diachronic emergence and transversal movements. As a consequence
of this conceptual disposition Brautigan’s poem can revisit two of its main presuppositions. First the very much-dreamed ‘loving grace’ of the machines that promises a
homeostatic cybernetic symbiosis with technology and secondly its implied ontological position that rests on individual fully formed organisms like deers, computers and
humans. The essay attempted therefore to explore the consequent dissolved distinction of the human and the machine in favor of a flat ontology that forms machines
everywhere. The machinic assemblages do not deny autopoiesis. They only move it
towards a direction that opens it up to change and novelty. In order to understand the
machinic character of elements that form assemblages the essay revisited Deleuze’s
86
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
method of dramatization and his construction of the virtual domain populated by differential relations and distributed singularities. The last two terms as loans from the
dynamical systems theory were explained. A rough lay out of the main terminology
helped to establish a position adequate enough to follow Deleuze’s constructive philosophical movements in the realm of virtual. It is through the virtual that parts and
species emerge in a very singular way and therefore the general conception of an entity-in-its-environment is concertised as the ‘Animal-stalks-at-five-o’clock’. Moreoever,
it is through the conception of the virtual that Spinoza’s idea of affecting and be affected is entering Deleuze’s thought not as a organismic metaphor but “the very condition of any cosmology”.
The essay aimed to explore an affective realm that distributes microbrains
everywhere. A realm that reveals the non-human perception in the human, affords
microperceptions of the algorithmic code and finally establishes machines between
those indepents and heterogeneous terms. According, to this view thereofre the nonhuman of the human enters as agent and patient in a Pandemonium with other demons. Archtects, inhabitants and code form symb(i/o)tic complexes at different levels
of that of a phenomenological approach. At this level the architect preoccupied with
code works towards the distibution of microperception accross the machine and not
by investing on the inhabitant’s macroperceptions of the final product. Arhcitects
therefore become hackers only to constructively set the stage for a drama to happen
in the symb(i/o)tic field of architecture.
Notes
1 Delanda’s “Intensive Science and Virtual Philosophy” will be proved an extremely valuable
material in order to uncover Deleuze’s links with dynamic systems theory and therefore to
rethink the emergence of an entity and the role of its environment. See (Delanda 2002) Further
conceptual developments can be found in his latest book “Philosophy and Simulation”. See
(Delanda 2011) especially pp. 1-6.
2 Mainly through his work on “The Fold: Leibniz and the Baroque” (Deleuze, 2002) and through
his lectures at Vincennes (1980).
3 It is worth making the distinction that chaos in this case is not used as a lay term, as something
that is completely random and arbitrary, as something that denies the production of behavior.
A Chaotic attractor on the contrary is a product of a system that undergoes certain processes.
The most famous Chaotic attractor was discovered by Lorenz during simulations of global
weather patterns (Abraham and Shaw, 1985b p. 85).
4 It is not the aim of this paper to discuss the benefits of dynamical systems theory in the prediction of patterns of behavior as it is happening in the fields of weather, stock market and
economy. The influence of Poincarée in Deleuze’s understanding of the world’s dynamism
is apparent in latter’s schematization of the method of dramatization and for that matter
Dynamical Systems Theory proves to be a system for Deleuze to form his ontology.
5 Virtual multiplicities are Ideas. “A multiplicity is a nested set of vector fields related to each
other by symmetry-breaking bifurcations, together with the distributions of attractors which
define each of its embedded levels. This definition separates out the part of the model which
carries information about the actual world (trajectories as series of possible states) from the
part which is, in principle never actualised”(Delanda 2002, p. 30).
Emmanouil Zaroukas UK
87
6 For further elaboration on the distinction between the virtual/actual and the possible/real
see (Deleuze, 1968) especially pages 260-261.
7 Feedback mechanisms are ubiquitous in systems everywhere and the study of them falls
in two ways to be conceptualised. “The positive and negative feedback concepts are easily generalized to networks of multiple causal relations. A causal link between two variables,
A ☐ B, is positive if an increase (decrease) in A produces an increase (decrease) in B. It is negative, if
an increase produces a decrease, and vice versa” (Heylighen, 2001).
The negative feedback loops recuperates the system to its normal functioning. The system
responds to a fluctuation caused by either internal or external perturbations and if the threshold which controls the response is crossed, the system is able to quickly return to its normal
pattern. The Stable functioning of the system is restored while the perturbations are just
corrected. Negative feedback loops therefore sustain the organization of the system through
what is known as homeostatic mechanisms. However, there are though cases where the fluctuations overcome the threshold and push the system to another pattern in its repertoire. The
system in this case is either unable to sustain the perturbation and therefore dies (brittle) or
it overwhelms its stereotyped patterns and pushes the system beyond the thresholds of its
normative zone where instead of death the system learns by creating new attractors which
imply new behaviors.(resilient systems) “While negative feedback is the essential condition for
stability, positive feedback are responsible for growth, self-organization, and the amplification
of weak signals” (Heylighen 2001).
8 Steven Shaviro will pose the following question: “Can we imagine a form of self-organization
that is not also self-preservation and self-reproduction?” The essay attempts to give an answer
by restating the question at a diachronic level of emergence that is being accompanied by
the appropriate reworkings of downward causality. See (Shaviro 2009, pp. 71-98).
9 The essay is being aligned here with Protevi’s terminology on homeostratic and heterostratic
diachronic transversal emergence. (Protevi 2006, p. 30).
10 Deleuze takes a critical stance to Hegel’s ‘relations of interiority’ where parts are determined
due to their relations to the whole that constitutes a unified totality. If a part is detached of
this totality it ceases to be what it is. Deleuze therefore invests in ‘relations of exteriority’ where
relations among parts are relevant to the capacities that can be excersised at any singular
moment. See (Delanda 2006, p. 10).
11 Based on the capacities of the entity to affect and be affected Deleuze builds a critique of
the organism that will expand eventually to the social and political realm. Ian Buchanan will
summarise this by stating that Deleuze’s intention is to reveal “the desire for the subject to
transcendent the given while still be constituted in the given”. See more in (Buchanan, 2000).
12 Brat Bunchanan has proceeded with an interesting commentary and comparison between
Heidegger’s and Deleuze’s idea of an entity embedded in its environment. (Buchanan 2008,
p.44.) Deleuze is coming considerably more close to Gregory Bateson’s ideas investing both
on the concept of difference. (Bateson 2000, p. 457)
13 Deleuze and Guattari develop there own term of machine in their volume on Anti-Oedipus.
They move from a critique on Marx’s reading on machines in order to develop their own concept. In the appendix to Anti-Oedipus which is not available in the English translation they
will state: “We proceed not from a metaphorical use of the word [machine], but rather from
an (indistinct) hypothesis about its origin: the way that arbitrary elements are made to form
machines through recursion and communication; the existence of a machinic phylum” (Deleuze
and Guattari 1977, p. 442). The text has been translated from Greek by the author.
14 Symb(i/o)sis can be thought in terms of the ‘Superfold’. In the same manner that the fold involves relations of exteriority, the Superfold opens those to new strata that consider genetic
code, third-generation machines like computers and a ‘non-language’ of a-signifying semiotics.
“It would be neither the fold nor the unfold that would constitute the active mechanism, but
88
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
something like the Superfold, as borne out by the foldings proper to the chains of the genetic
code, and the potential of silicon in third-generation machines, as well as by the contours of
a sentence in modern literature...” (Deleuze 1986, p. 131).
Bibliography
Abraham, R. and Shaw, R. 1985a. Dynamics – The Geometry of Behavior. Part 1: Periodic Behavior.
California: Aerial Press
Abraham, R. and Shaw, R. 1985b. Dynamics – The Geometry of Behavior. Part 2: Chaotic Behavior.
California: Aerial Press.
Bateson, G., 2000. Steps to an Ecology of Mind. London: The University Chicago Press.
Brautigan, 1967. All Watched Over by Machines of Loving Grace. San Francisco, California: The Communication Company.
Buchanan, B., 2008. Onto-Ethologies: The Animal Environments of Uexkull, Heidegger, Merleau-Ponty
and Deleuze. State University of New York Press, New York.
Buchanan, I., 2000. Transcendental Empiricist Ethics. In Deleuzism: A Metacommentary. Edinburgh
Univesrity Press, Edinburgh.
Delanda, M. 2002. Intensive Science and Virtual Philosophy. London: Continuum.
Delanda, M. 2006. A New Philosophy of Society: Assemblage Theory and Social Complexity. London:
Continuum.
Delanda, M. 2011. Philosophy and Simulation. The Emergence of Synthetic Reason. London:
Continuum.
Deleuze, G., 1968. Difference et Repetition. Paris: PUF. Translated as Difference and Repetition by Paul
Button, 1994. London: Continuum.
Deleuze, G., 1969. Logique du Sens. Paris: Minuit. Translated as The Logic of Sense by Mark Lester
with Charles Stivale, 1990. London: Continuum.
Deleuze, G., 1970. Spinoza. Philosophie pratique. Paris: PUF. Translated as Spinoza. Practical Philosophy by Robert Hurley, 1988. San Francisco: City Light Book.
Deleuze G., 1980. Final Year at Vincennes . Cours Vincennes (online). http://www.webdeleuze.com/
php/texte.php?cle=50&groupe=Leibniz&langue=2 [Accessed 17 May 2010].
Deleuze, G., 1986. Foucault. Paris: Minuit. Translated as Foucault by Sean Hand, 1988. London: The
Athlone Press.
Deleuze, G., 1988. Le Pli: Leibniz et le Baroque. Paris: Minuit. Translated as The Fold: Leibniz and the
Baroque by Tom Conley, 1993. London: The Athlone Press.
Deleuze, G., 2002. Desert Islands and other Texts 1953-1974. Translated by Michael Taormina. New
York: Semiotext(e).
Deleuze, G. and Guattari, F., 1972. L’ Anti-Oedipe: Capitalisme et Schizophrenie. Paris: Minuit. Translated as Anti-Oedipus: Capitalism and Schizophrenia by Robert Hurley, Mark Seem, and Helen R.
Lane , 1984. London: Continuum.
Deleuze, G. and Guattari, F., 1980. Milleux Plateaux: Capitalisme et Schizophrenie. Paris: Minuit. Translated as A Thousand Plateaus: Capitalism and Schizophrenia by Brian Massumi, 1988. London: The
Athlone Press.
Dijkstra E.W., 1972. The Humble Programmer. ACM Turing Lectures (online). 22 Mar 2007 , http://
userweb.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340.html [Accessed 26 Mar. 2009].
Emmanouil Zaroukas UK
89
Gauttari, F., 1992. Chaosmose. Paris: Eiditons Galilee. Translated as Chaosmosis: An Ethico-Aesthetic
Paradigm by Paul Bains and Julian Pefanis. Indianapolis: Indiana University Press.
Haque, U., 2007. The Architectural Relevance of Gordon Pask. Archit Design, 77. pp. 54–61.
Hensel, M. and Menges, A., 2009. The heterogeneous Space of Morpho-Ecologies. In Space Reader:
Heterogeneous Space in Architecture, eds. Hensel et al. London: John Whiley & Sons Ltd.
Humphreys, P., 2008. Synchronic and Diachronic Emergence. Minds and Machines, 18, no. 4, p. 431.
Licklider, J.C.R., 1960. Man- Computer Symbiosis [Online] Available at: <http://groups.csail.mit.
edu/medg/people/psz/Licklider.html> [Accessed 10 June 2011].
Margulis, L., 1998. Symbiotic Planet. A New Look at Evolution. Amherst, Massachusetts: Basic Books.
Massumi, B., 2002. A Sock to Thought. Expression After Deleuze and Guattari. London: Routledge.
Massumi, B., 2008. Of Microperception and Micropolitics. In Micropolitics: Exploring Ethico-Aesthetics.
Inflexions: A journal for research-Creation. October, 3.
Maturana, H.R and Varela, F.J., 1980. Autopoiesis and Cognition. :London: D. Reidel Publishing
Company.
Pearson, K.A., 1999. Germinal Life: The difference and Repetition of Deleuze. London: Routledge.
Prigogine, I. and Stengers, I., 1984. Order out of Chaos. New York: Bantam Books.
Protevi, J., 2006. “Deleuze, Guattari, and Emergence. In Paragraph: A Journal of Modern Critical
Theory, 29.2, Jul, pp. 19-39.
Shaviro, S., 2009. Without Criteria: Kant, Whitehead, Deleuze, and Aesthetics. Cambridge: The MIT
Press.
Roche, F., 2010. An architecture “des humeurs”. [online] Available at: <http://www.new-territories.
com/blog/architecturedeshumeurs/?p=757 > [Accessed 10 August 2011 ].
Simondon, G., 1989. The Position of the Problem of Ontogenesis. In Parreshia, 2009, 7. pp. 4-16.
Wark, M., 2004. A Hacker Manifesto. London: Harvard University Press.
90
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Alessio Erioli
DAPT Faculty of Engineering
University of Bolognia
Italy
The Fashion Robot
The conception of the body in the post-digital age commonly accepts the idea that
the body itself is no longer bound to its genetic set and a renaissance image. The
idea of anthropometrics - man as measure of all things - that spanned the anthropic
culture from Vitruvius to Le Corbusier, gave way to the idea of specie within an ecosystem. The body aesthetic, untied from classicism, integrates more and more the
concepts of prosthetics, mutation, and progressive dissolution of the established
boundaries of humanness (figure 1). The same ideas of body purity and integrity are
disrupted from the biological and techno-social perspectives (which are different aspects of the same interconnected plexus): human bodies change perpetually as its
constituent cells are periodically replaced and are, as a matter of fact, the environment to the smallest living organisms on earth. We are symbiotic with our viral and
bacterial pool; one example for all: E. Coli allows us to digest food in the intestine,
but microbial communities in our body are so many that they are referred to as the
microbiome and they have their own way to communicate and organize themselves
to pursue certain goals (NextNature, 2009). So much symbiotic in fact that it’s almost
doubtful whether our bodies are no more theirs than ours (Science Daily, 2008). Our
genetic set, unlike those of other animals, does not include the development of specific organs, skills or capacities that enable us to survive in specific environments; humans do not belong to any particular one, according to our anatomy and physiology
there is no such thing as a “natural” habitat for us. Yet we are inextricably connected
with the environment on all levels of complexity; in each of these, information exchange is continuous (from the horizontal exchange – specie to specie - of genetic
material up to the sophisticated transmission of values through language and cultural
systems). We are entangled with a universe of information, which at once amplifies
our sensory reach and further blurs the boundary of our bodies. We can as well say
that our specie lives in an ecosystem of information, which lies at the bottom of physical things (Wheeler, 1990) and manifests and flows as embedded property (like the
form of a chemical molecule or the color of a fruit’s skin) as well as through coded
systems (such as language, a technology, or its evolved, second order effect: culture).
Ever since our ancestors, we used technology (which pre-dates our humanness) as a
mean of environmental interaction, we took advantage of it to engineer nature in order to enlarge our ecological niche; this implies also improving our capacity to manage a higher level of complexity and sophistication in the architecture of information,
which changed our physical body (see for example how the development of cooking
food triggered the development of a bigger brain - Weinstock, 2010 - p. 151 & 157)
and mutually influenced the way we inform our architectures, both in the morphological and spatial organization and in their use as a medium and storage for our cultural
products (including social organization).
Technology then isn’t an added feature that implicitly subjugates or undermines
human condition, rather a symbiotic dimension that is inextricable from it: our environmental adaptation strategy is to build systems (NextNature, 2011), we are technological beings by nature. But when one becomes the use of a system, it becomes
part of the system: the price for the advantage of technology is our dependance on it.
Technology is co-evolving with humanity, or, in another fashion, humanity and technology are catalysts of their own co-evolution. An outdated yet popular idea (linked
to industrial and mechanical age concepts) is that which sees the “pure” body belonging to the biological world attacked and deformed by the technological world of pros92
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 1
Bart Hess - “A Hunt for High Tech”
- seeks to harness both nature and
technology and create armoured
skin and fur for a new human archetype incorporating animalistic
and fetichistic instincts.
thetics as additions. The more realistic idea of a web of relations symbiotically intertwined with its microbiome and the technium disrupts at once the ideas of “purity”
and “artificial” (which are complementary) as the notions of “growing” and “breeding”
new products are gradually substituting the one of “making”, the typical mark of man
up to the industrial age. The notion of machine in its philosophical meaning of an assemblage able of self-organization and self-reproduction processes provides a better
description of the upcoming generation of systems we will interact with.
“In Mechanism and Biological Explanation [Maturana 1970], Humberto Maturana
and Francisco Varela argue that machines and biological forms are very closely
related — so closely, in fact, that biologists can reasonably claim living systems
are machines. This is not meant merely as a pedagogical metaphor, but rather as
a rigorous analogy, which emphasizes important symmetries, and even better, expresses concisely specific experimental and theoretical aims. In what sense, then,
are living systems machines?
A machine is defined by a group of abstract operations, satisfying certain specific conditions. An abstract machine is this system of inter-relations which is itself
independent of the actual components which ‘realize’ the machine. A fishing boat
can be made from many kinds of wood, sailed on many bodies of water, used to
store many species of fish; a game of tag can be played with an arbitrary number
of arbitrary people in any suitable space. What matters is not the specificity of a
given component but the specificity of its relationships.”
(Fractal Ontology, 2011 citing Varela and Maturana, 1972)
Alessio Erioli Italy
93
Contemporary technology is establishing a pervasive and ubiquitous connection between an active space of information (whether physically embedded or coded) and
our metabolic functions, moving from an idea of human as a body+mind whole to
that of a system of relations: it is the set of those relations that engenders an ontology
of human condition as a dynamically interconnected system with its environment at
large, an interdependent part of it. Among those systems, the world of prosthetics and
robotics has evolved to a point where the threshold between biology and mechanics
has become heavily blurred: they are more and more similar to biological organs, in
behavior and structure. The man-machine duality is overcome by the idea that every
biological specie is a wet computer (BBC, 2010 and Wikipedia, 2010) and, on the other
hand, computers as we know them today are just primitive forms of a new biological
specie. Latest developments in medical research, such as 3D printing cellular tissues
and organs (like the one developed in London medical Labs by professor Alexander
Seifalian - see Medical Daily, 2011), envision new scenarios related to the possibility
to choreograph growth processes through multi-agent systems and neural networks,
where the threshold between categories (cause and effect in particular when thinking
of self-organizing processes such as catalytic webs in the context of nature, considered
as a Complex Adaptive System) is a blurred boundary. So is the one between living and
non living: the building of synthetic life is a matter of scale within the system we are
operating upon, the differences between living systems and machines as we see it today have then more to do with the scale of complexity of the relational system.
All organisms are modular: life always consists of sub-organisms which are involved together in a biological network. The interrelations between organ and organism form a series of feedback loops, forming a cascading and complex surface.
Each organ parasites off the next, but this segmentation is not spontaneous. Rather, it is development itself, the decoupling of non-communicating spaces for the organization of divergent series. Creative evolution, self-organization and modularity are the same idea.
(Fractal Ontology, 2011)
A ‘machine’ is defined by the boundary between itself and its environment; it is divided from an infinitely complex exterior. Communication within a machine-system operates by selecting only a limited amount of all information available outside (reduction of complexity), and the criterion according to which information is selected and
processed is meaning (in other words, only relevant information is selected, and relevance is assessed by evolutionary drivers coupling emergent self-organization and
environmental pressures). Machines process meaning, producing desire.
As biological machines our metabolic modularity shows a high hierarchical bias on
our brain: despite it represents only 2% of the body weight, it receives 15% of the cardiac output, 20% of total body oxygen consumption, and 25% of total body glucose
utilization (Magistretti, Pellerin and Martin, 2000); moreover, a gram of brain tissue
uses 22 times the amount of metabolic energy as a gram of muscle tissue (Weinstock,
2010 citing Aiello, 1997). This primacy in human physiology suggests analog importance in biasing relevance (meaning) and desire in the developmental processes of
metabolic modularity.
94
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
The theory of the development of metabolic modularity is called morphogenesis.
(Fractal Ontology, 2011)
Morphogenesis regards both the processes and the outcomes of the generation of
form; the capacity of transmitting morphogenetic patterns through communication implies an experience of form. Given the metabolic importance of our brain, it
plays a dominant role in such experience. Aesthetics is the dimension where such
experience takes place, in which the outcomes of morphogenetic systems generate patterns which interact with cultural systems, informing or affecting their evolution as well as being affected by them at once. Aesthetic experience happens when
all the sense operate at their climax “when you’re resonating with the excitement of
this thing you’re experiencing” (Robinson, 2010), it is direct and intensely subjective
psychological and sometimes psychophysical. The aesthetic object, according to Susanne Langer, has a virtual character: although it exists in reality as itself it has also
a semblance of a palpable experience; semblance “liberates perception--and with
it, the power of conception--from all practical purposes, and lets the mind dwell
on the sheer appearance of things” (Langer, 1953, pp. 48-9). The fact that psychology, emotion, and phenomenal experience of the art object are inextricably bound
suggests that highly individuated subjective experience is embedded into human
consciousness.
Aesthetic is then the moment of extreme sophistication in which a cultural production system is generated and then reaches the tipping point after which deploys
itself acting on the same relationships that individuals establish with forms themselves (attraction or repulsion, two aspects of behavioral ecology). It is in this sense
that fashion is intended: as a moment of fascination in which beauty (or repulsion) acquire the status of function, extending behavioral ecology beyond mere physiology,
functional efficiency and processes of environmental reaction: blushing, for instance,
is the involuntary reddening of one’s face, a rush of blood into certain vessels started
by a particular thought or emotion, which can also be self-induced. Such phenomenons testify how our brain is involved emotionally and not only rationally when driving our choices and how aesthetic can affect physical behavior. The physical body is
tied and tends to an image, a vision, which becomes an attractor. Recent research
and newborn fields in cognitive neuroscience, such as neuroaesthetics which is “concerned with the neural underpinnings of aesthetic experience of beauty, particularly
in visual art” (Cinzia, D.D. & Vittorio, G., 2009), are trying to disclose the relational patterns of aesthetic experiences.
Cultural evolution has produced an imaginary of bodies and machines far beyond
their purely functional dimension, especially in the pop universes of cinema and comics: those examples, especially the ones which cross the threshold of sophistication so
as to engender an aesthetic experience and a possible future become as viable as real
examples insofar they become attractors of future developments and behaviors which
can vehicle novelty in everyday life habits and sensibility.
The Otaku world in particular has been so far the most fertile ground on which robotics, aesthetics, cultural pulses and social models have interwoven and proliferated.
Takashi Murakami’s work is a significant example in the world of art, sublimating in
his works the unsolved conflict among sexual repression (where erotism and sexuality
Alessio Erioli Italy
95
represent the orbital of interaction among biology and cultural systems), obsession
for perfection, neoteny in the manga graphics, the fear and the morbid attraction for
cyborgs, tech and mecha (all that is a fusion or hybridization between flesh and metal)
in an uncompromised and vibrating tension. One significant example in this perspective is the sculpture tryptic “KO2 Second Mission” (figure 2), that depicts the transformation of a robot girl into a spaceship with extremely refined detail and elegant interconnection of metallic and organic parts. The obsession of losing humanness and
the flesh-machine clash have produced a great number of examples of progressively
refined metabolic modularity in that sense, from the purely mechanic (giant robots)
to cybernetic (Casshern, where a boy renounces his humanity to fight the legacy of his
father and search redemption for both) up to the biologic, genetic and parasitic body
manipulation, each one with its own set of sophisticated morphogenetic expressions
that resonated in the cultural domain. The Guyver, called “bio-booster armor”, carries
the idea of parasitizing and symbiosis (the “armor” is really an alien specie) with a continuum between the technological and biological dimensions.
Reiser and Umemoto, in “Atlas of Novel Tectonics”, defined architecture as a “mediator between matter and events” (Reiser+Umemoto, 2006). Armors, clothes, prosthetics are mediators between body and space (physical and event space); they embody
spatial articulation and differentiation among body parts and the functions they perform, affecting the deployment of capacities in the system in relation to the possible
contingent situations facilitating, amplifying or inhibiting degrees of freedom and/
or external exchange. Moreover they respond to environmental instances, protection
and communication (patterns used as language or information transmission, such as
transparencies used to modulate degrees of privacy and/or seduction for mating).
In the dimension of ornament elements and figures that declare their identity and
belonging are used as instrument for the harmonization and articulation of the system itself beyond mere function: ornament, made of more or less figurative patterns,
builds up an added layer of aesthetic sophistication; in some cultures this function is
exerted directly on the skin, through tattoos.
Any of these variables is influencing the others in an intensive and turbulent field
which evolves in space-time and where these influences engender emergent patterns
& self-organization processes. Dresses, armors, prosthetics are metabolism mediators,
expanders and enhancers.
A beautiful evidence of this multiplicitous role is the case of XV century armors
(some of which were carefully catalogued by Viollet-le-Duc in his “Dictionnaire raisonné du mobilier français de l’époque carlovingienne à la renaissance”, part 5 & 6, 1874)
in the invention and evolution of specific interconnecting parts (such as the elbow
or knee plate or the horse-rider compound leg plate depicted in figure 3). These war
outfits represent the peak expression of steel-plates armor evolution before the invention of firearms such as the arquebus, after which they were progressively abandoned
because of their ineffectiveness against such weapons (see Arquebus in Wikipedia,
2010): their primary protection function (which required stiffness and body coverage
against spears) had to be mediated with body anatomy (modulation of stiffness and
articulation for movement) and physiology (lightness, ventilation), thus engendering over time ornaments which strengthen the armor surface through corrugation
(allowing for a lighter plate with equal or improved resistance) and the use of scaled
96
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
Takashi Murakami - “KO2 - Second
Mission”.
Fig. 3
horse-rider compound leg plate in Viollet-le-Duc
“Dictionnaire raisonné du mobilier français de
l’époque carlovingienne a la renaissance”, part
6. image © Universitätsbibliothek Heidelberg,
license type: Creative Commons (by author, non
commercial, share alike).
connected plates with increasingly sophisticated joints were movement was required
(up to gauntlet’s fingers). Evolved control of fire temperature stability and mastery of
steelwork pushed by military needs produced a self-sustaining takeoff in terms of ornament, differentiation and articulation of parts, bringing the armor to the level of sophistication in which it became a performative and cultural metabolic extension: etches and corrugations in the armors, as well as specific interconnection parts, became
a layer of information embedding narratives of the knight’s personal history, family,
legacy and successes in battles. Since relations are always mutual, many family badges
were shields, which is an evidence of a tight relationship between war and daily life
(defend and conquer); such identification in vessels and logos has commercial and
military origins (which are, by the way, two different ways of negotiation).
Among the contemporary versions of the plate-armored knight, Marvel Comics
character Iron Man (in his recent blockbuster movie version, see figure 4) embodies
the more faithful version of the XV century armor evolution: apex of an extensive concept of architecture made of discrete elements, perfectly forged and fit according to
the body proportions, an extension and amplification of human possibilities wrapped
Alessio Erioli Italy
97
in a luxurious, shiny chrome outfit. An even more recent version of the armor as metabolic enhancer that
also deals with concepts of complexity and biomimetics is represented by the Nanosuit conceived for
Crysis videogame (figure 5), where nanomachines
act as agents which interact directly with the body’s
metabolic functions in order to empower or protect
the body, going through self-organized swarm configurations where specialization of functions derives
from local differentiation of the agents themselves.
Although fictional, these examples have reached (due
to the increasing demand of realism by contemporary
sci-fi stories) a level of detail and specificity of parts to
provide an aesthetic product which is not only as valid as a real one, but potentially also envisions a possibility yet to happen (it would not be the first, nor the
last case of a fictional apparatus that gives research
the visionary push and direction toward a forthcoming innovation), an aesthetic attractor. See for example the case of the Jetpack: their concept appeared
in popular culture, particularly science fiction long
before the technology became practical (Wikipedia,
2010).
Analogue relations show up in the case of clothes,
where technological evolution (from the first knots
holding together animals skins to the increasingly
Fig. 4
Iron Man prototype armor
- © MARVEL comics group.
Fig. 5
Crysis U.S. Nanosuit - © Crytech.
98
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
sophisticated interweaving of threads to produce fabric and explore their aesthetic
expressions) cohered with the response to metabolic functions and cultural transmission. The african dress in figure 6, coming from the museum of Gibellina in Sicily
uses as ornament a pattern which is a diagram for field irrigation: complex information is stored and transmitted through generations by means of garments, which
themselves carry in their fabric, sewing and construction information about how to
Fig. 6
African dress from Gibellina museum - photo by
Alessio Erioli.
Fig. 7
Fig. 8
Iris Van Herpen in collaboration with Daniel Widrig - escapism collection - © Iris Van Herpen.
Iris van Herpen in collaboration with Daniel Widrig - escapism collection - © Iris Van
Herpen.
Alessio Erioli Italy
99
second thermal regulation, respiration, movement, protection as well as social status,
gender, religion, etc. Without forgetting or underestimating the innovative contributions of Issey Miyake, Hussein Chalayan and Alexander McQueen (especially his unforgettable Spring-Summer 1999 collection show where 2 7-axis robotic arms and a
model make an esquisite display of playful, innocent, erotic surrender to technology
as art expression in fashion - Ebesp, 2006), a singular example in the contemporary
fashion scene for its connections with architecture is the case of the Crystalization and
Escapism collections by Iris Van Herpen in collaboration with Daniel Widrig (figures 7
and 8 respectively): modularity, self-similarity, topological abstract machines and formation processes are conveyed through the exploration of the collection’s concept
and materialized through a range of techniques that include traditional leatherwork
side by side with rapid prototyping. Descending along the material scale down to
the chemical molecules, the work of Manel Torres’ “fabrican” (figure 9) represents another interesting evolution: a sprayable fabric which can be modulated, deployed and
graduated not only according to the specifics of each’s own, but again materializing
an expanded aesthetic of the body which is enhanced by means of scaffolds, artifacts,
underlying structures; free from the constraints of plain woven fabric or discrete elements, this clothing technique brings up an aesthetic that entirely relies on intensities,
modulating material computation through density and thickness and closely behaving through fiber layering like human skin. In both cases aesthetic and signification
rely on the interplay between body and artifact without the embodiment of more or
less explicit messages.
Sonja Bäumel’s “(IN)visible membrane” (figure 10) goes even further, working on
the threshold of consciousness of our body as symbiotic with our microbiome, envisioning a second living layer on our skin based on the interaction between individuals
and the surrounding that confronted scientific data and methods with fashion design:
colonies of bacteria in our skin can sense the surroundings and produce growth patterns providing the right protection in various situations, a hidden membrane between body and environment with multiple actualizations.
In her own words:
What happens if we make the micro world of the human body perceivable?
I want to confront people with the fact that our body is a large host of bacteria and
that a balanced perception of the body is closely linked with a balanced perception
of the self.
(Bäumel, 2011)
Prosthetics is generally intended as the branch of medicine dealing with the production and use of artificial body parts that allow amputees to fair their capacities to the
ones of a non-amputee: retrofitting amputee soldiers after World War I (see figure 11)
was a way of re-humanizing them by giving them a new purpose, a role to make them
again part of the society. In a more general perspective (looking at the etymology of
it: προσθετικός “adding; repletive; giving additional power” ) we can consider a prosthetic any kind of device able to augment human capacities with which we establish
a symbiotic relation: smartphones, contact lenses, exoskeletons, vehicles. Again, the
world of fiction provides us a vision of a possible future through the viral campaign of
the videogame “Deus-Ex: Human revolution”: SARIF industries does not exist, but ex100
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 9
Manel Torres - “fabrican” sprayable fabric - © Fabrican Ltd.
Fig. 10
Sonja Bäumel - “(IN)visible membrane” © Sonja Bäumel.
Fig. 11
Amputee soldier retrofitted to help in reconstruction service - © photo by American Red Cross,
National Geographic Stock.
Alessio Erioli Italy
101
perts and scientists predict that the sophistication of their “augmentations” (figure 12)
could be reached in 2027 (the year in which the game is set), if not before (DeusExOfficial, 2011). It’s not really necessary to reach the point when people spontaneously
accept to have their sane limbs replaced by SARIF’s augmentations (one of the game
scenarios) to understand that fairing for an amputee is about restoring a self-confi-
Fig. 12
SARIF industries augmentations - © EIDOS/Square Enix.
Fig. 13
Bespoke innovations prosthetics - © Bespoke innovations.
102
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
dent body image as much as a lost performance (and, in both cases, we are not necessarily meaning the original ones): Bespoke innovations (figure 13) work on this line
of development for their prosthetic limbs, providing also the possibility to customize
them with several material combinations as well as unique patterns. Latest developments on the front of performance and function are placing the new limbs ever closer
to the threshold of a biological new organ. Again, borrowing from the world of science fiction, the stunning set of organs that Robin Williams/Andrew Martin designs for
himself to become human in “Bicentennial man” (the movie in itself is a curious case in
which ideas about artificial life are conveyed with unusual confidence through a hollywood family movie) represent a possible frontier for research as well as an eye-candy,
rich, plethora of synthetic-organic complex systems morphologies (see for example
the heart in figure 14).
In all cases (armors, clothes, prosthetics), at the peak of their sophistication the whole
set of parts (no matter what their scale or linkage type) is harmonically assembled
(meaning assemblage as set of relations) in order to form a coherent system in terms
of functionality, aesthetics, communication as well as being the result of environmental selection pressures.
When metabolism encompasses cultural processes in its complex information exchanges, the biological and cultural worlds interweave in taking part in morphogenetic and evolutionary processes (such as in the previously mentioned case of extended
behavioral ecology and aesthetics). When a morphogenetic system (be it genetically
grown or crafted) reaches the level of sophistication of aesthetic experience, it triggers a cascade of emergent feedback loops among biology, the technium (as Kevin
Fig. 14
Bicentennial man heart - © Sheep’s Clothing production.
Alessio Erioli Italy
103
Kelly names the kingdom of technological evolution - Kelly, 2010) and the cultural
dimension.
Architectures can be considered extensions of our collective metabolism, to which
we outsourced part of our physiology (i.e. thermal regulation is partially performed
through our buildings, widening our ecological niche and allowing us to spend more
metabolic energy in other activities and colonize a wider range of environments), our
protection (Manuel Delanda considers cities as our “mineralized exoskeleton” - Delanda, 2000), our memory of social organization (architecture is also a communication device itself, integrating and interacting with language and culture, storing and
transmitting information about biological, social and spatial organization from one
generation to the next: separations and organizations in cities responded to many
pressures, included social classes, norms about hygiene and sorting for classification).
Architecture tackles and organizes the same instances of environmental mediators on
the scale of a community as emergent result of individual interaction: through spatial
distribution pattern articulation it choreographs fluxes of matter-energy-information
as a result of the forces interacting with the milieu upon which it is based: environmental pressure, culture, patterns of social interaction, resources, etc. Steering away
from the mere concept of a rigid protection and mitigation artifacts, it should become
a more sensible mediator for interconnected relationships in a complex field of environmental and cultural pressures, as well as a new dynamic subject itself, using form
as agency for environmental adaptation and construction of relevant (meaningful)
open sets of new relations. The relations that forge meaning in a reality aware of complexity aren’t tied anymore to a formal repertoire made of a finite set of entities, rather
forms themselves dynamically, evolving together with the effect of their own interactions: aesthetic judgment relates to form as agency and how, depending on context,
establishing relations with other interacting and exogenous forces that can be at work
on it, meaning can change and differentiate.
In the course of Architectural design 3 we searched where environmental negotiation,
cultural pressures and body-space relations find their developmental landscapes. The
studio investigated, using environmental pressures as a breeding terrain, the aesthetics of environmental modulators with the means to promote the exert of students’
personal sensibility in front of the emergent potential within the complex relationships among form-process-performance; such sensibility was then applied to morphogenetic systems as models of integration for physiology, culture, social pattern,
environment. The course investigated on specific cases where the interconnected
environmental conditions and cultural pressures are the culture medium for systems
to thrive and grow onto. The ongoing processes and results are visible online on the
course blog (http://a3-tfr.blogspot.com/). The case-study presented here (figures 15 to
20) regards the design of a research center on aesthetic and nanotechnologies connecting together scientific research and performative arts, providing strong interconnection and interaction environments, as well as exhibition and event spaces and a
high porosity and connection among the various parts. The project is situated in a
garden which works as connection tissue among 3 streets of different circulation relevance and activities such as: the actual museum of modern art of Bologna (MamBo),
the historical Cinema and film restoration center and a night club. Using the sport car
104
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 15-16
Winding Ridges - project developed in the Architectural Design 3 course, Faculty of Engineering Università di Bologna - prof. Alessio Erioli - group: 5core (Matteo Cominetti, Marco Magnani, Luca
Pedrielli, Gianluca Tabellini, Francesco Tizzani).
Alessio Erioli Italy
105
106
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 17-20
Winding Ridges - project developed in the Architectural Design 3 course, Faculty of Engineering Università di Bologna - prof. Alessio Erioli - group: 5core (Matteo Cominetti, Marco Magnani, Luca
Pedrielli, Gianluca Tabellini, Francesco Tizzani).
Alessio Erioli Italy
107
prototypes as environmental mediator upon which base the morphogenetic study a
broad taxonomy of elements such as ridges, intakes, scoops provided the basic pool
of morphology-performance-aesthetic relations to work onto in order to design the
morphogenetic system that colonized the territory driven by environmental variables
such as radiation, natural ventilation and terrain sloping. The project aims to go beyond pure performative reaction to environmental stimuli promoting itself as autonomous, proactive ordering principle in which environmental pressures fuel a complex
morphogenesis that channels performance through its own aesthetic reasons.
References
BBC, 2010. Chemical computer that mimics neurons to be created. [online] available at <http://news.
bbc.co.uk/2/hi/8452196.stm> [accessed 11 january 2010].
Baumel, S., 2010. (In)visible. [online] available at: http://www.sonjabaeumel.at/index.
php?content=im>
Cinzia, D.D. & Vittorio, G., 2009. Neuroaesthetics: a review. Current Opinion in Neurobiology, 19(6),
pp.682-687.
De Landa, M., 1997. A thousand years of nonlinear history, Zone Books.
DeusExOfficial, 2011. Deus Ex: The Eyeborg Documentary. [video online] available at <http://www.
youtube.com/watch?v=TW78wbN-WuU> [accessed 12 july 2011].
Ebesp. Alexander McQueen Spring Summer 1999 - Creating an art piece. [online] <http://www.
youtube.com/watch?v=reK0A1XIjKA> [accessed march 2010].
Fractal Ontology. Machinic Autopoiesis. [online] available at <http://fractalontology.wordpress.
com/2007/10/11/machinic-autopoesis/> [accessed 5 february 2011].
Fractal Ontology. Abstract Machine. [online] available at < http://fractalontology.wordpress.com/
category/deleuze-and-guattari/abstract-machine/ > [accessed 5 february 2011].
Kelly, K., 2010. What Technology Wants, Viking Adult.
Langer, S., 1953. Feeling and Form.
Medical Daily. Human Body Parts Growing in Jar May Save Lives. [online] available at <http://
www.medicaldaily.com/news/20111031/7717/man-made-body-parts-frankenstein-replacementorgans-synthetic-body-parts-synthetic-organs-trans.htm> [accessed October 31, 2011].
Metapedia. Cultural Hibridity Discussion, Fall 2009: week 8. [online] available at <http://www.
metapedia.com/wiki/index.php?title=Cultural_Hybridity_Discussion_Fall_2009%3A_Week_8>
[accessed march 2011].
NextNature, 2009. How do bacteria communicate. [online] available at <http://www.nextnature.
net/2009/04/how-do-bacteria-communicate/> [accessed 5 November 2010].
NextNature, 2011. System animals. [online] available at <http://www.nextnature.net/2011/04/
system-animals/> [accessed April 2011].
Reiser, J. and Umemoto, N., 2006. Atlas of Novel Tectonics 1st ed., Princeton Architectural Press.
Robinson, K. Changing education paradigms, 5:50-6:10. [video online] available at <http://www.
ted.com/talks/ken_robinson_changing_education_paradigms.html> [accessed 12 January 2011]
Science Daily, 2008. Humans Have Ten Times More Bacteria Than Human Cells: How Do Microbial Communities Affect Human Health?. [online] available at <http://www.sciencedaily.com/
releases/2008/06/080603085914.htm> [accessed February 2011].
108
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Varela, F., Maturana, H. Mechanism and Biological Explanation. Philosophy of Science, Vol. 39, No.
3. (Sep., 1972), pp. 378-382.
Viollet-le-Duc, Eugène-Emmanuel. Dictionnaire raisonné du mobilier français de l’époque carlovingienne a la renaissance (Part 5 & 6), Paris, 1874. [online] available respectively at <http://digi.ub.uniheidelberg.de/diglit/viollet1874bd5/> and <http://digi.ub.uni-heidelberg.de/diglit/viollet1875>
[accessed February 2011].
Weinstock, M., 2010. The Architecture of Emergence: The Evolution of Form in Nature and Civilisation,
John Wiley & Sons.
Wheeler, John A., 1990. Information, physics, quantum: The search for links. Complexity, Entropy, and
the Physics of Information, Addison Wesley.
Wikipedia, 2010. Arquebus. [online] available at <http://en.wikipedia.org/wiki/Arquebus> [accessed February 2011].
Wikipedia, 2010. Jetpack. [online] available at <http://en.wikipedia.org/wiki/Jet_pack> [accessed
September 2011].
Wikipedia, 2010. Wet Computer. [online] available at <http://en.wikipedia.org/wiki/Wetware_computer> [accessed 10 November 2010].
Alessio Erioli Italy
109
Anastasios Tellios
School of Architecture
Aristotle University of Thessaloniki
Greece
Human Parameters
in Technology-Driven
Design Research:
Formal, Social
and Environmental Narratives
This paper will attempt to address issues referring to this once again defined relation between architecture and technology. As this relation grows deeper and already irreversible, a certain degree of intellectual as well as physical osmosis occurs
between the worlds of IT and architecture. Architecture absorbs the information
technology tools and methodologies and gradually begins to digest and internalize
them. The shift from humanities towards technology may be an evident outcome
of this osmosis. Proof for this underlying procedure is given and documented inside the studies’ curricula of Schools of Architecture and relevant research institutes
around the world.
Architecture, though, has -and should- never resigned from being a space-investigating act. Its main task is the conception, designing and production of space. The
definition of space in this context is that of space being the vessel of human activity
and social interaction.
There may have been a distinct ‘feel-well’ character when it comes to values of architecture that have defined and followed architecture so far, such as permanence,
solidity, stability and clarity. These values have guided the architect’s practice almost
forever, but now, they do not seem to be valid anymore. Furthermore, there is a lighthearted feeling among circles of criticism when completely renouncing them as flat
and narrow-minded. These values, though, have never been a flat projection of a similarly flat internal discussion. They have been theoretical platforms, proved and tested
for centuries. Each time they were closely tied to the certain, existing technological
niveau that prevailed and supported civilization each time.
There is an evident theoretical frame of a general consensus and encouragement
of the development and enrichment of new technologies and advanced digital design techniques. This is the current state of the art for the education and training of
architects and this should be clearly stated here. The almost apologetic comment,
therefore, projected in the previous paragraph, may sound like resonating a different
family of opinions and reflecting the way that architecture and the built space have
been treated in previous times. It is just as fair, though, for this kind of theoretical
opposition to be stated. Since technology constantly evolves and re-shapes the cultural construct, as we know it in an unprecedented way, architecture inevitably has
to follow and re-adjust itself responsively. This kind of procedure has been happening forever and still is. What should be pointed out, though, is that when it comes to
architecture, it is happening with a certain phase difference, compared to the technological advances. This phase difference has to do with architecture’s special features.
These features are related to architecture’s apparent geometry, its mass and solidity,
its very physicality as essential component of the built environment. Therefore, a certain amount of the criticism that this ‘old school’ approach is receiving is largely a projection of forces of intellectual inertia and not an internal or intrinsic intellectual stubbornness of architecture. And definitely it doesn’t imply any possible inadequacy of
the architectural intellect per se. Architecture just needs time to understand any new
status and adapt and comply with new rules set each time.
A new Understanding of Architecture: Complexity and Nature
What stands before us, as architects, is a promising, yet only almost new understanding of the space-defining procedure. This new understanding of the act of architec112
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
ture, allows the re-emerging of a typical array of issues. These issues belonged to
architecture’s agenda since centuries. They address a repertoire of quasi-classical bipoles, such as old-new, natural-artificial, etc. The validity of these bipoles presupposes
also the validity of all the corresponding hybridization techniques between the poles
each time.
According to a certain line of thinkers, the various parameters of architecture can still
be successfully defined by G. Broadbent’s (Broadbent, 1974) four ‘deep structures’ as
an explanatory scheme for architecture. This set of factors consists of:
a. The functional factor (architecture as container of human activity)
b. The aesthetic factor (architecture as a symbolic object)
c. The ecological factor (architecture as the converter of natural resources)
d. The physical factor (architecture as interacting within the built world)
Among others, someone can clearly underline the formal, the social and the environmental parameter. Previous definitions of architecture have been considering these
parameters as separate entities to be judged autonomously. The late and simultaneous evolutions of architecture and IT have urged that the architectural discussion
addresses the architectural procedure as a whole. There is a strong unifying element
acting as a catalyst today. This unifying element ultimately describes the way architecture is discussed, designed, scripted and fabricated today, not necessarily following
this order.
This element is of a bifold character. It incorporates two substantial, core issues,
questions that once again emerge in the architectural discussion. The first question is
about how to handle complexity, and the second question has to do with what to do
with nature. This is a direct cross-reference back to the space-investigating task of architecture, as stated in the beginning. Once again, architecture has to deal with complexity and its relation with nature.
Complexity has been the object of the architectural thinking various times in the
past. During the late years, following the structural waves of modernism, post-modern
theorists as well as the avant-garde of the time have attempted to decipher complexity and then incorporate it in architecture in a coherent manner. This has not always
been done effectively, even though at the time, it has received positive public welcome. Robert Venturi, during his mature, post-modern culminating has tried to address complexity when designing his ‘Vanna Venturi House’ (Venturi, 2002). He presented an obvious overlap of elements running throughout the design procedure,
from the organization of space to the elaboration of the plans and the articulation
of the elevations. Some decades later, Frank Gehry offers a more alternative version
of complexity in his iconic Guggenheim Museum in Bilbao. With an underlying theoretical discussion of de-construction of the time, Gehry presents an unprecedented
spatial definition of a complex architecture. Both approaches have had a deep influence in architecture and they are already distinctive moments of modern architectural
history. As interpretations of complexity, characteristic of specific and defined architectural genres, they are both educated, eloquent and strict. Still, though, they seem
rather confined in their strictness and offer only a faceted image of complexity and its
dynamics.
Anastasios Tellios Greece
113
Compared to these examples, natural systems that possess and incorporate complexity in their entirety, act in a different way. The ancient city of Khara-Khoto, once a
prosperous urban system, has been operating as a city for centuries. When the regional climate changed, the local ecology collapsed (Weinstock, 2010). Ecologies shifted
and changed and a more flexible natural system, more adaptive to complexity prevailed. Sand dunes now cover the ancient city, providing a natural topography of waving sand structures. The mere naivety of this example of natural prevalence following
an almost Darwinian procedure projects a non-educated, yet efficient spatial action.
Research Agenda: Spatial Investigations and the Limits of Design
The intellectual frame placed above is one part of the agenda of a design studio at
the School of Architecture of Aristotle University of Thessaloniki (AUTh, Greece). The
other arts are related to advanced design tools and methodologies. The design studio is called ‘2S1 62: Spatial investigations and the limits of design’. A further series of
Diploma Projects and Diploma Theoretical Researches is being organized and implemented under this scheme.
The object is the research being conducted is related to the procedure of architectural design and the investigation of its origins and its limits. As stated in the studio’s
curriculum, it attempts to connect architectural design, as a deep, creative procedure
with the broad fields of innovation, the study of structures (biological, technological,
etc.), the scientific observation (microscopic, molecular, macroscopic, etc.), as well as
other scientific and creative fields, using advanced technologies for digital design and
spatial representation. The aim is to understand the dynamics of space and its qualities, the challenge of established building schemes, the experimental process of complex and sometimes unexpected alternative functional programs. The purpose is the
final proposal of innovative spatial situations, through comprehensive, synthetic architectural narratives.
Particular emphasis is placed on encouraging the development of personal design
vocabularies, portraying flexibility and resourcefulness in responding to complex spatial requirements. The studio’s design direction can be coded in the following: the use
of advanced design techniques, the employment of spatial narratives and a complete
freedom of expressive means. The studio always tries to look at a ‘big picture’, with architecture being a part of a broader environment, physical and intellectual.
Within the research frame of a design studio routine, various projects seem to be
‘talking’ about all these aspects of the design through new IT tools. The ‘human’ as a
notion but also as a bodily incarnation of life is put under focus. The methodology as
to design a ‘living’ piece of spatial arrangement, the social interaction of biologicallydriven design elements and the adaptation of the human inside the physical environment are design approaches implemented within the frame of the design studio and
additional diploma-level design research.
Ultimately, the ‘human’ is related to a ‘living’ architecture. This living quality has
a certain connection with the human as an entity, its scale, its needs, its ethics let’s
say, its very human features. It is quite far, though, from metabolistic approaches as
they have appeared in past decades and represented by important architects. What
is really investigated here is a new relation with the physical environment. The architect’s work is considered as a procedure articulated with the physical environment
114
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
and eventually being part of it. The physical environment does not stand outside the
frame. It is not a sterilized entity besides architecture but an organic element inside
the very core of architecture.
Three different design approaches will be presented. They seem to address, respectively, to a formal, a social and an environmental parameter. All three projects do
research by design the issues framed and described above. This is done through a specific trend of ‘design speculation’ using advanced tools and the use of spatial narratives to support the presented schemes.
‘Birth ponds, in_vitro/arch’
Design studio: 2S1 62, Spatial Investigations and the limits of design, 2010-2011
Student: Giorgos Tsakiridis
The project investigates the possibility of designing the ‘body’ of architecture, thus the
ability of a formal, spatial and structural language not just to imitate a physical (biological) body, but to generate a physical presence through the ‘reading’ of biology’s
generative techniques. The designed spatial device is a not a smart skin or surface but
an entire spatial organ. The project addresses the issue of geotropism, thus the ability
of plants to respond to gravity and the circles of life involved.
Fig. 1
Giorgos Tsakiridis, ‘Birth ponds, in_vitro/arch’, Design studio: 2S1 62, Spatial Investigations and the
limits of design, supervisor: A. Tellios, School of Architecture, Aristotle University of Thessaloniki,
2010-2011.
Anastasios Tellios Greece
115
Architecture is considered as a parasite to nature and not vice versa. What is investigated here is the possibility of a form to evolve, mutate and grow, each time adjusting
itself to certain functional and spatial needs. Form-generating natural policies are replacing the sometimes infertile form-finding parametric techniques. The aim is to find
ways to incorporate biological techniques and mechanisms into architectural design
procedures.
The specific project is about the design of ephemeral constructions. Placed in
between the boundaries of science fiction and reality, it attempts to introduce the
‘living’ factor in architectural design. The creation of form emerges from the controlled development, growth and eventually deterioration of plant mass, the -genetically
mutated- cells of which are not able to follow the law of gravitism. By suggesting a
continuously changeable process, instead of a final outcome, it investigates the dynamics of space and its qualities, to the point where the significance of the ‘circle of
life’ is not only connected to the users along with their activities, but also to the ‘containers’ of life.
The designed product is not imitating a physical, biological body. Instead, the
project focuses to attain and maybe rationalize a structural, morphological, formal genius found in nature. This genius is ultimately transcribed into architectural
design.
‘Spatial Cochleas’
Design studio: 2S1 62, Spatial Investigations and the limits of design, 2010-2011
Student: Alexandros Charidis
This project attempts the potential of choreographing the behaviour of architecture,
through the social interaction of a spatial, biologically-driven device with the everyday lives of humans. The project is using the given space and architecture. It is researching on the capacity to create an open-source system, able to absorb and physically respond to social stimuli and re-adjust itself. The designed mechanism is closely
related to the behavior of people and elements of architecture. This is investigated
through the social interaction of a spatial, biologically-driven device with the everyday lives and routines of humans.
The scientific search of the project is initiated by the ability of the human auditory system to distinguish specific sound frequencies within a wide frequency range.
This property is explained by industry psychophysics and theories of ‘Auditory Scene
Analysis’ (Bregman, 1990). Using similar sets of theories, the protocol of a biotropic
organism is being articulated.
The Cochlea is a parasitic mechanism which represents the spatial mapping of frequencies inside the environment that it resides and ‘feeds’. By the use of certain hierarchical procedures the mechanism is evolving and allows for direct modification
of functionality from its users through an ‘open-source’ language. The Cochlea interferes into the social habits of humans occupying space. It is present in their places of
gathering and their passages. Moving and standing, are characterized by different
architectural qualities. The Cochlea reads these human habits, and their distinct frequencies. It then responds and re-adjusts its very physical presence and the size and
position of its elements.
116
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
Alexandros Charidis, ‘Spatial Cochleas’, Design studio: 2S1 62, Spatial Investigations and the limits of
design, supervisor: A. Tellios, School of Architecture, Aristotle University of Thessaloniki, 2010-2011.
‘Inhabiting fragile territories: a little story for a little house’
Diploma design project, 2010-2011
Student: Stylianos Psaltis
This diploma design research project is speculating about the generating of spatial
hybrids, the fulfilment of a functional program and their adaptation in a physical surrounding. Natural elements are suggested to inspire the generation of key architectural components that will, eventually lead to the composition of a new man-built environment and territory. The design scheme involves an essential symbiotic narrative.
This is the basic architectural concept of the ‘inhabitation’ of nature. This procedure is
supposed to enable and activate space so as to deliver and facilitate specific human
life and activity.
Initially this project was worked using actual tactile techniques, molding clay and
plaster using handcraft. A certain speculation was being conducted as to find a way to
frame human, visceral wisdom, delivered instinctively onto matter by the movement
of the hand’s muscle while molding clay models, and interacting with nature using
hand labor. Then the physical result was once again examined and re-articulated using digital tools. This allowed the speculation of a constantly evolving and expanding
formal condition, where apart from measurable elements, other nonmaterial qualities
were incorporated into the digital model.
This all happens within the boundaries of a speculative, interactive and non-linear
relationship. As the architectural product emerges and is being produced by the elements of the landscape, at the same time it is actively reproducing that very landscape and to a certain extent is giving physical ‘birth’ to it. An organic ‘fight’ is taking
Anastasios Tellios Greece
117
Fig. 3
Stylianos Psaltis, ‘Inhabiting fragile territories: a little story for a little house’, Diploma design project,
supervisor: A. Tellios, School of Architecture, Aristotle University of Thessaloniki, 2010-2011.
Fig. 4
Stylianos Psaltis, ‘Inhabiting fragile territories: a little story for a little house’, Diploma design project,
supervisor: A. Tellios, School of Architecture, Aristotle University of Thessaloniki, 2010-2011.
place between materials and human actions. The project speculates on ways of a direct domestication of nature, and a ‘gentle’ incorporation to the maximum possible
extent. Human existence depends materially on the surrounding territory and at the
same time natural elements depend essentially on human action, craft and labor.
What is subconsciously investigated here is a series of notions, so far only superficially touched by ‘digital architecture’s’ discussion: the insuperable, the imaginary, the
mysterious, the emotional, even the illogical, the experiential or the subconscious and
their possible projection within a virtual, digital space and world. Additionally, a fresh
environmental agenda is hereby presented and supported, which is far from existing
‘green’ and sometimes insufficient, commercial architectural solutions.
118
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Conclusion
Technology-driven design research is already an established act and the use of its
mostly digital tools can be considered as a given condition. Sometimes, though, the
strict obeying to tools, features and capabilities creates often myopic and shortsighted design outcomes. This is the reason why there is already a formulated and popular digital Puritanism about these territories of architecture. The kind of projects produced under this trend, are sometimes too involved with an obviously controllable
micro scale, a long way down the design procedure.
It may appear as necessary, therefore, to take a step back, so as to be able to receive and comprehend a broader sight of the designing event. It may be resembled as
a step backwards, but ultimately, this stepping back is a systematic stepping sideways,
or a constant habit of always moving a bit sideways so as to get as many perspectives
of the same subject as possible while keeping looking forwards. Despite the evident
shift of focus from the field of humanities to that of technology, design research within the design studios still seem to pursue a personal approach, closely related to spatial narratives.
Inside these narratives, nature is not a decorative element of human inhabiting on
earth, therefore an intelligent environmental approach is required (Tellios, 2011). The
surrounding world is a complex set of elements and so are the lives of humans, therefore our social behavior inside the designed architectural space is closely depending
on decisions made by the architect. In this frame, technology offers all tools to essentially drain wisdom out of nature and inspire a new formal presence for the built
environment.
There is, once more, a re-invented cultural role to be identified within young researchers’ approaches on architecture. Some of those projects constitute intelligent,
almost bodily incarnations of cultural and social needs and values, encouraging new
versions of social life and interaction and leaving ever less room for sceptical approaches. The suggested architecture has the capacity to self-regulate itself confronting the massive evolution of its substantial design tools, thus, constituting a concise,
holistic spatial event.
References
Anestidis I., Siopidis I., Tellios A. (superv.), Deterministic Chaos: Emerging natural structures, Diploma
theoretical research, School of Architecture, Aristotle University of Thessaloniki, 2010-2011.
Bregman, A. S., Auditory Scene Analysis, The Perceptual Organization of Sound, Bradford Book, 1990.
Broadbent G., ‘The deep structures of architecture’, in ‘Design in Architecture’, Wiley and Sons, London, 1974.
Charidis A., Tellios, A. (superv.), ‘Spatial Cochleas’, Design studio: 2S1 62, Spatial Investigations and
the limits of design, School of Architecture, Aristotle University of Thessaloniki, 2010-2011.
Colletti M., AD: Exuberance: new virtuosity in contemporary architecture, Wiley, London, 2010.
Cruz, M. and Pike S., AD: Neoplasmatic design, Wiley, London, 2008.
Davies, C., Key houses of the twentieth century, WWNorton & Co Inc, New York, 2006.
Hensel M., Menges A. and Weinstock M., AD: Techniques and Technologies in Morphogenetic Design,
Wiley, London, 2006.
Anastasios Tellios Greece
119
Lynn G., Animate Form, Princeton Architectural Press, New York, 1999.
Psaltis S., Tellios A. (superv.), ‘Inhabiting fragile territories: a little story for a little house’, Diploma design
project, School of Architecture, Aristotle University of Thessaloniki, 2010-2011.
Sheil B., AD: Protoarchitecture: Analogue and Digital Hybrids, Wiley, London, 2008.
Tellios A., Synecdoches: architecture, image, spatial representation, Epikentro, Thessaloniki, 2011.
Tsakiridis G., Tellios A. (supev.), ‘Birth ponds, in_vitro/arch’, Design studio: 2S1 62, Spatial Investigations and the limits of design, School of Architecture, Aristotle University of Thessaloniki,
2010-2011.
Venturi, R., Complexity and Contradiction in Architecture, The Museum of Modern Art, New York,
2002.
Weinstock M., The Architecture of Emergence: The Evolution of Form in Nature and Civilisation, Wiley,
London, 2010.
120
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Xin Xia
Nimish Biloria
Faculty of Architecture
Delf University of Technology
The Netherlands
A 4EA Cognitive Approach
for Rethinking the Human
in Interactive Spaces
Introduction – From Interactive Space to Human Cognition and beyond
This research is initiated by the research question: How is it possible to physically and
mentally involve humans as an intrinsic part of interactive architectural spaces in order to formulate a mutual feedback based system, within which interactive spaces
serve an extension of human cognition?
The research paper exemplifies the definition and characteristics of interaction
and attempts to connect the disciplines of interactive architecture and cognitive sciences, with a focus on the 4EA cognitive approach. Following the concepts of 4EA
school, which, profess the mind, the body and the world as an integrated interactive
cognitive system, the paper focuses on the theoretical support of the Shared Circuits
Model (SCM), and its application. This paper aims to apply the SCM to an interactive
architectural space design case study, which shares the key elements: dynamic information processing loops, non-linearity, bottom-up emergence and un-predictability.
True Interaction or Rich Interaction
It has been more than 40 years since Myron Krueger developed “Glowflow” in 1969,
which offered visitors the possibility of modifying visual and sonic parameters by
means of pressure-sensitive sensors (Giannetti, 2004). Since then, the rapidly developing technology (sensors, software, programs, actuators, manufacturing...) turned more
and more fictions into reality and apart from developing complex geometry driven architectural and interior projects, has made it possible for a great number of so called
“interactive architecture” projects to be realized (Fox & Kemp, 2009). However, on careful observation, most of these projects operate as “active” & “reactive” systems, instead
of being truly “interactive”. Many such “active” works failed in challenging and extending people’s understanding of the nature of the space, stimulate meaning making,
evoking emotions etc and therefore, deny involving the essence of human interaction, which, could have been a significant contribution to developing human cognition driven architectural works.
The question, as regards the characteristics of interaction and interactivity, thus
becomes important to investigate? Background research on the definition of interaction reveals: Interaction to be a bidirectional or even multi-directional process: Interaction is a terminology used to elaborate the action which occurs as two or more
objects have an effect upon one another. The generation of a two-way effect/dialogue instead of a one-way causal effect/monologue is the basic underlying principle
for interaction. Interactivity as a phenomenon is similar to the degree of responsiveness, and is examined as a communication process in which each message is related
to the previous messages exchanged, and to the relation of those messages to the
messages preceding them. What this implies is that in the case of interactive and reactive response, the roles of the sender and the receiver are interchangeable with each
subsequent message. Thus, a basic condition for interactivity to prevail is a coherent
response from each communicant. This understanding, in the field of architecture,
by architects such as Prof. Kas Oosterhuis imply that “Interactive Architecture can be
defined as the art of building relationships between built components and second,
as building relations between people and built components” (Oosterhuis, 2006, p. 4).
Also, in their seminal book Interactive Architecture, Michael Fox and Miles Kemp state
that “A truly interactive system is a multiple-loop system in which one enters into a
122
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
conversation: a continual and constructive information exchange. As people interact
with architecture, they should not be thought of as “users” but instead as “participants”
(Fox & Kemp, 2009, p. 13). Therefore, interaction should be a conversation.
One of the most interesting aspects of interaction however, is its ability to shape
experience, as well as influence lifestyles and behaviors. Fox and Kemp state that “Architectural space can take advantage of an audience locally, regionally, and globally
by re-conceptualizing the role that the physical environment plays in shaping the
viewer’s experience.” (2009, p.138). According to Fox and Kemp, interactive architecture can create an enhanced spatial experience, and can create a dialogue amongst
the inhabitants based on either satisfying an interpretation of goal states or creating a
new emergent state based on ambiguous assumptions of desires.” (Fox & Kemp, 2009,
p. 138). Last but not the least, because of the bidirectional nature of interactivity, at
a systems level, it involves a careful negotiation between top-down and bottom-up
processes, which result in emergent, unpredictable yet organized communication as
an inherent feature. This results in an innate human like communication rather than a
mechanic input-output based feedback loop.
In the contemporary, owing to the enhancements in technology, which enable
people to develop real-time communication with spatial environments, the agenda for architecture and more so for identifying the intricacies of spatial adaptability
and their understanding for meaning making and emotional response has escalated.
Could an interactive space by its very nature be concerned with people’s experience
and response, or in other words, human cognition? The research aim is thus geared
towards using a cognitive approach to explore interactivity between architectural
spaces and human beings, from the perspective of the participants within the space,
instead of the designer. The central idea behind the research revolves around the connection of interaction design and human cognition for a meaningful integration of
multisensory technology into interactive spaces.
4EA Cognitive Approach
Before focusing on 4EA cognitive approach, let us first briefly review what a cognitive
approach is. Cognitive science is revolutionizing our understanding of human mind
by providing new accounts of human rationality and consciousness, perceptions,
emotions, and desires (http://en.wikipedia.org/wiki/Cognitive_science). Cognitive psychology investigates internal mental processes such as problem solving, memory,
language, attention and planning, and is thus quintessential for providing a logical
grounding for interaction design. (http://en.wikipedia.org/wiki/Cognitive_psychology) Certain basic media of cognitive psychology could provide a logical grounding
for interaction design. These include mental models, mapping, interface metaphors,
and affordances.
Among many different cognitive branches, there is a young branch which is different from other theories of cognition such as cognitivism, computationalism and
Cartesian dualism (http://en.wikipedia.org/wiki/Enactivism_(psychology)), which is
called 4EA cognitive school. It emphasizes the importance of the involvement of human body and human emotion, in the process of human interaction with their environment. Body, mind and environment; these three key components and their interrelationship constitute interactive spaces as well as the 4EA cognitive approach.
Xin Xia, Nimish Biloria
The Netherlands
123
Traditionally, philosophers and scientists of the mind have regarded perception as
input from the world to the mind, action as output from the mind to the world, and
cognition as a sandwiched piece in between. This is why Susan Hurley called it a “Classical sandwich” (Hurley, 1998, p. 21). As a critique to this “classical sandwich” approach,
the 4EA cognitive approach argues that cognition is Embodied, Embedded, Enactive,
Extended and Affective. 4EA thinkers see the vast majority of cognition as real-time interaction of a distributed and differential system composed of brain, body and world.
4EA school of thought sees cognition as the direction of organismic action via the integration/revolution of differential fields immanent to extended/distributed/differential neuro-somatic-environmental systems (Protevi, 2010, p. 170). Therefore, cognitive
activity consists of an immediate rather on-line interaction with the environment. Interactive architecture, owing to the real-time exchange of information between the
user and the space is thus an appropriate field to which the 4EA-based nature of cognitive activity can be applied. In order to do so, a concise understanding of the Embodied, Embedded, Extended and Affective modes of cognition comprising the 4EA
approach were examined:
1. Embodied cognition
Embodiment is required for multiple phenomena. In their book Handbook of Cognitive Science, An embodied Approach, Calvo and Gomila argue that any living being must be realized in some form that can maintain itself in its environment(s).
Action and interaction are constrained and enabled not only by environment, but
also by the specific forms of embodiment that must engage in those interactions
(2008, p. 38). Embodied cognition emphasizes the body as one of the key points
for philosophical investigation of cognition. Experimental psychology and cognitive science have traditionally viewed the mind as an abstract information processor where the connections to the surrounding environment are of little importance. While an embodied cognition viewpoint commits to the idea that “the mind
must be understood in the context of its relationship to a physical body that interacts with the world” (Wilson 2002, p. 625).
2. Embedded cognition
Sometimes “embedded cognition” is replaced by the term “situated cognition”. It
states that intelligent behavior emerges out of the interplay between brain, body
and world. The world is not just the ‘play-ground’ on which the brain is acting.
Rather, brain, body and world are equally important factors in the explanation of
how particular intelligent behaviors come about in practice (http://en.wikipedia.
org/wiki/Embodied_embedded_cognition).
3. Enactive cognition
Enactive cognition emphasizes the emergent self-organization of organisms and
the human mind by interacting with their environment. It means emerging by action, and new resources are generated in the process of action.
4. Extended Cognition
The extended mind begins with the question “where does the mind stop and the
rest of the world begins?” It emphasizes the active role of the environment in driv124
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
ing cognitive processes. The inspiration of extended cognition to the designer of
the interactive space is that, no matter how complex the space is, how much advanced techniques are applied, the space is only a part of the cognitive system.
5. Affective Cognition
Recent affective neuroscience and psychology have reported that human affect
and emotional experience play a significant, and useful role in human learning
and decision-making. In normal individuals, the covert biases related to previous
emotional experience of comparable situations assist the reasoning process and
facilitate the efficient processing of knowledge and logic necessary for conscious
decisions (Ahn and Picard, 2005).
Affective cognition unfolds in a social context between embodied subjects formed
by that context. But “context” is too static: there are multiple levels and time-scales
involved. That is, in de-personalizing affective cognition, we see bodies in concrete
situations act in real time with response capacities that have been generated over developmental time-scales as produced by multiple subjectivation practices in a distributed/differential social field. Thus a sense-making encounter, a de-personalizing case
of affective cognition, is an emergent functional structure, a resolution of a dynamic
differential field operating at multiple levels and different time-scales as those bodies
navigate the potentials for the formation of new assemblages (Protevi, 2010, p. 183).
The five elements of 4EA cognitive theory are closely related to each other. And are
often used in combinatorial fashions or are even extremely overlapped. The term “Embedded cognition” is often been replaced by the term “situated cognition”. Embodied and embedded cognition are often been mentioned together. Even there is a term
“embodied embedded cognition”(EEC). Some scholars focus on “embodied cognition”,
but they also emphasize the relation between “thinking” and “doing”, which actually
incorporates the concept of “enactive” cognition. Some scholars work on the relation
between action and emotion, which combines the domains of “enactive cognition”
and “affective cognition”.
Our cognitive activity consists of immediate on-line interaction with the environment.
“For the enactive approach, cognition is embodied action. In a concrete and practical
sense, a cognitive system is embodied to the extent to which its activity depends non-trivially on the body. This is close to expressing a tautology: cognition cannot but be embodied.” (Hanne De Jaegher and Ezequiel Di Paolo) (Ideas of enactive cognition) link several
themes centred around the role of life, self-organization, experience and the animate body
in shaping cognition as an ongoing and situated activity (Hanne De Jaegher and Ezequiel
Di Paolo).
Cognitive process refers to the interaction sequences within the micro and the macro scales of our cognitive system: a cognitive system was defined by Nakashima as a
system of interactions between a human (agent) who has thoughts in mind and the
surrounding environment: “S/he perceives something from the environment, has
thoughts in mind and does something to the environment” (Bilda and Candy, p. 2).
This means that cognitive is embodied and enactive.
Xin Xia, Nimish Biloria
The Netherlands
125
Shared Circuits Model (SCM)
Disagreeing with the “classical sandwich” conception of the mind –regards perception as input from world to mind, action as output from mind to world, and cognition as sandwiched between. Susan Hurley, together with Rodney Brooks, argued that
perception and action share dynamic information-processing resources as embodied agents interact with their environments. Cognitive resources and structure can
emerge, layer by layer, from informational dynamics, enabling both perception and
action. Susan Hurley developed a Shared Circuits Model (SCM), which is a five layered
heuristic model, which can be combined in ‘various ways to frame specific hypotheses
(Hurley, 2008, p. 1).
SCM shows how information for important cognitive capacities of persons can
have a foundation in the dynamic co-enabling of perception and action (Hurley,
2008). Specifically, materials for active perception can generate cognitively significant
resources: the action/perception, self/world, actual/possible, ... The SCM explains how
imitation, deliberation, and mindreading can be enabled by sub-personal mechanisms of control, mirroring, and simulation. It connects shared informational dynamics
for perception and action with shared informational dynamics for self and other, while
also showing how the action/perception, self/other and actual/possible distinctions
can be overlaid on these shared informational dynamics. The SCM is a flexible and
heuristical model, in five layers that can be combined in various ways to frame specific
ontogenetic or phylogenetic hypotheses (Hurley, 2008, p. 1).
Fig. 1
SCM Layer 1 - Basic adaptive feedback control. SCM’s starting point is dynamic online motor control,
whereby an organism is closely attuned to its embedding environment through sensorimotor
feedback (Hurley, 2008, p.19).
Control depends on dynamic relations among inputs and outputs. Information about
inputs is not segregated from information about outputs; this blending of information is
preserved and extended in the informational dynamics of further layers. Perception and
action arise from and share this fundamental informational dynamics (Hurley 1998;
2001).
126
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
SCM Layer 2: Simulative prediction of effects for improved control. Add online predictions of sendory feedback from ongoing motor output. Online predictive simylation improves instrumental
contral and provides information distinguishing action by the agent from perception of the world
(Hurley, 2008, p.19).
In layer two, the biggest difference is that instead of having an external feedback loop,
there is also an internal feedback loop, which is generating another input signal.
Fig. 3
Layer 3: Mirroring for priming, emulation, and imitation. Mirroring reverses layer 2’s predictive associations, so that observing movements generates motor signals in the observer that tend to cause
similar movements. Various mirroring structures can enable various forms of copying, with various
functions. If mirroring preserves novel means/ends structure of observed actions, it can enable
imitative learning. But mirroring provides intersubjective information in a subpersonal “first-person
plural,” without distinction or inference between own and others’ similar acts (Hurley, 2008, p.19).
Xin Xia, Nimish Biloria
The Netherlands
127
In layer three, the target is not clear anymore, but there is richer exogenous input,
which contains other’s similar acts or evoking objects.
Susan Hurley views cognition as embodied and situated (2008, p. 2). The same
view of criticizing on the “Classical sandwich” brings her closer to her philosophical ally
Andy Clark, who is an embodied embedded cognition scholar. SCM was developed
under the concepts of 4EA cognition. The bottom-up emergence and dynamic information processing loops however are deepened by Susan Hurley in the SCM to a position that falls in line with the 4EA cognitive theory.
Application of 4EA Approaches to Interactive Architecture
What can 4EA cognitive approach contributes to interactive space studies and what
is the significance of 4EA cognitive approach to this research undertaking? To answer
these questions, three aspects are outlined:
Firstly, as mentioned earlier, a 4EA cognitive approach provides the designer a wider
view of looking at the interactive spaces and a more tolerant attitude while designing
an interactive space. “The environment is part of the cognitive system” (Wilson, 2002,
p. 629). Mind, body and the world, altogether formulate an entire real-time emergent
interactive system. When developing the “world”, how can we ignore the human body
and human mind? When talking about making connections of things and people,
how can we isolate ourselves from the perspective of people, and only deal with sensors, input-output information processing, programs etc. Interactive spaces designers
therefore should be aware that they are actually designing a part of the cognitive system, and this part is playing a role in the entire interaction sequence as a self-evolving
system, which is proactive.
Secondly, existing 4EA research results & conclusions could be applied as guidelines
in the design process. Studies of embodied cognition and the embodied mind show
that the form of the human body largely determines the nature of the human mind.
Aspects of the body shape all aspects of cognition, such as ideas, thoughts, concepts
and categories. These aspects include the perceptual system, the intuitions that underlie the ability to move, activities and interactions with our environment and the native understanding of the world that is built into the body and the brain (http://www.
sccs.swarthmore.edu/users/08/ajb/tmve/wiki100k/docs/Embodied_philosophy.html).
Thirdly, we could use the 4EA research model structure to define intricate information design models of interactive space. In this research article we shall emphasize this
re-appropriation of the SCM model based idea for a chosen interactive architecture
based case study.
Application of SCM to Interactive Architecture
Most of the applications of cognitive science and cognitive psychology apply the
common conceptions of perception and actions as separate and peripheral to central
cognition, which is not sufficient and powerful for a dynamic and unpredictable environment. Although there are many other existing cognitive models, they fall under
128
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
the “Classical Sandwich” approach, in which, the information processing happens as
a linear input-output loop. In the case of Interactive environments, which invariably
have to deal with unpredictable situations and constant data exchange, an equally
suitable methodology to develop a systemic understanding of cognition can be found
in the Shared Circuits Model (SCM).
SCM is intensively relevant to the nature of interactive spaces, and inspiring, mainly
due to these characteristics:
1. Dynamic information processing loops
As mentioned earlier, true interaction contains dynamic multiple loops for information processing. According to Susan Hurly, in SCM, perception and action share
dynamic information-processing resources as embodied agents interact with their
environments (Hurley, 2008, p. 2). This quality and the possibility to create multiloop scenarios for a dynamic space with constant information exchange requirements make it suitable to be considered.
2. Nonlinear
In his book A Thousand Years of Nonlinear History, Manuel De Landa states “Attractors and bifurcations are features of any system in which the dynamics are not only
far from equilibrium but also nonlinear, that is, in which there are strong mutual interactions (or feedback) between components (p. 14). The internal looped information exchange sequences with the SCM model itself replicate this non-linear mode
of information communication in real-time not only amongst its system components but also with the context within which it is embedded. In interactive architecture as well componential interaction is the key to produce complex behaviors.
The SCM and its looped system architecture could thus be a beneficial platform to
use for such investigations.
3. Bottom-up emergence
Gordon Pask argues that “Rather than an environment that strictly interprets our
desires, an environment should allow users to take a bottom-up role in configuring
their surroundings in a malleable way without specific goals“ (interactive architecture, p. 14). Very much related to the idea of enactive cognition, Susan Hurly also argues that cognitive resources and structure can emerge bottom-up, layer by layer,
from informational dynamics, enabling both perception and action (SCM, p. 2). One
of the aims of SCM is to illustrate the philosophical view that embodied cognition
can emerge from active perception, avoiding the “classical sandwich” architecture,
which insulates central cognition from the world between twin buffers of perceptual
input and behavioural output (Hurley, 2008, p. 12). Interactive Architecture and its
real-time information processing aspect also necessitate the generation of bottomup emergent spatial behavior for developing a bi-directional communication between space and subject. The SCM model could thus be a suitable model to explore.
4. Un-predictability (possible, instead of actual)
SCM addresses the “how possibly?” rather than the “how actually?” question. It
provides a higher-order theoretical model. But it also provides generic heuristic
Xin Xia, Nimish Biloria
The Netherlands
129
resources for framing specific first-order hypotheses and predictions about specific ontogenetic or phylogenetic stages. Its five layers, can be re-ordered in formulating specific first-order hypotheses (Hurley, 2008, p. 3). Interaction design, its
associated pro-active nature and thus the unpredictability of spatial adaptations
(physical as well as ambient), as a by-product of spatial attempts to develop bidirectional communication make the SCM model and its structuring intriguing.
In order to illustrate the above-mentioned characteristics and to re-appropriate the
SCM structure to interactive spaces, a case study; analyzing the InteractiveWall, an interactive architecture project is conducted. Hyperbody, Delft University of Techonology, developed the InteractiveWall project in 2009. It was commissioned by a German
pneumatics company Festo, for its presentation at the Hannover Messe 2009. For Hyperbody the motivation for the development of interactive architecture is a response
to the rise in demand of programmable, multi-mediated, and customisable environmental conditions in the digital age (Hosale and Kievid, 2010, p. 55).
Fig. 4
People interacting with the InteractiveWall. Copyright Festo AG & Co. KG, photos Walter Fogel.
The InteractiveWall, is a collection of seven vertical wall-like components which have
the ability to dynamically transform their physical and ambient behaviour in realtime as a real-time interaction scenario. At its structural core, each wall component
comprises a kinetic skeleton, connected together with two sensors, forty-eight channels of LED lights and embedded speakers. As soon as a person is detected (by the
embedded sensors) approaching one of the wall panels, a real-time update in physically bending the panel in the opposite direction coupled together with a change
in the lighting and sound pattern is triggered. This physical and ambient triggered
within one panel is in real-time communicated to neighboring panels, which in turn
results in a multi-modal feedback loop. The seven installation components thus interact constantly with its context and generate emergent behaviors as a negotiation amongst the context, amongst each other as well as the users who visit the
installation.
One of the designers of the InteractiveWall explained that, when the same participant approaches the same panel for the second time, the panel would react differently, because the movement pattern of the neighboring panels at that point in time
might be different than the last time. Internal data communication and its negotiation
amongst the seven wall units are thus bound to vary. Apart from this the system also
has a memory (case-base) and thus remembers which physical and ambient adaptations have been actuated previously. Just like two persons having conversation on the
130
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 5
Fig. 6
One panel is bending due to the approaching
of a participant. Copyright Festo AG & Co. KG,
photos Walter Fogel.
LED light patterns on the panels. Copyright
Festo AG & Co. KG, photos Walter Fogel.
same topic for the second time, they might speak about things differently because
they remember what had been mentioned and discussed last time. This learning ability and the connected adaptation based negotiations in order to maintain a sustained
level of interest within the audience presents a key difference between interactive
projects and active projects.
Taking the InteractiveWall project as an example, and extrapolating information
structuring principles within layer 2 and layer 3 of the SCM, we further designed a
restructured SCM “map” which displays information processing loops for the case of
the InteractiveWall (Fig. 7). Only Layer 2 and 3 were of specific interest for this research, since Layer 4 and layer 5 in the SCM are for the off-line situation, which implies that the output is inhibited and the interaction does not happen in real-time.
The case of the InteractiveWall however presents a real-time interaction scenario
and thus these layers were not applicable. Layer 1 is already an inclusive part of the
layer 2, and thus the main emphasis was on the restructuring of layer 2 and layer 3.
The restructured “map” has two main components; the upper circle, which is a combination of layer 2 and layer 3, representing the “self” of one participant of the InteractiveWall installation and the bottom circle which, represents the living system of
the InteractiveWall.
Xin Xia, Nimish Biloria
The Netherlands
131
Fig. 7
Restructured Shared Circuits Model for the InteractiveWall project.
The elements constituting this restructured “map” are the following:
1. In the first circle, there is no specific target for the participant since in many interactive spaces, when the participants are confronted for the first time with the
space, he or she does not have a clear target of what he or she is going to do with
this environment.
2. The exogenous input, namely the static or moving space generates input signal 1
to the participant.
3. This input signal goes to the comparator and generates an output. Part of the output goes through the internal loops, in the way of “mirroring, simulation enabling
action understanding” or “Predictive simulation of effects for improved control”, to
subsequently become input signal 2. The other part of the output goes through
the external feedback loop, becoming his or her own action, which is providing a
new input, input 3.
4. When there is more than one participant, then others’ similar acts (the other participants with similar actions) or evoking objects also become input 4.
5. Looking at the bottom circle, meanwhile, the output of the upper circle, which
goes through the external feedback loop and becomes own action, is giving an
input signal 1 of the living system of the InteractiveWall.
6. In the bottom circle, there is a target. In the case of InteractiveWall, the target is
that it wants to attract people, wants to be surrounded by people. So the input signal 1 goes to the comparator, in order to compare if this target has been achieved,
and provides this reading as an output.
132
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
7. The output also goes through two loops: the internal feedback loop and the external feedback loop. The one that goes through the internal feedback loop becomes a new input signal to the living system, as input signal 2. The output which
goes through the external feedback loop, becomes the new input signal(s) to the
“self” of the participant, in the form of activation of structural bending of the walls,
changing light patterns as well as changing sound.
8. The other participants’ similar acts or evoking objects, are also sending input signals to the living system of the installation.
Conclusion
This restructured “map” is a beginning to test the applicability of SCM based cognitive models onto interactive spaces. It is apparent that the inherent complexity of this
model stems from the fact that the interactive installation has to deal with multiple
participants as well as engulfs not only the user but also its own changing state as its
immediate contextual interaction sources. The new model can be further enhanced
by means of connecting each physical as well as ambient output to cognitive modes
of perception and emotion generation and can thus become an intensive study for
analyzing interactive spaces and their behavioral profiles.
Other than analyzing existing projects, the SCM also can be developed as a precursor to design development stages. Such models can precisely outline the underlying DNA or in other words the cognitive basis and its connection with each interaction routine in a meaningful manner even before the actual prototyping phase
begins. With such a “map”, designers can have a clear overview of how information is
processed, generated and perceived through multiple dynamic loops, between the
interactive space and its participants. Apart from instilling a mutually participatory nature of an interactive space, this idea of evolving a real-time learning and responding
system, which, can trigger emergent interaction routines almost like a human being
can help enhance how people as well as spaces connect to each other. The notion of
subject-object relationship and the perception that architectural spaces in essence
would serve as containers or shells for habitation can thus be challenged by incorporating cognitive modes of understanding and designing interactions.
The attempts to understand the 4EA school of thought and to derive appropriate SCM based meta models for Interactive architecture presented in this paper are
the beginnings of an intensive investigation to intricately connect the disciplines of
Cognitive Sciences and Interactive Architecture. We strongly believe that the two disciplines, would more than benefit from each other as regards how the human mind
can perceive architectural space as a living/caring entity and how architecture and
interaction design as disciplines can benefit by intricately involving human cognition as the basis for generating real-time communicating spatial features as well as
ambient characteristics. Further research within this research framework will thus
involve designing new SCM based interaction models first, in order to realize in a
bottom-up manner the connectivity between how we think, imagine and experience
space. This will be then preceded by physically building interactive spatial systems,
both physical and digital in order to analyze user interaction within such dynamic
environments.
Xin Xia, Nimish Biloria
The Netherlands
133
References
Giannetti, C., “Aesthetic Paradigms of Media Art“, Available through: Media, Art, Net website http://
www.medienkunstnetz.de/themes/aesthetics_of_the_digital/aesthetic_paradigms/ [Accessed 9
May 2011].
Calvo, P. and Gomila, T. (eds.) 2008. Handbook of Cognitive Science, An embodied Approach, Amsterdam; Boston; London: Elsevier Science.
Robbins, P. and Aydede, M. (eds.) 2009. The Cambridge handbook of situated cognition. Cambridge
University Press.
Hurley, S., 2008. “The shared circuits model (SCM): How control, mirroring, and simulation can
enable imitation, deliberation, and mindreading”, in Behavioral and Brain Sciences (2008) 31, 1–58.
Hurley, S., 1998. Consciousness in Action, Cambridge, Harvard University Press.
Oosterhuis, K. and Xia, X., (eds.) 2006. Interactive Architecture #1, Rotterdam: Episode Publisher.
Margaret, W., “Six views of embodied cognition”, in Psychonomic Bulletin & Review, 2002, 9 (4),
625-636.
Fox, M. and Kemp, M., 2009. Interactive Architecture, Princeton Architectural Press.
Protevi, J., 2011. “Deleuze and Wexler: Thinking Brain, Body, and Affect in Social Context”, in Hauptmann, D. and Neidich, W., (eds.) Cognitive Architecture, From Biopolitis to Noopolitics. Architecture &
Mind in the age of Communication and Information. 2011. 169-183.
Hosale, M., and Kievid, C., “Modulating Territories, Penetrating Boundaries”, in Footprint, 2010. 55-67.
Jaegher, H. and Paolo, E., “Participatory Sense-Making, An Enactive Approach to Social Cognition”,
in Phenomenology and the Cognitive Sciences, 2007.
Bilda, Z., Candy, L., and Edmonds, E., 2007, “An embodied cognition framework for interactive
experience”, in CoDesign, Volume 3, Issue 2.
Clark, A., Being There: Putting Brain, Body and World Together Again. 1997, Cambridge, MA: MIT Press.
Ahn, H. and Picard, R., “Affective-Cognitive Learning and Decision Making: A Motivational Reward
Framework For Affective Agents”, 2005, in Tao, J., Tan, T., Picard, R. (Eds.) Affective Computing and
Intelligent Interaction. Beijing.
134
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Stavros Vergopoulos
Dimitris Gourdoukis
School of Architecture
Aristotle University of Thessaloniki
Greece
Network Protocols /
Architectural Protocols:
Encoded Design Processes
in the Age of Control
Protocol, is of course a word with a large history. It is a word of Greek origins and it
literary means ‘the first page’. It was initially used to describe a piece of paper attached
in the beginning of a book, containing its summary. From that point on, the word took
several different meanings, in many cases having to do with a set of rules regulating
specific or appropriate behaviors. However, for the scope of this article, we will focus
on the meaning that the word has in the context of computer science. There, protocol, “refers specifically to standards governing the implementation of specific technologies
[...] it is a language that regulates flow, directs netspace, codes relationships, and connects
life-forms.” (Galloway, 2004). If we want to simplify things, a protocol is a description
of a set of procedures to be used when communicating. Especially it refers to rules
regulating the communication between computers. A protocol provides the necessary framework inside which different computers can exchange information.
The protocol “at large”
Before looking at an example of a protocol in the context of computer science, it
might be useful to ‘zoom out’, in order to see how the idea of the protocol, stemming
from computer science, becomes an important component of the way that modern
societies are organized. To achieve that we will rely on the periodization, proposed
by Michel Foucault (1995, 1990) and extended by Gilles Deleuze (1997), of the human
history from the 18th century onwards. Foucault names the societies up to the 18th
century as sovereign societies. With the French revolution we are moving to the discipline societies and finally, according to Gilles Deleuze this time, we are now in what
he calls ‘control societies’ or ‘societies of control’. Therefore according to Deleuze, control is one of the main essential concepts in modern societies. Following that idea, the
present article revolves around the concept of control and the means to achieve it.
Deleuze (1997), expanding on the above periodization associates each of the three societies with a specific kind of machines: simple machines for the sovereign societies,
thermodynamic machines for the discipline societies and computers for the control
societies. Protocol, according to Alexander Galloway (2004), is coming to play the role
of the management style of the control societies. In that sense protocol replaces, or
has the same function with, hierarchy in the case of sovereign societies and bureaucracy in the case of discipline societies. Taking this thinking forward, there is also a different type of network associated with each of the three: Centralized networks for the
sovereign societies, decentralized networks for the discipline societies and distributed
networks for the control societies.
So why is the protocol so important in that context, or how does it become the management style of our era? To answer that question, it is useful to keep in mind the
three different kinds of networks mentioned above and their visual representation
that can be seen in fig. 1 (Baran, 1962): In the diagram, the dots represent nodes while
the lines represent communication routes. In the centralized network, there is a central node to which all other nodes are connected. The center commands, the rest
obey. Communication is achieved through sovereignty. In the decentralized network,
we have multiple centers, but still a hierarchical organization. Discipline allows communication between the nodes but still there is a linear transmission of the message
from top to bottom or from center to periphery through the existence of several more
136
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 1
Baran, P. Illustrations of Centralized, Decentralized, and Distributed Networks, 1962 (Baran 1962).
localized centers. In distributed networks however, each node can be virtually connected to any other node. But in order for this to be possible, all nodes need to share a
common language. This is where protocol enters the picture: It is the framework necessary for the distributed system to exist. Consequently the protocol needs to be accepted by every node that wants to be part of the network.
Internet Protocols
Having the above theoretical outline in mind, we are going to ‘zoom in’ again, in order
to see an example of one of the most important protocols that are in use today: The
internet protocols, that are governing (in other words are giving control over) a large
amount of our everyday activities.
“The Internet Protocol is designed for use in interconnected systems of packetswitched computer communication networks. […]The internet protocol provides for transmitting blocks of data called datagrams from sources to destinations, where sources and
destinations are hosts identified by fixed length addresses.” (Postel, 1981). Every communication on the internet – where ‘communication’ is any data exchange, in effect
any action that takes place online – is rendered possible through a bundle of several
protocols that are operating at different levels and are facilitating the transfer of information on the internet. In brief: At the local level we have a web page containing
text and images. Those are marked up in the HTML (HyperText Mark-up Language)
protocol. In turn HTTP (HyperText Transmission Protocol) encapsulates this HTML object and allows it to be served by an internet host. Then, both client and host must
Stavros Vergopoulos, Dimitris Gourdoukis Greece
137
abide the TCP (Transmission Control Protocol) protocol that divides the data in small
packages to be sent and recomposed. The TCP is itself nested inside the IP (Internet
Protocol) that is in charge of actually moving data packages from one machine to another. And finally the entire bundle is physically transmitted according to the rules of
the physical media itself.
This formation of protocols is behind the distributed character of the internet and
the necessary condition for each machine to be able to connect with any other machine in the network. Therefore any node that wants to be part of the network needs
to accept the above described protocols: “The model of operation is that an internet
module resides in each host engaged in internet communication and in each gateway
that interconnects networks. These modules share common rules for interpreting address
fields and for fragmenting and assembling internet datagrams. In addition, these modules (especially in gateways) have procedures for making routing decisions and other
functions.” (Postel, 1981). Any node that does not accept the protocols does not exist on the net, while any node that abides to the protocols becomes of equal value
with any other node. Such an organization is working in contrast to more traditional,
hierarchical modes of organization where the routes that data transmission can follow are specifically defined. On the internet, data can take any possible direction and
therefore potentially reach any point in the system. This horizontal organization, especially in the first years of the net, led to a certain kind of euphoria that was looking
at the internet as a medium that provides an absolute freedom, as a totally democratic medium able to transcend the limitations of our current societies, defined
themselves by the strict hierarchies that we encounter in their ‘physical version’. As
Alexander Galloway (2004) also points out in his book, this is a misconception, and
it is important to see the reason why: The TCP/IP protocol coexists with the Domain
Name Space (DNS) protocol, which is a large decentralized database, that maps network addresses to network names. “The Domain Name Space and Resource Records,[...]
are specifications for a tree structured name space and data associated with the names.
Conceptually, each node and leaf of the domain name space tree names a set of information, and query operations are attempts to extract specific types of information from a
particular set.” (Mockapetris, 1987). At the top of the tree structure are the so-called
roots, which are specific centralized servers: “There are over o dozen root servers located around the world in places like Japan and Europe, as well as in several U.S. locations.”
(Galloway, 2004, p. 9). Those roots are then branching into the different domains,
which are then branching again, in essence following the structure of the familiar
web address, from right to left.
TCP/IP is a result of the action of autonomous agents. The DNS is a result of a predefined structure. Those two have to function together in order for the internet to
work: “To grasp ‘protocol’ is to grasp the technical and the political dynamics of TCP/IP
and DNS at the same time.” (Thacker 2004). It is through the hierarchical form of the
DNS that one can gain access to the horizontal organization of the TCP/IP. Strict hierarchy has to coexist with the distributed nature of the web in order for the whole system to be able to function. Horizontal control is always subject to the vertical control
applied by the DNS.
138
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Characteristics
Alexander Galloway (2004, p. 82) points out seven basic characteristics of the protocol: It is a system of distributed management. It facilitates peer-to-peer relationships
between autonomous entities. It is anti-hierarchy and anti-authority. It engenders localized decision making, not centralized. It is robust, flexible and universal. It can accommodate massive contingency and it is the outcome (not the antecedent) of distributed behaviour.
Those characteristics refer to the operating level of the protocol and are necessary
in order to understand its nature. However, in order to be able to start thinking about
protocols in the context of architecture, it might be useful to locate the characteristics
of the protocol also in an ontological level. Therefore, in order for a set of procedures
to be eligible to be characterized as a protocol, the following characteristics also need
to be present:
1. Code. A protocol needs to be encoded. And it needs to be so in a specific language
that can be understood or translated if necessary. As mentioned before, the protocol is gaining its importance from the fact that it provides a common language for
all nodes so that they can communicate with each other. That common language
is achieved through code.
2. Automation. A protocol needs to be executed automatically, in order for the whole
process to take place. There cannot be decisions, interpretations or doubts that
take place ‘on the fly’, when the protocol is executed. The protocol has to be made
out of predefined routines that get executed when specific conditions are met.
3. Machines. As a result of (1) and (2), there is a need for a machine that would guarantee the correct reading of the code and its automatic execution. Machinic implementation of the protocol guarantees that it will always be executed in the same
way, without any derogations or exceptions.
All the above characteristics, both operational and ontological, render the protocol as
the outline that may give rise to bottom-up, self organized, processes. Since protocols
allow communication between autonomous agents at will, they provide the framework for bottom-up processes and self-organization. And through self-organization
we may arrive to the emergence of specific behaviors, or patterns, but behaviors or
patterns that were not predicted by the protocols or were not the intention of the protocol. An unlimited field of possibilities arise that needs to be explored.
Having all the above in mind, what might be an architectural protocol, or why
should we try to understand what an architectural protocol could be to begin with?
Trying to answer that question we are going to see three examples, the first two being
older architectural projects, the last one being a contemporary field in architectural
computation.
The Generator
The Generator was a project designed by Cedric Price, in collaboration with John
Frazer, from 1976 to 1979. The project, that was never built, was a design for a retreat
center to the White Oak Plantation on the coastal Georgia-Florida border. It consists
of 150 12x12’ cubes distributed on a grid system, along with bridges, catwalks, glazStavros Vergopoulos, Dimitris Gourdoukis Greece
139
Fig. 2
Price, C. Generator 1979. Drawing (Price 1984).
ing and sliding glassdoors. The cubes and all the other elements of the composition
were moveable. A crane was also in place that was able to move the cubes according
to the requests of the residents (Price, 1984). But “what made Generator different was
that it had some rudimentary self-awareness and knowledge. Its architecture was machine readable.” (Spiller 2008, p. 208). In other words, Price and his team designed a
computer program that was controlling the way that the building would be functioning. In Frazer’s (1995, p. 9) words: “…we produced a computer program to organize the
layout of the site in response to program requirements, and in addition suggested that a
single-chip micro-processor should be embedded in every component of the building, to
make it the controlling processor. This would have created an ‘intelligent’ building which
controlled its own organization in response to use. If not changed, the building would
have become ‘bored’ and proposed alternative arrangements for evaluation, learning
how to improve its own organization on the basis of this experience”. Each of the microprocessors was communicating the position of its cube to the computer which in
turn was using that information as the input for the program that it was running. The
program had three main routines: (1) Keeping an archive of the already used configurations. (2) Helping the users with the design of the configurations. (3) Proposing
new configurations by itself when the setting remained unchanged for a long period
of time.
In the context described in this article we can see the software that Generator’s computer would have run (the code), augmented by the microprocessors and the crane
(the machines) as a protocol regulating the relations between the users and the
cubes, where the last of the above mentioned routines guarantees the automation of
the process: even if the users will not change the configuration, the program will do it
by itself making sure that the whole process keeps running. It is this protocol that pro140
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 3
Price, C. Generator 1979. Model Photographs (Price 1984).
duces the building and it is doing so over time. A new ‘building’ is produced each time
that the users or the computer decide to change the configuration of cube according
to the behaviors that the protocol allows.
Stavros Vergopoulos, Dimitris Gourdoukis Greece
141
Fig. 4
Price, C. Generator 1979. Drawing (Price 1984).
The Flatwriter
The Flatwriter is a project developed by Yona Friedman in 1970. It was “conceived
but not executed for the world’s fair at Osaka [and it] allows the individual to select and
print out his future housing preferences. He can locate his dwelling within a given infrastructure of services and be warned of the possible consequences of his decisions”
(Friedman, 2006, p. 129). The Flatwriter was a machine, in the shape of a typewriter,
that was intended to be an interface used by a prospective owner of a house in order to design - or better: decide - the exact configuration of that house. The first
part of the process was involving the perspective owner using the 53 keys of the
‘typewriter’ in order to decide the configuration of the spaces in his future house.
For every plan only 9 keys had to be used, thus allowing through the 53 available
keys millions of different configurations. After that first phase the flatwriter was
printing a plan of the configured space, and the user was moving on to a second
keyboard where “he had to indicate how often he is in the habit of going to his different rooms” (Friedman, 2006, p. 132). In other words, the user was providing the machine with input concerning the way that he or she was planning to use the house.
At this point the Flatwriterwas giving information back to the user. The software
that was installed on the machine was checking for the possible problems arising
from the comparison of the created plan (in the first keybord) and the user’s habits
(entered in the second keyboard). The user was able to reconfigure his space according to the feedback that he received. Then, on a TV screen the user was deciding where to place his house on the existing infrastructure grid. The Flatwriterwas
again providing feedback concerning the problems that the placement might create. At this point the software was comparing the placement decided by the user
142
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 5
Friedman, Y. Flatwriter 1970: keyboard layout (Friedman 1996, p.131).
against the general configuration of the settlement (the houses already placed by
other users) and was providing feedback both to the new user and the ones already
owned a house in the same area. The software also was making sure that evaluations of the new house against all the necessary parameters, like access, natural
light, natural ventilation, orientation etc., were put forward. Through that process
a solution accepted by both the software and the user was reached and the flat
could get constructed. Friedman suggests that the Flatwriter could also take care
of all the necessary permits for the construction of the house, thus automating the
whole design process, from design to construction.
Friedman in the Flatwriter project, even though most of it remained in paper, envisioned a machine that would create the necessary framework for all the possible
relations needed in the design process for a house to be accommodated. Through
the Flatwriter communication becomes possible between all the people involved
(future residents, neighbors, authorities, constructors) while all the necessary considerations are taken into account. The software (code) makes sure that the house
will be fulfilling all the requirements that a house should fulfill, while the typewriter (machine) ensures that the process will be unobstructed by checking for all
those requirements (automation). Therefore, we could also understand the way
that the Flatwriter functions as a kind of an architectural protocol; accommodating
and automating all the relationships present in the process leading the design of a
house.
Stavros Vergopoulos, Dimitris Gourdoukis Greece
143
Fig. 6
Friedman, Y. Flatwriter 1970: organizational diagram (Friedman 1996, p.131).
Fig. 7
Friedman, Y. Flatwriter 1970: diagram (Friedman 1996, p.132).
Information Modeling
While the two previous examples illustrate two - historical - cases that might give us
some clues as to what an architectural protocol could be, contemporary architectural
practice - as a practice that operates within a protocologically controlled society - already has its own protocols in place. When, during the ‘90s, architectural academia
144
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
started to experiment with animation techniques using software developed for the
film industry, at the same time software targeted specific to the building industry,
concerning construction management, was being developed. The above developments, in academia and in practice, were happening almost completely independently. Meanwhile, scripting (lower level computer programming) became more and more
prominent in architectural experimentations while more and more experimental designs started to exit the virtual space of the computer in order to be constructed. That
led to the convergence of the two different directions. Today, Building Information
Modeling packages offer the ability to be employed in the design process from the
very beginning and therefore assure a much greater integration between conceptual
design and its execution.
According to the US National Institute of Building Science “BIM refers to the use of
the concepts and practices of open and interoperable information exchanges, emerging
technologies, new business structures and influencing the re-engineering of processes in
ways that dramatically reduce multiple forms of waste in the building industry.” (Garber
2009, p. 9). In essence, BIM is a set of protocols that accommodates the communication between: (a) databases concerning all the different quantitative aspects of a
design project - from environmental data and program requirements all the way to
materials and cost estimations - and (b) between all the different people involved
in the design process - from the architects and the engineers to the owners and the
constructors -. Therefore BIM is a tool that offers control over the large network of the
nodes that constitute the architectural process; starting from conceptual design and
ending with the actual construction. It provides the necessary common language
for all those nodes to communicate and exchange information in a streamlined and
seamless way.
If we think about the three different kinds of networks described in the first part
of this article, one might think that a distributed one would be a logical result of the
interconnection of all the different nodes of the architectural practice. It seems reasonable that any of the parameters of the design process should be able to be connected to any other. One could argue that BIM offers this possibility. In that sense a
look at figure 08 might turn out a little confusing. The image is taken from the latest
Autodesk Revit promotional brochure. This diagram of course represents a clearly centralized system, with the software - BIM - being at the center. One could argue that the
diagram is simplified for the sake of clarity, however at this point it might be useful to
think of the different levels of control that might exist.
In a recent issue of the Architectural Design journal dedicated to BIM, Chuck Eastman
(2009, p. 53) describes how preliminary designs for courthouses in the US can be reviewed through building information models and not traditional drawings anymore.
He notes: “The concept design can be generated using any of the GSA-approved BIM design tools. Currently these include Revit, Bentley Architecture and ArchiCAD.” While such a
process might be much more efficient than the traditional one, it also renders specific
software packages not as tools that are able to help, but as tools that are required in
order to design. Similarly we read in the Revit’s promotional brochure: “Autodesk® 3ds
Max® Design software can be used to conduct indoor lighting analysis in support of LEED®
8.1 certification.” (Autodesk 2011). Protocological thinking needs to go through standardization, the common language necessary for all the nodes to communicate. HowStavros Vergopoulos, Dimitris Gourdoukis Greece
145
Fig. 8
Autodesk Diagram explaining Autodesk Revit’s function (Autodesk 2011).
ever through standardization, specific software packages - and therefore specific corporations - are gaining control over the design process. Therefore, the diagram from
the Autodesk Revit brochure might be telling the truth: It is the software who is in the
center, who is in control.
It is true that BIM software is giving control to the architect over the local parameters that he has to work with. But at the same time the architect is giving part of the
control over the overall design process away to the creators of the software. At this
point things are becoming more serious, and it might be useful to think of the example of the internet protocols: the TCP/IP - a horizontal protocol - coexisting with the
DNS - a vertical, hierarchical protocol. Any node might be able to communicate with
any other node, but only through a specific hierarchy predefined by others. It might
be illuminating to contrast the protocols we see in Cedric Price’s and Yona Friedman’s
example with the BIM protocols. In the first two cases the architect is the creator of
the protocol in order to control the design process and its outcome. In the later case
the architect is the user of protocols that have been created by others, therefore control is passing over to them. “The contradiction at the heart of the protocol is that it has
to standardize in order to liberate. It has to be fascistic and unilateral in order to be utopian” (Galloway, 2004, p. 95).
The BIM example is extremely useful in order to understand that: (a) architects are
already using extensively protocols in their everyday practice; (b) protocological approaches do not necessarily result in horizontal processes and bottom-up modes of
organization. As with the internet, what initially looks liberating might be hiding several drawbacks; (c) architectural protocols raise a number of important issues that are
146
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
in essence political. As is always the case with control there are questions of power
and authority at place, which need to be considered and examined.
Four possible Directions
Believing that protocols might provide the means to develop novice architectural
processes, organized from the bottom-up, that will favor diversity and differentiation
over standardization, we propose four possible directions that architecture might take
in its encounter with protocological logic and which need to be studied further. One
would be the development of custom protocols. Much in the spirit of the Generator
and the Flatwriter, protocols that are project specific, created by the architect in order
to accommodate the needs that he encounters each time. A second approach would
be the introduction of ambiguity in the protocols: aesthetics, political agendas, historical information and any other kind of non-quantitative data might be another way to
produce highly differentiated results through protocols. A third direction would be
the exploitation - or hacking - of the existing, corporate protocols. Since architects are
not programmers, and they cannot create tools as efficient as large corporations can,
they can use those tools in ways different from the ones for which they were designed
to, thus exploiting them. A last option could be the use of collectively developed protocols. In other words, open source software, developed at large by the community
that will use it, which will not follow a centralized direction but will instead follow a
route that will emerge out of the interaction of its own users. Of course, those four directions need to be studied thoroughly in order to better understand their potentials
and characteristics. However, they seem to be promising in several ways.
Rethinking the Human
Closing this article we will try to consider the position of the human in a protocologicaly driven society, and consequently in a protocologicaly driven architecture. For
that, it might be useful to go back to periodization described in the beginning of the
article as developed by Michel Foucault and Gilles Deleuze. Each one of the three societies had a central, conceptual figure around which they were organized. Sovereign
societies revolved around the concept of God: A supreme power that was justifying
the strict, top to bottom hierarchy of the organizational scheme. At some point, as
Nietzsche has very eloquently shown, “God died” and was replaced by man. The concept of man, as supported by the humanistic principles that were developed after the
French revolution, replaced that of God and became again the reason - or the alibi behind the decentralized organization of the societies. So what could be the entity
that replaces the human in our contemporary, ‘distributed’ societies? It might be adequate to just change the number of the subject in order to introduce the multiplicity
that distributed networks impose. It is the humans - in plural and understood more as
a multiple entity - that it is taking the place of man. Or better, what Antonio Negri and
Michael Hardt (2005) call the ‘multitude’; a concept taken from Spinoza that might be
able to describe better an ‘organism’ where it is of equal importance each of its ‘cells’
and their accumulation that produces a new entity with characteristics emerging out
of their collective behavior.
Stavros Vergopoulos, Dimitris Gourdoukis Greece
147
Bibliography
Autodesk, Revit Architecture Overview Brochure.pdf. Available at: http://www.cadac.com/nl/brochures/Documents/revit_architecture_overview_brochure_a4.pdf [Accessed November 1, 2011].
Baran, P., 1962. On Distributed Communications Networks. Available at: www.rand.org/pubs/
papers/2005/P2626.pdf [Accessed October 3, 2011].
Deleuze, G., 1997. Negotiations 1972-1990, New York: Columbia University Press.
Eastman, C., 2009. Automated Assessment of Early Concept Designs. Architectural Design: Closing
the Gap, 79(2).
Foucault, M., 1995. Discipline & Punish: The Birth of the Prison 2nd ed., Vintage.
Foucault, M., 1990. The History of Sexuality, Vol. 1: An Introduction, Vintage.
Frazer, J., 1995. An Evolutionary Architecture, London: Architectural Association.
Friedman, Y., 2006. Yona Friedman: Pro Domo, Actar.
Galloway, A.R., 2004. Protocol: How Control Exists After Decentralization, Cambridge, Mass: MIT Press.
Garber, R., 2009. Optimisation Stories. The Impact of Building Information Modelling on Contemporary Design Practice. Architectural Design: Closing the Gap, 79(2).
Hardt, M. & Negri, A., 2005. Multitude: War and Democracy in the Age of Empire, New York: Penguin
Books.
Mockapetris, P.V., 1987. Domain names - concepts and facilities. Available at: http://tools.ietf.org/
html/rfc1034 [Accessed November 14, 2011].
Postel, J., 1981. Internet Protocol. Available at: http://tools.ietf.org/html/rfc791 [Accessed November 8, 2011].
Price, C., 1984. Cedric Price, London: Architectural Association.
Spiller, N., 2008. Visionary Architecture: Blueprints of the Modern Imagination, Thames & Hudson.
Thacker, E., 2004. Foreword: Protocol Is as Protocol Does. In Protocol: How Control Exists After Decentralization. Cambridge, Mass: MIT Press.
148
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Socratis Yiannoudes
Department of Architecture
Technical University of Crete
Greece
From Μachines
to Μachinic Αssemblages:
a Conceptual Distinction
between two kinds
of Adaptive Computationally-driven
Architecture
Since the 1990s, research groups, courses and workshops in schools of architecture
have been working on the design and potential implementation of adaptive computationally-driven architecture. This kind of architecture involves physical structures
able, in theory, to adapt to constantly changing needs and environmental conditions,
through the use of kinetic mechanisms and embedded computation (wireless networks, micro-processors and sensor-actuator effectors) (Fox, n.d.; Fox and Yeh, n.d.;
Fox, 2010). The term “adaptive computationally-driven architecture”, however, would
also refer to the so-called Intelligent Environments, the applications of the field of Ambient Intelligence (AmI), which has also been a developing research project since the
1990s, albeit outside the architecture discipline. Intelligent environments are spaces
able to adapt autonomously and proactively to personalized needs, as well as changes of the habits of their occupants through the use of ubiquitous computing, i.e. environmental information feedback, activity recognition / detection and learning, memory and proactive anticipation mechanisms (Monekosso, Paolo and Yoshinori, 2008).
The ideas behind the above developments and practices are not new, since they
can be traced back to the visionary projects of the ‘50s and ‘60s architectural avantgarde; projects such as Cedric Price’s Fun Palace, Archigram’s Living 1990 [fig.1] and
Control and Choice Dwelling as well as Constant’s New Babylon, characterized by an
obsession with technology, systems, interfaces, responsiveness and indeterminacy,
were imagined as paradigms of a flexible, transformable, self-regulating and adaptive
architecture. These projects were inspired by the emerging new science of cybernetics, the founder of which, Norbert Wiener, in his book The Human Use of Human Beings
(1947), described how information feedback was central for the creation of environmentally responsive machines (Hughes, 2000, p. 98). Architecture as a cybernetic system could thus be theoretically changed at will by their inhabitants, in order to «adapt
to the changing desires of the human communities that inhabit it» (Colquhoun, 2002, pp.
225-226).
Yet, although the vision was for a really responsive and adaptive architecture, a
close examination of current practices of adaptive architecture will inevitably lead to a
confusion as to the degree of adaptation they are capable of. Looking at those kinetic
Fig. 1
150
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
structures discussed in William Zuk and Roger Clark’s book Kinetic Architecture (1970)1,
as well as at recent computationally-driven structures, such as those constructed by
the Hyperbody Research Group at TUDelft led by Kas Oosterhuis [fig.2], we sense that
their adaptability is limited by the range of the transformational constraints of their
structure and components. Furthermore, most applications of intelligent environments seem to be able to adapt to a prescribed range of human activities and needs.
Bearing this problematic in mind, in this paper we would like to propose a conceptual
distinction between two kinds of adaptive architecture.
The first can be conceptualized in terms of a “machine”, characterized by deterministic function and preprogrammed behavior. The second can be conceptualized in
terms of a “machinic assemblage”, a concept coming from Deleuzean philosophy to
describe a specific functional arrangement of heterogeneous flows and interactions
between technical and social entities, with indeterminate capacities. We have to bear
in mind that the machine, as a technological artifact, has been an object of continuous philosophical debate historically and a model to explain and conceptualize cultural and physical phenomena.2 For instance, cybernetics and the adaptive, homeostatic and “conversational” machines that were produced during its early period by
scientists such as William Walter Gray, Ross Ashby and Gordon Pask, challenged age
old Cartesian epistemology and ontological boundaries in culture that had prevailed
in philosophical thinking throughout modernity (Pickering, 2010). Therefore the subsequent architectural projects and practices which were inspired by cybernetics and
cybernetic machines, such as Archigram’s and Price’s, as well as the more recent intelligent environments and interactive architecture applications, should be re-examined
and discussed in the context of cybernetic machine models and other philosophical
concepts (such as the assemblage) which take issue with the cybernetics agenda.
Fig. 2
Socratis Yiannoudes Greece
151
Machines
In his famous lecture of 1947 entitled “Machine and Organism” (first published in
1952), the French philosopher and historian of science Georges Canguilhem attempted to explain machine constructions in different terms than just being the result and
application of scientific activity and theorems. He proposed an understanding of machines by making reference to and studying the structure and functions of organisms,
a research agenda put forth a year later by cybernetics. In this way, Canguilhem pointed to a reversal of the mechanistic philosophy and biology, which sought to explain
organisms in terms of mechanistic principles. This reductionist model of machines
comes from a traditional theory of mechanics, such as that in Franz Reuleaux’s Theoretische Kinematik: Grundzüge einer Theorie des Maschinwesen (Braunschweig: Vieweg,
1875). In such theories machines are mechanical devices functioning within narrowly defined limits and their every movement holds up to certain norms, measures or
estimates:
A machine can be defined as a man-made, artificial construction, which essentially
functions by virtue of mechanical operations. A mechanism is made of a group of
mobile solid parts that work together in such a way that their movement does not
threaten the integrity of the unit as a whole. A mechanism therefore consists of
movable parts that work together and periodically return to a set relation with respect to each other. It consists of interlinking parts, each of which has a determinable degree of freedom of movement... The fact that these varying degrees of freedom of movement can be quantified means that they can serve as tangible guides
for measuring, for setting limits on the amount of movement that can be expected
between any two interacting solid objects (Canguilhem, 1992, p. 46).
Thus, the machine operates uniformly and unidirectionally toward completing a particular activity presenting finality and purposiveness, an understanding of machines
traced in Aristotle’s Politics (Canguilhem, 1992, p. 57).
Kinetic structures fit well within this paradigm and conceptualization of technological devices. Kinetic structures are physical constructions consisting of moveable
interconnected parts which can rearrange their relative positions, according to demand, either manually or through feedback systems of control. The result is a significant overall mechanical change of the physical configuration of the structures, which
is determined by the set relations of their internal components and their inbuilt kinetic mechanisms. The latter may range from mechanisms of deployment, folding and
extension to rolling, sliding and nesting techniques, and from scissor-type mechanisms and inflatables, to tensile and transergetic systems. Thus, the determinable degree of freedom of movement that the structure is able to produce means that the
range of possible functional changes that may occur is predictable; this is because the
form can only “respond to a range of functional changes possible within the initial envelop limitations” (Zuk and Clark, 1970, p. 98). Therefore, flexibility, although dependent
on the multiple transformational states of the structure, is confined within a narrow
range of alternatives, predefined by design.
Flexibility was an emerging idea in architectural discourse during the post-war
period to deal with the uncertainty and rapid development in the Western world
brought forth by the constant social and economic changes. However, as Adrian Forty
argues, the application of flexibility in architectural design gave architects the illusion
152
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
that they can sustain and extend their control on buildings even after the period of
their real responsibility, the design stage (Forty, 2004, p. 143). Thus, flexibility in architecture is by definition limited by design, i.e. it is predetermined. If this paradigm of
flexibility is a functionalist one, i.e. deterministic, avant-garde cybernetics-inspired architects and architecture groups of the same period, such as Archigram, envisaged a
more radical approach to flexibility. They postulated the concepts of ‘indeterminacy’
and ‘open-endedness’ in architecture, and thought that its inhabitants would have an
active and participatory role in the determination of its configuration and functions
(Sadler, 2005; Hughes, 2000). As we have argued elsewhere (Yiannoudes, in press), the
Archigram project, as well as Cedric Price’s Fun Palace and Constant’s New Babylon,
despite their different conceptual, material and visual aspects, constituted an alternative paradigm of flexibility, one which Adrian Forty considers to be not a characteristic of buildings but of use (Forty, 2004). In such a case, flexibility is not determined by
design or technical means but by the participatory and constructive engagement of
those who use space, namely ‘users’.3
This discrepancy, between the functionalist paradigm of flexibility -which can be
understood in terms of the machine described above- and its alternative, characterizes, as we will argue, two distinct branches of contemporary intelligent environments.
Intelligent environments (IEs)
Ambient intelligence (AmI) is a vision of the field of computer science aiming by definition at the creation of spaces, the so-called Intelligent Environments, able to respond
to the presence and activities of people in an adaptive and proactive way by supporting and enhancing their life through smart devices (Streitz, 2006). Although automatic
buildings have been around since the 1950s and 1960s,4 intelligent environments are
different because they have developed complex and adaptive ways to enhance domestic habitation through the use of ubiquitous computing (neural nets and fuzzy
logic supported by networks of intelligent agent-based systems) and user-friendly interfaces (Ahola, 2001). Without attempting to discuss the perpetual meaning of intelligence as analyzed in the fields of Artificial Intelligence and Cognitive Science, it is sufficient to say that, in the context of AmI, it refers to autonomously functioning systems
able to provide automated services, assessing situations and human needs in order to
optimize control and performance in architectural space. Such systems use environmental information -acquired through activity recognition / detection- as feedback to
obtain knowledge and experience, through learning, memory and proactive anticipation mechanisms, in order to adapt to personalized needs as well as changes of user
habits (Cook and Das, 2007). Such environments, a.k.a. autonomous intelligent environments, would include the iDorm, an experimental student apartment developed
at Essex University, the PlaceLab, developed by House_n research program at MIT, the
Adaptive Home, and the MavHome.
The system of an intelligent environment models the activities and behavior of users,5 in order to produce rules to determine and optimize its performance. It is thus
able to adapt to new behaviors and respond accordingly. However, although intelligent environments can adapt to changes of the habits and activities of their occupants, the system’s responses, its performing functions, are predetermined by design.
In effect, these functions are limited by the range of human or environmental activiSocratis Yiannoudes Greece
153
ties that the system is programmed to be able to recognize, supported by the intelligent agents’ capacities and the knowledge it obtains in time through learning. Furthermore, the high level of system autonomy in mainstream intelligent environments
does not permit any modification of the system’s rules, thus restricting any appropriation of its functions, either by users or other agents. Therefore, the system’s capacity for adaptation is predetermined by design which means that it is located within a
functionalist paradigm of flexibility.
A different set of Ambient Intelligence applications, the so-called end-user driven intelligent environments, constitute an alternative direction to the functionalist
paradigm of flexibility because their functionality is not dependent on the pre-programmed rules of the system. Such environments constitute “ecologies” of interacting
technological and human entities -artifacts, human users, and software- demonstrating open, indeterminate and multiple functionalities. Therefore, as we will argue, they
can be conceptualized in different terms than that of the mechanical machine. Before
describing their specific characteristics, we will discuss the concept of the machinic assemblage, in order to attempt to conceptualize end-user driven IEs in this term.
Machinic Assemblages
French philosopher Gilles Deleuze proposed the theory of the assemblage (agencement in French). According to Manuel DeLanda (2006, pp. 8-9) this theory is the main
theoretical alternative to the age-old organismic metaphor in sociological theory
which still exerts significant influence in most schools of sociology. The assemblage,
in Deleuzian thinking,6 is a cybernetic conception referring to a distributed arrangement/set up of functional relations, liaisons and affiliations among and across heterogeneous concrete parts and processes. Defined only pragmatically, in terms of components and actions “it arises when converging forces enable a particular amalgalm
of material and semiotic flows to self-assemble and work together” (Johnston, 2008, p.
117). The whole, in the assemblage, is characterized by relations of exteriority, implying that a component part of an assemblage may be detached from it and plugged
into a different assemblage, in which its interactions are different. These relations of
exteriority imply a certain autonomy for the components, which are necessarily heterogeneous, either material, cognitive, affective, or social, while relations can change
without the terms changing. Thus, the relations that constitute the whole can never
be explained by the properties of the component parts (DeLanda, 2006, pp. 10-11).
The term ‘machinic’ in the machinic assemblage, as Felix Guattari, Deleuze’s co-writer, has explained, points to a very different conception of the machine. The machine is
hereby not necessarily a mechanical device but an ensemble of heterogeneous parts
and processes whose connections work together to enable flows of matter, energy
and signs. Thus, the term ‘machinic’ refers to this working together of the assemblage
which includes humans and technologies (Johnston, 2008, p. 112). In this conception of the machine, the machine is opened out towards its machinic environment and
maintains all sorts of relationships with social constituents and individual subjectivities (Guattari, 1993). Thus the machinic assemblage contains particular technical devices inseparably embedded and operating within a continuum of related elements,
networks of materials, processes, systems and infrastructure (both technical and sociopolitical) (Guattari, 1992).
154
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Within the machinic assemblage, ontological boundaries between beings and
things, humans and machines, nature and technology, are subverted. Such fundamental dualities, which have preoccupied modernity and all philosophy up until
Heidegger, and which were radically and seriously challenged by the early cybernetics discourses and practices,7 are no more valid within the assemblage. Deleuze and
Guattari in particular, postulate the existence of a special realm in the assemblage,
which they call the machinic phylum, a term which suggests “a conjunction or interface between the organic and the nonorganic, a form of ‘life’ that combines properties of
both” and which “cuts across the opposition between history and nature, the human and
the non-human” (Johnston, 2008, p. 107). In Guattari’s thinking, the machinic assemblage crosses ontological boundaries acquiring consistency through non-linear and
autopoietic processes which run through its heterogeneous and diverse components,
either social or technical (Guattari, 1992, p. 352).
End-user driven intelligent environments can be conceptualized in terms of the
machinic assemblage because they are constituted and actualized by the participatory engagement and synergetic interaction of heterogeneous components - artifacts,
spaces, people, objects and software.
End-user driven Intelligent Environments
End-user driven IEs provide the tools to construct personal AmI applications in domestic environments that are adaptable to users’ indeterminate needs and changing
circumstances. In particular, they can empower users to appropriate the system and
its operational rules in a creative and even improvisational way through user friendly
interfaces. This approach can be traced in a conceptual path alternative to traditional
HCI research, within which system designers are no longer in control of interaction;
instead they focus on techniques to allow user appropriation and improvisational use
of the artifacts, so that they will be able to act through those artifacts, i.e. to modify and deconstruct their interface, relocating their constituting elements (Dourish,
2001).
Following this path, end-user driven environments such as PiP (pervasive interactive programming) and e-Gadgets, enable users to program and appropriate their environment by “building” their own virtual appliances according to their special desires.
Such approaches deconstruct the system, allowing the users to choose and combine
different device functions thus forming “virtual pseudo-devices” (Meta-Appliances
– Applications) (Chin, Callaghan and Clarke, 2008). In the e-Gadgets case [fig. 3], users would choose the combinations connecting plugs of smart components graphically, namely flexible blocks with no predetermined capacities and an “open” function
mechanism (eGadgets), creating synapses and rules of the “if…then” type (Kameas,
Mavrommati and Markopoulos, 2004; e-Gadgets, n.d.). The actions of the users in such
environments are not always successful in terms of their functional outcome, because
this outcome depends on the dialectical relationship, the ‘conversational’ process, between user and environment (the technological artifacts that constitute it). This conceptualization of architecture as a cooperative partner in conversation with its users,
rather than as a functionalist machine/tool, was postulated by cybernetics pioneer
Gordon Pask back in 1969:
Socratis Yiannoudes Greece
155
Fig. 3
The high point of functionalism is the concept of a house as a ‘machine for living
in’. But the bias is towards a machine that acts as a tool serving the inhabitant. This
notion will, I believe, be refined into the concept of an environment with which the
inhabitant cooperates and in which he can externalize his mental processes (Pask,
1969, p. 496.).
The functional capacities of an end-user driven intelligent environment derive from
the interactions of heterogeneous components, things and people. The environment
constitutes an ‘ecology’, where actors, either artifacts or subjects/users/designers,
cooperate for the optimization of the behavior of the whole. Zaharakis and Kameas
(2007) discussing the potential of end-user driven environments such as e-Gadgets
have argued for a conceptualization based on biological paradigms thinking of them
as symbiotic ecologies of heterogeneous actors, where intelligence is collective and
an emergent outcome of their interactions.8 The subject -the user- is part of that en156
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
vironment and his/her embodied actions within it, are essential for the way he/she
relates to it. The functionalities of the environment are underdetermined, its behaviors emergent and its functional potential is ‘open’. These observations lead us to the
second concept that we would like to discuss and defines another dimension of the
assemblage; that is the concept of the virtual and the virtual machine.
Virtual Machines
As John Johnston explains, Deleuze, following Bergson, developed the concept of
the virtual to describe the workings of the assemblage’s most piloting function, the
abstract machine, which points to the assemblage’s virtual dimension (Johnston,
2008, p. 119). The virtual, in Bergson, refers to the enabling of the creative aspect of
time, involving a truly open-ended and indeterminate model of the future, and the
creation of new lines of actualization contrary to linear views of causality and mechanical repetition of physical law. Unlike the possible, which defines a process in
which a structure acquires reality out of a set of predefined forms, the virtual defines
a process in which an open problem of creation is solved in a variety of ways, with
actual unanticipated forms emerging in the process. The virtual does not represent
something real but is a real that is yet to come -not effectuated-, it is ideal but not abstract in any transcendent Platonic sense. Thus, in Deleuze’s words “The virtual is not
opposed to the real but to the actual”, since the virtual and the actual are two different
ways of being (Lister, et al., 2003, p. 364). This virtual dimension of the assemblage is
“its power to actualize a virtual process, power or force, that while not physically present
or actually expressed immanently draws and steers the assemblage’s ‘cutting edges of
deterritorialization’ along a vector or gradient leading to the creation of a new reality”
(Johnston, 2008, pp. 119-120).
In the book Time Travels, discussing the Bergsonian and Deleuzean concept of the
virtual, Elizabeth Grosz concludes:
Insofar as time, history, change, the future need to be reviewed in the light of this
Bergsonian disordering of linear or predictable temporality, perhaps the open-endedness of the concept of the virtual may prove central in reinvigorating the concept
of an open future by refusing to tie it to the realization of possibilities (the following
of a plan) and linking it to the unpredictable, uncertain actualization of virtualities
(Grosz, 2005, p. 110).
Bringing the concept of the virtual into architecture she further asks: “What does the
idea of virtuality… offer to architecture?” and she answers: “The idea of an indeterminate,
unspecifiable future, open-endedness… the promise of endless openness” and “the usage
of spaces outside their conventional function” (Grosz, 2001, p. 88).
Indeterminacy and open-endedness; concepts that appear in Archigram’s rhetoric (Cook, 1999, p. 78) but have also been influential in the development of projects
of adaptive architecture of the same period such as Cedric Price’s Fun Palace [fig.4].
As Stanley Mathews argues, the assumed indeterminacy of both program and form
in the Fun Palace turn it into what Alan Turing in 1936 termed “a Universal Machine”
to predate the contemporary computer, that is, a machine capable of simulating the
behavior and the function of many different devices. In contemporary terms, this is a
Socratis Yiannoudes Greece
157
Fig. 4
virtual machine, namely a functionally underdetermined complex machine, never actualized as the totality of the virtual functions of which it is capable (Mathews 2006;
Lister, 2003, pp. 360-64). Like the virtual machine, “Virtual architecture would be similarly flexible and capable of emulating the behavior of different buildings” (Mathews, 2006,
p. 42). Thus, the Fun Palace signifies a shift in the functionalist (deterministic) view of
flexibility -conceptualized through the mechanical machine-, towards an alternative
view of the machine and its relation to architecture. This, concept, we argue can also
be applied to contemporary adaptive architecture projects such as end-user driven
IEs, as discussed.
Computational Assemblages
Although the terms machinic assemblage and virtual machine seem at first glance appropriate to conceptualize end-user driven intelligent environments, the assemblage
being a model of the set-up whereas the virtual machine being its function, there is
158
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
a crucial point that has to be made. In Deleuze and Guattari’s philosophy the assemblage works by conjoining two reciprocally interacting realms: the machinic assemblage on the one hand and a different type of assemblage on the other, the collective
assemblage of enunciation. While the machinic assemblage involves bodies and physical things, the collective assemblage involves semiotic expressions, signs, gestures
which hold the assemblage in a state of dynamic equilibrium. As Johnston states this
is essentially a cybernetic conception and in a loose sense self-organising (though Deleuze and Guattari do not use the term). Agency is completely distributed - no centrally located seat of power control or intentionality allocates and guides functionality.
Although within end-user driven IEs, the subject/user is the primary decision maker
-in other words he/she takes creative, autonomous and intentional action according
to his/her desires- the environment taken as a whole including human actors and
technological artifacts, presents agency in a distributed manner. Both humans and artifacts can act on the environment, changing its rules and hence its overall functions
and behavior according to human desires or the intelligent agents’ pre-programmed
motives.
This loosely self-organizing character of the assemblage leads us to question the
conceptualization of adaptive architecture in terms of the machinic assemblage. We
would rather use a different term proposed by John Johnston in his book The Allure
of Machinic Life (2008) where he examines the role of software and computational
processes in the assemblage. The computational assemblage links physical computational devices, bodies and artifacts with the coding and informational processes that
they instantiate and execute. We come to think that the computational assemblage is
a better term to conceptualize adaptive computational architecture and, in particular,
set ups such as end-user driven intelligent environments where all the above components, either material or immaterial seem to co-exist and interact (Johnston, 2008, p.
123).
Conclusion and Discussion
In this paper we attempted to explore the potential to conceptualize adaptive architecture in terms of the concepts of the machine and the machinic assemblage –in other words to propose a system of thought within which to discuss this kind of architecture. We also attempted to contextualize this exploration within avant-garde practices
and cybernetics discourse to show that the debate and vision has a historical reference. We have to admit though that our attempt to conceptualize adaptive computational architecture in terms of machinic and computational assemblages or virtual machines was rather sketchy despite the fact that we described these terms as accurately
as possible. Therefore the reliability of our argument can be easily criticized. However,
since, as discussed in this paper, adaptive computationally driven architecture is not a
tool, a modernist functionalist machine, but a functionally open-ended, indeterminate
and “virtual” system, an active agent in cooperative interaction with its occupants,
then novel concepts, systems of thought and even theories are urgently needed. Our
hope, then, is that this paper will raise a fruitful debate around these theoretical issues
which may lead to the exploration of specific architectural configurations, spatial systems and devices.
Socratis Yiannoudes Greece
159
Notes
1 Although kinetic architectural elements and structures have existed since antiquity and in
different cultures they were more widely recognized and developed throughout the second
half of the 20th century due to the rapid changes in the western way of life. In particular, from
the Second World War till recently, kinetic structures such as those transformable lightweight
and deployable or portable environments, built by architects and firms such as Buckminster
Fuller, Hoberman associates and FTL Happold, to name a few, have sought to resolve economical, practical or ecological problems of the construction industry, and respond to issues
of survival and nomadic dwelling. For discussion, categories and overview see Kronenburg
(1996); Oungrinis (2009); Robbin (1996).
2 For instance see Georges Canguilhem’s Machine and Organism.
3 The term “user” in this context would not only mean the person that has the potential to use
space but also to appropriate it, inhabit it, determine its manner of use at will, creatively reinterpret it or even abuse it (Forty, 2004, σσ.312-315).
4 For instance the All Electric House built by General Electric Company in Kansas in 1953 involved
remote controlled assistive services such as setting on/off the lights, watering the garden or
coffee making.
5 Of course there are drawbacks to this capacity including difficulties in recognizing simultaneous activities or activities performed by more than one user.
6 DeLanda explains that in Deleuze and Guattari’s work, the pages dedicated to assemblage
theory are relatively few and the definitions of the related concepts are dispersed throughout
their different texts and hardly interpreted in a straightforward manner. Therefore, DeLanda’s
books A New Philosophy of Society: Assemblage Theory and Social Complexity and Intensive Science and Virtual Philosophy attempted to reconstruct Deleuzian ontology and the theory of
assemblages.
7 On how cybernetic practices in the 1940s and ‘50s proposed and staged a novel non-modern
ontology challenging traditional human boundaries see Pickering (2010).
8 This discussion can be informed by post-cognitivist theories and the social studies of technology, such as Actor Network Theory, distributed cognition theory, activity theory and embodied
cognition/interaction theories, which, despite their differences, seem to conceive the word as
a hybrid techno-social environment consisting of heterogeneous associations, liaisons and
synergies between humans and non-humans (tools/machines/artifacts/technologies). For a
thorough attempt to contextualize intelligent environments and adaptive architecture within
these theories see Yiannoudes (in press).
Bibliography and references
Ahola, J., 2001. Ambient Intelligence. [online] ERCIM News. Available at: <http://www.ercim.org/
publication/Ercim_News/enw47/intro.html> [Accessed 17 July 2007].
Canguilhem, G., 1952. Machine and organism. In: J. Cray and S. Kwinter, eds. 1992. Incorporations.
New York: Zone Books, pp. 45-69.
Chin, J., Callaghan, V. and Clarke, G., 2008. A Programming-by-Example Apprach to Customising
Digital Homes. The Fourth International Conference on Intelligent Environments. [CD] Seattle, USA,
21-22 July 2008. Instn Engg & Tech.
Colquhoun, A., 2002. Modern Architecture. Oxford/New York: Oxford University Press.
Cook, P. ed., 1999. Archigram. New York: Princeton Architectural Press.
160
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Cook, D. and Das, S., 2007. How Smart are our Environments? An updated look at the state of the
art. Pervasive and Mobile Computing, 3(3), pp. 53-73.
DeLanda, M., 2006. A New Philosophy of Society: Assemblage Theory and Social Complexity. New
York: Continuum.
Dourish, P., 2001. Where the Action is: The Foundations of Embodied Interaction. London/Cambridge
MA: MIT Press.
e-Gadgets, n.d. e-Gadgets. [online] Available at: <http://www.extrovert-gadgets.net/intro> [Accessed 13 February 2009].
Forty, A., 2004. Words and Buildings: A Vocabulary of Modern Architecture. London: Thames & Hudson.
Fox, M., n.d. Beyond Kinetic. [online] Kinetic Design Group. Available at: <http://kdg.mit.edu/Pdf/
Beyond.pdf> [Accessed 30 January 2006].
Fox, M. and Yeh, B., n.d. Intelligent Kinetic Systems. [online] Kinetic Design Group. Available at:
<http://kdg.mit.edu/Projects/pap01.html> [Accessed 30 January 2006].
Fox, M., 2010. Catching up with the Past: A Small Contribution to a Long History of Interactive
Environments. Footprint: Delft School of Design Journal, Spring Issue: Digitally Driven Architecture,
6, pp. 5-18.
Grosz, E., 2001. Architecture from the Outside: Essays on Virtual and Real Space. Cambridge MA/
London: The MIT Press.
Grosz, E., 2005. Time Travels: Feminism, Nature, Power. Australia: Allen & Unwin
Guattari, F., 1993. On Machines. In: A.E. Benjamin, ed. 1995. Complexity. Architecture, Art, Philosophy.
JPVA No 6, pp. 8-12.
Guattari, F., 1992. Machinic Heterogenesis. In: W.W. Braham and J.A. Hale, eds. 2007. Rethinking
Technology: A Reader in Architectural Theory. Oxon/New York: Routledge, pp. 342-355.
Hughes, J., 2000. The Indeterminate Building. In: J. Hughes and S. Sadler, eds. 2000. Non-Plan: Essays
on Freedom Participation and Change in Modern Architecture and Urbanism. Oxford: Architectural
Press, pp. 90-103.
Hyperbody Research Group, n.d. TUDelft. [online] Available at: <http://www.tudelft.nl/live/pagina.
jsp?id=2e0eef38-3970-4fc0-8e0b-1ce04928c500&lang=nl> [Accessed 17 March 2005]
ISTAG, n.d. Ambient Intelligence: from Vision to Reality. [online] European Commision CORDIS. Available at: <ftp://ftp.cordis.europa.eu/pub/ist/docs/istag-ist2003_consolidated_report.pdf> [Accessed 8 March 2005].
Johnston, J., 2008. The Allure of Machinic Life: Cybernetics, Artifical Life, and the New AI. Cambridge
MA/London: MIT Press.
Kameas, A., Mavrommati, I. and Markopoulos, P., 2004. Computing in Tangible: Using Artifacts as
Components of Ambient Intelligence Environments. In: G. Riva, F. Vatalaro, F. Davide and M. Alcaniz, eds. 2004. Ambient Intelligence: The evolution of Technology, Communication and Cognition.
Amsterdam/Oxford/Fairfax: IOS Press, pp. 121-142.
Kronenburg, R., 1996. Portable Architecture. Oxford: The Architectural Press.
Lister, M. et al., 2003. New Media: A Critical Introduction. London/New York: Routledge.
Mathews, S., 2006. The Fun Palace as Virtual Architecture: Cedric Price and the Practices of Indeterminacy. Journal of Architectural Education, 59(3), pp. 39-48.
Monekosso, D., Paolo, R. and Yoshinori, K. eds., 2008. Intelligent Environments: Methods, Algorithms
and Applications. London: Springer.
Oungrinis, K., 2009. Structural Morphology and Kinetic Structures in Tranformable Spaces (in greek).
Ph.D. Aristotelian University of Thessaliniki.
Pask, G., 1969. The Architectural Relevance of Cybernetics. Architectural Design, 9, pp. 494-496.
Socratis Yiannoudes Greece
161
Pickering, Α., 2010.The Cybernetic Brain: Sketches of Another Future. Chicago/London: University
of Chicago Press.
Sadler, S., 2005. Archigram: Architecture without Architecture. Cambridge MA/London: MIT Press.
Saggio, A., 2002. How. In: F. De Luca and M. Nardini, eds. 2002. Behind the Scenes: Avant-Garde
Techniques in Contemporary Design. Basel: Birkhauser, pp. 5-7.
Streitz, N., 2006. Designing Interaction for Smart Environments: Ambient Intelligence and the
Disappearing Computer. In: The Second International Conference on Intelligent Environments. 1,
Athens, Greece, 5-6 July 2006. Athens: On Demand.
Yiannoudes, S., (in press) Adaptive Architecture: capacities and design factors of tranformable and
“intelligent” spaces (in greek). Athens: ION Press.
Yiannoudes, S., 2011. The Archigram vision in the context of Intelligent Environments and its current potential. In: The Seventh International Conference on Intelligent Environments. Nottingham,
United Kingdom, 25-28 July 2011. IOS Press.
Zaharakis, I. and Kameas, A., 2007. Social Intelligence as the Means for achieving Emergent Interactive Behaviour in Ubiquitous Environments. In: J. Jacko, ed. Human- Computer Interaction, Part II.
4551, Berlin Heidelberg: Springer-Verlag, pp. 1018-1029.
Zuk, W. and Clark, R.H., 1970. Kinetic Architecture. New York: Nostrand Reinhold.
162
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Ava Fatah gen. Schieck
The Bartlett, UCL, London
UK
Embodied, Μediated
and Performative:
Exploring the Architectural
Education in the Digital Age
This paper looks at architecture and teaching architecture in the digital age. It aims to
outline how rapid developments in information and communication technology and
its applications in architecture have introduced new opportunities and challenges. It
will also consider how the rise of pervasive media technologies has resulted in a hybrid structure of Mixed Reality Architecture (Schandlebach et al, 2007) and Augmented Environments (Aurigiand De Cindio, 2008) and how this will influence the role of
architectural education in order to prepare architects for new challenges in the field.
At the heart of the 21st century ‘interactive’ and increasingly ‘adaptive’ architecture
will become part of our experience, forming an important shift in rethinking the built
environment, the public space and its social importance. Digital media augments everyday interactions, creating visual and auditory interaction spaces that enable various types of embodied and performative experiences as we interact within a shared
space.
In parallel, architectural schools have witnessed a growing importance of information and media technologies in their curricula. Schoen has already identified some of
the dilemmas posed by the expanding horizon of knowledge within the architectural
field (Schoen, 1985). He stressed the need for reflecting on how this will impact the architectural education. More significantly he identified a dilemma facing architectural
schools as they start to recognise the increasing importance of new fields of knowledge to the education they must provide:
“architecture may try to incorporate them in a way that imitates the technical education in other fields, thereby turning its back on the tradition of the architectural
studio. Or, out of a wish to remain true to a certain view of that tradition – and to the
image of the architect….architecture may turn its back on the rising demands for
technical education” (Schoen, 1985, p. 86).
Building on this, I argue that as we find ways to incorporate digital media and
computation in architecture, we must rethink the position of architectural education
and how best to prepare architects for the challenges in the field. More significantly
recent advancements in pervasive systems have made depth sensing and contactless
gesture-based interaction available to anyone. It is likely that this will have an impact
on current and future research in pervasive computing and mediated architecture (either by supporting or replacing many existing approaches) introducing a significant
shift in the way we design and generate technologically mediated interactions within
the architectural space.
In this respect, I suggest that understanding the body; its movement and how it
relates to its subjective space will open up the possibility to better understand and
design augmented spaces in order to capture, respond to and regulate users’ experience mediated through digital technologies.
In the following section I present in detail emergent developments in the field followed by an outline of my teaching approach on the MSc Adaptive Architecture and
Computation at UCL.
Setting the Scene: the Embedded
Advancements in architecture and computing have evolved together to a surprising degree since the 1990s: Virtual Reality ‘VR’ represents one of these developments,
which remained mainly of academic interest for a number of reasons such as cost and
164
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
the unfeasibility of working in a completely immersive environment. VR was followed
by an emergent interest in combining real with virtual environments, allowing people
to interact with digital information within the real architectural space, and bringing
live digital imagery into the real world. It was argued that this type of Mixed Reality
interaction offers people a better sense of ‘embodiment’, a state of being in the world,
which fits more naturally with the way people act and interact with everyday objects
compared to interacting with more abstract virtual representations (Dourish, 2001). At
the turn of the millennium the focus has shifted again and attention has been drawn
to the increasing significance of interactivity beyond the scale of the task at hand. Recently we have witnessed a shift in the focus of interest from ‘cyberspace’ to ‘ubiquitous computing’; digital technology is built into our environments, embedded in our
devices, everywhere. Increasingly these technologies are networked.
Ubiquitous computing (also known as ‘ambient, physical, embedded, environmental
or pervasive computing’) was first introduced by technology visionary Mark Weiser.
He envisioned a world of fully connected devices with cheap wireless networks where
information is accessible everywhere. A world in which computers and information
technologies become indistinguishable from everyday life: ‘any time, any where and
always on’ (Weiser, 1991). Weiser proposed that in a pervasive computing environment, computers and information processing become ordinary, and penetrate into
every object in our daily lives. In this respect as digital technology becomes invisibly
embedded in everyday things, even more activities become mediated. Architecture
acquired a digital layer, which involves the design of organisations, services and communications. Networks extend rather than replace architecture (McCullough, 2004).
More significantly, now for the first time in the short history of pervasive computing
depth sensing is available for anyone using off-the-shelf technology (www.xbox.com/
en-gb/kinect) to support activity recognition and contactless user interaction. It is
likely that Kinect will replace the traditional computer vision approaches and this will
have a crucial impact on current research on adaptive and responsive environment.
I believe that building pervasive systems into our built environment requires a
new way of thinking about architectural design, digital information and how it interweaves with the built environment. I argue that as we find ways to incorporate digital
media and computation in architectural teaching we must rethink the role of architectural education. We need to develop new ways of teaching that extend beyond conventionally applied methods within the architectural education and Human Computer Interaction (HCI). In this respect, both architecture and interaction design together
can help compose the necessary framework for a better integration.
Building on this, and as the relationship between the human and her environment is
significantly reshaped through digital information, and pervasive embedded computing, we need to gain a better understanding of the human body and human movement through space and in particular when mediated through the pervasive technologies. This is reflected throughout my experimental approach on the ‘Body as
Interface’ module, which explores methods ranging from capturing the human form
and body movement to designing a performative and interactive experience using
computer vision and depth sensing.
Ava Fatah gen. Schieck UK
165
In the following section I investigate the embodied and performative nature of mediated interactions and present my approach on the ‘Embodied and Embedded Technologies’ unit, which covers both the body scale and the city scale. In this paper I will
focus on the body scale.
Mediated Space: The Embodied and the Performative
In her book ‘Architecture and Narrative’ Sophia Psarra examines the ways in which
buildings are shaped, used and perceived. She outlines that buildings are defined
through a thinking mind that arranges, organizes and creates relationships between
the parts and the whole and they are experienced through movement and use. According to Psarra the conceptual ordering, spatial and social narrative are essential to
the way we experience and design buildings (Psarra, 2009). Within a supportive physical built environment and temporal continuity, people perform a place ballet; a set of
integrated gestures and movements that maintain a particular aim within a habitual
space-time routine in everyday life (Seamon, D., 1979).
Fig. 1
Collective performative experience: situations of mediated urban interactions highlight aspects
of social proximity and shared encounters among friends and strangers (image: Cloud Gates,
Chicago 04).
However, with the advent of pervasive and mobile computing, the introduction of situated technologies in our space may motivate and modify social interactions or stimulate new performative behaviours. It may interrupt the habitual nature of everyday
interactions, creating new stages on which people can play out their engagements
mediated by the new media technologies. People are no longer limited to the role of
the spectator but are rather participants who are active in defining the emergent collective experience.
On the other hand, areas of architectural design and HCI (Human Computer Interaction) are merging. Areas of architectural research are continuously shifting as the new
developments affect our physical environment in ways that extend digital media into
the physical and temporal aspects of architecture. Accordingly, designers and educators attempt to embrace the ongoing developments and exploit every opportunity to
redefine our profession.
166
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
Embodied interactions with Rock & Pop Museum’s displays using full body movement, Norway,
2010.
Architectural Education in the Digital Age
A systematic approach to designing the physical and digital environments as an integral system calls for a coming together of Architecture and Interaction Design. Key to
this interdisciplinary integration is the concept of space (with its social protocols, conventions and attached values). More significantly, I argue that the focus on the human
at the core of adaptive buildings and understanding issues relating to the inhabitants
(including mediated, embodied and performative interactions), and their impact on
the quality of inhabitants’ experience, is essential to advancements in the field.
In his book Entangled, Chris Salter demonstrates that technologies from the mechanical to the computational— from a ‘ballet of objects and lights’ to contemporary
technologically-enabled ’responsive environments’—have been entangled with performance across a wide range of disciplines. Whereas Marleau-Ponte stated that the
lived foundation of this human-world is perception, which he relates to the human
body. According to him it is a basic human experience where the embodiment is the
condition for us to objectify reality. A body that expresses, acts in and is aware of a
world that normally responds with patterns, meaning, and contextual presence (Merleau-Ponty, M., 1962).
We have contributed elsewhere to the understanding of situated interactions and
shared encounters using a digital artefact as a stage that mediates social interactions
and performative play. We outlined that the nature of these mediated interactions and
their appropriateness are tied to the nature of space as well as the affordances offered
by the technology. More importantly, our findings demonstrated the importance of
Ava Fatah gen. Schieck UK
167
taking into account full body movement and performative interactions as an essential
factor of human experience (Fatah gen Schieck et al, 2008, 2010).
I argue that understanding the body and how it relates to its subjective space i.e.
looking at space-in-the-body (Laban, R. and Ullmann, L., 1974) in addition to the body
in space, posture and gesture will open up the possibility to better understand and
exploit body movement and users behaviour in order to capture, respond to and regulate users’ experience as it is mediated through digital and pervasive technologies.
This is reflected throughout my teaching on the MScAAC.
Introduction to Adaptive Architecture and Computation
The MSc AAC is a one-year taught course in the field of digital architectural design,
which has been running since 2005. Teaching on the course draws on a multidisciplinary milieu of two established disciplines in the Bartlett: the Space group, which
explores different ways of investigating spaces and forms of architecture and the
VR centre which uses computing technology in the field of architecture. It comprises 5 taught modules and two studio units (taught in parallel during the two teaching terms) followed by a thesis, which is supervised from May to September. The five
modules are: a) Design as a Knowledge-Based Process, b) Introduction to Programming, c) Computational Analysis, d) Computational Synthesis and e) Morphogenetic
Programming. These taught modules aim to challenge the students to think about
how computation can improve the design and use of architecture. The studio based
Fig. 3
Project MINIMAL COMPLEXITY by Vlad Tenu (MSc AAC alumnus 09) commission and assembly
Tex-Fab.
168
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
units give the students a hands-on opportunity to create spatial approaches using
computational sketches. My approach in particular explores how we can challenge
the way we relate to the built environment as it is mediated through pervasive digital
technologies. This is achieved through implementing a project-based learning that is
carried out in the real world (in the actual site itself) within the public space (Fatah
gen. Schieck, 2008).
In the remainder of the paper I will present my teaching approach on the unit ‘Embodied and Embedded Technologies’. The unit consists of two modules investigating the
body scale ‘Body as Interface’, which focuses on the human body and how architectural space is generated through body movement and the city scale ‘City as Interface’,
which focuses on the city as the space of potential interactions: person to person, person to digital, person to physical, digital to physical, and digital to digital). In this paper
I will on focus on the module ’Body as Interface’.
The Body as Interface
This module introduces students to digital space as part of the architectural space
and as an interface between people and other people, i.e. as a facilitator. It investigates the complex relationship between the architectural space, the digital space
and the human body, and the way that this is mediated by and mediates people’s
relationship to each other. Bearing in mind that the majority of the MSc AAC students come predominantly from an architectural design background, it seems that
the challenge is to incorporate these foundations in an understanding of architectural research that will help achieve a better understanding of different aspects and
the complexity of social and technologically mediated interactions in relation to the
spatial context in which it occurs. As a result, and building on the reflective nature of
architectural education, I have introduced a project based learning approach in an
attempt to integrate digital media and computation combined with understanding
the nature of the body and the subjective space it generates. This is primarily motivated by the design studio culture, a culture that represents a tradition of education by reflection-in-action and on the spot experimenting reflecting a generalised
setting for learning-by-doing, which integrates the three environments of: teaching,
discovery, and application. The premise is that adopting an approach based on the
design studio culture will foster a creative and challenging environment that will in
turn encourage critical thinking providing values of optimism, sharing, engagement
and innovation. The proposed approach differs from traditional architectural education, however, in that the project is implemented in the real setting, which requires
applying a range of methods from interpretative-ethnographic to experimental approaches. In this way the students themselves are active participants in the learning
experience; they play an active role that helps in shaping and understanding the final
outcome.
In the following section, I present the module project. It was first introduced during
2009 and implemented in a studio session in 2010 and 2011.
Ava Fatah gen. Schieck UK
169
The Project
The project was carried out during the last 4 weeks in the module. It is supported by
diverse teaching modes including workshops, lectures, and tutorials. The project is
evaluated formally by giving an audio-visual presentation and a demo of the interactive installation in real-time.
During the ‘Body as Interface’ workshop1 we apply a multidisciplinary approach
where architectural space, body movement, performance, dance and interactive
design come together. We start with the focus on practical physical exercises led by
dance practitioners through which the students gain understanding of how we embody space and the practical aspects related to space orientation / gravity organization. The aim is to open the body and mind, break routines and rethink various aspects that we take for granted. We explore movement in terms of its spatial content
(space-in-the-body) and aim to develop a dynamic moving body conscious of personal space and aware of spatial relationships between bodies.
Fig. 4
MScAAC students explore the relationship between space perception and body movement, how
they relate to their own space and how they relate to other people and the environment around
them (proxemics).
The physical exercises are followed by a talk exploring the complexity of the processes
involved in building our perception in space with focus on the central role played by
the body in this process. Finally, in order to reveal how the human body acts in space,
the students go through testing an interactive sound installation2. The installation is
experienced by a single person in total darkness, where her/his movements modulate
170
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
the soundscapes. As soon as the person enters, sounds begin to circle around them
in response to their movements. A real dialogue emerges between the body’s movement and the space through the sound, engaging the participant in space and revealing how the body acts in space. The outcome of this experience is captured and
analysed and the collected data and material act as a starting point for the student
project.
Initial outcome indicated that different types of bodies (different people) generate
different types of movements when moving without using visual cues.
Fig. 5
A richly physiological and perceptual system producing spatial and temporal trace form
Finally, the students start working on their own project. Building on the workshop
outcome and based on the data collected during the experiment with the sound
installation they are asked to find ways to represent the personal kinesphere (ie the
space surrounding the body or the movement space). The aim is to explore the relationship between space awareness, perception and body movement; how people relate to their own space and how they relate to other people (proxemics), to their subjective space and the environment around them (Hall, 1966).
Body movement and spatial representation were explored through students studies using different methodologies. For instance, one of the projects created an installation in order to explore interactions with 2D planes (see figure below), another looked
at how different types of the perceptual categorization of space would produce different ways to occupy and relate to the architectural space.
Fig. 6
Visual perception and bodily action operate interdependently on one another in our experience of
space. This project aims to clarify their own relation to their own space (developed in an interactive
project using computer vision and processing - MSc AAC 2010).
Ava Fatah gen. Schieck UK
171
Fig. 7
Photography and motion pictures in the last moments of the 19th C revealed the microfluctuations
of human movement. Here an attempt to drive movement capture through movement speed.
For the first time it is now possible to draw in 3D space using depth sensing. A group
of students studied different types of body movements mediated through interactive
installations. This was demonstrated through works carried using depth sensing (Kinect) in order to generate 3D drawings in space using different body parts.
Fig. 8
Different body signatures through drawings in 3D space with hands and feet by 4 different people
(developed through using Kinect and processing - MSc AAC 2011).
In summary, the students’ work demonstrated aspects of gaining understanding of
the importance of the subjective body (and how it relates to its own space) and the
relation to surroundings when designing interactive and mediated experiences. This
has implications on designing people’s experience through mediated and adaptive
environments, which could be explored further.
Conclusion
New technologies and methodologies are constantly being incorporated into architectural practice and education. These provide tools that promote new modes of engagement with older questions. However, there are issues and risks inherent in such
adoptions. Among these is the possibility that the potential of new technologies may
not be fully developed.
In this paper I have presented my approach for the unit ’Embodied and Embedded
Technologies’ as part of the MSc AAC course at The Bartlett, UCL. I focused in particular on my approach in teaching the module ‘Body as Interface’. This approach is in-
172
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
spired by the design studio culture and way of thinking, which integrates the three
environments of: teaching, discovery (research), and application (practice).
I presented an attempt to develop new ways that extend beyond conventionally applied methods within the traditional architectural education. The aim is to draw attention to the possibilities offered by the integration of new technologies into the architectural teaching in a module that explores the relationship with the human body,
the architectural space and performative interactions mediated through technology. I
argued that the relationship between the human and her environment is significantly
reshaped through digital information, and pervasive embedded computing and suggested that we need to gain a better understanding of the human body and human
movement through space and in particular when mediated through the pervasive
technologies.
More significantly I outlined that for the first time in the short history of pervasive
computing depth sensing is available for anyone and it is likely that this will have a
crucial impact on current research on adaptive and responsive built environments.
Our current teaching approach in interaction design to students from architectural
backgrounds focuses on introducing body awareness using dance training methods.
It takes the position that body awareness is key to the education of architects and
designers in the digital age, in particular when designing the inhabitant’s experience
mediated through technologically driven environments. Another aspect is the gaining of understanding of the importance of the subjective body (and how it relates to
its own space) in the design of interactive and mediated experiences. Finally, to recognise the importance of body movement as a medium for the design of quality of
experience, in particular in interacting and engaging through responsive and adaptive
environments.
This experience has offered a rewarding opportunity as well as challenges and
dilemmas. Considerably more investigations and deeper reflections are required in
order to address the question raised through the module, which will ultimately help
inform a truly engaging and inspiring learning environment that will better prepare
designers and architects for an ever changing world within the digital age.
Acknowledgements
We thank Prarthana Jaganath (Aedas), Armando Menicacci (Châlons sur Saone, France) and Christian Delecluse (Ecole Spéciale d’Architecture, France) for their invaluable contribution. We would
like to acknowledge the efforts of the MSc Adaptive Architecture and Computation of the 2010-11
and 2011-12 cohort.
Notes
1 The workshop is developed through a collaboration between the Ava Fatah (The Bartlett,
London), Christian Delecluse (ESA, Paris) and Armando Menicacci (Châlons sur Saone).
2 The installation is designed by Christian Delecluse an architect and interaction designer from
Ecole Spéciale d’Architecture (ESA), Paris.
Ava Fatah gen. Schieck UK
173
References
Aurigi, A., and De Cindio, F., 2008, Augmented urban spaces: articulating the physical and electronic
city. Aldershot: Ashgate.
Benford, S., and Giannachi, G., 2011, Performing Mixed Reality, The MIT Press.
Bianchi-Berthouze, N., Cairns, P., Cox, A., Jennett, C., and Kim, W.W., 2006, On posture as a modality
for expressing and recognizing emotions, Workshop on the role of emotion in HCI 2006.
Body as Interface workshop on the MSc Adaptive Architecture and Computation (2008-2011) at
UCL: http://www.bartlett.ucl.ac.uk/graduate/programmes/postgraduate/mscdiploma-adaptive
-architecture-and- computation.
Dourish, P., 2001, Where the Action Is: The foundations of embodied interaction, MIT Press.
Seamon, D., 1979, A Geography of the lifeworld: Movement, rest and encounter. St. Martin’s Press,
New York.
Fatah gen. Schieck, A., O’Neill, E., and Kataras, P., 2010. Exploring Embodied Mediated Performative
Interactions in Urban Space. In: Proc. UbiComp 2010, Copenhagen.
Fatah gen. Schieck, A., 2008, Exploring Architectural Education in the Digital Age. In: eCAADe 08
Architecture ‘in comutro’: integrating methods & techniques, Antwerp.
Fatah gen. Schieck, A., Briones, C., and Mottram, C., 2008, The Urban Screen as a Socialising Platform:
Exploring the Role of Place within the Urban Space. In: Eckhardt, F., Geelhaar, J., Colini, L., Willis,
K.S., Chorianopoulos, K., and Henning, R., (Eds.) MEDIACITY – Situations, Practices and Encounters,
Frank & Timme.
Hall, E.T., 1966, The Hidden Dimension, Anchor Books.
Kinect: http://www.xbox.com/en-gb/kinect.
Laban, R. and Ullmann, L. (Eds.), 1974, The Language of Movement: A Guidebook to Choreutics,
(publisher unknown).
McCullough, M., 2004, Digital Ground: Architecture, Pervasive Computing, and Environmental
knowing. MIT Press, Cambridge, MA.
Merleau-Ponty, M.,1962, The phenomenology of Perception trans. by Colin Smith, New York: Humanities Press.
Psarra, S., 2009, Architecture and Narrative: The Formation of Space and Cultural Meaning,
Routeledge.
Salter, C., 2010, Entangled: Technology and the Transformation of Performance, The MIT Press.
Schnädelbach, H., Penn, A., and Steadman, P., 2007, Mixed reality architecture: a dynamic architectural topology, in Space Syntax Symposium 07, Istanbul, Turkey.
Schoen, D. A., 1985, The design studio: an exploration of its traditions and potentials, London: RIBA
Publications for RIBA Building Industry Trust.
Seamon, D., 1979, A Geography of the lifeworld: Movement, rest and encounter. St. Martin’s Press,
New York.
Weiser, M., 1991, The Computer for the 21st Century, Scientific American, 265(3).
174
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Konstantinos-Alketas Oungrinis
Marianthi Liapi
Department of Architecture
Technical University of Crete
Greece
Rethinking the Human:
An Educational Experiment
on Sensponsive Architecture
Workshop Theme
Over the past 10 years, autonomy, adaptability, customization and communication
have been the most common words used to describe the qualities of Information
Technology devices that facilitate everyday activities. IT is now ubiquitous, integrated
with a multitude of objects in the contemporary, fast-pacing lifestyle. It is not strange
then that the spearhead of architectural research today engages with more elaborate
and sophisticated issues aiming toward the integration of IT systems into the living
space. The know-how to perform such a feature is available and the potential for architecture is significant. Already innovators in the field from around the world have
created a test bed for the integration of IT into the core of the production of space.
These research efforts open up the way for architecture to extend its inquiry beyond
the Vitruvian triptych and design spatial behaviors. Embedded interactivity in places
that were long regarded inert exhibits new possibilities for the human experience.
Intelligent control systems are able to enhance the functionality of space, create
provocative aesthetics and instigate radical changes in everyday life as we know it.
Moreover, contemporary social conditions seem to be addressed better through the
acquired connectivity.
The development of more powerful and intuitive software as well as the ability to
experiment with ‘approachable’ electronic assemblies facilitated an ever-growing tendency for responsive environments. Virtual, actual or hybrid, they are the new exciting
thing, becoming widespread and common globally through art installations and architectural applications. Particularly in architecture, the design tools aim to respond to
users’ needs, fabrication methods are developed to respond to design idiosyncrasies
and space is designed to respond to human behavior and environmental conditions.
As we explore this ability and understand better how to control and how to apply IT
systems, we inadvertently reach a threshold. The animatic effect within a (retinal) informational rich era may become a goal in its own. This fact marks a point where the
difference between recreation and architecture must be identified so that responsiveness can be applied and have a useful, aesthetically appealing and symbolically rich
outcome. Moreover, responsiveness must be carefully planned through time in order
to exhibit an intriguing multiplicity in its manifestation. The next logical step then is
to put sense to this response and create a context for the way space performs and the
way it learns from the past: a sensponsive architecture.
Ever since the rise and the swarm-like diffusion of personal media, space has been
gradually dislocated in the human mind to become the background in people’s activities. Even haute architecture examples remain briefly in the cognitive spotlight before
they begin to lose their importance. Ongoing research in this field has revealed that
when a sensponsive approach is adopted, space acquires two new, human-centered
features that elevate its importance. Firstly, it becomes a medium with embedded
smart assemblies able to connect with all kinds of advanced mobile media, either in
the architectural or in the urban scale, providing a continuous relationship between
designer, owner and space. Secondly, it is equipped with tools to regulate the environment toward ad hoc conditions the users desire.
While responsive systems react repeatedly with certain programmed actions, a
sensponsive system chooses the location and the type of the response. The most crucial factors in advancing from simple responsiveness to a response with a sense is time
176
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
and timing. It is also crucial to differentiate reaction from behavior. In this sense, a simple set of rules in designing a sensponsive environment is the following:
- Avoid an immediate reaction.
- Develop and exhibit a process of assessment.
- Define the context for action.
- Describe the context for reaction by defining intentions.
The most important aspect that justifies such an elaborate design approach is the increased level of communication that can be achieved between people and their surrounding space. People have always been communicating with their environment,
either artificial or natural, although through a passive process of attributing sentimental or psychological values to objects and elements. Active communication was first
manifested by automatons but remained restricted to certain reactions that rendered
them uninteresting (and thus with no communicational value) after a period of time.
Active communication with a plethora of unanticipated yet helpful responses has the
ability to augment the emotional connection people create with space. There should
be limits though in order to avoid entering the “uncanny valley.” As long as people
perceive a discrete presence in space they can relate easier with it through time, ascribing to it character as well as intentions.
Scenarios and Workflow
The workshop theme was further specialized to direct the design outcome toward
sensponsive environments for children and in particular the creation of intriguing
spaces with responsive partitions that can help them acquire a better sense of the
world through play. Why this particular user group? To begin with, there is a plethora
of scientific data to support detailed design explorations for children’s spaces. Moreover, children are experts in exploring as well as composing, driven mainly by necessity
and not by design, to better place themselves in a grown-up world. They ceaselessly
imagine and create alternatives of things that already exist, pushing the boundaries
of known concepts like creativity and simulation, imagination and knowledge. Last,
there is a very distinctive behavior required (the frame for creating a meaningful response) for these spatial arrangements; the environment must act as a tutor, providing continuously new stimuli for the children to re-ignite the game-process and keep
the educational value high. In this sense, the goal was to create sensponsive environments that could be integrated in educational and/or recreational spaces as playfully
learning mediums, able to engage children in their programmed activities through
fun.
Each one of the three student teams was assigned a specific working scenario related to how children perceive, experience and develop a meaningful understanding
of the world around them. Those scenarios where described as following:
1. Life-cycles: Create sensponsive spatial configurations to help children understand
the notions of time and change through play.
2. Storytelling: Create sensponsive spatial configurations to help children develop
their narrative and memory skills through play.
Konstantinos-Alketas Oungrinis, Marianthi Liapi Greece
177
3. Traces: Create sensponsive spatial configurations to help children understand their
body by getting feedback from the movement of its parts through play.
The teams were encouraged to adopt a bottom-up approach with certain key-points
that guaranteed a smooth process. During the initial conceptual phase they were all
asked to figure out an intriguing way to play out the assigned scenario. This included
the development of a design scheme with the appropriate functional, aesthetic and
symbolic value that could creatively facilitate the activity and enhance the experience
of the children involved. It was crucial from this point to agree upon the intensity of
the activity as well as the basic design parameters. Drawings and small conceptual
mock-ups were necessary for each team to communicate the prevailing ideas.
After getting feedback on the conceptual phase, each team had to shift gears and
begin to work on two distinct parallel explorations, one related to the actual form and
the structural details of the sensponsive spatial configurations and the other to the
development of the code for their behavioral patterns. The proposed steps were:
- Define the exact parameters for each activity (spatial and behavioral design)
and evaluate the time frame in which the spatial elements can act discretely and
beneficially.
- Choose the appropriate type of sensors to capture the quantitative and qualitative
features of the set parameters.
- Set up a crude neural network to process this data.
- Synchronize this information with the design environment.
- Test the process of “identify activity (sensor input) - exhibit response (mechanism
output) - get feedback and fine tune - loop”.
The final phase involved the assembly of the basic electronic infrastructure and the
fabrication of the necessary parts to create working prototypes, scaled or 1:1, with
character and intentions. The physical models were made out of simple materials, like
fabric, paper, plastic and wood using the Department’s fabrication laboratory which
is equipped with 2 laser cutters, a CNC router and a 3D printer. The kinetic parts were
controlled via Arduino assemblies. Moreover, all teams were asked to visualize their
work on the aforementioned phases on an exhibition panel.
Scenario 01: Sensponsive Life-Cycles
The first scenario touched upon the concept of life-cycles. It is a very crucial notion
that children ceaselessly explore in order to comprehend temporal events. They
gradually adjust their inner clock to different time frames and as they grasp the sense
of time, they identify and understand the difference between short and long-term
changes, the meaning of duration as well as the act of projecting oneself or other
things in time.
The students researched ways to spatialize time in a child’s world, as well as ways
to materially support a child’s perception of speed and duration. They identified some
basic dualities, such as the looped life-cycle vs. the spiral life-cycle, the notion of real
time vs. relative time and the distinction between short and long periods of (real)
time. Their design project is an autonomous unit equipped with a sentient system that
178
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
develops its personality according to the stimuli of its environment. By employing Arduino assemblies with proximity, movement, acoustic, touch/pressure and light sensors, the unit uses colors and motion to communicate with children. It is also designed
to form a network with other similar units creating an urban entertainment installation, located within a large urban setting, allowing a variety of responses by letting
the units act alone, in a cluster or as a whole. Children are introduced to a staged playtime watching the cumulative effects of their actions proportionally affect the network. In the long run, entertainment, urban landscaping and in-depth stimulation are
the project’s aims and desires.
Fig. 1
The exterior surface of the unit has a long-term memory capacity while as the interior one has a
short memory depending on the activity that takes place (haptic, somatic, acoustic). When multiple
units form a local cluster or an urban scale network the communicate with each other exhibiting
collective behavior.
Fig. 2
The overall form is designed to provide a variety of visual cues for children through time.
Konstantinos-Alketas Oungrinis, Marianthi Liapi Greece
179
The students relied on both analog and digital features in order to address creatively the multiplicity incorporated into their scenario. The notion of real time, for example, was passively addressed by the changes in the daily and the seasonal cycles.
The form of the unit itself had the design characteristics of a sun clock which helped
in accentuating the passage of time, throughout the day as well as during the year,
with the occurring shadows being the visual results caused by the sun’s movement.
The inability to affect this cycle also points out its looped character.
Fig. 3
The fabrication process.
Fig. 4
The unit attributes its sensponsive abilities to a variety of sensors controlled in an Arduino
environment.
The notion of relative time, on the other hand, was planned to be unveiled by the
type and the duration of the activity within the unit, evoking diverse experiences like
curiosity, anticipation, surprise and fun. There were two layers of programmed interaction based on a series of digital assemblies. The first layer could be affected immediately by the actions of the children, feeding them with short-term changes and
180
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
rewards. The second layer was designed to assess the actions taking place and react
accordingly through time, helping this way children to develop the hardest skill which
is to understand long term changes and effects. Having the ability to affect the actions
of the unit, and consequently of the network, children were also able in this case to
grasp the sense of the spiral cycle.
Fig. 5
A working prototype on a 1:5 scale.
Scenario 02: Sensponsive Storytelling
Our second scenario involved the concept of storytelling. It constitutes an integral
part in children’s activities as it facilitates role-playing and stage-setting. Moreover, it is
an essential tool for children to practice and develop their narrative and memory skills.
Storytelling can be enacted anywhere and it can only be limited by the imagination.
Nature, spaces, objects and people, in a variety of combinations, form an inexhaustible pool of storytelling material.
Fig. 6
In this project children are introduced to open-ended stage configurations that contribute to their
immersion in the game. Anticipation, surprise, balance, creation, pleasure, knowledge, adaption
and imagination are all phases a child will experience in a course of using these ‘storytelling’ cubes.
Konstantinos-Alketas Oungrinis, Marianthi Liapi Greece
181
After researching their options, the students soon realized that they needed a medium that would be able to trigger storytelling activities, to help the children play them
out as well as to provide an unanticipated feature for the sake of open-endedness.
Moreover, that medium needed to be regarded as a toy. They elaborated on different
types of forms, scales, and assemblies before concluding to an archetypical form, the
cube. Keeping in mind the simple analogy of one cube per child, they proceeded with
their project based upon an interesting mixture of analog digitality that transcended
the cube beyond the traditional notion of the ‘object’. As with the previous team, this
one also employed two layers of design, the conventional one based on formal and
tactile variations and the hi-tech one based on sensor-actuator assemblies. As a result,
the cube could be regarded and used both as a passive toy, offering the traditional
repertoire of such playthings, as well as an active toy, able to behave as a sensponsive playmate. Regarding the later, the students decided to introduce time in a wider
sense, allowing also for processes of reflection and understanding to take place, apart
from the immediate responses. The possible combinations of the cube’s different attributes guaranteed the open-ended medium required.
Fig. 7
Fabricating actual-sized models.
The cubic form of the produced object offered a variety of advantages, ranging from
the simplicity and the recognizability of the form to the potential visual and spatial
complexity of having multiple such cubes connected, creating a whole larger than
the sum of its parts. The materiality of the surfaces could be combined to produce
homogenous or heterogeneous displays as well as playful patterns for the children to
use for climbing, sitting, hiding and other actions. Moreover, each cube is penetrated
on two of its sides, either opposite or adjacent, encouraging a type of ‘connect the
dots’ type of play that can easily drive to the creation of three-dimensional complex
structures. This structural connection is achieved with the use of magnets.
182
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 8
Whenever a change in temperature takes place, the cube gives out a colored light, ranging between
blue and red, and it may enhance this output with tones of green according to the surrounding
soundscape. Moreover, each cube is programmed to express extreme emotions, such as boredom
and over-stimulation, by emanating a clicking sound. A feeling of happiness can also be communicated when a lot of cubes assemble together through the blinking of their lights.
Regarding the intelligence embedded in the cubes, the students concentrated on the
definition of simple rules. Each cube is programmed to respond to light, sound, temperature and pressure, producing light, color and sound effects. A cube can react to a
variety of stimuli coming from the children, other neighboring cubes as well as from
changes in the environmental conditions, serving as a very potent medium for children to constantly reinvent their own physical and sensorial space, both on an individual and a collective level.
Fig. 9
Toddlers examining the working prototypes.In the long run, the Cubes are aiming toward a ‘playin-progress’ situation of endless possibilities.
There are multiple possible reconfigurations that create twists and unexpected
events, urging children to adapt to the new conditions and evolve their story: the
change of colors creates sudden points of interest, the matching of textures provides
a purpose, cubes left unattended cry for attention while rearrangements incite new
responses from the connected cubes. The continuous change and the urge to adapt is
the main element of the sensponsiveness attributed to this project.
Konstantinos-Alketas Oungrinis, Marianthi Liapi Greece
183
Scenario 03: Sensponsive Traces
The third scenario addressed the theme of traces. A variety of activities are connected to this theme, like creating and leaving traces over time, knowingly or by chance,
following them, changing them or just observing their temporary existence. When it
comes to children, most of the times traces trigger a playful activity. At this point it is
very crucial to point out the fact that play is considered the main exploration mechanism for children in their quest to understand their own bodies as well as the world
around them. Incorporating traces into play constitutes a principle means in their way
of learning.
The students researched ways to guide children into understanding that an activity they are involved with affects not only the surrounding environment with the traces it produces but also the play parameters and the decisions of another person. They
designed an artificial terrain where they could control the quality of the environment
to ensure the inscribing effect of the activity in space. Their project had to rely heavily
on technology and on mediums that would help them evade the barrier of a synchronized locality. In other words, technology was able to render the trace-producing activity visible for a longer period of time, in a different place from where it initially took
Fig. 10
Diagrams depicting the design intentions.
Fig. 11
A very inspiring visualization of the atmosphere created during play.
184
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
place. It could also provide children with additional feedback and opportunities for reflection. These feedback elements can make elusive for the children concepts, like the
sense of time, the type and the speed of movement and so on, easier to discern.
Design-wise, the students came up with a parti pris that allowed for an increased level
of interaction between children, without getting obscured with added complexity.
They created an intelligent surface with its two sides acting as floor and ceiling, separating two otherwise isolated levels. Both sides of the surface consist of sensor-actuator assemblies with piezo-electric tiles and flexible fabrics. The part of the surface that
comprises the floor of the upper level can be actuated by the activities of one or more
children, initiating a series of chained reactions. Pressure forces on the floor turn on
a moving laser spotlight that maps the traces of movement on the lower level floor.
By chasing the light, children in the lower level not only play along the activity above
them, without even knowing, but also affect the geometry of both their roof and the
top level’s floor. The changed terrain affects in turn the movement of children on both
levels, constantly changing the way they leave traces. The intensity of the movement
is depicted through the degree of deformation of the surface. After a while the trace
begins to diminish, and the surface is gradually restored mechanically to its original
state.
Fig. 12
The upper layer of the intelligent surface provides a tactile landscape that changes its relative
altitude through nodes. It is a foldable rigid surface, made of a stable yet elastic material. The lower
layer of the surface is made from an elastic fabric, cut in patterns so as to simulate a natural cover,
resembling tree leaves or spider nets.
Fig. 13
The working prototype as it was exhibited during the conference.
Konstantinos-Alketas Oungrinis, Marianthi Liapi Greece
185
The whole system urges children to experiment with this type of interactions. With
time and through a ‘cause and effect’ narrative they will begin to understand the way
those traces are made, the way their actions affect a wider system and they will eventually refine their playing skills.
Conclusions
The whole process allowed for the evaluation of two main issues, the pedagogic approach and the research framework. For the first part, in the aftermath of the workshop one can clearly acknowledge the fact that a large team of tutors with distinctive
specializations and skills worked complimentary in a very productive way. Throughout the duration of the workshop, the tutors would form and re-form smaller groups
based on the specialization background the emerging problems required and dealt
with them promptly. The high quality of the end-result would not have been achieved
without this conglomeration. Regarding the evaluation of the chosen research framework, it is also evident that in order for architecture to overpass the established borders and achieve an augmented new state of being useful and collaborative it requires a multi-disciplinary approach along with an in-depth trial-and-error analysis
of the optimum way to integrate IT in people’s living space. This feature can only be
achieved by diverse groups of researchers and eager students.
A very interesting observation came up at the exhibition of the workshop prototypes to the public during the conference. It has to do with the fact that people,
regardless of their enthusiasm for things that react and move, they either regard an
animated environment as an artistic installation, or in the best case scenario, as an
environment with predetermined repetitive behavior, for which they loose interest in
the long run. So, even though there is no immediate visible difference in the integration of sensponsive logic into responsive systems (technically speaking), the difference lies in the context that emerges from the use of such spaces through time. In this
sense, the sensponsive approach requires time in order to be truly evaluated by its
users, a fact that seems to be returning to a true architectural value where buildings
are “felt” and “understood” through time and not just judged from their appearance.
A media enhanced architecture, then, should not only lead to decisions that affect the
design but also, more importantly perhaps, it should follow how people accept artificial environments into their everyday lives by providing continuous feedback on their
effectiveness.
Acknowledgements
We would like to extend our gratitude to our teaching collaborators in the workshop, presented
here in alphabetical order: Edith Ackermann [MIT School of Architecture/University of Aix-Marseille], Panayiotis Chatzitsakyris [AUTh], Christian Friedrich [TU Delft], Dimitris Gourdoukis [AUTh],
Peter Schmitt [MIT Media Lab], Susanne Seitinger [MIT Media Lab], Constantin Spiridonidis [AUTh],
Kostas Terzidis [Harvard Graduate School of Design], Maria Voyatzaki [AUTh], Socratis Yiannoudes
[TUC], Emmanouil Zaroukas [University of East London]. Additionally, we would like to express our
appreciation for all the students who participated in this event, coming to Chania from various
places around the world: (A-Z order) Alexiadou Sotiria, Anagnostopoulos Georgios, Andresakis
Georgios, Apostolakeas Vasilis, Apostolopoulos Ioannis, Charvalakis Panagiotis, Chronopoulou
186
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Eleni, Gavra Xara, Gourdoukis Christos, Haratsis Danae, Himpe Caroline, Inglezakis Georgios, Kantarzis Michalis, Koumpli Eleni, Kourounis Iraklis, Kowalczyk Paulina Anna, Letsiou Eleni, Mairopoulos
Dimitris, Maistralis Vaggelis, Pankratov Tanya, Papalexopoulos Vasilis, Papavasileiou Nikos, Paschidi
Mariana, Preari Zoi, Psaltis Stelios, Saylan Turker Naci, Svoronos Theodore, Tomara Avra, Tsesmetzis
Vasilis, Zhou Junhua.
Notes
- A descriptive video presentation of the workshop can be found online at http://senseresponsive.blogspot.com/
- Our research on Sensponsive Design is carried out at the TUC Research Laboratory for the
Creation of Transformable Architecture, Kinetic Systems and Intelligent Environments in collaboration with the TUC Electronic and Computer Engineering Department. Our ongoing
projects are titled Spirit and Ghost, conducted in collaboration with D. Mairopoulos, a final-year
student of Architecture.
References
Ackermann, E., 2010. Constructivism(s): Shared roots, crossed paths, multiple legacies. In: Proceedings of the Constructionism 2010 conference. Paris, France 16-20 August 2010. Paris: American
University of Paris.
Ackermann, E., 2004. Time, and changes over time: A child’s view. In: H. H. Knoop, ed. 2004. Children, play, and time: Essays on the art of transforming stress to joyful challenges. Denmark: Danish
University of Education Press, pp. 101-113.
Lund, H. H., Klitboo, T., and Jessen, C., 2005. Playware technology for physically activating play. In:
Artificial life and robotics 9 (4), 165-174.
Mori, M., 1970. The uncanny valley. Energy 7(4), pp.33-35. Translated by K. F. MacDorman and T.
Minato.
Oungrinis, K. A., 2006. Transformations: Paradigms for designing transformable spaces. Cambridge,
MA: Harvard GSD Design and Technologies Report Series.
Turkle, S., 2005. The second self: Computers and the human spirit. Cambridge, MA: MIT Press.
Konstantinos-Alketas Oungrinis, Marianthi Liapi Greece
187
Sophia Vyzoviti
Department of Architecture
University of Thessaly
Greece
Pleat and Play
The paper discusses ways to embed a critically responsive design methodology in architectural education and illustrates the argument with a case of a deployable prototype fabrication. While robotics and structural engineering are today considered the
dominant constituents of a responsive architectural prototype, the paper focuses on
the architectural layers of responsive form generation in terms of flexibility and programmatic instability. In compliance with the conference theme the paper also poses
a critical question on the overriding technological paradigm and suggests possibilities for low- tech adaptable and responsive prototypes actuated by human agents.
Considering shape control in adaptable prototypes some kinetic, and polymorphic
examples are discussed drawing from the bibliography on folded plate structures and
origami based deployables as well as the author’s past research1 (Vyzoviti, 2011) in of
form generation experiments which employ the continuous surface, its aggregates,
and its subdivisions as a tool. In this framework the paper attempts to make a contribution towards a responsive design process agenda in architectural education focusing on three positions:
• Physical modeling as a complement and as a challenge to digital modeling,
• Participatory form finding as a collective repository of form generation rules that
can be shared between members of a creative community,
• Behavioral research as data to be incorporated in form generation processes, describing systems of human activities as algorithms.
The argument is illustrated with the fabrication of a deployable shell prototype based
on a geometric transformation of the ‘Miura-ori’ origami pattern. Form-finding is introduced as collective play between the pleated shell and small group of architecture
students. Considered as material exploration and topological propedia to a technologically equipped one, this prototype draws knowledge from research on deployable
structures developed from membrane folding and explores in its design the biaxial
shortening of a plane into a developable double-corrugation surface. Further, considering creativity as the ultimate intrinsically human activity and exploring the architectural potential of the fabricated prototype improvisational group play is introduced
as program evoking the paradox of rule following that determines all human activity.
Activated by human agents the performance of the deployable prototype exceeds its
expectations, manifesting spatial enclosures and containments that were not predicted in its original design, enriching and augmenting its architectural predicaments.
Responsive Surface
According to Tristan d’Estrée Sterk (2003) what is commonly defined as responsive
architecture is “a type of architecture that has the ability to alter its form in response to
changing conditions”2 and comprises a research field that combines interdisciplinary
knowledge from architecture, robotics, artificial intelligence, and structural engineering. The term responsive is usually combined with the term adaptive, broadening the
definition for architecture as a system which changes its structure, behaviour or resources according to demand. A state of the art outlook in adaptive architecture reveals four thematic foci of research in progress3: dynamic façade systems, transformable structures, bio-inspired materials, and intelligent actuation technology. Excluding
190
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
intelligent eco-systems and actuation technology an integrated research entity can be
established around the notion of surface with emphasis on the geometric and structural component.
Since the early 1990’s the notion of surface has evolved into a formal trait among
the avant-garde architectural discourses. Conceptualized within the Deleuzian ontology of the fold it has become associated with diagrammatic techniques and digital
morphogenesis, prolifically materializing in the projected and the built by continuity,
curvature, smooth layering and manipulations of the ground. Liberating architects
from typology and the order of the box, the focus on surface processes diversifies
architectural research in exploring and making explicit form defining strategies that
envision a redefinition of architectural performance. As a diagrammatic practice that
appeals to morphogeneticists as well as urban conceptualists, the surface has acquired
conceptual, operational and material depth.
If we consider the current term structural façade (or structural skin) the modernist
dichotomy between the ornamental - geometric character of the building envelope
and its structural support appears to have merged. In the architectural production of
the past decade the single surface has evolved into an omnipotent architectural tool
combining landform, structure, ornament, aperture, and environmental comfort. Stan
Allen (2009) attends to the innovative capacity of single surface implying that “green
roofs, artificial mountains and geological forms; buildings you walk on or over; networks
of ramps and warped surfaces; buildings that carve into the ground or landscapes lifted
high into the air” that may seem commonplace in architecture today have been exacerbated by the single surface projects of the 1990’s and particularly that “the emergent
field landscape urbanism predicated on a series of formal experiments carried out in the
early 1990s by architects working with strategies of folding, surface manipulation and the
creation of artificial terrains”.4 In this context I would argue that it would be valid to investigate the responsive or adaptive potential of surface as an integral architectural
strategy with a radical programmatic function in terms of indeterminacy and flexibility, beyond dynamic façade design.
The responsive surface has a distinguished set of intrinsic properties. The responsive surface is able to change shape –in terms of its overall form or texture – and over
different time-scales, and therefore it is dynamic. Its dynamics are embedded within
the design and fabrication process and also manifest interactivity and/or kinetics in
the completed work. Surface Kinematics is enhanced by surface geometry, trough
patterning: tessellations of a single surface or assemblages of kinetic components.
Surface geometry is becoming increasingly smarter. Surface kinetics is actuated
through intelligent components comprising of systems of sensors, servo mechanisms,
pistons, sliders, leds, or lower tech configurations four-bar linkages, gliders, etc.
According to Tristan d’Estrée Sterk (2006)5 structural systems used within responsive building envelopes and exoskeleton frameworks they must have controllable rigidity, must be lightweight and capable of undergoing asymmetrical deformations.
These essential characteristics work together to provide the most robust and flexible
outcomes and form a core set of principles that can be applied in the development
of all successful responsive architectures. Tensegrity structures comprise the prime
research field for adaptive structures – as exemplified in the adaptive tensegrity of
orambra6- with a large number of potential applications, due to their small transportation or storage volume, tunable stiffness properties, active vibration damping and
Sophia Vyzoviti Greece
191
deployment or configuration control. However I intend to focus and further invest a
certain depth to another genre of potentially responsive or adaptive surfaces that integrate structural and formal characteristics and manifest kinetic performance while
maintaining continuity and cohesion attending to according to the single surface principle: the origami based deployables.
Origami based Deployables
Another genre of kinetic and potentially responsive surfaces, are the folded and tessellated surfaces whose geometry originates in Origami, the ancient Japanese art of
paper folding. Folded surfaces most usually encountered in architecture are static
structures, folded plate shells, with applications as roofs, load baring walls, or combinations of both. Martin Bechthold (2007) who investigates cementitious composite
folded plate systems, defines shells7, as material efficient structurally active surfaces
that are extremely thin in proportion to their load baring capacity for large spans,
and epitomizes their aesthetics in the early built works by Eduardo Torroja, Pier Luigi
Nervi and Felix Candella. Folded plate structures that are kinetic are rarely realized in
architecture despite their high performance for wide span and formal qualities. Martin Trautz, and Arne Künstler (2009)8 investigate possibilities to combine the advantages of folded plate structures with the possibility of a reversible building element
through folding and unfolding by considering the existing plate thickness and associated stiffness that constrains the kinetic behaviour of origami originating patterns.
The most relevant -in architectural terms - kinetic property of origami patterns and
their derivatives is deployment. Origami based deployables transform from a two dimensional into a three dimensional kinetic surface and then into a flat packaged one.
In this sequence of transformations deployment occurs by translation and by rotation.
When constructed of thin sheets of paper origami based deployables are extremely
lightweight yet rigid, manifesting in overall zero Gaussian curvature. Further -as we
will discover in the last part of the presentation they bare the potential for secondary
asymmetrical deformations. Therefore origami based deployables bear the potential
to constitute a research depository for adaptive – responsive surfaces.
Origami based deployables of architectural relevance, are tessellated kinetic surfaces based on 4-fold mechanisms that manifest synchronous deployment. The most
prominent pattern in this respect is Miura-ori. Devised by Prof. Koryo Miura of Tokyo
University’s Institute of Space and Aeronautical Science in 1980, the Miura-ori folding
technique allows the unfolding and refolding of a surface in one action9.
Miura-ori pattern comprises a grid of creases arranged along zig-zag lines at angles of small deviation to the vertical. According to Pellegrino and Vincent (2001, pp.
64)10 who compare the behaviour of a sheet surface folded according to an orthogonal grid –as applied in road maps- and the Miura-ori pattern, the zig-zag geometry
introduces a radical change in the unfolding behaviour of the surface compared to
the orthogonal grid, that of synchronous –rather than sequential- deployment. The
unfolding of a sheet folded according to Miura-ori manifests biaxial shortening of a
plane into a flat-foldable yet rigid double-corrugation surface.
Complementary to the Miura-ori, there are two more origami patterns based on
4-fold mechanisms that appear in the bibliography of origami originating surface
structures, the Yoshimura pattern and the Diagonal pattern. Buri and Weinand (2008)11
192
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
outline two fundamental geometric characteristics in the structure of these patterns
with operational potential, their parallel corrugations, and the principle of the reverse
fold. They proceed through geometric analysis and employ digital form-finding software to generate variations of complex folded patterns by two generative paths, the
section profile and the corrugation line. Buri and Weinand (2008) prove that numerical models of complex folded plate structures enable them to be milled by Computer
Numerically Controlled machines, and fabricated by cross laminated timber panels.
However, despite the innovative form generation contribution of their research in the
domain, the prototypes produced are static structures.
Pellegrino and Vincent (2001, pp: 60)12 deliver the proof of one fundamental constraint for all deployable membranes which are modelled as zero thickness surfaces.
The ‘inextensionality constraint’ is necessary condition for flat fold ability, also known
as compact packaging. Accordingly the inextensional folding of a membrane requires
that whenever different creases meet at a common point there should be at least
four folds, of which three have one sign, and one fold has the opposite sign. The inextensionality constraint can be applied on origami based deployable surface patterns,
enriching the basic origami rule that crease patterns are two colourable, that is, either
concave creases (also known as mountains) or convex creases (also known as valleys).
Therefore prototypical origami patterns such as the Miura-ori, the Yoshimura and the
Diagonal as well as their geometric transformations intrinsically bare the potential to
become deployable surfaces.
Digital Simulation vs Material Computing
Origami based deployable surfaces have until recently resisted digital simulation accessible to architects, their production relying exclusively on relentlessly disciplined
sequences of paper folding and their representations on mathematical models. Nevertheless in the past few years the possibility to simulate the biaxial shortening of a
plane into a flat –foldable and developable double-corrugation surface is available in
a user friendly interface. Tomohiro Tachi (2009)13 has developed a system for computer
based interactive simulation of origami, based on a rigid origami model that manifests
the continuous process of folding a piece of paper into a folded shape by calculating
the configuration from the crease pattern. Rigid Origami Simulator is a GUI program in
windows that simulates kinematics of rigid origami from the origami crease pattern
which helps users to understand the structure of the origami model and the way to
fold the model from the crease pattern. It requires as input a minimum necessary representation of a folded surface, a two dimensional crease pattern which complies with
the two colourable rule - the notation of convex and concave creases - upon a flat
state. Rigid Origami Simulator is open source and facilitates the formation of an online
interest community.
My contribution to the field of research in origami based deployable surfaces comprises of form-finding experiments conducted primarily in analogue media incorporating the basic morphogenetic rules- parallel corrugations, the principle of the reverse fold and the inextensionality constraint in hands on paper folding experiments
(Fig. 1). A state-of-the-art definition of the research methodology conforms to the notion of material computing. Patrick Schumacher (2007) defines material computing as
“analogue form-finding processes that can complement the new digital design tools
Sophia Vyzoviti Greece
193
Fig. 1
Origami based deployable surfaces and patterns (source: Vyzoviti 2011).
and that might in fact be described as quasi-physical form-finding processes”.14 In this
context the crease pattern of the deployable surface is not the starting point, or generator of the form finding procedure, but rather one of the findings of the experiment.
The derivative crease patterns, analyzed and reconfigured in order to comply with the
rule of the two colourable can become input for the Rigid Origami Simulator, enter a
cycle of digital transformations of the folded form and become a numeric model within state of the art software. In the case of origami based deployable surfaces physical
modelling is a complement as well as a challenge to digital modelling.
The particular experiments with origami based deployable surfaces in analogue
media confront the geometry of surface transformations with material effects, incorporating factors such as gravity, time, and human participation. A series of origami
based deployable surface prototypes that have been produced under my supervision
at the academic studio15 challenge materiality though an agnostic eclecticism of surface treatments. If the crease pattern comprises ‘intelligible matter’16, full of potentiality, the deployable surface investigates numerous possible actualities. Prototypes may
be fabricated in rigid or soft sheets as well as in combinations of both (Fig. 2). Paperboard is the most commonly employed rigid surface material: opaline paper beyond
220g (2.1) combines rigidity with folding ability while scored corrugated paperboard
(2.2) specifically enforced with duct tape along its scores (2.3) is suitable for up scale
prototypes. Some of the better composites in the current collection of deployable
surface prototypes17 (Vyzoviti, 2011) combine soft and hard materials: nylon membrane enforced with plastic rods (2.4) or lycra tissue enforced with cardboard facets of
considerable thickness (2.5).
194
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
Origami based deployable prototypes (source: Vyzoviti 2011).
The performance of these origami based surface prototypes in terms of kinematics
manifests translational and rotational deployment, flat packaging, axial rotations and
more rarely axial revolutions. Translational Deployment produces the spatial effect of
oscillating tubular forms that contract and expand maintaining constant enclosure.
Rotational Deployment produces the spatial effect of oscillating spherical forms that
allow for variable boundary enclosure. Flat-packaging produces the spatial effect of
a diminishing footprint. Revolutions and axial rotations produce the spatial effect of
alternating concave and convex enclosures.
Traditionally origami artefacts derive through disciplined in time sequences of
folding actions that transform a flat sheet of thin paper into a three dimensional object. Origami is art, science and craft. Origami instructive notations are delivered in
practical terms, single surface transformations which can be easily conveyed in sets of
simple rules. The hands-on production of the Miura-ori pattern for example incorporates geometric principles within distinct set of acts: pleat, crease, reverse, unfold, fold
up or down. Such sets of embodied –or hands on- geometry which constitute its basic know-how can be considered as elementary form-finding knowledge components,
which can be easily be disseminated, diagrammatically described, algorithmically
prescribed or even narrated. Partaking into a collective repository of form generation
rules or simple morphogenetic algorithms, they may be shared between members
of a creative community, and become scaffolds for collective creativity. Therefore origami based patterns can be considered as matter which is not physical, operating as
participatory form-finding devices.
Deployable Shell Prototype
The case that I use to illustrate the argument is a kinetic deployable shell fabricated
together with students during an EASA summer workshop18. The prototype shell demonstrates the kinetic potential of a minimum thickness double-corrugation surface.
Further, the prototype shell challenges the potential of non-architectural matter (corrugated paperboard) to acquire architectural properties through geometric patterning, pleated according to an origami based crease pattern. It also verifies the potential
of hands-on paper folding experiments to generate patterns for deployable surfaces,
Sophia Vyzoviti Greece
195
capable to become the intelligible matter of digital simulations. Finally the prototype
shell investigates the performance of a deployable shell in the context of user interaction, in this case partaking of improvisational group play. The installation of the prototype in a low tech context poses a critique to the dominant norm that actuated shape
change and control occurs exclusively in the field of robotics and artificial intelligence.
It also poses a critical question to the desires and intentions of possible users of responsive architecture.
The pattern selected for the prototype derived from a hands-on paper folding
session with the EASA students. It was first created in lightweight opaline paper. Its
deployment ability is based on a four-fold mechanism, and its pattern comprises a
geometric recombination of the ‘Miura-ori’ and Yoshimura origami patterns. The digital simulation of the pattern’s deployment process is a post production made only recently with Rigid Origami Simulator19 and demonstrates in discrete steps the transformation of the two-dimensional crease pattern to the three-dimensional folded form
and further into to flat packaging. Two formal states comprise minimum and maximum
values the object achieves: flat state (two dimensional pattern) and flat package state
(inextensional folding). In between the object acquires a variety of vault configurations
producing an oscillating tubular spatial effect. Synchronous translational deployment
is evident in all steps of the transformation from vault to flat package (Fig. 3).
The fabrication of the prototype shell explored the potential of a physical double
corrugation surface in terms of kinematics and formal effects. The architectural –or
programmatic- objective in this case was a manipulable and deployable space enclosing object which was larger than a piece of furniture and smaller than a room. The
selected pattern was scaled to fit – in vault configuration – the height of a sitting adult
or a standing child. The shell was fabricated in corrugated paperboard and duct tape.
It was produced by assembling components of 100x70 sheets of double wall corrugated board. The sheets were scored according to the crease pattern on both surfaces
of the board. There are three layers that compose double wall corrugated board, the
external liners and the corrugated medium of flutes. The external liners enable the
board to receive scores along valley creases and cuts along mountain creases complying with the two colourable origami rule. Component sheets were initially folded according to the pattern and then assembled in semi-flat state (Fig. 4). Mountain creases
were covered with duct tape ensuring that the shell remained cohesive and establishing continuity between independent pleated components. Two identical prototype
shells were fabricated.
Initially, the prototype shells were considered completed when folded into a vault
configuration, smoothly performing translational and rotational deployment. They
were carried flat packaged to the installation location. Form-finding was further introduced as improvisational collective play within the small group of participants
(Fig. 5). The group explored deployment ability of each shell in an indeterminate series of transformations alternating overall enfolding of the pleated shell in order to
produce minimum inhabitable spaces. Considering creativity as the ultimate intrinsically human activity, improvisational group play was introduced as a form generation program, an activity that actuated the pleated surface challenging the limits of its
deployment while exploring the kinetic behaviour of the fabricated prototype. Partaking in a series of transformations the pleated shell became animated, constantly
affected by the improvisational choreography informing and deforming it. Through196
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 3
Pleated prototype deployment simulation with Rigid Origami Simulator © Tomohiro Tachi 20072009 (source: Vyzoviti 2011).
Fig. 4
Pleated prototype component assembly (source: EASA007 archives).
Sophia Vyzoviti Greece
197
Fig. 5
Stills from pleated prototype collective performance (source: Vyzoviti 2011).
Fig. 5
Pleated prototype folded into minimum inhabitable capsule (source: Vyzoviti 2011).
198
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
out the performance the prototype remained cohesive and self-supporting asserting
the potential of corrugated paperboard to physically substantiate an origami based
deployable pattern. Complementary to translational deployment which was predicted in the digital simulations counterintuitive or unexpected configurations were also
observed. Noteworthy configurations of the shell were achieved in-between stages of
translational and rotational deployment, such as asymmetrical deployment and tilting of the deployment axis. Enfolding of the pleated shell produced spatial enclosures
and containments that were later evaluated as minimum inhabitable capsules (Fig. 6)
Activated by human agents the deployable surface surpassed the groups’ initial expectations, enriching and augmenting its architectural performance.
Multiple shape configurations that appeared during the performance were not
considered in advance when designing the prototype shell. The digital simulation of
the deployable pattern is not adequate to predict (beyond translational deployment)
the full spectrum of shape change in the physical, animated through human activation prototype. The prototype shell’s multiple responses to improvisational group
moves are not a case of efficient causality. Formal instability and plurality is triggered
by human interaction within the group and between group and kinetic prototype.
The pleated surface generated shape changes operating as a morphological automaton. This performance can be partly attributed to the intrinsic behaviour of the material pattern (synchronous deployment) and partly to material failure since thickness
of sheet and surface fatigue may also influence the transformations of the prototype.
The physical deployable prototype and the group can be considered as one unity –
pleat | play- and described as one complex functional structure. I would argue that in
terms of kinematics and shape change the unity pleat | play processes an emergence
property. The behaviour of the whole in terms of kinematics and polymorphics cannot be reduced to the sum of its parts. As Manuel De Landa (2011, pp. 385) defines
“a whole is said to be emergent if it is produced by causal interactions among its component parts. Those interactions, in which the parts exercise their capacities to affect and be
affected, constitute the mechanism of emergence behind the properties of the whole.”20
And this emergence property constitutes the pleat | play experiment extremely relevant in the debate for rethinking the human factor in technology driven architecture.
Concluding Remarks
In conclusion, rather than strictly summarizing the preceding argument I would like to
submit some propositions on the way digital technology may be embedded in architectural education today. In order to educate in the field of architecture today we need
to create. As new technologies facilitate us towards prototype production through
digital fabrication apparatuses our role as educators has shifted towards inventing ways to sustain and promote a culture of making rather than emulating projects
through drawings. Computer Aided Design interfaces has liberated architectural form
finding from the tediousness of technical drawing, as in the traditional plan – section
– elevation - model compositional sequence, providing utilities for immediate three
dimensional form finding experiments with digital dynamic models. Paradoxically
physical modelling as material computing or analogue ramifications of digital modes
of representation21 (Davis, 2006, pp. 98) have become extremely relevant in current architectural form finding. And even more paradoxically analogue materiality has conSophia Vyzoviti Greece
199
tributed to digital morphogenesis some pre-digital techniques and modes of thinking such as bricolage and making do with collective repositories of form generation
knowledge (scripts, rules, patterns, etc). Form finding in the digital era has become
not only more immediate but also more collective and collaborative.
In addition workshop teaching as an educational model, which is becoming increasingly popular today, also equips and facilitates the educators towards creating.
Workshops accelerate and intensify standard design processes in the field of architecture providing the frame for exhaustive investigation of one single topic, in brief and
in depth. Workshops allow diversions from and enrichment of standard architectural
content by forming interdisciplinary teams. Their experimental nature promotes risk
taking. The capacity of problem solving or unveiling all parameters that structure a
problem increases significantly through collaboration. Workshops function as social
condensers providing platforms for community bonding. Collective creativity sessions
are enjoyable and rewarding. Evaluating the deployable prototype fabrication and installation of the previous section in terms of educational benefits I can conclude that
these benefits were not only knowledge based but also enhancing communication
and collaboration skills. There are two necessary conditions to bind a group of architecture students into a creative community: working together in creative sessions towards a collective product and sharing a palette of comprehensible and achievable
forms. These conditions ought to be considered towards a design methodology that
critically embeds new technologies in the academic design studio.
Notes
1 Vyzoviti, S, 2011. Soft Shells: Porous and Deployable Architectural Screens. Amsterdam: BIS
Publishers.
2 Sterk, TdE, 2003. Building Upon Negroponte: A Hybridized Model Of Control Suitable For
Responsive Architecture available at http://www.sciencedirect.com/science/article/pii/
S0926580504000822 (accessed 28-1-2011).
3 Adaptive Architecture, 2011. International Conference. 3-5 March 2011. London: Building Center.
available at http://www.buildingcentre.co.uk/adaptivearchitecture/adaptive.html (accessed
1-11-2011).
4 Landform Building -Architecture’s New Terrain. 2009. A working Conference at Princeton University School of Architecture. 18 April 2009. available at http://soa.princeton.edu/landform/
(accessed 28-1-2011).
5 Sterk, TdE, 2006. Shape Control in Responsive Architectural Structures – Current Reasons and
Challenges. In 4th World Conference on Structural Control and Monitoring (accessed 11-08-2011).
6 www.orambra.com (accessed 11-08-2011).
7 Bechthold, M,. 2007. Surface Structures in the Digital Age: Studies in Ferrocement. In LloydThomas , K. ed. 2007. Material Matters: Architecture and Material Practice. London: Routledge,
pp: 139-150.
8 Trautz, M, and Künstler, A. 2009. Deployable folded plate structures – Folding patterns based
on 4-fold-mechanism using stiff plates. In International Association for Shell and Spatial Structures (IASS) Symposium. 28 September – 2 October 2009. Spain: Universidad Politecnica de
Valencia. available at: http://trako.arch.rwth-aachen.de/publikationen/aufsaetze.php (accessed
11-10-2011).
200
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
9 Inspired by Origami, Koryo Miura and Masamori Sakamaki from, created the Muira-Ori folding
technique to allow people to unfold a map in one action and refold it with minimal hassle.
Available at http://www.designsojourn.com/the-miura-ori-map (accessed 11-10-2011).
10 Pellegrino, S, and Vincent, J. 2001. How to fold a membrane. In Pelligrino, S. 2001. ed. Deployable Structures. Wien –NewYork: Springer, pp: 59-75.
11 Buri, H, and Weinand, Y. 2008. Origami – Folded Plate Structures, Architecture. In 10th World Conference on Timber Engineering, 2-5 June 2008, Japan: Miyazaki. Available at http://infoscience.
epfl.ch/record/118687 (accessed 11-10-2011).
12 Ibid, Pellegrino and Vincent, 2001, p: 60.
13 Tachi, T, 2009.Rigid Origami Simulator Software. Available at http://www.tsg.ne.jp/TT/cg/ (accessed 1-8-2011).
14 Schumacher, P, 2007. Engineering Elegance. In Kara, H. ed, 2008. Design Engineering, London:
AKT. Available at http://www.patrikschumacher.com/Texts/Engineering%20Elegance.html
(accessed 28-1-2011).
15 Supersurfaces Design Elective. 2007-2009. Department of Architecture, University of Thessaly
Greece. Available at www.supersurfaces-supersurfaces.blogspot.com
16 Aristotle in Metaphysics argues that mathematical objects have matter, noetic or intelligible.
Therefore geometric patterns can be matter without being physical.
17 Ibid, Vyzoviti, 2011, pp: 57-87.
18 Supersurfaces Workshop, 2007. In European Architecture Students Assembly EASA007_Final
Report: City Index: Greece: Elefsina. 2010. Greece: School of Architecture National Technical.
19 Ibid, Tachi, 2009.
20 DeLanda, M. 2011. Emergence, Causality and Realism. In Bryant, L, Srnicek N and Harman, G.
eds. The Speculative Turn: Continental Materialism and Realism . Melbourne:re-press, pp: 381-392.
21 Davis, W, 2006. How to Make Analogies in a Digital Age. In October 117, October Magazine,
Ltd. and Massachusetts Institute of Technology, pp: 71–98.
Sophia Vyzoviti Greece
201
Contributions
(Re)searching a Critically
Responsive Architecture
Herniette Bier
Faculty of Architecture, TU Delft
The Netherlands
Reconfigurable Architecture
Technological and conceptual advances in fields such as artificial intelligence, robotics, and material science have enabled responsive architectural environments to be
implemented and tested in the last decade by means of virtual and physical prototypes; these are incorporating digital control, namely, sensing - actuating mechanisms
enabling them to interact with their users and surroundings in real-time through
physical or sensory change and variation1.
In this context, Hyperbody at TU Delft has been pushing forward interactive architecture by developing prototypes at building component scale. While these prototypes2
obviously point towards a paradigm shift from static towards dynamic, reconfigurable
architecture (Fig. 1 & 2) they do not operate at building but at building component
scale (façade, wall, etc.) and do not address socio-economical or environmental aspects that affect the society at large.
Fig. 1
TUD students working on the development of the MuscleTower
II’s reconfigurable system.
The aim is, therefore, to develop reconfigurable architecture at building scale that addresses with consideration to environmental impact issues such as inefficient use of
built space and overpopulation. In this context, application of robotics to architecture
advances research in distributed autonomous mechatronic systems becoming test
bed for development of new robotic building systems that aim to be energy efficiently resizable and spatially expandable or contractible as well as kinetically moveable or
moving3. They, furthermore, aim to generate and use energy gained from solar and
wind power4, which implies that their ecological footprint is minimized and their economical efficiency increases significantly due to the maximized 24/7 multiple use of
built space as well as sustainable operability.
208
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
InteractiveWall developed by Hyperbody in collaboration with Festo employs sensor-actuator
technologies.
In conclusion, societal problems such as overpopulation5 and current limited use
(25%) of built space that can be successfully addressed by means of reconfigurable
architecture, whereas robotics applied to architecture not only enables advancement
of distributed autonomous mechatronic systems but also becomes test bed for development of new building operation technologies. The aim is, therefore, to develop and
test new approaches for the control and regulation of autonomous systems and apply
them to architecture in order to offer solutions for efficient use of built space.
Reconfigurable, robotic architecture is, as discussed in this paper, technology bearer
as well as test bed for distributed, autonomous systems that are developed in disciplines such as automation and robotics, implying not only transfer of intelligent mechatronic strategies to architecture but also development of new concepts and ideas
for architecture, where reconfiguration is accomplished collectively by smaller subsystems such as building components that are operating in cooperation.
Notes & References
1 H. Bier and T. Knight, ‘Digitally-driven Architecture’ in 6th Footprint issue edited by H. Bier and
T. Knight (Delft: Stichting Footprint, 2010), pp. 1-4.
2 Relevant examples of prototypes for reconfigurable architecture built at TUD are: - http://www.
youtube.com/watch?v=PVz2LIxrdKc&feature=fvw - InteractiveWall TUD in collaboration with
Festo, - http://www.youtube.com/watch?v=PVz2LIxrdKc - and MuscleTower - TUD [accessed
3 January 2011].
3 Archibots workshop at UBICOMP 2009, group #4 with inter al. H. Bier, J. Walker and J. Lipton
- http://www.archibots.org/ - [accessed 3 January 2011].
Herniette Bier The Netherlands
209
4 H. Bier and R. Schmehl, ‘Archi- and Kite-bots’ paper on integrated sustainable energy generation
for reconfigurable architecture to be published in IA #5 on ‘Robotics in Architecture’ edited by
H. Bier and K. Oosterhuis (Heijningen: Jap Sam Books, 2011).
5 The UN forecasts that today’s urban population of 3.2 billion will rise to nearly 5 billion by
2030 - http://www.un.org/esa/population/publications/wpm/wpm2001.pdf - when 3 out of
5 people will live in cities [accessed 12 May 2011].
Acknoledgements
Projects presented in this paper have benefited, inter alia, from the contribution of Protospaceand Hyperbody-team including on projects participating students. Detailed credits for individual
projects can be found on the Hyperbody website (http://www.hyperbody.nl - accessed 25 March
2011).
210
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Ermis Chalvatzis
Natassa Lianou
Zaha Hadid Architects
UK
Krama Proto-Tower
Systemic Interactions
Design Research
KRAMA is the exploration of a Proto-Tower which, as its name implies (from the greek
ΚΡΑΜΑ//krama = alloy), is a synthesis, a mixture which develops relationships between multi systems and sub systems1, to ultimately produce one interconnected
soma, one organism able to perform as one unit with its inherent intelligence and
rules, serving and adapting to the human needs. Similar to a living organism, our
research proposes a total parametric creature which establishes feedback loops between human needs and spatial expression, producing various phenotypes. Thus,
through probing relationships between each system and subsystem of the Proto-Tower we produce multiple variations, offering selection’ s plateaux.
Our essay follows the meticulous juxtaposition of structural, aesthetical, environmental, spatial, and phycological systems, exploring each one the relationship with its
neighboring subsystem.
Site Specific
Krama Proto-Tower is performs in extreme weather conditions and more specifically at
Tromso, in North Norway. Tromso, is located at the north Fjords of Norway, inside the
Arctic Circle. It is the largest urban area in Northern Norway and it experiences a subarctic climate. The mean temperature is 3oC throughout the year. Due to its location,
Tromso experiences the Midnight Sun for 6 months and the Polar night for the rest 6
months of the year. Thus during June and July, there is no darkness at all, while during
December and January it is all the time dark.
Focusing on the social data of our site, Tromso has a population of 67.000 people.
Recent climate research clearly show that climate change is altering the fabric of the
entire polar region. As the ice melts, the ecosystem of the Arctic is being transformed.
At the same time, opportunities are expanding for economic activities and development of the Arctic region’s plentiful natural resources. In sum, the Arctic is in great
transformation and therefore our tower can be part of this transformation.2
Simultaneously, according to the UNDP’s most recent Human Development
Report, Norway has the highest standard of living in the world.3 Despite the high
quality of life, Norway has also high suicide rate . The lack of daylight correlates
with depression and suicide. Moreover Norway’s climate with long, dark winters is
the main cause for this high suicide rates. People of Scandinavia in general suffer
from Seasonal Affective Disorders (SAD) and they experience depression in the absence of sunlight, which may lead to suicidal tendencies.4 The low urban densities
of the cities and the isolated inhabitation, is also a parameter which affect the suicidal tendencies in addition with the high consumption of antidepressants and the
alcoholism.
Thus, although Norway scores very high on standards of leaving, Norwegians
are more pessimistic than other poorer societies.Moreover, during winter people
go to hospitals and get light treatment, in order to absorb some light, retain energy
and avoid depression. Simultaneously due to the harsh climate people stay in their
houses for days, thus they become isolated from their neighbors and from the hole
society.
The lack of sunlight and the harsh climate affect the phycological state and the relationships between people and that is why Norway has high suicide and depression
rates.
212
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Local Phenomenon_ Aurora Borealis
Tromso is the best place for someone to experience the great local phenomenon, the
aurora borealis phenomenon also known as northern lights. These dancing lights are
luminous glows of the upper atmosphere which are caused by energetic particles
that enter the atmosphere from the Sun and hit molecules and atoms of the atmosphere. Tromso moves into the aurora zone around 5pm and moves out again around
midnight while these luminous glows are obvious during winter where there is total
dankness.5
Proto -Tower scenario
Thus, based on the site, social and climate data of Tromso, KRAMA deals with the creation of Towers, which are proposing a new type of habitation in this area, a kind of
social network which will be able to accommodate at least the half population of the
city, and “protect” them from the harsh climate during winter. Through our research
we aim to produce a satellite city into the city, based on social interaction and light.
Our proto tower is a Krama which is also its name, which means that it is a synthesis,
a mixture which develops relationships between multi systems producing one interconnected soma, able to perform as one unit with its inherent intelligence and rules.
Krama is consisted from various systems and subsystems, with their inherent autonomous logic, that are coming together, adapt their bodies in relationship with their
neighbors and create one articulated and interconnected global system able to perform as one living organism.6 Moreover the ground is a fundamental parameter which
influence and is influenced from the tower topology, since it is performing as the topological surface which is giving birth to its own body.
Thus, while spreading our seeds into the ground, the ground networks generate
populations of fibres with micro intelligence which recognize each other and with
their inherent intelligence, they will perform as structural elements, as feet of the
proto-tower and by communicating and exchanging data or matter, they differentiate
according to fitness criteria of shape variations and based on the proto parameters,
while finally the skeletal patterns and the whole field of towers with the slabs, the
navigation system and the skin is performing as one interconnected system made of
luminous veins throughout their body, which according to their local behaviors they
produce stronger and weaker areas and thus closed and open spaces.
All the systems should develop relationships with the rest systems or subsystems,
in order to be able to communication and adapt to every situation. Everything is a
matter of relationships, from the social character of the building to the structural scenarios of the systems and the subsystems. Thus as Karl Chu argues that “ “information
is the currency of life”7 we should implant the communication data inside the seeds, in
ordrer to let our creature evolve and mutate, based on their inherent intelligence.
Focusing on the function of our proto Tower, it is a mixed uses tower, where residential apartments, hotel rooms, retail outlets and offices are distributed into the skeleton, while at the ground level there are extra public facilities.
Simultaneously, we are proposing a seasonal occupation of the residential zone,
where during winter people can live together inside the tower while during summer
they could spread out back to the city and make it again vivid. At the same time, the
Ermis Chalvatzis, Natassa Lianou UK
213
Fig. 1-2
Proto -Tower scenario.
214
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
rest of the functions will continue their use while, they will adapt and transform their
program according to the seasonal needs.
On the other hand, our design goals, deal with an ecological perspective of this
building in relationship with its environment, adapting to the extreme weather conditions of the area. We aim to achieve a structural performance which from an engineering point of view, it will engage in this extreme weather conditions environment,
introducing relationships between our main parameters (fibers profile, foot condition,
fibers distribution etc.) and the environmental constrains (wind, rain, local phenomenon etc.).
Proto Relationships
Our journey, begins by observing a natural vertical growth of a plant and the process
that it follows in order to create one strong body, able to hold and control itself during
its vertical growth. Thus, starting with the skeleton which is triggered from the properties of the natural growing system, we continue with the rest subsystems, such as the
connection systems, the slabs, the navigation and the skin of the proto tower. While
all the systems and subsystems, follow their own inherent logic, when they meet each
other, they communicate, they exchange data and eventually they adapt to the various situations globally as well as locally. Therefore, we define the proto relationships,
after experimenting and exploring various paths into each system, that determine the
local and global behavior of each organ when it meets another one, according to the
situation, the location and the function. Finally, all the autonomous organs eventually
become one interconnected organism, referring to the initial natural process of the
vine growth, as a controlled method of producing structural stability with extra living
qualities.
Thus, our creature, can exist as an autonomous structure but also as a member of
a cluster, able to adapt to the external stimuli as a whole not only globally, but locally
too.
Our journey to the proto tower is triggered at its initial state from the growth of
the vine. The vine has the characteristic to develop into vertical axis when it meets
a stable vertical element, until it creates a stable and strong body and start growing
on itself. It has the ability to create structural cores and evolve on them through time.
Moreover the ground and its roots are fundamental for its vertical anaptyxis.
Therefore we conceived as fundamental parameters the helical geometry while we
focus on the concentration of the fibers in parts where the plant needs more support
and that gives us the bundling technique as our main structural system.
Our goal is to create a new “topos”, where the ground condition will trigger the
vertical anaptyxis and vice versa. This new topos is like a gradient vector field, were
the esoteric forces cultivate the growth of the seed, which will produce the new
emerging cores. At the next stage, the vine will expand and evolve at the new terrain,
while it should grow at the vertical axis, where the cores-skeletons dictate it.
The new Topos
Based on the above parameters we produce a new topos, where the ground networks
are producing populations of fibres, which can perform as structural elements, as feet
Ermis Chalvatzis, Natassa Lianou UK
215
of the prototower and by communicating and exchanging data or matter, they can
differentiate according to the fitness criteria of shape variations.We aim to introduce
building structures which will work as one unit, based on the growth of fibres which
according to their bundle degree they will produce stronger and weaker areas and
thus closed and open spaces.
So, we aim to produce internal networks which will be adaptive to their neighbors
and that means that they would be able to connect and communicate into a population, whereas the flow and the networks of the city, will live around or even inside the
proposed towers.
Since the spiral growth of the vine could not be a static structural system our task
is to change or enrich its geometry in order to work.
Thus, the concentration of fibres into places where the plant needs more support
is the fundamental parameter for our vertical anaptyxis. The bundles of our fibres occur, in order to support the structure while they produce transparent and opaque areas throughout the skeleton. Moreover, their thickness is playing a major role, since
when we will apply the double or triple layer technique, the fibres of each layer adapt,
according to their structural role. So we produce hierarchies not only between the fibres but also between the layers.
Fitness criteria
We are developing a skeletal system which is able to adapt to volume shape transformations, while the whole system could readjust in input scenarios. Thus, here we
are projecting potential shape deformations where they start breaking the symmetry
while they respond to given fitness criteria.
Code based fibres
Our main system is the skeleton. Since our main idea deals with the fiber growth, the
helix and the bundles we focus on the study of bundle degrees meaning the frequency of touching, and thus we produce catalogues of skeletal phenotypes.
Our goal is to develop one structural language between the main skeleton and its
structural connections and densities. Thus, we re-explore possible configurations and
dynamic hierarchies between the fibers that emerge.
We produce codes which based on our rules trigger the concentration of fibers.
The curves are generated together at the same time, and as they grow based on helical geometries, they are always checking their distances.
Finally, we produce curves based on scripted helical geometries which are connected through bundling and based on the exchange of material between the internal-external layers. Simultaneously, we introduce slabs which emerge accroding to
the distances of the fibres. Therefore, the skeletal systems begin to be controllable
and emerge based on the given rules, while they start to communicate the whole
skeleton with its internal layer as two initial agents of the creature.
Thus, by defining the proto skeletal set up, we determine the initial rules, the
code of the seed. We implant it the ‘information of life”,8 in order to evolve and mutate
based on it.
216
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Our code has the intelligence to adapt into differentiation of shapes by checking
distances and adding more fibers while it bundles and connects them with the existing ones. So we have produced the minimum proto system which according to the
structural needs it adds more fibers and thus more support.
Next we introduce a population of connection systems, which will grow in between the layers and will convert the skeletal system into one interconnected structural building able to accommodate, the rest sub-systems of slabs, navigation and
skin. Thus, we explore various types of connections, which could consist different
versions of our connection systems. While exploring the connection systems, we produced various methods and geometries which could consist this “communication
layer” in between the two main layers, where finally we combine two distinct systems
which could coexist into the structure according to the local and global situation, the
function and the quality of the space. Therefore, the connections, are the first subsystem which is working based on its inherent logic,while it is able to adapt and follow
the various situations of the skeleton.
Fig. 3-5
Code based fibres.
Ermis Chalvatzis, Natassa Lianou UK
217
Fig. 6
Code based fibres.
218
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 7
Code based fibres.
Secondary Connections System
Simultaneously, we need one more sub system which will accommodate the slabs distribution and which will develop where the programatic functions dictate. So our secondary subsystem is a simple pin connection system which increases the entasis of its
Fig. 8
Secondary Connections System.
Ermis Chalvatzis, Natassa Lianou UK
219
depth at the middle area, while as it moves towards the edges, it tapers out, in section
as well as in plan.Simultaneously, through the various designs of this rib connection,
we explore also neo design techniques and details which will convert this structural
element into a light beam into the space, able to produce atmospheric qualities and
senses, not only to the visitors but also to the residents as well.
Fig. 9-10
Secondary Connections System.
220
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Internal geography
Focusing on the slabs, we produce a geometry which touches the skeleton but also allows to our skin to exist not only as a surface but also a facade in depth. Thus, the slabs
are also one sub-system able to adapt according to the fibers of the skeleton, consisting the imprinted micro geography of the fibrous skeleton, the spatial trace of the choreography of the performative skeletal veins, which through their valleys they produce
atriums, while they allow the sun to insert and illume the interior of our creature.
Finally, when we combine the two subsystems of slabs and ribs, we decide that the
slabs should produce a visible reactive spatial behavior.Thus, this wrinkle quality effect, highlights the enosis the connection of these two systems.
Fig. 11-12
Internal geography.
Ermis Chalvatzis, Natassa Lianou UK
221
Performative Veins
The local unique effect of Aurora Borealis, influenced us to conceive the skin as an atmosphere of the tower similar to the earth’ s atmosphere.So our proposal for the skin,
is to enclose this atmosphere into cases, so having pneumatic membranes, ETFE cushions which evolve in between the existing fibers of the skeleton and thus this living
organism is able to insulate, protect, breath, and harvest energy. Indeed this inflated
layer can produce energy through piezoelectric technology, which means that in places were the wind hits the skin, it can convert this energy into electricity through its
oscillation, while in the same logic applies during rainfall.
Moreover, in areas where we need closed spaces, the profile based on its micro
intelligence aggregates its elements and produces opaque qualities, while in areas
where two fibres detach, the ETFE cushion pops out in between them.
Fig. 13
Performative Veins.
Navigation
The skeletal fibers and thus their profiles, perform as the main navigation system,
where a cluster of fibers accommodates the elevators. Since our fibers taper out as
they grow upwards while they change angle, we develop a proto set-up of how the
fibers will adapt to the elevator ‘s plug in. We produce a population of fibres which
cluster and produce the elevator’s shaft, while they decrease their size only at the
external area of the shaft, keeping in that way the space steady and able to accom222
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
modate one or more capsules. This proto navigations system, provides a panoramic
journey to the residents and the visitors, since it moves onto magnetic belts and thus
there is no need for extra mechanical closed volumes.
Moreover, the fibers, can perform as environmental elements. Firstly, by deforming the profile at the internal area, the skeleton could perform as passive heating system in south, where during the day they can collect the heat from sun, while during
night they can give it back to the apartment. By deforming their faces externally, in
areas where the wind speeds up, we can produce buffering zones which protect the
building’s surface, while in areas where there is sun and wind at the same time, we deform the profile in both sides and produce coinstantaneously this double effect.
Moreover, some of the fibers can perform as water collectors and thus produce
drainage networks, water collection networks and water flow network, which transfers
the water to the apartments and the other functions.
Light Operator
Finally, we produce a network of luminous veins which transfer the light
from outside to inside and from the
top to the bottom of the tower.
Simultaneously, while the ETFE
cushions harvest energy, the skeleton
converts this energy into electricity and triggers the luminous network,
while the light is reacting with the insulation material of aerogel and produce emergent light and color effects
at the facade.
Fig. 14
Light Operator.
KRAMA vers. 1.0
Finally, based on these proto relationships that we have developed, we combine all
the systems and subsystems of the tower in order to produce our mixture, our synthesis, our first version of the Krama Proto-Tower.
Thus, in this proto apparatus of a cluster of floors we combine all the above relationships into one interconnected system, where all the systems and subsystems of
skeleton, connections, ribs, slabs, navigation, skin qualities and luminous materialities
are articulated and interconnected under their proto set up relationships and produce tectonic spaces able to trigger social interactions and personal interconnections.
Entering into the occupiable spaces, we enter into the body of our creature where, all
Ermis Chalvatzis, Natassa Lianou UK
223
Fig. 15
KRAMA vers. 1.0.
of its organs are visible and where the systems project their transformations in relationship with their neighbors, through their global or local mutated topologies.
Fibrous Organism
The tower is depended on its ground, since together they produce the neo-Topos
where the ground’s fabric generates our creature. Finally, these two levels, the horizontal and the vertical, are performing as one interconnected organism, which is generated, evolves, mutates and adapts, as a whole and based on its inherent systems of
growth and communication.
Krama Variations
Finally, through the proto relationships that we have developed we produce variations of the proto tower. While different outcome editions of this organism are being produced, there is a critical parameter that differentiates one from the other. This
deals with the parameter of mutation. Mutation is triggered by adaptational behaviors, while all together trigger the evolutionary process. Architecture benefits from the
differentiation of the phenotypes, simply because there is a population of versions of
solutions for a single problem. The process of evaluation and selection is based on the
fitness criteria that the we set into our generative code. Finally, it is a game of analogous models between actual, digital and physical.9
224
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 16-17
Fibrous Organism.
Though, what is the role of the aesthetics in the evolutionary process? Could it be
an additional rule into our code? Maybe. But as Karl Chu states: “...the beauty is in the
code...”10. Similarly, the aesthetics of the vine, is not dealing with its beauty, merely as
a form, but in the degree that it responds to the reasons that it is created for. Thus, we
argue that the beauty of the encapsulated data of our seed, could be the key to the fittest and most adaptable architectural outcomes.
Notes
1 Schumacher Patrik, The Autopoiesis of Architecture, John Wiley & Sons Ltd., London 2010.
2 Huebert Rob. “A new sea”, The need for a regional agreement on management and conservation of the arctic marine environment, WWF, January 2008. October 2010
< http://assets.panda.org/downloads/a_new_sea_jan08_%nal_11jan08.pdf>.
3 “Top quality of life in Norway”, News of Norway, issue 3, 2001. November 2010. <http://www.
norway.org/ARCHIVE/News/archive/2001/200103report/>.
4 Soreff Stephen. “Suicide”, emedicine, November 2010.
5 Pettit Don. “Auroras, Dancing in the night”, Earth Observatory, Nasa, 2 January 2004.
<http://earthobservatory.nasa.gov/Features/ISSAurora/>.
6 Krama Proto-tower scenario is based on Patrik Schumacher’s Parametricism agenda.
7 Chu Karl, “The Unconscious Density of Capital” (Architecture in Vitro/ Machinic in Vivo), in Neil
Leach(ed), Designing for a Digital World (Chichester, 2002), pp. 127-133.
Ermis Chalvatzis, Natassa Lianou UK
225
8 Frazer John, “A natural model for architecture”, An Evolutionary Architecture, (Architectural
Association, London, 1995).
9 Frazer John, “A natural model for architecture”, An Evolutionary Architecture, (Architectural
Association, London, 1995).
10 Chu Karl, “The Unconscious Density of Capital” (Architecture in Vitro/ Machinic in Vivo), in Neil
Leach (ed), Designing for a Digital World (Chichester, 2002), pp. 127-133.
226
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Rita Pinto de Freitas
School of Architecture of Barcelona
Polytechnic University of Catalonia
Spain
Hybrid Architecture
Hybrid Tools
The analysis of hybrid architecture and its commitment with the built world puts in
value hybrid design tools that include among others the digital design tools.
Hybrid tools widen the range of possibilities in anticipating the qualities of the
physical space, without subordinating the definition of the qualities of space to the
tools.
On the Concept of Hybrid
The biological and mathematical foundations of the hybridisation process were laid
down in the 18th and 19th centuries by Kölreuter and Mendel, in parallel to the observation and description of hybrid vigour, or heterosis, which is the ”phenomenon by
which hybrids dispose of more vigour, fertility etc. than their progenitors.”1
It is this phenomenon -heterosis- which gives meaning to the act of hybridising.
The goal of the process of hybridisation is to attain new qualities -or new levels of
intensity- by means of the crossbreeding of diverse elements.
In hybridisation, unlike what happens in
the ‘collage,’ the original parts are no longer
recognisable in the new being. The original
components disappear as autonomous elements in the formation of a new entity.
Unlike addition, hybridisation generates
a new totality2 with its own identity and
characteristics.
The hybrid object “ascends to a richer
and more elemental totality, invigorated
by a poetic union of its minor parts.” 3
Fig. 1
Bull Head. Pablo Picasso, 1942.
On Hybrid Architecture
Definition
The use of the term ‘hybrid’ in architecture has continuously expanded and multiplied,
and this overuse has given it multiple meanings based on more or less rigorous interpretations. Few bibliographical references to hybrid architecture exist, but the
most widely documented meaning understands hybrid as “inherently multifunctional
buildings”4, where most cases are examples of addition rather than hybridisation.
Transferring the original concept of hybridisation to the field of architecture defines
architectural hybridisation as a process that, through the act of crossbreeding (or unifying) diverse architectural natures or elements, makes the attainment of a new reality
228
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
Hybridisation vs. Addition.
possible – a reality with its own identity and new architectural qualities that do not
exist if the hybridised elements are considered individually and separately.
The hybrid architecture is then the result of the hybridisation of three diverse natures
in one intervention: object-related nature, landscape-related nature and infrastructure-related nature.
Consequently, all architectural intervention is defined as hybrid that is at once object, landscape and infrastructure, an architectural intervention that simultaneously
meets three conditions:
• It is a physical intervention that, as a result of a project, proposes an architectural
space generated on the basis of human intervention.
• It is an architectural intervention, which is at the same time a landscape, beyond
simply being an object placed within the landscape: using a variety of possible
mechanisms (fusion, transformation, reconfiguration...), the architectural intervention integrates inseparably into the landscape.
• It is at once an architectural intervention and an infrastructure, beyond its connection to infrastructure: in transforming into a section of infrastructure itself, the architectural object becomes a part of the infrastructure and incorporates its laws
and mechanisms of functioning.
In the process of architectural hybridisation, the borders between architecture, landscape and infrastructure disappear in order to achieve a common architectural reality
that simultaneously possesses this triple nature.
In their diagram, “Habitat in Landscape / Habitat is Landscape”, Alison and Peter Smithson depict a first step towards diluting the borders between object and
landscape.5
Rita Pinto de Freitas Spain
229
Fig. 3
Ideograms
a. Alison and Peter Diagram: “Habitat in Landscape, Habitat is Landscape”
b. Rosalind Kraus Diagram: Hybrid Morphologies
c. Rita Pinto de Freitas Diagram: Hybrid Architecture
In the late 90s, taking it one step further, Marc Angèlil and Anna Klingmann describe
‘hybrid morphology’ as a reality in which “the borders between architecture, infrastructure and landscape dissolve, while the notion of the architectural object as a
closed entity”6 spreads, and they illustrate this with a diagram by Rosalind Krauss.
They add the infrastructural dimension and assert the dissolution of the three dimensions -architecture, landscape and infrastructure- whereby each of the realities
takes on qualities of the others in order to define itself.
In this description, architecture is associated with the architectural object, and its
three dimensions interact, without losing either their autonomy or their specificity.
The concept of hybrid architecture takes a third step by which the direct association of architecture with the architectural object disappears, and in which the elements that make up the triple reality of object, landscape and infrastructure lose their
autonomy when they are hybridised in architecture.
Elements of landscape, sections of infrastructures and the architectural object together make up the architecture, i.e. hybrid architecture.
230
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Hybrid architecture, pushed by the fact that it concentrates in a single architectural
intervention a triple object-, landscape- and infrastructure-related nature, generates
architectural answers with very specific features.
Its analysis outlines architectural hybridisation’s ability to widen the conceptual
frame of architectonic concepts, consubstantial to all the times, while at the same
time transforming the qualities of hybrid architecture into project tools serving architecture in general, concerning the definition of the qualities of space, concerning the
genesis of the architectonic project.
Regarding the different concepts that the conception of hybrid has got an impact
upon this presentation will focus on the concept of context and on the concept of
space.
Both will be exemplified in the same example two examples of hybrid architecture:
the port in Sta. Cruz de Tenerife from FOA and the Museum of Anthropology and Human Evolution in Murcia from Federico Soriano.
Hybrid architecture and CONTEXT
Fig. 4
Hybrid architecture and CONTEXT.
Considering that landscape and infrastructure are elements that belong simultaneously to both architecture and context, the inseparable relationship between architectural intervention and context emerges as a core issue with respect to the hybrid’s identity, and the description of that relationship becomes indispensable for its
outline.
In hybrid architecture, the context-intervention relationship works in two ways:
the hybrid incorporates the environment by abstracting and extracting certain of its
qualities into the genesis of the project; while at the same time it exerts a -transformaRita Pinto de Freitas Spain
231
tive- impact on the same environment by means of its subsequent incorporation into
its physical reality. “In this impossible duality of integrating and highlighting, of reflecting what exists and of denoting something new, lies its complex birth.”7
The double and simultaneous relationship of incorporation and transformation
converts the hybrid into a ‘revealing’ element of selected qualities of the context.
These three issues -the replication, transformation and revelation of the contextstructure this subchapter on the relationship of interdependence existing between
hybrid and context.
Fig. 5
Port, Sta. Cruz de Tenerife_
Foreign Office for Architecture, 2000.
The intervention takes on the structure and the width of the streets that descend from
the mountain towards the sea -parallel to one other and perpendicular to the coastline- and generates a system of land strips that make up the structure of the proposed
space.
While ‘replicating’ the structure of the streets, this spatial structure generates new
spatial solutions and joins the proposed space with the existing streets and becomes
its extension.
The new ensemble of space -with renewed unity and identity- incorporates, moreover, two other qualities extracted from the environment: ‘Plaza de España’ as a site of
confluence of the transversal streets, and the continuation of a marginal street course
-Av. Marítima i Cinturón perimetral- which runs parallel for almost the entire perimeter
of the island of Tenerife.
232
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 6
Museum of Antropolgy and Human Evolution,Torre Pacheco, Murcia_Federico
Soriano, 2006.
“The colour (of Mount Cabezo Gordo) is special, those who might call it reddish, upon
a second look at photographs, see that it has changed to a golden colour. Against its
volume, the empty voids of the abandoned mines emerge like black shadows. [...] The
project means to refer to these reflections.
It does not want to lean on the mountain, pierce or violate it.
It has looked for an intermediate position, neither far away from the access road
and thus separated, meaning that there would be no intimacy between the two, nor
on top the Cabezo, endangering its history and what is still hidden in its interior. [...]
But the project also wants to gain some attention. It has to be a singular piece on a
singular mountain. [...] Cabezo Gordo itself wants to be seen from far away.” 8
The facade is conceived as a new surface of Mount Cabezo Gordo. The definition of
its surface takes on the colour, the folds, the density, the irregularity of the black openings in reference to holes of the mines...
Seen from the road, the two surfaces are superimposed, recreating a visual reference to Mount Cabezo Gordo in the landscape.
The borders of Mount Cabezo Gordo shift and are situated along the new borders
of the proposed construction; while at the same time, the borders of the proposed
construcion are situated along the borders of the mountain.
Rita Pinto de Freitas Spain
233
Hybrid architecture and SPACE
Fig. 7
Hybrid architecture and SPACE.
Mobility and the demand for physical continuity as decisive elements in the spatial
configuration of a hybrid require an investigation of spatial solutions capable of resolving changes in level without abandoning continuity and favouring the implementation of spatial order as suggested by Paul Virilio in the 60s: oblique order.9
The oblique order and the topological dimension are the main topics of this quality:
the conception of space as an intrinsically three-dimensional reality and the abandonment of the superiority of some spatial ‘directions’ over others.
In 1965, Paul Virilio introduces a third spatialorder, the oblique order.
“After the horizontal order of the rural habitat, of the agricultural era and the vertical order of the urban habitat of the industrial era, consequentially and logically -‘topologically’ one could say- the third, oblique order of the post-industrial meta-city arrived.” 10
Oblique order is not the sum of the other orders, but the creation of a third order, of a
new order.
An order that goes beyond the articulation of horizontal surfaces with vertical surfaces by ”the introduction of the inclined surface as a space that simultaneously accommodates space for circulation and space for habitation.” 11
With the introduction of the oblique order -associated with the introduction of the inclined surface as fundamental element of the project- it is possible to overcome the
polarisation and the confrontation of vertical order and horizontal order, while at the
same time offering new spatial possibilities.
234
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 8
Habitable circulation, Paul Virilio.
Oblique order allows the linking of spaces to a more static use and the integration of
spaces of circulation in a hierarchical equality, without some of them prevailing over
others.
As to the use of space, oblique order multiplies the surface available for incorporating diverse uses and allows the interchangeable inclusion of programmes of a predominantly dynamic or predominantly static character.
Contemporary architectural discourse witnesses a progressive increase in the use of
the term typology to describe certain artificial landscapes; at the same time, this term
is gradually displacing the -until recently more often used - term topography.
Taking the simplification of comparing these two terms to an extreme, in the field
of architecture one could associate topography to the qualities that describe the deformation of a surface, and topology to the qualities that describe the deformation of
a space.
Physical continuity as the decisive quality of the space required by the hybrid introduces -as has been described previously- the search for spatial solutions capable of
resolving the change in conditions of the space in general and the change in level in
particular.
The three-dimensional deformation of surface (and of mass) -topological deformation- is one of the mechanisms that allows the achievement of this continuity with the
ability to confront any direction of the space whatsoever.”
Topologically ‘deformed’ space stops being subjected to the dominance of verticality and horizontality, so as to take on diagonalisation, “presenting a diagonal structure
instead of a structure determined by gravity.” 12
Rita Pinto de Freitas Spain
235
In deforming a spatial system made up of two-dimensional surfaces -with the objective of maximising the continuity of surfaces- ‘three-dimensional surfaces’ can be
created.
The three-dimensional surfaces are ‘topological surfaces’ with the ability to articulate diverse surfaces, levels and directionality in a single physical reality without its
continuity being interrupted.
Fig. 9
Port, Sta. Cruz de Tenerife_FOA, 2000.
Fig. 10
Port, Sta. Cruz de Tenerife_FOA, 2000.
236
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
With the proposal for the port of Santa Cruz de Tenerife as a ‘three-dimensional surface,’ the space does not allow its graphical representation in independent horizontal
‘floor plans.’ However, the two-dimensional representation of the entirety of the surfaces that compose the three-dimensional surface, has enabled the marking of the areas reserved for circulation and the detection of where they superimpose on one other (knowing that beyond their superposition they are also connected with each other).
The two-dimensional representation transforms into a map of densities with regard to
the flows of circulation
Fig. 11
Museum of Antropology and Human Evolution,Torre Pacheco, Murcia_Federico Soriano, 2006.
The primary programme, occupying almost the entire available surface of the ‘Museu
de Antropología y de la Evolución Humana’ is a route: descending from the upper level, where visitors begin their visit, replicating the structure of access to the abandoned
mines existing in Mount Cabezo Gordo.
“Its interior spatial form brings to mind the perceptive conditions of ‘Sima de las
Palomas’ and the descending array of the exhibition rooms represents a journey into
the depths of the origins of mankind.” 13
The entire surface is simultaneously at once a space of circulation, a space of exhibition, a space of contemplation...
The habitable space becomes at once static and mobile, contemplative and active.
Value achieved through hybridisation
The main value achieved through hybridisation is its responsibility and ability of
hybrid architecture to configure the common space beyond the physical area of
intervention:
• Implicit in the value of the hybrid is the value of an architectural practice that assumes all its responsibility in the configuration of the qualities of the common
space, as well as all its potential for urban transformation and reconfiguration of
the landscape.
• Implicit in the value of the hybrid as well is the value of an architectural practice
that is conscious of its inevitable impact on physical reality -beyond the limit of its
area of intervention- and that maximises its compromise with that reality.
• Implicit in the value of the hybrid is the value of an architectural practice that
wants to shorten the distance between the disciplines of the architectural project,
urbanism and landscape design.
Rita Pinto de Freitas Spain
237
On Hybrid Tools
Conceptual tools
Once formulated, the qualities of hybrids turn into main conceptual tools concerning
architectural design in general and the genesis of the project in particular.
Independent from the graphic tools these conceptual tools are responsible for significant initial project decisions: attitudes towards ineludibly questions of architecture
like relation with the context, limits, relation with the ground, scale, mobility...; decisions on the spatial structures adopted, on the strategies of order followed,...
These decisions linked to the genesis of the project should be materialised in a coherent way through the formal definition of space.
Graphic tools – Digital tools
The graphic tools in general -that include digital and non digital tools- serve this formal definition, the formal control of the space of hybrid architecture, of architecture in
general.
Subordinated to the conceptual tools that have a particularly strong presence in
the genesis of the project, the graphic tools allow the proposal to “receive” a form.
In this process, the use of digital tools allows an anticipation, control and visualisation of spaces that wouldn’t be possible with the single use of traditional graphic
tools. They widen the spatial possibilities to be proposed by architects, what means
that they widen spatial possibilities to be conceived by architects.
It’s important, however, to give these tools the role that belongs to them in the
design process: The role of a tool.
A tool is an instrument, that serves a goal, but should never be confused with the
goal itself.
Hybrid tools
Talking about design tools, about graphic tools, a confrontation of digital vs. non-digital could emerge.
But this is would be a false polarity: Digital tools are just “new” tools that have
been added to the set tools that already existed in architectural design, and should
not placed in opposition to it.
But, with the level of automatism that the digital tools can have, and in front of
tools with a very high potential that goes further beyond what we know or are able to
use, a real and important polarity emerges:
Conceptualization of the design process vs. digitalization of the design process.
And it is in the dissolution of this polarity that hybrid tools emerge: The design tool
that uses all the potential of the digital tools -and of all other graphic or material
tools- without getting detached from the conceptual tools, is the one that can assure
criteria, coherence, intention…during all the process to the space being generated.
The formal definition of space reformulates the conceptual tools, the conceptual
tools ‘guide’ the digital tools, in a continuous and interdependent process.
238
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
The conceptual and the complete set of graphic tools and material tools form the
hybrid tools and as was said at the beginning about the hybrid architecture, we can
now say that the hybrid tool “ascends to a richer and more elemental totality”14.
Notes
1 Diccionari de la Enciclopèdia catalana, 1997. Barcelona: Enciclopèdia Catalana, S.A. [ from fr.
hybride, and this, from the latin hýbrida, íd.].
2 von Mende , J. and Ruby, A. , 2000. Hybrid Hybris. Daidalos, núm.74, p. 80-85.
3 Kenneth I. ,K., 1985. Heterotic Architecture. In: Hybrid Buildings. Pamphlet Architecture, LTD,
núm.11. New York, San Francisco: William Stout. Architectural Books.
4 dos Passos, J. Hybrid Buildings, 1985. In: Hybrid Buildings. Pamphlet Architecture, LTD, núm.11.
New York, San Francisco: William Stout. Architectural Books, 1985.
5 Smithson, A. 1968. Team 10 primer. Londres: Studio Vista.
6 Angélil, M. and Klingmann, A. 1999. Hybrid Morphologies - Infrastructure, Architecture, Landscape. Daidalos, núm. 73, pp. 16-25.
7 Soriano, F. 2007. Museu de Antropología y de la Evolució Humana, Torrepacheco, Murcia. El
Croqui, núm.137: pp. 252-359.
8 Soriano, F. 2007. Museu de Antropología y de la Evolució Humana, Torrepacheco, Murcia. El
Croqui, núm.137: pp. 252-359.
9 Virilio, P. 1995. 1995. La Función Oblicua. BAU. n. 13 (2nd sem.).
10 Virilio, P. 1995. 1995. La Función Oblicua. BAU. n. 13 (2nd sem.).
11 Virilio, Paul & Parent, Claude. Architecture. Principe 1966 und 1996. Paris: Les Editions de
l’Imprimier. 2000.
12 Zaera, A. and Moussavi, F. 1998. La reformulació del sol. Quaderns d’Arquitectura i Urbanisme,
n. 220, pp. 36-41.
13 Soriano, Federico. 2007. Museu de Antropología y de la Evolució Humana, Torrepacheco, Murcia.El Croquis, n. 137: pp.252-260.
14 Kenneth I. ,K., 1985. Heterotic Architecture. In: Hybrid Buildings. Pamphlet Architecture, LTD,
núm.11. New York, San Francisco: William Stout. Architectural Books.
Rita Pinto de Freitas Spain
239
Dimitris Psychogyios
Department of Architecture, NTUA
Greece
Collective Design
for Hybrid Museum
Environments
Terminology
(Hybrid) The study aims to examine at the same time the physical and the digital museum space within a common context. It is trying to map the intermediate space.
(Museum) In particular this research focuses on designing environments with cultural
content, such as museums, archaeological sites, exhibition areas etc.
(Environments) The paper proposes the term “Environment” instead of the term
“Space”, aiming to support that the deployment of information technology in the existing physical space shifts the concept of space and the way of design. For example,
in the City Museum, the outdoor area (city) can be transformed into a Museum “space”
and vice versa urban experience can be transferred to the Museum.
Description of the Research Problem
(The museum in transition) In recent years, changes in the organization and direction of the Museum are swift and extreme. The identity of the museum is changing
and is constantly exploring. If one refers to history, finds repeated changes in the
role and the way they work. (The continuum, Design-Construction-Use) The penetration of information technologies in museums, seem to have strong role both in its
identity and in its design. Papalexopoulos1 (2006, pp.95-102) states there is a shift
in manufacturing and construction of the “digital”, which has abandoned the search
for virtual spaces and is now asking for the unbroken connection with the materiality of things. The “digital” does not make sections, but falls within the existing and
gradually changes it. The new field is formed by interfering with the materiality of the
construction process on the one hand and the materiality of the construction on the
other .These observations raise questions such as: How we design a museum which
is in transition? How we deal with the continuum Design – Construction – Use in the
case of museums? And finally how the museums are designed and function in the
new living conditions?
Explanation of the importance and relevance of the problem
The changes brought about by information technologies in the museum are located
in two combined areas. The first relates to the very purpose of the museum. The second refers to “digital” design tools of the museum. (The potentiality of Museum) The
introduction of digital technology has brought the architects confronted with two
new data. The operation that museum is called to “accommodate”, is almost always
hub networks, while its main characteristic is the change in time (The Potentiality of
Museum Design) As far as multi - disciplinarity is concerned, it is gradually observed
that design platforms which ensure common access of multiple disciplines for the
production of the final result, are developing, which does not lead to a static solution
but to a parametric dynamic solution that is open to continuous change. Regarding
the interaction, a continuous circular process of planning is apparent, which is the
movement when we “cross” firstly from the design information, then the production
information, the construction information and finally the use information and the
process begins again from its start.
242
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
In his blog2 about the knowledge communities Linardos Rylmon states (translation by
me):
“...The post-Fordism period (methods of production of the Ford company
named, especially from Marxists scholars, Fordism) is marked by the definitive doubt of this segregation between manual, fragmented and nonspecialized work of the great majority of workers and by mental work in
the field of production management and organization, which was only
fed by the educational system and the scientific research system. Several
parallel developments have contributed to this break. The continuing rise
of the educational level of the working population, and therefore the ability to question the Taylors’ (Frederick Winslow Taylor, 1856-1915) organization and its pace, led to renewed forms of work organization with the
workers involvement in management processes and organization (like the
Japanese model) and then the wider involvement of production employees in innovation processes related to technology, organization, or quality
of goods or services. The engagement with the administration, organization and innovation, overflowed the entrenched organizational and social
management space, to become the object of activity of extensive layers of
employees, leading to the overthrow of the type of working relationship
of Fordism period (methods of production of the Ford company named,
especially from Marxists scholars, Fordism): new employees should, however, either acquire the appropriate incentives for stronger bonds with
authorities, or be transferred to an area of uncertainty and insecurity, to
be persuaded in this way that their cognitive abilities do not in any way
imply that they have changed in their advantage the balance of power.
The spread of education and in particular the higher education in large
sections of young people, and the expansion of research activities within
or outside the universities, fueled procedures of obtaining and producing knowledge, that no longer concern the social space of employers and
management, neither they respond strictly to the reproductive needs of
capitalist societies. In the large movements that marked the postwar peDimitris Psychogyios Greece
243
riod, such as the anti-colonial, the anti-imperialist, the movement for human rights and the movement for the environment, not only the student
world participated, but also a calculable part of the scientific potential of
educational and research institutions, in conjunction with initiatives of educated citizens who could now massively exploit knowledge arising from
the scientific world. This period marked the strong tendency for liberation
of the movements from the political parties’ hierarchy and their autonomisation with the use of cognitive capital coming from the institutions of
knowledge production. It also marked the development of autonomous
social initiatives, in the forms of social economy enterprises, NGOs, or independent “knowledge communities”. And [it marked] the extension of the
importance of individual paths in acquiring knowledge and experience
within education and production systems, a development that appears
to be inherent to the functioning of the market mechanisms, but is actually a new possibility of individual contributions to social initiatives and
innovations...”
The museums form official organizations of knowledge production and therefore
could not be untouched by the developments, as described above. Museums are involved in initiatives of citizens who can now massively take advantage of knowledge.
Also within the same organization of museums, sometimes less and sometimes more
autonomous social initiatives grow, in the form of social economy enterprises, NGOs
or even autonomous “knowledge communities”. As well through their own museums
institution, individual paths to obtain knowledge and experience are being developed. The arising question from the above is which are the design tools used by the
architect to respond to this fluid situation.
Analysis of Existing Solutions or the Methods
(Linear vs Non Linear Design Process) With the entry of information technology in architectural design there has been a shift in the design process. Since the linear process (design - construction – use) where the main feature was the distinction between
activities in time and “mechanical” assembly, now we pass to non-linear, generally
continuous processes, where several stages are being difficult distinguished. The information of design is now information of construction and use (Architectural creativity vs Participatory Design ) A second shift occurs in the design as a participatory
process and not as a result of the “concept” of an architect or of an “architectural creativity”. The participatory design can be distinguished in two levels. The first concerns
the relationship of the architect with other disciplines of the project (employees /
civil engineers / engineering consultants /etc). In this case, digital technologies are
integrated into the design and provide a common field of work. Also they are entering the building construction and allow constant communication between design
and construction. The second level of the participatory design concerns the relationship between the architect and the user. Once more in this case the role of information technology is catalytic. The basic assumption is that the user, who does not have
the mood in advance to determine his needs, finds him in the centre of the design
problem.
244
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Presentation of some Signs of new Proposals and Methodologies
Some signs of new methodologies that can used for the design of Hybrid Museum Environments are Collective Design as an education process of social learning arising from
public’s participation in decision making. The participation, is not a goal in itself but is
an ‘education process’ (social learning) arising from public participation in decision making, that is the progressive properties development of a complex system consisting of
people belonging to different social groups – sometimes being in conflict- to complete
work on a common domain. Therefore the public education (social learning) is both a
process and a result (Bouzit and Loubier, 2004). The nature of the participatory design
process depends on what is designed and on the aim of the initiative. In some cases,
the public needs only to be informed about specific initiatives or some aspects of these
initiatives, while in other (cases), public’s opinion is necessary in order to improve decisions and to ensure the sustainability of the initiative (Van Jaarsveld Romy, 2001).
Knowledge and Innovation communities, as an opportunity of individual contribution
to social initiatives and innovations. A KIC as mentioned in the web site of European
Institution of Innovation and Technology is a highly integrated, creative and excellence-driven partnership which brings together the fields of education, technology,
research, business and entrepreneurship, in order to produce new innovations and
new innovation models that inspire others to emulate it. They are to become key
drivers of sustainable economic growth and competitiveness across Europe through
world-leading innovation. The KICs will be driving effective “translation” between partners in ideas, technology, culture, and business models, and will create new business
for existing industry and for new endeavours.
Living Labs4 as User centred research methods. A living lab is a research concept. A living lab is a user-centred, open-innovation ecosystem, often operating in a territorial
context (e.g. city, agglomeration, region), integrating concurrent research and innovation processes within a public-private people partnership. The concept is based on a
systematic user co-creation approach integrating research and innovation processes.
These are integrated through the co-creation, exploration, experimentation and evaluation of innovative ideas, scenarios, concepts and related technological artefacts in
real life use cases. Such use cases involve user communities, not only as observed subjects but also as a source of creation. This approach, allows, all involved stakeholders
to concurrently consider both the global performance of a product or service and its
potential adoption by users.
Signs of new proposals
We will look at 4 digital platforms with cultural content, mainly on the possibility they
offer of public participation in design, production and dissemination of digital cultural
content.
Google Art Project as mentioned in the wikipedia5 Google Art Project is an online compilation of high-resolution images of artworks from galleries worldwide, as well as a
virtual tour of the galleries in which they are housed. The project was launched on 1
February 2011 by Google, and includes works in the Tate Gallery, London; the MetDimitris Psychogyios Greece
245
ropolitan Museum of Art, New York City and the Uffizi, Florence. The “walk-through”
feature of the project uses Google’s Street View technology. The project includes 16
images over one gigapixel in size (over 1 billion pixels); the largest, Ivanov’s The Appartition of Christ to the People, is over 12 gigapixels. By comparison, a typical digital
camera takes pictures at 10 megapixels, or about 1000 times smaller in area. The platform is completely closed, both for the curators and the audience. The audience has
the ability to navigate through predefined options and study at a really great resolution only some of the museums’ artifacts. The audience is a spectator.
Empedia as mentioned in the official web site6 Empedia offers in cultural organizations
an effective way to develop guides. Cuttlefish Multimedia are working with museums across the East Midlands to help create guides for Empedia. This activity is helping them to develop the necessary authoring tools and features for the public release
246
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
later this year. Ultimately, museums will be able to upload content and manage their
own guides. Empedia can be used to deliver interactive content for venues, attractions
and exhibitions. Empedia supports text, images, audio and video and provides simple mapping tools for creating location and trail content, what they term ‘locative content’. Empedia is suitable for cultural trails in towns, cities or remote rural locations and
for exhibition guides indoors or out. Empedia really comes into its own when using
the free Empedia iPhone App. The website and iPhone App are fully integrated so that
Guides created on Empedia are automatically published to the iPhone App. Essentially
this means that museums and organisations can publish to iPhones in just one click!
Empedia works on PC, Mac and is optimised for iPad displays. Mobile content can be
viewed on iPhone or iPod Touch and there are plans for Android support later this
year. In this case the audience is a spectator. The platform is a tool for exhibition curators. The aim is to allow the curator to organize the content which will be accessible
via the internet, on any personal mobile devices.
Museum of the city as mentioned in the official web site7 is a digital platform where
everyone is a curator. Museum of the City It is a virtual museum about the world’s cities – past, present and future. It is about the design of cities and the people who live in
them. It will be a forum for the exchange of knowledge, ideas, and informed opinion
between specialists and civilians. It is a collaborative project of Portland State University and the Museum of the City, Inc., an Oregon 501(c)(3) nonprofit corporation. As
mentioned in the web site is: “A PLACE TO VISIT As” you learn new things, make notes
about the type of exhibit you might wish to create and share with others, here in our
virtual museum. “A PLACE TO SHARE your City with Others” Unlike most museums, the
Museum of the City is a virtual space, a digital museum with electronic galleries in
which citizens, students, scholars, curators, and professionals in the study, planning &
design of cities are invited to submit their own exhibits. The Exhibit Information Page
explains how to share your knowledge & discoveries with all of us. In this case the
platform enable the co-curation. One can download an exhibit tutorial8 and create a
personal narration for the city museum. Its important to report that the audience As
Contributor to the Museum, agree to the following terms9:
Dimitris Psychogyios Greece
247
• You state you have the right and authority to share and contribute images and
text that you are providing to the Museum of the City. You have obtained the necessary authorizations and copyright clearances if you did not create the image or
text yourself.
• You agree the Museum of the City may use, for as long or as briefly as it wants, the
images and text in your exhibit. The Museum of the City may display the images
and text on its website or use the images and text in any way it chooses.
• You agree the Museum of the City may edit your text for length, syntax, spelling or
English to make it understandable to global users.
• You agree that the Museum of the City may use your name as the Exhibit Creator
or as a Contributor.
• You agree the Museum of the City has the right to remove any image or text it
deems offensive or in violation of copyright law.
• You agree the Museum of the City may accept contributions to the exhibit if the
Museum of the City believes the contribution will make the exhibit more interesting or meaningful to its visitors.
A meipi as mentioned in the web site10 is a collaborative space where users can upload
information and content around a map. Each meipi has a particular context, which
can be local (when the entries are related to a specific area), or thematic (when the
content is associated with a particular idea). How can a meipi be useful? A meipi allows a group of users to share information around a place or a topic. It can be very
useful for collaborative dynamics, workshops, associations, enterprises, groups of
friends, artistic actions... In meipi.org they already have several meipis created by different users. They cover different areas and topics, showing what a meipi can offer.
Registration is not necessary in order to navigate through the different public meipis
and access the information contained in them. Registration is needed to upload entries, and to rank and comment on them, and also to create meipis. Registration process is very quick and easy, and it has to be done only once for all the meipis where
you want to participate; it is not necessary to register each time. One can open a new
meipi in Meipi.org or install a meipi on his own server. In this case the platform is free.
248
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
There is not differentiation between the curator and the audience. One can upload
content, nobody else can modify the content.
Conclusion
We note, throw these four examples, that there is a wide range in the possibility of
public participation in the design, production and dissemination of digital cultural
content. Using the classification of the audience as stated in the Brown A. Novak-Leonard J.11 study we observe that these digital platforms provide equivalent levels of
freedom to the public. Thus we have the following categories of public involvement
in the production of cultural content: spectating, enhanced engagement, crowdsourcing, co-curation, the audience as curator and of course a wide range of intermediate
and combinatorial categories.
• Spectating is fundamentally an act of receiving a finished cultural product. It is
therefore outside the realm of participatory arts practice.
• Enhanced engagement. Educational or “enrichment” programs may activate the
creative mind, but for the most part do not involve creative expression on the part
of the audience member.
• Crowd sourcing. Audience becomes activated in choosing or contributing towards
a cultural product.
• Co-curation. Audience members contribute something to a curator experience curated by a professional curator
• Audience as curator. Audience members substantially take control of the curator
experience; focus shifts from the product to the process of creation.
Notes
1 Papalexopoulos, D., 2006. The representation of the continuum: Design-Construction-Use. In: V.
Trova, K. Manolidis, G. Papakostantinou, ed. 2006. The representation as a vehicle to architectural
thinking. Athens: Futura, pp. 95-102.
Dimitris Psychogyios Greece
249
2 Λινάρδος - Ρυλμόν Π., 2010. Γνώση, εργασία και συλλογική δράση. Μέσα από το πλήθος, [blog]
18 April 2011. Available at: < http://withinthemultitude.blogspot.com/p/blog-page_4815.
html > [Accessed 21 November 2011]. (translation by me)
3 Knowledge and Innovation Communities: overview, 2011. NEuropean institute ofInnovation
and Technology. [online] Available at: <http://eit.europa.eu/kics1/knowledge-and-innovationcommunities/overview.html> [Accessed 21 November 2011].
4 Wikipedia, 2011. Living Lab [online] Available at: < http://en.wikipedia.org/wiki/Living_lab>
[Accessed 21 November 2011].
5 Wikipedia, 2011. Google Art Project [online] Available at: <http://en.wikipedia.org/wiki/Google_Art_Project> [Accessed 21 November 2011].
6 Empedia, 2011. Empedia info [online] Available at: <http://empedia.info/about> [Accessed 21
November 2011].
7 Museum of the city, 2011. About the museum [online] Available at: <http://www.museumofthecity.org/> [Accessed 21 November 2011].
8 Museum of the City, 2011. Building an exhibit for the museum of the city. [online] Available
at: <http://museumofthecity.org/sites/default/files/Exhibit%20Tutorial_0.pdf> [Accessed 21
November 2011].
9 Museum of the city, 2011. User account [online] Available at: <http://www.museumofthecity.
org/user/register> [Accessed 21 November 2011].
10 Meipi. Collaborative spaces production over a map, 2011. What is mepi [online] Available at:
<http://www.meipi.com/lang/en/about> [Accessed 21 November 2011].
11 Brown A. Novak-Leonard J. in partnership with Gilbride S., 2011. C Getting In On the Act How
arts groups are creating opportunities for active participation. [online] James Irvin foundation.
Available at: < http://irvine.org/images/stories/pdf/grantmaking/Getting-in-on-the-act2011OCT19.pdf> [Accessed 21 November 2011].
250
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Georgia Voradaki
Despoina Linaraki
Technical University of Crete
School of Architecture
Greece
Responsive Architecture
as a Tool to Suppress
the Psychological Disorders
of an Individual
and Replace Medicines
This research follows a multidisciplinary approach through the fields of architecture,
psychology and pharmacy science in order to investigate how the immediate or the
broader spatial environment could suppress the human psychological disorders and
eliminate the use of medicines to some extent, based on the principles of transformable architecture. It aims at helping individuals overcome difficulties that contemporary lifestyle imposes on them and stressful conditions which they experience. These
difficulties may lead them to overuse of drugs (e.g., antidepressants, antipsychotics
and others) and addiction.
Today, the statistics indicate that humans suffer from a variety of psychological
disorders. The National Institute of Mental Health in the U.S.A. has studied the percentages of American population and the outcome of the research has shown that
26.2% of the adults in the U.S. (that is 57.7 million people) suffer from a diagnosable
psychological disease during a year (Nimh, 2010). Moreover, one in five families has
a member suffering from that kind of diseases. Those disorders, like depression, phobias, bipolar disorder and schizophrenia, are caused by everyday situations that all humans may experience: stress due to hard work, an argument with a beloved person,
or a separation could be possible reasons. Professor George Piperoloupos points out
that one in four persons in Greece may experience some kind of neurosis or psychosomatic disorder due to contemporary social and economical conditions. Furthermore,
“Internet addiction can cause serious psychological diseases, mostly in young ages” (Piperopoulos, 2010).
According to the Citizens Commission on Human Rights International approximately, 100,000,000 people use drugs daily in order to feel better (CCHRGR, 2011) and
a great number of them are addicted to the overuse of medicines (Kastellakis, 2007).
The consecutive use of drugs affects the human organism in a negative way, and a
withdrawal syndrome can appear after the therapy period, as well. In the beginning,
the consumption of this kind of chemical substances creates a sense of pleasure and
satisfaction, but, after a short time period, the human body and the brain get used to
them and the individual needs more and more doses (Nestoros, 2007) (Fig. 1).
The research evaluates all the above facts and the importance of psychological health
and pursues an innovative solution to the problem. Concisely, it is suggested that the
appropriate management of the spatial qualities could help the human overcome
everyday stressful situations and eliminate the use of medicines to a feasible extent.
Initially, it was examined the human sensory systems, the endocrine system and
the human feelings, as well as their interrelation and the role of the brain in this process, in order to be clear and understandable the function of the human organism.
Therefore, it was analyzed the whole procedure from the moment of the perception
of a stimulant to the final production of behavior and feelings as a response to that
stimulant. To achieve that, knowledge from the fields of Biology, Pharmacy and Psychology has been combined. In addition, Biological Psychology studies the relation
between the nervous system and human behavior, and the endocrine system, as well
(Kalat, 2001).
The research has shown that the human body has a plethora of different kinds
of receptors, and each kind of them is specialized in recognizing specific stimulants.
When the individual perceives a stimulant from his environment through his senses,
the information is transferred through the nerves from the Peripheral Nervous System
252
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 1
Fig. 2
to the Central Nervous System and then to the brain. The brain processes the information and commands the Central Nervous System and then the Peripheral Nervous
System to produce specific hormones and behavior. While the brain causes the reaction, the endocrine system and the nervous system are responsible for carrying out
the reaction. Hence, the process that takes place is as following:
Stimulant > Peripheral Nervous System > Central Nervous System > Brain >
> Central Nervous System > Peripheral Nervous System >
> Reaction (hormones and behavior) (Fig. 2)
The hormones are the link between the human senses and the feelings, the mechanism that converts the message of the stimulant to emotions. Therefore, it was further
studied how the senses of vision, smell, touch and hearing work and the role of specific hormones in changing individual’s mood (Fig. 3). The choice of the hormones examined in this research is based on the feelings which each one of them causes when
released in the blood and these are melatonin, dopamine, endorphins, oxytocin, serotonin, cortisol and adrenaline (Fig. 4).
Melatonin is released in the blood and controls the transition from sleep to vigilance and vice versa in a safe way (Cardinali, 1998: 9-15). The dark contributes to the
production of melatonin while the light prevents it. When the human is exposed to
bright light during the nighttime or to dim light during the day, the production of melatonin is affected and as a result, the human organism does not function properly.
According to researches, possible disorder of its levels is related to seasonal affective
depression and other important symptoms, such as insomnia and fatigue (Vander,
Sherman, Luciano, and Tsakopoulos, 2001: 398).
Dopamine is responsible for several functions in the brain. It influences the behavior, the motion, the sleep, the mood, the motivation and the learning. Insufficient
Georgia Voradaki, Despoina Linaraki Greece
253
Fig. 3
Fig. 4
quantity of it can cause Parkinson’s disease which makes the individual unable to
move normally (Fahn, 2006). Dopamine is also connected with the motivation of the
human to have certain activities and the feeling of pleasure.
Endorphins are produced during exercising or when the human feels excitement,
pain or love. They resemble to opium substances and have the ability to cause analgesia and euphoria and to suppress the feelings of stress and aggressive behavior.
Oxytocin is the hormone of love and trust. It can be produced by caress and an experiment conducted at the Medical School of the Miami University proved that after a
thirty-minute massage on the neck, the oxytocin levels are increased and the feelings
of depression and stress are reduced (Medical School of Miami ).
Serotonin controls the human’s mood, sleep and appetite. It calms the body and
causes euphoria. Researches show that serotonin’s low levels are related to depression
and the psychologist Molly Crockett (2009) of the University of Cambridge proved
that there is a connection between serotonin’ s abnormal levels and the aggressive
behavior of the individual (Crockett, 2009: 76-86).
The hormone cortisol is being released when the human experiences stressful conditions (Κοld and Whishaw, 2009: 310-2). It is the response of the organism to
stress and it activates the human metabolism (Neave, 2008: 201-4). However, the continual excretion of it due to stress, can have undesirable consequences.
Regarding the feelings examined in this research, these are aggressive behavior, anger, stress, depression and fear, and they all are negative feelings with social roots,
which the human often feels unable to deal with and starts using drugs (Fig. 5). Nowadays, studies indicate that hormones can affect the human feelings to the extent
that senses do (Neave, 2008: 53-5). The results led to the conclusion that the human
senses receive stimulants from the surrounding space which are transferred to the
brain through the nervous system. Then, the brain orders the excretion of the corre254
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 5
Fig. 6
sponding hormones as a response to the stimulants. Ultimately, the range in the levels
of the hormones in the blood causes changes in the human’s feelings. According to
the above procedure occurring in the body, it is an indisputable fact that architecture
could suppress the psychological disorders, since the spatial stimulants offering to
the human can be controlled. The spatial qualities, for example the red color or the
smooth texture of a surface, have a great impact on the user’s emotions. Thus, it was
tried to investigate how certain qualities are related to certain feelings, based on scientific experiments already conducted (Sommer, 2008). The spatial qualities examined
are the textures and the temperature in a room, the music or the noise levels in it, the
smells effused, the lighting and the colors prevailing on the surfaces (Fig. 6).
More specifically, the bibliographic research has shown that smooth and flat textures, like glass or plastic materials, raise a pleasant and calm feeling. They imply clearness and serenity and make the individual feel relaxed. Additionally, they suppress
stress and anger by increasing the serotonin levels (http://www.bookrags.com/tandf/
texture-perception-tf/, 2010). Rough textures, for example stone textures, correspond
to violent and hard attributes and they intensify the contrast between light and shadow. When the user touches such kind of surfaces in his surroundings, he is urged to
action and feels energetic, and the hormone dopamine is being produced. However,
high levels of it in the blood can cause anger. Hence, rough textures could be useful in
some cases, for instance in a gym, but they should be avoided when it concerns education rooms. Lastly, soft, fluffy and unshaped textures enhance a sense of warmth,
safety and privacy. They cause an increase in the levels of oxytocin and the human
feels protected and he can easily trust someone, as well. Soft textures are pleasant and
appealing to the human senses. When they are touched by the individual, they could
be curative and reduce fear (Fig. 7).
Regarding the temperature of a room, the suitable temperature for the human
when he rests is 18-20 oC and when he works it is 15-18 oC. In cases of completely imGeorgia Voradaki, Despoina Linaraki Greece
255
Fig. 7
Fig. 8
proper temperature, the human is not able to cope with his activities and the brain
commands for adrenaline excretion. As a result, he might feel nervous and have aggressive behavior, or even stress (Bell, Green, Fisher and Baum, 2005: 169-72) (Fig. 8).
It has been scientifically proved that music has positive impact in cases of stress,
insomnia, depression, psychosomatic disorders, educational difficulties, pregnancy
and social rehabilitation (Dritsas, 2007). Harmonic sounds activate brain areas that
are related to pleasure and satisfaction and they reduce cortisol levels and thus stress.
They impel the production of endorphins, as well (Fig. 9).
Furthermore, high noise levels are the reason for many psychological and body
disorders in contemporary cities. The traffic noise, the noise of building construction
or the noise due to housework affect the nervous system and cause an increase in
adrenaline and cortisol levels, resulting in stress, aggressive behavior and symptoms
like headaches, seasickness, fatigue, insomnia and nervous shock. According to calculations, one in three Greeks suffers from psychological problems because of environmental issues. Also a research conducted in the Lund University Hospital on a sample
of 28,000 people, showed that exposure to noises over 60db increases the danger of
hypertension by 25%, while noises over 64db increase by 90% (http://ixoripansi.gr,
2011) (Fig. 10).
Regarding the sense of smell, aromatotherapy has a great impact on the brain
and the body because it awakes memories and emotions. It contributes to the reduction of stress, which may lead to numerous pathologic problems, such as depression,
apathy and melancholy. Although there are hundreds of different essential oils, not
all of them have been studied about their beneficial attributes. However, a research
that was held in Ohio University in 2008 among 56 healthy people has shown that
the smell of lemon can improve the mood notably and helps the human relax, since
it increases the serotonin levels in the blood (http://neuroscientificallychallenged.
blogspot.com, 2008). Several studies have proved that lavender has a soothing effect
256
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 9
Fig. 10
on stress and insomnia and it promotes the reduction of endorphins. Furthermore,
during an experiment on people working in a room aromatized with rosemary scent
in 2003, they were more vigilant and had better performance than those working in
odorless spaces (Moss, Cook, Wesnes, and Duckett, 2003). In addition to the previous
experiment, another research occurred in 2007 concluded that after breathing rosemary for five minutes, the levels of cortisol in the human organism reduce and thus
the feeling of stress is suppressed (Moss, Cook, Wesnes, and Duckett, 2003) (Fig. 11).
The quality and the quantity of the lighting which the individual is exposed to daily,
controls several biological functions, like the biorhythms and the production of hormones, as well as the mood and the emotional state of the human. This happens because two hormones affected by light, melatonin and serotonin, are crucial to psychological health, even for people with vision problems or for the blinds. More specifically,
low light levels or lack of light cause excretion of melatonin and prevent the production of serotonin and thus the human is getting prepared to sleep. In contrast, vivid
light suppresses the production of melatonin and increases the serotonin levels, urging the human to action. Imbalance of the normal hormonal levels in the body, that is
the continuous exposure to light during nighttime or the lack of adequate light during daytime, can cause symptoms such as insomnia, fatigue, stomachache, headache,
and psychological disorders such as depression and seasonal affective disorder (SAD)
(Vander, Sherman, Luciano, and Tsakopoulos, 2001). The usual treatment for these cases is phototherapy with natural or artificial light and the prescription of medicines that
change the serotonin levels (Fig. 12).
Colors have been proved to change the brain activity, as well as the production of
the hormones related to the mood and the energy levels. It is considered that every
color interacts with the endocrine system in a different way, in order to support or to
prevent the production of certain hormones. Dr. Willard R. Daggett και Steven J. GerGeorgia Voradaki, Despoina Linaraki Greece
257
Fig. 11
Fig. 12
tel investigated how the colors affect
the human and resulted in that specific colors and shapes have impact
on students’ health, feelings, behavior
and performance, depending on their
culture, age, sex and the subject they
work on (Daggett, Cobble, and Gertel,
2008). In researches about the behavior of prisoners, it was found out raised
violence as a response to red and yellow color, since they increase the levels
of dopamine and adrenaline. In contrast, blue and green increase serotonin’s levels and relax the human and
pink has a soothing action, since it was
found to help the production of oxytocin – the hormone responsible for serenity and not stress- and to suppress
the aggressive behavior of the prisonFig. 13
ers (Daggett, Cobble, and Gertel, 2008).
Relevant studies have shown that red light increases the strength of the athletes and
improves their performance, supplying them with fast energy, while blue light offers
stable energy to them. Finally, orange contributes to the production of endorphins
and help the individual relax and eliminate feelings of anger (Fig. 13).
Based on the studies that have been described above and other similar cases of
research, it has been conducted a table named Neurobiological Network (Fig. 14) containing certain spatial qualities and their consequence on the human organism and
258
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 14
feelings. This table can constitute a useful tool in the hands of architects and designers, by helping them create human spaces. The architectural qualities that have been
included, affect the individual and any change on them has an impact on the human’s
hormonal levels and emotional balance. Textures, temperature, music, noise, smells,
light and colors are only some indicative spatial qualities and the table can be enriched trough further research.
Neurospase System
This research suggests the creation of the Neurospace System between the human
and the space in order to face and suppress negative feelings. This system will allow
the space to change as the user’s feelings alter through time. This is possible, if sensors
installed in a room and programmed to sense the user’s hormonal levels and thus his
emotional state, could response by ordering the space to change his qualities according to the table and help the individual overcome negative situations. By changing
the spatial qualities, the human senses are activated and send messages to the brain
for the production of certain hormones. Therefore, the human can avoid the use or at
least the overuse of medicines when encountering with aggressive behavior, anger,
stress, depression and fear (Fig. 15).
Technology is developing continually offering new ways of realizing the Neurospase System. For the moment, it has reached to some extent. Specifically, the estimation of the hormonal levels of a user in a room is possible though the analysis of saliva,
urine or blood. Additionally, the Bayer Company has constructed the CONTOUR® USB,
a glucose gauge for those suffering from diabetes, which consists of the gauge strips
CONTOUR and the puncture device MICROLET™2. Its use is simple and fast and similar devices can be used easily for the hormones. Researchers at the University of California UCLA in collaboration with Aneeve Nanotechnologies LLC have already been
trying to create special sensors for progesterone and estrogens, based on the idea of
Bayer. The sensors will be portable, easy to use and low-cost. Researchers aim at the
controlled dosing of medicines (http:// www.medicalnewstoday.com, 2010).
Regarding the materials available for the construction of a space that could support such functions, microprocessors programmed according to the table can be put
in the room and control their attributes. Some examples of materials follow.
Georgia Voradaki, Despoina Linaraki Greece
259
Fig. 15
Fig. 16
260
Fig. 17
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 18
Fig. 19
Textures: There are materials that can
change nature by storing heat and
alter from smooth to rough or soft
(Oungrinis, 2010).
Temperature: The room temperature
can be controlled by air-conditioning
systems, but also there are thermoelectric materials that can cause variation
in temperature due to changes in the
electric field (Oungrinis, 2010).
Music/Noise: The quality of the music in
a room is affected by the materials in
the surroundings. For instance, a concrete wall has different acoustics than
a wood one, due to their different absorbance ability. A room with several
tough surfaces, like stone and concrete,
has worse acoustics than a room which
Fig. 20
has wallpaper and carpets on the floor.
The same factors affect the noise levels,
as well (Oungrinis, 2010).
Smells: The Japanese Shimizu Company which works on aromatized space has embodied odor in the air-conditioning system and it is diffused in space. Murayama Noboru,
Yamauchi Satoshi and Suzuki Koichi created a control system for olfactory stimulants,
which can be installed in the room and remove undesirable smells or emit satisfactory
scents. Other Chinese companies have manufactured materials suitable for the floor
Georgia Voradaki, Despoina Linaraki Greece
261
that are infused with different smells and when someone walks on them, odors are
diffused in the air (Oungrinis, 2010).
Light: There are numerous materials regarded to light. The luminescent materials have
the ability to convert energy to light in low temperatures, the phosphorescent convert energy to light but not immediately, the electroluminescent materials convert
electricity to optical light and the fluorescents that absorb radiation of different wavelength and then radiate light. Therefore, when more light is needed, these materials
on different surfaces can offer it (Oungrinis, 2010).
Color: The color of a room can change by using thermochromic materials which alter
according to the temperature, either photochromic materials which alter according to
the lighting or electrochromic materials that change color when exposed to voltage.
Electrochromic materials consist of a tissue of thin metal wires that are connected to
a battery. Additionally, optical fibers can be put within other materials, like a concrete
wall, and emit light in various colors (Oungrinis, 2010).
Figures 16-20 constitute indicative examples of spaces that transform their qualities in
order to face aggressive behavior, anger, stress, depression and fear respectively.
The energy needed for the above spatial changes is produced by the human himself. The body motion produces kinetic energy, the body temperature produces thermal energy and the chemical procedures produce chemical energy. Researchers in the
U.S.A. created nanofibers that are embodied in clothes and produce electric energy
when the human moves. The nanofibers take part in piezoelectric phenomena that
help the alteration of mechanical energy to electrical energy. As mentioned in Nature
magazine, the scientists claim that these materials could be used in tends or other
constructions in order to exploit the wind power.
These are only a few ways in which the results of the research can be utilized and
technology is raising new ones continually, by inventing innovative smart materials
and systems. Responsive architecture can contribute to an improved environment
for individuals. Human beings could feel weak sometimes and architecture can help
them improve their psychological health and prevent them from irrational use of
medicines, at least in cases of everyday life.
References
Bell, P., Green, T., Fisher, J., & Baum, A. (2005). Environmental psychology. Orlando: Lawrence Erlbaum.
Cardinali, D.P. (1998). The human body circadian: How the biologic clock influences sleep and
emotion. Neuroendocrinology Letters, 21, 9–15.
CCHRGR, Available: http://cchrgr.blogspot.com (11 November 2011)
Crockett, M.J. (2009, June). The Neurochemistry of Fairness Clarifying the Link between Serotonin
and Prosocial Behavior. Annals of the New York Academy of Sciences, 1167, 76-86.
Daggett, W.R., Cobble, J.E., & Gertel, S.J. (2008). Color in an Optimum Learning Environment. New
York: International Center for Leadership in Education.
Dritsas, (2007, March). Lecture with title «Medicine and Music», Hospital Evagelismos, Greek Association of Professional Music-Therapists, 3.
262
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fahn, S. (2006). The History of Levodopa as it Pertains to Parkinson’s disease. Movement Disorder
Society’s 10th International Congress of Parkinson’s Disease and Movement Disorders on November
1, 2006, in Kyoto, Japan.
Kalat, J.W. (2001). Biological Psychology. book A&B. Translation Kastelakis, & Christidis. Athens: Ellin
publications.
Kastelakis, A. (2007). Physiology of Bahavior. University of Crete, School of Social Sciences, Department of Psychology.
Κοld, B., & Whishaw Q.I. (2009). Brain and Behavior. Translation Kastelakis A.A., Panagis G. Athens:
Medical Publications Pasxalidis.
Moss, M., Cook, J., Wesnes, K., & Duckett, P. (2003). Aromas of rosemary and lavender essential oils
differentially affect cognition and mood in healthy adults. The International Journal of Neuroscience.
Neave, N. (2008). Hormones and Behavior, A Psychological Approach. Cambridge: Cambridge University Press.
Nestoros, N., I. (2007). Notes on Neuropharmacy. University of Crete, School of Social Sciences,
Department of Psychology.
Oungrinis, K., (2010). Lecture with title «Smart Materials» in the course «Contemporary Materials
amd Construction Methods», Technical University of Crete, School of Architecture.
Piperopoulos, G., (October 2010). Lecture with title «Neurosis – Psychosomatic disorders», open University, Chania.
Sommer, R. (2008). Personal Space. The Behavioral Basis of Design. Bristol: Bosko Books.
Vander, M.D., Sherman, J., Luciano, D., & Tsakopoulos M., (2001). Physiology of the Human. book
A&B. Athens: Medical Publications Pasxalidis.
http://ixoripansi.gr/ (ανάκτηση: 07.02.2011).
http://neuroscientificallychallenged.blogspot.com (March 2008).
http://www.bookrags.com/tandf/texture-perception-tf/ (03 November 2010).
http:// www.medicalnewstoday.com (11 January 2010).
http://www.nimh.nih.gov (04 August 2010).
Medical School of Miami, Institute of Research of Touching, Available: http://www6.miami.edu
Georgia Voradaki, Despoina Linaraki Greece
263
Anna Klara Veltsista
Nadia Charalambous
University of Cyprus, School of Engineering
Department of Architecture
Nicosia
Cyprus
Architectural Design Studio:
Reconsidering the Digital
through Embedded Technologies
The design studio has undoubtedly been at the core of architectural design education since its inception in the nineteenth century. The overriding primacy given to the
studio as the main forum for creative exploration, interaction and assimilation remains
a common characteristic of schools of architecture. During the past decades the traditional design studio has come under considerable criticism enforced by social, cultural, epistemological and economical factors, knowledge, research and technological developments and increased use of information technology and computer aided
design. However, we believe that it can still be rightfully considered as the foundation
of architectural education albeit in a possibly different form. To understand why this is,
we attempt in this paper a sober reflection upon the possible role of digital technologies in design, through a diploma project, developed in a studio environment “dominated” in many aspects by the use of information technology.
Underlying the nature of the design process today, is the question of the relationship between the digital and the analogue worlds which forms a central issue for architectural research. It is acknowledged that digital technology has brought a radical
change in the contextual frameworks in which architecture and architectural production are normally placed. Examining such issues, recent research work suggested
an interesting finding, that advances in digital technologies are paving the way to
achieve “integrated design” -a type of practice in which various disciplines involved in
building design work together to achieve efficiency and other benefits. These technologies enable the designers to collaborate, visualize, research and modify building
performance with relatively high accuracy. If such an approach gradually becomes
widespread, architectural education and studio in particular, needs to take it into consideration. Per Olaf Fjerd1 recently pointed out that architecture is evolving into far
more of an infrastructure capable of taking on a variety of spatial and functional programs, than the actual physical edifice. Critical thinking becomes thus, an essential instrument in a research-based architectural education which needs to actively navigate
towards strengthening reflective and inventive capacity. This approach requires the
integration of technology and allied disciplines at the outset of the design process.
The application of such an integrated approach to the studio, building on individual and research-based knowledge throughout the design process, was explored
through a diploma proposal addressing the future of domestic space in the light of
contemporary digital developments. The architectural design was pursued within an
interdisciplinary context of development, where the building morphology, construction and materials were considered as equally important design parameters, and were
investigated interactively from the initial design stages.
These parameters were explored through both digital and analogue forms of representation, through the different scales of design. The diploma project presented,
used a multiple iterative conversion between analogue and digital media to investigate the design concept through material, texture, structure and shape in a detailed
and intuitive way. The proposal sought to examine and at the same time question the
use of digital technologies as a facilitator and catalyst for the promotion of the underlying pedagogical objective of integrated design and research-produced knowledge,
at every stage of the design process. A design methodology, enhancing an explorative design process and employing representation as research, was also explored, as
suggested by Reinhardt (2008). An integrative approach was used, in which design
266
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
elements and partial solutions in different media (analogue and digital design techniques) were cross-referenced and re-informed each other.
An important goal which preoccupied the student throughout the process was to
explore the effects of such a digital approach to the design process itself and the final
building proposed as well as to her own role as an architect. The whole process and
final proposal suggests that there is a way to create a softer, more human interface
both to the process of design and the final building outcome which adopts a digital
approach. The designer’s individualism, an integrated design methodology and research - based design which takes into consideration the complexity of the human
senses and behavior, are possibly three of the most important issues central to the approach towards architectural education in general and the architectural studio in particular, in relation to digital technologies. The paper thus attempts to explore the interrelation between a) the architects’ individualism, b) research- based findings during
the design process through experimentation and c) an integrated design approach,
where information technology is integrated from an early design stage with morphology, construction and other design parameters, underpinned by human values and
needs. Through a presentation of the diploma thesis proposal, we discuss a number
of important findings in relation to the present and possibly future form of the design
studio.
We suggest that the designer’s individual knowledge and intuition, the presence
of the human being that is, may not only be the basis for creative action, but also the
basis for understanding and interpretation. By taking into account design constraints,
architects rely on individual knowledge brought together with research-based knowledge, through a cyclical process of experimentation, evaluation, self reflection and
redefinition. Design informed and enriched at every stage by a research-based process, might well be the transmission and transformation of the designer’s individual
knowledge, a process of elaboration and discovery which facilitates and enhances design creativity and possibly allows for a multiplicity of approaches through a range of
possibilities.
Domestic Architecture and the Digital World
The contemporary dweller is faced with many dramatic perhaps, changes - economic,
social, and technological; changes which also have spatial consequences. The concept
of contemporary living in general and domestic space in particular, seems to remain
quite static, based on ideas of domestic space organization suggested in the early
parts of the 20th c.2 We seem to continue to produce dwelling models which do not
correspond to the contemporary user. Domestic space organization always responded – or should respond- to its time, but this is not the case for today. This problem
creates the need for a new house model which will respond to the new conditions
and demands of a digital era. The architect today should understand that his role has
changed; he should design for the new needs and according to them. The new program cannot be determined in detail not even conceived clearly. It arises by the new
conditions and has aspects that prove the complexity of social relations, space and
time. In short, the change should be a deliberating move that will reject any attempt
for standardization of needs and spaces. We should recognize that the dwelling has
Anna Klara Veltsista, Nadia Charalambous Cyprus
267
different meaning for every one of us; it refers to different qualities and different relations; a digital approach needs to respond to this “subjectivity”.
For some, the real house can be everywhere as long as its user can have the sense
of intimacy. Nevertheless we cannot actually dwell this sense of intimacy. The reword
of dwelling relies on two basic points: the redefinition of the idea of house, and the
multiple and complex components of everyday needs. In a world that never stops
evolving, architecture’s new nature is rather hard not to be commented. The architectural space seems to be changing to something new, something resultant from the
constantly changing reality. The present is undeniably a period of radical transition.
This new era has a hybrid form, something between the analogue world and the digital culture that dominates our everyday life. In this hybrid form of life, we still have
the same needs, the same choices to make, but different spaces to experience and a
new kind of architecture to deal with.
What comes to hand when talking about architectural experience is the balance
between the analogue and the digital world. Today, the texture of both the city and
the building, is becoming more digital, introducing dimensions such as networks, information, time, and forgets all about limits and fixed spaces. Time becomes a variable
of its own, not a general notion that nobody dares to ‘touch’. We have the media to
control it, to change it and to choose the right moment for everything. Spaces, public
or private, can connect with each other through the new technologies, overcoming
physical distance. All these media are around us every day, in our houses. As William
J. Mitchell3 is proposing in his book e-topia, ‘the online world which once consisted of
ephemeral and disconnected fragments has become increasingly persistent, interconnected, and unified’. A global network of spaces, buildings and users can be created at any
time, changing the dynamics of architecture and the role of the architect. Everyone
can be connected to these networks while being at home. New relations emerge and
the user, who is surrounded by all these networks, is in constant pursuit of the sense
of intimacy in his private world. The activation of these networks becomes possible
with information and the building which is programmed to accept this information,
transforms them and changes its shape in real time. This kind of design can evoke
new experiences for the user, experiences that are made possible by information, and
the interaction of the user with all these networks from his home.
As mentioned, networks can now be considered as the “new limits”4. The right
“node” can make a building successful or not. The first step is to make sure that we
connect the house with the right networks. The sense of intimacy can only be created
when our house becomes a node in a network or more. It is then when people can
satisfy everyday needs different than those we traditionally knew’ needs that relate to
the digital era. We then need to explore what happens with the sense of privacy when
all these networks intrude one’s house. Most of today’s social networks that include
interaction have several filters that can be controlled by the user so as to still feel safe
and private in his house. His own choice transforms his house to more or less private.
What concerns this project does not rely on what makes users reveal their privacy in their domestic environment. What concerns us as architects is who exactly this
“contemporary” user might be and how could his needs be met through the incorporation of digital technologies which do not ignore the human factor. So, the dweller
under consideration might be anyone who lives in the so called megalopolis, where
nothing works according to a strict program or rhythm; people who are away from
268
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
their family, friends; people who work at home or even people who due to a disability
are obliged to stay at home. These people have still the need to communicate with
the exterior in a way that provides full experience of their physical environment.
THE ubiHOME PROJECT
An Interactive Installation for the Virtual Connection of Domestic Spaces
The project sets out to explore the ways in which contemporary domestic space is affected by the “presence” of digital technologies? In what ways does the dwelling respond to this new kind of technology and how does the dweller-user under study,
possibly redefine spatial and social behavior. The objective of this diploma research
has therefore been to create a new scenario of today’s experience of the user in his
house, in a digital era where architectural design seems to undergo dramatic changes.
The proposal suggested that there is need for a new dwelling model, in an era fully
digitalized and characterized by both the analogue and digital worlds. It suggested
the introduction of the digital technology in the domestic world, in a way that can be
naturally experienced and manipulated by the user, through an experimental virtual
connection of two houses in order to achieve a different way of communication facilitated and possibly demanded by recent technological advances.
Following an initial, thorough investigation in relation to the aforementioned
questions, the project set out to explore which could possibly be the most suitable
way to incorporate digital technology in the design of domestic space in such a way
that the user could interact through it with exterior space in a natural, three dimensional way. In other words, how could digital technologies facilitating social interaction and communication with the external environment, be incorporated in the domestic space in a natural, “silent” way which takes into consideration users’ needs and
patterns of everyday living.
Embedded Technologies
Digital networks such as Facebook or Twitter, allow the social networkingnetting and
interaction from the interior of the house. People inform others about what they do
or where they are in a specific moment. In some cases, such as Skype, three dimensional images of the surrounding physical environment are introduced through the
communication of users. Instead of just using flat screens, keyboards and mice, people
should be able to interact with their computers and other devices by moving around
and through real physical objects. In short by “acting” naturally.
The Mixed Reality Architecture5 project for example, dynamically links and overlays
physical and virtual spaces. By constructing and controlling co-presence, architecture
creates the potential for social interaction on which the reproduction of social forms,
such as organizational or community structures, and the generation of new forms ultimately depends. The importance of MRA relies on the potential it provides, in order to
enable remote communication and interaction between people and groups, in ways
that are directly analogous to those offered by physical architecture. These new architectural forms afford near instant access to non-adjacent parts and, as Virilio points
out, the distinction between near and far becomes irrelevant here: the spaces travelled across are lost and become invisible. The spatiality into communication across
Anna Klara Veltsista, Nadia Charalambous Cyprus
269
Fig. 1
Mixed Reality Architecture.
physical and virtual environments will support every social interaction and can convert the digital communication media to a more generative form, familiar from physical spaces.
In a similar line of thinking, The Urban Carpet6, aimed to create a physical scenario
where people can interact by means of technology with the urban and social environment. The urban carpet is a new kind of technological interactive platform, a LED’s
urban carpet in a public space that could enhance social awareness and interaction
between people nearby. A prototype carpet was set in three different locations of the
city of Bath, investigating how people move, congregate and socialize round the interactive installation.
In the MIT Media Lab, a project called Sixth Sense is currently exploring ways to
make technology tangible. A non architectural project which has a very intelligent approach on how they see the nature of technology from now on. Sixth Sense frees information by seamlessly integrating it with reality, and thus making the entire world
your computer. Hiroshy Ishy, from the MIT Media Lab, wants people to interact with
their computers and other devices by moving around and by handling real physical
objects. In short, by doing what comes naturally. For Ishy, the importance is not just
about making something that works; in his careful attention to make material and design, he makes it clear that he believes the experience of interacting should be pleasing not just functional. On the other hand, the user himself demands that technology
around him can be tangible. From his early years, he is being in touch with every form
of information and technology and as a result he can be adaptive to any evolution.
The apple technology which introduced touch screens in our everyday life has not
been accidentally popular.
A social reading of the potential application of such technologies in the 21st century
would inevitably raise issues about the perception of the users in such spaces. Privacy is an issue that could drive such an application to failure. Good knowledge of how
technology functions will lead to the best control of it. We need to give the users the
choice to control it at any time. Architecture means nothing if the user is not comfortable in it. In the case of ubiquitous computing, we need to ensure that it can be manipulated by people in the best possible way facilitating contemporary forms of social
interaction in an integrated way.
270
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
Six Sense at the MIT lab.
Ubiquitous Computing
According to Mark Weiser, the world is the next interface and computers live out in
the world with people, as a part of challenging integration of human factors, computer science, engineering and social sciences. Weiser describes this technology as optimally ‘calm’ when it resides to the background of our lives. It transcends traditional
user interfaces by being part of small devices and appliances, but also large-scale
walls, buildings, and furniture. Computers allow us to capture the movement of matter and to manipulate and redeploy physical traits, qualities, and behaviors into novel
composites. We should redirect research from the screen-based simulations that have
predominated in the last fifteen years, towards the considerably greater intelligence
that is already impressed into matter, including the investigation of new scientific and
industrial processes and the new materials in which these processes are embedded.
Tobi Scheidler designs the REMOTEHOME, a communication system extending the
idea of home as a private and situated space, to one that connects home in two different cities. While communication and media technologies, including mobile phones
and instant messaging, are already creating new scenarios of sharing friendship and
intimacy over long distances, what would happen if real-time mediated communication were to become part of our everyday environment, the spaces we inhabit, the furniture we use and the items we cherish? In this case, a model apartment was set up at
the Science Museum in London and the Raumlabor in Berlin. Remote audiences could
participate and interact with each other in real time via sensory furniture which detect and distribute impressions rather than information about the inhabitants. Those
cues of occupation are then transmitted via the internet to the other side, where they
surface through kinetic, tangible features and light installations. This way, the home
stretches beyond borders, and helps friends to stay in touch, literally, though tangible
and sensual communication, an emotional and intuitive form of presence.
Interaction Design in Contemporary Domestic Space
How could interactive design be incorporated in contemporary domestic environments in ways such as those described above? In ways which will seem as natural as
breathing? The two main aspects one needs to consider are the social relations the
resident has with the exterior of his home, and the ways in which information filters into the domestic space through several means. Concerning the social relations,
Anna Klara Veltsista, Nadia Charalambous Cyprus
271
it is easy to imagine a resident who chats in facebook or skype from his house, or a
resident who is trying to synchronize his activities with those of others. On the other hand, the information has also a significant power towards the resident. The real
world phenomena of both the interior and the exterior can be coded by the appropriate technology and “give out” to the house the information as well as ways to manipulate it.
The need for such ways of communication through the embedded technologies
were then explored. As the house is the basic space that hosts these activities, one can
foresee the new possibilities that arise. The experience of space during those activities and the communication between distant dwellers are the focus of this proposal.
The question now turns to the most appropriate ways through which digital communication with the external world can be naturally embedded in the spaces we dwell,
taking into consideration basic domestic concepts such as intimacy, privacy, safety.
Embedding digital technology into the building is suggested to be a way worth exploring. It is suggested that information and communication technology become an
integrated part of the domestic environment, potentially available for anyone at all
times. Traditional user interfaces will be transcended by being part of small devices
and appliances, but also large-scale walls, buildings, and furniture.
The Virtuality of the Domestic World
Interaction design in the house
The basic idea is that every dwelling can be synchronized with others in terms of certain domestic activities. Furthermore, when we choose to connect with another house
and synchronize a particular activity, the distance is no longer an obstacle. The connection can be virtual, proposing a global network of shared actions which can act
regardless of time or distance.
Fig. 3
The two houses under study.
272
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Creating a shared space
Creating a shared space premises at least one common activity and two different
physical spaces. After the choice of the activity, every dweller will have a different
sense of his space. The users meet their houses with different limits this time, with virtual extensions that allow the communication he needs along with the spatial sense
of this communication.
Fig. 4
Creating a shared space.
What do we share?
For every house that our house is connected to, a different layer of nodes/activities is
created. The shared activities are defined by the dwellers. Those may be the same or
different. All the layers are added together and appear in the house as nodes of a bigger network off virtually connected houses.
All the layers of shared activities appear, forming a unique mass of connections,
including the nodes one can use in his house.
Fig. 5
The “carpet” of shared nodes.
Anna Klara Veltsista, Nadia Charalambous Cyprus
273
Shared activities
Every dwelling can be synchronized with others in terms of certain domestic activities. Furthermore, when we choose to connect with another house and synchronize a
particular activity, the distance is no longer an obstacle. The connection can be virtual,
proposing a global network of shared actions which can act regardless the time or the
distance between the physical domestic spaces and the dwellers.
Fig. 6
Choice of activities.
Who do we share with?
Each house contains certain shared activities which are determined by the network we choose to interact with. Online and offline activities create a unique carpet
of nodes from several layers of activities, different for each house. The traditional relation between private and public is thus redefined and explored. This relation is filtered
with digital mediums, so that the user may choose if and in what degree he will reveal
his house to the public, or in our case to other private spaces.
Fig.7
Choice of action is either
online or offline, securing
the sense of privacy.
274
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Conclusion
Design can no longer be considered as simply “building”; it is more associated with
the production of ideas, routines, contexts, entire social and cultural environments
and processes. Architects are now inventors of those intervening films that seem to
coat everything these days and that one used to call interfaces. Electronics, computation and network connectivity, are increasingly embedded into the objects of our
everyday life. Unless we know how they are designed we won’t have control of what
they do, or what they can do to us. This is why digital technologies need to be treated
by architects as a way to improve design and its final outcome incorporating in the
process humane values. Human senses and behavior need to be decisive factors in
any proposal which adopts digital technologies, even more in the case of domestic
space. Architects as well as potential users can then have a more influential and decisive presence. The designer’s individual knowledge and intuition, the presence of the
human being that is, will then not only be the basis for creative action, but also the
basis for understanding and interpretation of a digital era.
References
Monographs
Lawson, B. (2003) How Designers Think, Architectural Press.
Lawson, B. (2004) What Designers Know, Architectural Press.
Mitchell, W. (1999) e-topia “Urban Life, Jim--But Not As We Know It, MIT Press.
Schön, D. (1985) The Design Studio: an exploration of its traditions and potentials, RIBA Building
Industry Trust, London.
Vitruvius (1960) The Ten Books on Architecture, translated by Morgan M.H. Harvard University Press.
Vrichea, A. (2003) Living and Dwelling, Ellinika Grammata, Greece.
Papers
Briones, C. (2006) LEDs urban carpet: a portable interactive installation for urban environments. Masters thesis, UCL.
Charalambous N. and Hadjisoteriou M. (2007) “Introductory Architectural Design Studio: (Re)Searching for a New Approach” proceedings of the European Network of Heads of Schools of Architecture
(ENHSA) and the European Association for Architectural Education (EAEE) conference: “Teaching and
Experimenting with Architectural Design: Advances in Technology and Changes in Pedagogy”, pp.
Darke, J. (1984) “The Primary Generator and the Design Process”, Developments in Design Methodology, Nigel Cross, Open University, John Wiley & Sons, pp. 175-188.
Moggrridge, B. (2007) “People and Prototypes: Elements of the Design Process” in Moggridge, ed.
Designing Interactions, Cambridge, MIT Press, pp. 729-735.
Per Olaf Fjerd (2008) “Changes of paradigms in the basic understanding of architectural research”,
in the ARCC/EAAE conference Architecture and the Digital World, Copenhagen.
Rendell J. (2004) “Architectural Research and Disciplinarity” in Architectural Research Quarterly,
vol.8, no.2, pp. 141-147.
Reinhardt, D. (2008) “Representation as research: Design Model and Media Rotation”, The Journal
of Architecture, 13:2, 185-201.
Anna Klara Veltsista, Nadia Charalambous Cyprus
275
Sanvido, V.E., Norton, K.J. (1994). Integrated Design-Process Model, Management in Engineering,
10(5), pp. 55-62.
Schndelbach, H. and Penn, A. and Steadman, P. (2007) Mixed Reality Architecture: a dynamic architectural topology. Presented at: 6th International Space Syntax Symposium 2007, Istanbul, Turkey.
Notes
1 Per Olaf Fjerd (2008)
2 Vrichea, A. (2003)
3 Mitchell, W. (1999)
4 Ibid.
5 Mixed Reality Architecture (2007)
6 Urban Carpet (2006)
276
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Mihaela Hărmănescu
Faculty of Urbanism
University of Architecture and Urbanism “Ion Mincu”
Bucureşti, România
Responsive Architecture
through Urban Planning,
Landscape Architecture
and Urban Design
Theoretical Framework
The social and cultural forces associated with globalization have overwritten local
social and cultural practices and globalization has generated a world of restless landscapes in which the more places change the more they seem to look alike and the
less they are able to retain a distinctive sense of place. The connection between rapid
industrialization, the acceleration of everyday life, and the decline in the quality of
collective life that were first noted by Walter Benjamin (1936, p. 217-252) have been
intensified as post-industrial society has become fragmented and reorganized by the
acceleration powers of information technology and transmission. These pressures
are closely connected with the materialism of post-industrial society and as a result,
a work spend cycle has become fundamental to the economic and social dynamics
of contemporary society. Speed has become a hallmark of many aspects of planning
and strategy, as reflected – and indeed, prompted – by advertising. In advertisements,
the pace of people’s consumption is often linked to the pleasure they appear to be
experiencing, speed and busy-ness of schedules are transformed from negative into
symptoms of laudable, well-adjusted and fulfilling lifestyles.
Starting from the nineties, in many European cities urban projects have been a primary tool to establish exceptional measures in planning and policy procedures and
also to establish a Neo-liberal urban policy. Great Urban Projects (GUP) were the policy tools to exploit parts of the city, starting from alliances between real estate lobby
groups and politics, favouring a repositioning of economic powers and relationships
between public and private sector.
In this context the alignment of political and technical equipment necessary for
the achievement of these GUP has provided an important role both to the traditional
city experts (urban planners, economists, architects, mobility experts, engineers) and
to the new professional figures (Development Manager in particular).
Almost all medium size European cities are involved in these processes, sharing a
similar development perspective: equipping cities against new international scenarios that lie ahead in terms of competition The traditional city of clean duality (figure/
ground) has disappeared, to be overtaken by a meshwork of interactions, “Big Events”
(particularly sports, events, leisure) were a natural medium for these new forms of urban development.
The most evident effects of this approach have been related to brown field regeneration and rebuilt within the city, development of wide areas outside the city in relation with new demand of culture, tradition, social, infrastructure, transport and logistics, particularly in harbours and waterfronts, historical areas, interest points because
Fig. 1
Urban evolution and complexity.
278
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
the model pattern and basic unit of urban complexity is based on hierarchical levels
of complexity of the territory from micro- scale to macro-scale.
The continuum transformations of the city behavior, sequence of past events that
relate to practical reality are activating a highly urban development, headquarters of
the functions and abstract phenomena.
Romanian Background
Since the three decades in Europe are ongoing a changing in the processes of transformation of our cities: from urban planning (based on regulatory way) to urban governance (based on a strategic approach and on operational tools) and the intention
of the paper is to have a reflection on how would be possible to attribute a more outstanding and efficient role to the Urban Planning, both on the educational side and in
the planning practice.
Fig. 2
Romanian Planning Framework - General Context.
There are differences between urban plan which is guidance role, regulatory role
(General UP, Zonal UP, Detailed UP) and urban project, just an architecture project,
but related to neighborhoods (form composition, structure, context…) and based on
regulations fixed by Zonal UP and Detail UP. The last one can be also an Zonal UP or
Detail UP and because an urban tissue that could flexibly respond to all local input
factors as well as accommodate desired (planned) goals and is verifying the validation
of the reality to be flexible, functional and permeable.
Fig. 3
Validation of the reality.
Mihaela Hărmănescu Romania
279
The argument of this approach is the contemporary city (Thom Mayne, 2011): city
is like an evolving process (Leach,2009) which does not oppose nature, but is inherently part of it. It is not a sum of its elements, is an interaction of complex systems like
a results of such extreme richness and fragility of striking nature, and the nature here
is not as chaotic and random as it seems at a first look.
Never static, the contemporary city is dynamic, unstable, and increasingly difficult to
trace as a linear process.While cities have traditionally provided stable and hierarchical
spatial organizations appropriate to the once relativity uniform nature of social composition and concentrated political power, the contemporary city has liquefied into a
dispersed urbanity – poly- nucleated attractors, or downtowns, in which architecture
is one more network with infrastructure as its vector of mobility. The urban space is
changing dramatically and looses more and more its coherence. Constantly urban fabric is erased and replaced with new structures.
Educational – Planning Practice Relationship
Dynamic Components and Land Use
Cities are performing themselves operations of stratification, destratification and restratification on the flows that transverse them. Henri Lefebvre (2004) describes this organic aesthetic of the flows of life in a city and its continuous restlessness and unpredictability as rhythms with as having a “maritime” quality: ”There on the square, there
is something maritime about the rhythms. Currents traverse the masses. Streams
break off, which bring or take away new participants. Some of them go towards the
jaws of the monster, which gobbles them down in order quite quickly to throw them
back up. The tide invades the immense square, then withdraws: flux and reflux.” (Lefebvre, 2004).
Fig. 4
Collective form - complex behavior.
280
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
While we confortably allow environmental models to influence our perception of urban structure, eventually we must translate patterns of human behavior into urban
system and space. The practice of urban planning, wich has traditionally been aligned
with permanence and stability, must change to accomodate and take advantage of
the rapid changes and increased complexities of contemporary reality.
The spaces are no longer seen as flats and linear but as a flow of matter-energy animated from within by self-organization processes said De Landa (2000). Furthermore,
it is necessary to introduce the notion of self-organization, which became a crucial
tool of comprehension for today’s complex world (defined by gradients or intensities
of values, with no “deterministic” boundaries).
Focusing on environmental evolution, which produces increasingly complex life
forms over time, the city is a field of permanent genesis, the constant flux of its systems is the means by which its social structure evolves with ever-greater complexity.
Systems never get simpler.
Our time suffers from an inability to organize, the possibilities that it has itself produced. While we have relied principally on the quantitative and controlled frameworks
of physical and geometry to define and manage the seemingly incompressible the
qualitative and approximate world of biology is now emerging as a more useful model
of both scientific and metaphysical explanation.
Development in life sciences, ecology, mathematics, systems theory and computation has affected a paradigm shift in how we conceive of organizational processes.
Fig. 5
Complex spatial urban body.
“The contemporary urban environment in composed and recomposed by each individual
every day around literal and virtual itineraries, and not in relation to a fixed arrangement
of places” (Pope, 1996, p32).
Urban system is an imperfect system, territorial distribution, spatial, functional,
etc.. Its imperfections become principles generating urban form, which by self-organization evolves, being found at different levels, layers, subsystems, urban systems, in
Mihaela Hărmănescu Romania
281
constantly networking and subject to the same general conditioning and local-global
constraints.
The actual urban material is a concept that considers the exchange of substance
in urban body as information expression, but is important to have a interior/ exterior
and exterior/ interior permeability for his evolution. Landscape planning, urban planning and architecture confer shape to information, in the sense of information emission, reception, exchange and generation.
Integration, Development and Conservation
Next to all the exchanges,in the last decade, new digital technologies have evolved
from being simply representational tools invested in the depiction of existing models
of urban space to becoming significant per formative machines that have transformed
the ways in which we both conceive and configure space and material. These tools
for design, simulation, and fabrication, have enabled the emergence of new digital
diagrams and parametric landscapes—often emulating genetic and iterative dynamic
evolutionary processes—that are not only radically changing the ways in which we integrate disparate types of information into the design process, but are also significantly altering the methodological strategies that we use for design, fabrication and planning. This technology offers an alternative method of urban production that design
flexible framework of relation system within which activities, events and programs
can organically play themselves out and engages the premise of continuous process
over static, in doing so, presents ways to activate the city.
In the Faculty of Urbanism (Urban Planning and Landscape Design), University Of
Architecture and Urbanism “Ion Mincu”, Bucureşti, the current models of space produced by our students are far more continuous, variant and complex, is specifically a
result of the tools we are using to produce them, an inevitable by product of the everexpanding capacities of digital computation and related fabrication technologies as
these intersect with theoretical trajectories that long ago dismantled the social, functional and technological truths of the early part of this century.
The notion of architectural responsiveness through urban planning, landscape architecture and urban design relate to occupants and their activities, to the city, and to
change, as well as to the outside climate. It reflects an organization of society in terms
of a complex division of labor, high levels of technology, high mobility, interdependence of its members in fulfilling economic functions and social relations.
The Department of Urbanism and Country Planning coordinates an important
part in this field, both the theoretical and design educational process at all academic
levels/degrees (Faculties, School of Advanced Studies). The courses, urban and landscape studios adapted to the particular aims of the curriculum of each unit are generally articulated following the increasing complexity of education throw synthetic
knowledge of planning and landscape architecture, urban renewal, town administration and management.
The main lecture topics are: Urban structures, Urban composition, Urban project;
Sociology/ Sociology of dwelling, Landscape architecture; Town administration Recycling of the built stock; Territory planning; IT in Urbanism; Town Traffic.
282
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 6
Educational research method.
For our teaching program this process could start by looking at three key issues:
- the knowledge that informs design and decision making
- the environmental performance targets to be achieved
- the environmental attributes of the urban context.
Education should take a driving role in this evolution. We need to move beyond the
technical fixes perpetrated by current practice and start extending the architectural
vocabulary towards expressing the temporality of natural and operational cycles in
more diverse and creative ways. Permanently enriched by laying emphasis on sociohuman, aesthetical, and economic components of all these aspects, the aim is to develop the comprehensive perspective of the architect-planner to be.
We are concerned to introduce elements of the Parametric Urbanism who take the
paradigm and tools of parametric design into the domain of urban planning. If the
power of parametric is usually confront with the succession of design changes for its
ability to produce variations of a single building, or to generate versions of building
components for a complex building geometry that does not allow for the repetition
of element, what we are considering Parametric Urbanism suggests that these techniques of versioning can be applied to an array of buildings, so that a new version does
not replace an older version but instead comes to join and extend the field of simultaneous versions in building up a complex urban field (Schumacher, 2006).
The students in their projects, explored possible parametrically future configurations for the study area (the urban plan) and tried to push for a solution that would
maximize performance (density, height, shading, access to natural elements and connectivity) of the whole region concerning also the architecture of the buildings, while
leaving ample potential that allows for unplanned emergent evolution.
Trough this educational exercises, our interest is to envision an urban tissue that
could flexibly respond to all local input factors as well as accommodate desired
(planned) goals: to differentiate and create a unique lattice that allows for surprise and
yet is easily mapped due to its inner space-partitioning algorithm, to cycle trough the
analysis and continuously changed the main axial map of the area to maximize integration of the buildings.
The students exercises are concentrated to making the plan both functional and
flexible, making it more permeable and responsive to the given input, creating also
an organic urban tissue (the proposed grid can have an almost infinite number of
combinations, quite the “ilot ouvert” kind of concept), to define and integrate the architectural object (the building itself or the buildings them self) through the urban
planning.
Mihaela Hărmănescu Romania
283
Students Exercices
Project 1: Urban-development-proposal
Academic year: IVth, 2010, Architecture
Students : Dimitrie Stefanescu, Dragos Mila
Project Teachers : prof. phd. arch. Tiberiu Florescu, assist. urb. Sebastian Guta
Description: Research on the proposal project (Dimitrie, 2010) shows that the students
approach left from Chaos Theory. They explored possible future configurations for the
concern study area (450ha large, at the intersection of two main planned road-infrastructure extensions) and tried to push for a solution that would maximize performance (density, height, shading, access to natural elements and connectivity) of the
whole region while leaving ample potential that allows for unplanned emergent evolution. Their goal was to envision an urban tissue that could flexibly respond to all local
input factors as well as accommodate desired (planned) goals creating the neighborhood typology which are based on local factors. An universal 130m by 130m grid was
proposed and then deformed to differentiate and create a unique lattice that allows for
surprise and yet is easily mapped due to its inner space-partitioning algorithm.
They iteratively cycled through several circulation analysis (using Depthmap) and
continuously changed the main axial map of the area to maximize integration usingRhino and its parametric plugin Grasshopper for the main work and Depthmap for
the integration analysis.
“Our task on the project was to create several “islands” that we will connect on
multiple layers (peripheral roads, axial roads and pedestrian walkways). After the “islands” were established we proposed central public spaces for each one, spaces that
thanks to Dimitrie’s excellent programing skills (Grasshoper and Rhino) are distorting
the initial urban grid of 130 m by 130 m making it more permeable and responsive
to the given input, creating an organic urban tissue.We concentrated on making the
plan both functional (easy to walk and easy to drive from one part to another) and
flexible (the proposed grid can have an almost infinit number of combinations, quite
the ilot ouvert kind of concept).”
284
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Project 1
Table 1
Mihaela Hărmănescu Romania
285
Project 1
Table 2
286
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Project 2: REA Competion Baku, Azerbaidjan, Urban- development- proposal
Academic year: 1st year Master Degree - Urban Project, 2009
Students: Gavrila Silviu, Ioana Ichim
Project Teachers: prof. phd. arch. Tiberiu Florescu, assist. urb. Sebastian Guta
Project 2
Table 1
Mihaela Hărmănescu Romania
287
Project 2
Table 2
288
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Project 3: REA Competion, University Campus, Bucharest, Romania
Academic year: 2nd year Master Degree- Urban Project, 2010
Students: Nasui Laura, Damian Alexandru
Project Teachers: prof. phd. arch. Tiberiu Florescu, assist. urb. Sebastian Guta
The project is one example
of the exponential complexity of system of the iconic
computer circuit. The concept is a link between the
idea of a university campus
and computer circuit: urban
structure is the inputs/ outputs of the circuits, transformation into a network, now
identifiable only through
graphics.
Project 3
Table 1
Mihaela Hărmănescu Romania
289
Project 4: REA Competion, University Campus, Bucharest, Romania
Academic year: 2nd year Master Degree- Landscape and Territory, 2010
Students: Irina Pata, Petrisor
Project Teachers: prof. phd. arch. Tiberiu Florescu, assist. arh. Mihaela Hărmănescu
The project is an approach of a “paradigm shift”: a territory across by university research: concepts, values, perceptions and practicesare shared by the campus community, “which forms a particular vision of reality that is the basic of the way the community organizes itself” ( Capra, Kuhn, 1996)
In keeping with these new conceptual framework, urban formation is now understood as an accumulation of spontaneous, non sequential elements that overlap and
fragment into integrated networks along with finance, migration, communication and
resources, all of which evolve and mutated at precarious whim.
Conclusion
The true territory for innovation in urban planning, than is not a result of platonic solids, but rather in the design of operational strategies that deals with the multiply and
overlapping forces of a highly complex and entirely uncertain “collective form” (Fumihiko, 1964).
Ben Van Berkel (1992) considered that “structures are changing today; they are
losing their specific separate properties and are defined more by how they relate to
the organization of the whole and how you relate to them; you zoom in to solids, you
fluctuate along evanescent distances, space opens up around you; any variety of mutations are possible, all unquantifiable, orderless, dimensionless, happening in a fluidium” .
References
Benjamin, Walter. (1996) The World of Art in the Age o Mechanical Reproduction, 1936, 217-252.
Ben Van Berkel (1992) monographs.
Capra, Kuhn.( 1997) The Web of Life: A New Scientific Understanding of Living Systems, Anchor Books,
New York.
Cerasella Craciun. (2008) Metabolismul urban. O abordare neconventionala a organismului urban.
Tze Tsung Leong, (2001)cap. Gruen Urbanism, Project on the city 2, Curs Harvard Design School, Edited
by Chuihua Judy Chung, Jeffrey Inaba, Rem Koolhaas, Sze Tdung Leong, Taschen.
De Landa, Manuel. ( 2000) Thousand Years of Nonlinear History, London: Zone Books.
Fumihiko Maki, (1964), Investigations in Collective Forms, St. Louis School of Architecture, Washington University.
Leach, Nel. Swarm Urbanism. AD: Digital Cities, 2009, July/August.
Lefebvre, Henry. (2004) Rhythm analysis: Space, Time and Everyday Life, London: Continuum.
Pope, Albert. (1996) Ladders, Houston, Rice University for Architecture, New York: Princeton architectural Press, p. 32.
Thom Mayne, Stan Allen. (2011) Combinatory urbanism: the complex behavior of collective form
Culver City, CA : Stray Dog Café.
290
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Project 4
Table 1
Mihaela Hărmănescu Romania
291
Schumacher, Patrik. (2006) Swarm Urbanism, July/August.
Marcuse P. and Van Kempen R. (eds) (2000) Globalising Cities: A New Spatial Order? Oxford: Blackwell .
Project 1: Urban-development-proposal: http://improved.ro/blog/2010/01/urban-developement
-proposal/
Project 2: REA Competion Baku, Azerbaidjan, Urban- development- proposal: http://tibiflorescu.
ro/showproject.php?proj=1263035539&cat=proiecte
292
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Sally Stewart
Mackintosh School of Architecture
Glasgow School of Art
UK
Mapping the City:
the possibility of developing
a rich picture place
through experiments
with conventional, digital
and stolen techniques of mapping
“One’s destination is never a place, but rather a new way of looking at things.” ( Miller)
Mapping the City is a short elective course on offer to postgraduate students across
Glasgow School of Art. We work together over a period of 15 weeks, normally through
eight two-hour sessions. Students come from a range of disciplines, and the course
itself could be considered to sit outside any particular discipline – it is constructed and
hosted by the Mackintosh School of Architecture but is not exclusive to architects.
A key factor in the design of the elective course was to identify an area of common ground, relevant and attractive to range of disciplines, in this case the city, as a
departure point from which to examine how the city figures on our individual and
collaborative practice. The course offers the opportunity to consider ways of seeing,
documenting and understanding place, a reasonable point of shared interest, and understand how we are influenced by place, and can in turn influence it. This is achieved
through a series of weekly tasks or activities generating a portfolio of work.
Recent cohorts have included architectural students, environmental artists, graphic designers, painters, a graphic novelist, a sound designer and a sculptor giving a
wide ranging group of participants, from very varied academic, professional, cultural
and social backgrounds.
Originally designed to be a series of activities that could offer a point of common
interest and discussion, now in its second year it is moving to something different and
I believe more interesting, producing a rich picture of the City we encounter daily. As
Katherine Harmon notes “To orientate is to hop back and forth between landscape and
time, geography and emotion, knowledge and behavior.”
Structure
A series of activities and tasks are set, which initially were specifically designed to find
out what individual interests are, the types of and variety of practices apparent in the
student group, range of techniques they are familiar with and use regularly and the
boundaries of their practice if known.
The structure for the course is simple with each session following a similar format.
An introduction or brief lecture leads to a task can be done there or then or one set
as homework for the following session. Each subsequent session then begins with a
show and tell of the previous session’s task and a discussion of emerging points.
Students are asked to consider that the work they show each session is in its final
form. All that is required is to undertake the work as it comes up and then gather it in
a folio for a final submission. The following is offered as guidelines from the outset:
• Think of each piece as a final piece.
• Do it well once, don’t hedge your bets or rely on going back to it to refine it.
• Focus on the task in hand and put your whole energy in doing the task at the time.
• Become aware of what you can produce in the given time and take control of the
time given.
• The work need not be revisited – you set the goals.
• The discussion of work is just that – it is not an assessment – we all play an equal
part on the discussion. Value what your colleagues say.
294
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Having identified what student are interested in, their skill set and level of curiosity,
the focus now shifts to one of providing space to explore emerging issues and shared
practices – or practices that can be shared or appropriated but have not been up to
that point. Students become familiar with the pace of the tasks, the fact that they are
varied and stretch them in different directions. As each task has its own focus and
context there is no hierarchy to them. Together they build to give the potential of a
rich picture. It is up to the student where they take them to as single task or how they
conceive of them as a set.
In all students undertook ten tasks, some short and sharp as if warm ups in the
sessions themselves, others requiring preparation, contemplation and more sustained
engagement. It is clear through feedback that the mix in pace and types of tasks allowed everyone to find the reassurance of some form of practice that was familiar as
well as task that they had never encountered before. The mix and the open nature of
the discussion – not a critique where ever possible, helped students to embrace the
less familiar or unknown. The lack of pressure to revisit and constantly correct work
in actual fact encouraged students to be more self -critical. It also appears to allow
student to take more risks and be experimental rather than to be reliant of tried and
tested methods and solutions. Some of the activities are individual and others require
small group working or as pairs. In either situation, each student must make his or her
own record of the outcome for the portfolio. As many of the tasks were new to the
students, either in form of in the way they were to be undertaken their previous experience or performance was no guarantee of current success.
This session the course ended with a final presentation of each portfolio to the
peer group. Students were asked to take control of publishing their portfolio on the
wall, to consider if there was an emerging theme within their work, reflecting on
what had interested them – not what they had got right or wrong. To do this some
students revisited aspects of their work, reordered the sequence, down played some
aspects and highlighted others. If effect they took control of how they wished it to be
received.
The Engaging with the Digital
The course does not presuppose the types of media students use in making work, the
choice is open depending on techniques present in their existing practice or indeed
those they may wish to begin to use. The form of activities undertaken have scope to
be interpreted and realized in many different ways, and indeed some student use the
opportunity to test new ways of making, documenting and capturing the activities
and experiences the course provokes.
Being situated in an art school programmes are predominantly studio based if not
studio led. The digital now forms part of most students working method, whether to
research ideas or background information, generate the work itself or to document,
publish and disseminate work. It is rare however for student to have no ability to
sketch or draw or make by hand as an integral part of their practice. It is interesting
therefore to see through an elective course such as Mapping the City, and one drawing students from across the range of postgraduate programmes, how students use
digital media and techniques and to what end.
Sally Stewart UK
295
Early on it became apparent that the first point of departure for many students on
projects or tasks is the Internet search. Information is readily available but the problem becomes how to sift and evaluate it, how to become conscious of the sifting that
has already gone on on your behalf by the algorithm (many students were unaware of
this). The source of information and therefore the provenance of that information becomes a key issue. Questioning and interrogating information, opinions and facts became a concern for students particularly where it was data they were coming across
for the first time or in the case of Glasgow, a context which they were largely unfamiliar with. Fact checking, seeking corroboration, researching more systematically more
perhaps with some skepticism have become more common and visible as the course
goes on. . If the information exists on a Google map it must be correct, mustn’t it? As
the students realise that it is not the tutor’s role to correct their work they have taken
more responsibility for this themselves and even developed a lack of tolerance for
their peers how do not and who compromise in this. The issue is not one of getting
something wrong, but rather how to ensure you have confidence on the evidence you
build your arguments on. So information becomes triangulated, anomalies are exposed rather than being suppressed, robust methods of working are shared.
Another aspect of digital work, its finish or gloss, was also a cause of discussion.
While our eye becomes more accustomed to the clarity and qualities of digitally generated images and drawings we take this quality as an indication of authority and authenticity. As our use of, and reliance on computers and the internet increases, it is
easy to forget these are just tools and filters of reality, rather than reality itself. Rather
than with the digital, the issue lies with us, and our expectations. In the case of work
generated through the elective course, as students became more comfortable with
the lack of an single prescribed type of output and aware that different media were
used in quite different ways dependant on the authors core discipline, there was more
evidence of trying different forms of output depending on the nature of the task, or as
a way of diversifying from their habitual media. This worked in both directions - while
one student who has little confidence in his own drawing skills and had been wholly
reliant on manipulating images through Photoshop began to enjoy the freedom and
the rough but ready nature of his own drawing, while others posted (and therefore
published) video and images on the internet for the first time, buoyed by the feedback form complete strangers – a new form and experience of peer review.
An awareness of these issues, and the desire to respond to them emerged through
time.
Tasks and Provocations
Any attempt to map the city cannot be reliant on one approach alone. The task at
hand it too complex, too subtle and too varied. As Roland Barthes notes “This city can
be known only by an activity of an ethnographic kind: you must orientate yourself in it not
by book, by address, but by walking, by sight, by habit, by experience...” (Barthes, p. 36).
The task for the tutor is to imaginatively construct a sequence of activities that in
their execution them might bring the student closer to generating their own image of
the city. The sequence offered through Mapping the City is not exhaustive, but subject to change, being added to and altered along the way depending on the success
or otherwise of each proceeding task. Success being measured in this case by the lev296
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
el of interested generated in the students and the work it provokes. In many instances, students through thoughtful deployment of the digital, produced highly engaging
and innovative solutions to the problem set. The following are a selection of the activities undertaken and the responding work.
Route through the city
The first session involves the group undertaking a guided city walk, the starting at the
School of Art itself and weaving through the grid iron Blythswood area to the downtown ending at the Lighthouse, centre for architecture and design, both starting point
and finish being buildings by Charles Rennie Mackintosh. As the walk progresses we
rewind Glasgow’s history traced through our surrounding building and public realm,
the pavement, the facades and the skyline. Students were asked to make a map of
the route or a similar one of their own devising. One student through tracking Google
street view discovered it was possible to undertake the route in its entirety purely using this the still images this provided, and by combining them as a stop animation
producing a film of the walk. The result became the first of a series of Vimeo posts
(Haddad, J).
Another when retracing the walk became aware of the level of scrutiny by CCTV
cameras apparently everywhere. At their response was to produce a map of the cameras to highlight their extent and the level of surveillance going on. However the map
itself suggested producing a new route between the starting and end point where the
walker disappeared from view, using lanes and alleys to achieve the same result, albeit
below the radar.
Derive/ drift
Students were required to undertake a form of drift or derive the sense of which is
prompted by a text by Guy Debord, beginning from a given but unfamiliar point in
the city. In this form the drift has few but important rules. It is a collaboration where
the movement not of the pair of drifters is not arbitrary but based on discussion and
agreement. The resulting journey is documented with a single use camera, with each
change of direction being logged with one shot, and where no editing is possible. No
other devices, digital cameras, phones etc are allowed thus reducing the potential of
constant editing, way finding and revising of image, direction degree of disorientation. In effect a challenge to encounter the city by letting yourself get lost.
One couple discovered that they were unable to follow the route suggested by
their surroundings form a starting point south of the river by Ibrox, an area of the city
influx and awaiting regeneration. The visible became the unobtainable as landmarks
with their power of suggestion and visual cueing became cut adrift from the original
street pattern and grain. Having mapped their journey on a contemporary map they
then layered earlier versions revealing the extent of change in the sense of the place
and well as the physical fabric.
Sequences and sequencing
Explorations were made into the nature of sequences of drawings, where activities
dependant on a strict or defined order could be replicated by illustrating the key acSally Stewart UK
297
tions or points of transition. Working from the graphic cook strips of Len Deighton, initial sequences tested every activities as varied as dancing a salsa to making hot sauce
or walking through a well know place. Animation allowed the drawings or photos to
come to life, given a pace and timeframe and the potential for looping, re-sequencing
and reanimation. This offers one technique to revisit the city to a fragment of it and to
revise its scenograph and reshape the experience of moving through it.
Series
The potential for a series of images to provide a description of place was prompted by
the wood block Ukiyo-e prints of Hiroshige with One hundred Famous Views of Edo
made between 1797-1858 and the Thirty Six Views of Mount Fuji by Hokusai in the
seven years form 1826. More than just a succession of images, the two series structured round a single rule or conceit – the former recording and celebrating the diversity of life and activity in a city through images selected for their opportunity to
display such characteristics, a type of loose survey, or the later anchored by a single
visual constant with an ever changing foreground. Just as Hiroshige has used this opportunity to play with composition and graphic quality, which provoked Van Gogh to
repaint one view to better understand its structure (Plum Blossom in Kameido Park,
no.30, Spring sequence, One Hundred Famous Views of Edo, 1957, (copied by Vincent
Van Gogh as Japonaiserie: Flowering Plum Tree, 1887), so student within a series of
twenty five images were asked to define a series which had a particular resonance
with them, and compose images in any medium that captured the qualities they were
influenced by.
The results ranged from a distinctly non digital series of rubbings exploring the
surface quality of Glasgow stonework by someone better acquainted with a brick
built city, to a series of images of the Newbery tower, a part of the GSA campus and
Glasgow skyline about to disappear, a record of something whose loss is imminent- as
a test of where can be seen from, what opportunities this framing device or rule allows the series author to explore. The task allowed an extended or sustained engagement with the subject, the possibility of narrative or suspense and surprise, cutting
and editing, a cycle, capable of shuffling or a redux. In presenting and discussing the
work, the opportunity arose to consider whether or not reconfigure or re order the series, to re-hang in response to comment and any emerging hierarchy with it. Although
one of the most straightforward tasks, the multiple proved one with potential for varied interpretation and purpose.
Narrative vs non Narrative
The narrative task involved the plotting of mapping of a city or city fragment described at second hand in a text, fact or fiction. Having warmed up though the proceeding challenges students provided many highly imaginative approaches to both
the choice of text and the means of mapping. Some borrowed the banal to map
the extraordinary such a swat analysis to map Paris as described in Hemmingway’s
“A Moveable Feast”, or a paper sculpture capable of being endlessly reconfigured to
follow the waxing and waning of the Macando as described in” One Hundred Years
of Solitude“ by Gabriel Garcia Marquez. One recurring theme the student came up
298
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
against is that narratives are seldom completely linear, and indeed many of the plots
lent themselves to non linear narrative systems commonly found n game play and
web design.
Conclusions & Observations
Having originally designed the elective and with it in its third cycle I freely admit that
it offers the chance for me to learn as much from the students as they learn from me
or each other. If we had to produce a common reflection of the work a coda might be
added to the title, hybridizing techniques and ideas.
As the discussion as evolved so has the process with the result that our research
cycle had evolved form the original one: Task /show & tell, to include three iterations:
Task /show & tell / reflect /represent/ reflect use for own personal folio and practice.
As for the digital, we are limited in its use only to the extent that we imagine how
to apply it intelligently and innovatively. If we can remember this then the human and
digital need not be in opposition.
Postscript
While in Chania this summer I heard an anecdote that seems an appropriate way to
end. I hope Gunnar Parelius doesn’t mind me sharing it.
“At a meeting of the Nordic Group in Lund Peter Kjær told me how he finds
his way when in Lund (he is an external examiner in Lund). He tries to go in
a (slightly) different direction each time and when sufficiently confused he
looks up to find the cathedral (visible from most places) and then can head
straight to the hotel.”
It seems to sum up our ambiguous relationship between allowing we to get lost and
need to orient ourselves again.
References
Bathes, R., Empire of Signs, 1983, Hill & Wang.
Debord, G, “ Theorie de la derive” Internationale Situationniste no.2.
K. Knabb. (1981). Situationist International Anthology. Ed. K. Knabb. Berkeley: Bureau of Public Secrets. http://www.bopsecrets.org/SI/index.htm
Deighton, Len, Basic French Cookery Course, London, 1979, Jonathan Cape.
Harmon, K. , You are here; Personal Geographies and Other Maps of the Imagination, 2003, Princeton
Architectural Press.
Miller, H. (1957). Big Sur and the Oranges of Hieronymus Bosch, New York: New Directions, 1957.
Haddad, Jerrick, Mapping the City Project 1, http://vimeo.com/20057458, last accessed November
2011.
Sally Stewart UK
299
Fig. 1
Still from Glasgow City walk as retraced through Google
Street View.
Jerrick Haddad, January 2011. http://vimeo.com/20057458
Fig. 2
Drift prompted by “Theory of the Derive”, starting point Govan Underground station recorded on
single use camera.
Sam Kollmeyer and Sunwang Myuang, February 2011.
Fig. 3
Sequence for making Hot Sauce, stills from stop animation.
Jerrick Haddad and Daniele Sambo. Posted on Vimeo. http://vimeo.com/20610163
Fig. 4
Plum Blossom in Kameido Park, no.30, Spring sequence,
One Hundred Famous Views of Edo, 1957 (copied by
Vincent Van Gogh as Japonaiserie: Flowering Plum Tree,
1887).
Hiroshige.
300
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 5
Fig. 6
View of Fudo Falls in Oji, no.49, Summer sequence, One Hundred Famous Views of Edo,
1957-59.
Kinryuzan Temple, Asakusa, no 99, Winter sequence, One Hundred Famous Views of Edo,
1856-57.
Hiroshige.
Hiroshige.
Fig. 7
Mishima Pass in Kai Province, No.33, Thirty Six
View of Mount Fuji.
Hokusai.
Fig. 8
Honjo Tatekawa, the timber yard at Honjo,
no.37, Thirty Six View of Mount Fuji.
Hokusai.
Sally Stewart UK
301
Fig. 9
A sketch of the Mitsui shop in
Suruga, Edo, no. 11, Thirty Six
View of Mount Fuji.
Hokusai.
Fig. 10
One Hundred views of New
Tokyo, Takujoshha - the ‘On the
Table Group’ 1928-32
Kabuki at night, Fujimon Shizuo.
Fig. 11
Red Gate of Tokyo University in
Early Summer, 1929, One Hundred Views of New Tokyo.
Fujimori Shizuo.
302
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 12
Ginza, 1929, One Hundred
Views of New Tokyo.
Kawakami Sumio.
Fig. 13
Subway, 1931, One Hundred
Views of New Tokyo.
Maekawa Senpan.
Fig. 14
Studies of the Newbery Tower, Glasgow School of Art, from a limited Series of twenty images.
Daniele Sambo, March 2011.
Sally Stewart UK
303
Luigi Foglia
Renata Valente
Dipartimento di IDEAS Industrial Design, Ambiente e Storia,
Seconda Università degli Studi di Napoli
Italy
Rethinking the Public Space
for Urban Intensive
Performance Places
The Disciplinary Approach
The interest of Technology of Architecture in the world of Information Technology has
traditionally been developed by studying the control of the entire building process in
programming, design, construction and management, following the changes that occurred with the development of dedicated tools, both hardware and software, favoring reflection on the outcomes of performance induced from time to time, rather than
about the formal impacts on architectural products.
Another aspect characterizing this particular architecture scientific branch has
always been the consideration of complex issues competing to inform the design
process and therefore of the contemporary presence of objective aspects, such as the
terms of the context in which the project goes to influence, along with the whole of
aspects related to subjective perception. Therefore, on the basis of this tradition, the
interpretation proposed in our research for a good use of IT is the study of a computerized decision-making process through which the human element is taken in great
consideration from the point of view of fruition and perception. We have chosen some
time the scope of the redevelopment of urban open spaces, wanting to find directions to the sustainable design of equipment dedicated to users of such sites.
We believe it is possible to have an adaptive architecture in which the human
being’s presence is more decisive and influential, as in our research the effects of
components on the man habitat determine the appropriate design decisions. Our
contribution starts from considering how the evolution of the urban environment,
characterized by constant and increasing disorder, make gradually lose the specific
role of open spaces, which have historically been the focus of aggregating public and
social life and now incur marked abandonment processes. We believe that, in widespread as in settled cities, it is necessary to reinterpret these places - both in semantic
and environmental terms- so they can set up and drive new assets for evolution of
urban public life, thanks to the opportunities offered by Information Technology. Today it is up to new agoras to trace and materialize, the stratification of the complex
system of networks and connections (energy, environmental, social, economic) that
constitute the epicenters of the main hubs of a renewed urban environment attitude.
In this context the re-appropriation of open public spaces offers the chance of a
performance reinterpretation, based on the study of new human needs. For this objective, the contribution of Information Technology is essential to systematize and to
successfully relate the different aspects of phases, both of analysis or design and in
general of the entire building process.
Proposal for an Applied Research Method
Opportunity to develop these considerations is the research entitled “The systemic integration of renewable technologies in the built environment”, inside which the local
unit of the Second University of Naples studies experiments in urban voids, applying
by inductive methods interpretations of public space framework proposed by Environmental Design and working to use computer systems.
Because the social and technological changes affect the way of current attendance of public open space, determining consequences of usage requirements become more complex, the group is studying a place of Aversa (Caserta, Italy), where, in
consonance with the international scientific literature, the urban space is read by the
306
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
superimposition of layers, expressing reading physical data, material and uses, as well
as mapping of physical and meteorological conditions. Actually we believe that the
contemporary technological means may assure a value-based responsive architecture
as it is demonstrated from using the computer application of GIS type, which later we
would like could give even value based selection criteria on design specifications.
Fig. 1
Example of diffuse solar radiation mapping for Piazza Diana in Aversa (CE, Italy) made with software
Townscope by R. De Martino (from A. Bosco, R. De Martino, “Percezione e riconoscibilita’ degli spazi
aperti urbani. Approccio metodologico e strumenti”, in Proceedings of International Conference
S.A.V.E. Heritage IX International Forum Le Vie dei Mercanti, edited by C. Gambardella, La Scuola di
Pitagora editrice, Naples, 2011).
To find vocations and priorities, we are working on creating a computer graphic map,
that would report through the GIS the contextual conditions, along with the current
needs of the users of the open space, on the basis of any analysis carried out according to a survey protocol we are implementing. The needs are checked under the light
of contemporary social conditions and technological uses, referring to experiences already carried out on a national scale. For this purpose, we study the transformative
dynamics of places, developing the analysis of kinetic assets related to the times of
day and seasons, to identify streams and profiles and to imagine architectural design
solutions that can continuously transform through the management of information
systems, as environmental conditions vary. In this sense, the possibility of an intelligent and responsive architecture revolutionizes also construction systems, that from
static become dynamic and changing, thanks to proposals of convertible solutions
of micro-architectures. These discuss even the firmitas through new approaches that
would not prepare ready-made solutions, only apparently free to be modified.
In addition, the transition from the conception of open spaces as passive energy
systems - terminal elements of the energy network – towards active and highly comLuigi Foglia, Renata Valente Italy
307
plex systems - nodes with flows into and out - leads to rethink relationships that develop between space and energy in the direction of a “dematerialization” of components, also through the use of Information Technologies. We actually believe we have
to create self-sufficient hubs, inter-connected or off grid, which meet the system requirements found in favor of integrability criteria of use and function.
By respecting the needs classes identified, the user is considered as the heart of
the design and management processes and we propose to him the experience of a
hybrid space (hyper-fruition) through IT support, that crosses the traditional building
process (programming, design, construction, use), allowing to manage data related to
the uses, the permanences, the use of space, once interventions are implemented.
IT makes possible to provide as well as information processes also the automation
of the architectural spaces, in which the user’s physical and emotional interactions
with place aren’t denied, not compromising perception and emotion, but offering
support for them.
The outcome of our research will be the development of a cognitive tool (GIS),
which collects data from both existing software programs as well as two new tools
built by the research group: a smart protocol to analyze urban open spaces and a
database format, that is being tested for the identification of appropriate project requirements. The research team suggests, finally, an additional tool to support design
decisions, where, through the Preliminary Design Document of Sustainable Urban
Open Spaces, a new format for design brief raises the single professional from costly
investigations. In this sense, the contribution of information technology is intended to
aid and address the design decisions, rather than through architectural artifacts modeling software.
This is a chance to develop a new language, from the perspective of maximum energy efficiency, to enhance integrability of languages and uses, configuring interactive
areas, highly inclusive, real augmented spaces, rich in perceptual stimuli and interactive capabilities suit to become core elements of the new urban contemporary, creating new social identifications.
Intensive Autonomous Performance Spaces
These strategic choices allow the creation of spaces that we have defined intensive
autonomous performance spaces, with new emerging qualities and extensible potentials, closely related to the alternative/innovative way of use, characterized by four
main aspects:
- Fullest possible integration of existing equipment in volumes and surfaces, enhanced by assimilating capabilities and raising the performing possibilities, using
the opportunities offered by IT,
- Convertibility of the same physical equipment, designed and tested through the
application of IT models,
- Dynamics of these spatial configurations powered by the integration of technologies for the production of energy from renewable sources,
- Performativity, providing a direct benefit, even of information, and through this,
causing new participatory relationship between users and location, giving value
to the relational outcomes in terms of induced actions.
308
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
We are studying the possibility of an architectural production receptive to sensations
of the human being which will inhabit it with the project for micro-architectures in
urban open space, aimed at creating conditions of cultural, social and emotional aggregation, real sensitive spaces, as we are convinced of the need of the political role of
project for public spaces in the city. This belief was reflected, for example, in the recent
contribution of Pedro Gadanho (2011), which reminds
“Long before the public monuments of art were subsumed from emerging of street art,
performance art was in many ways, the first edge to investigate the political dimension of
the road. And its teachings-by Wodiczko homeless vehicles to structures illegally invaded
by Matta-Clarke, the cabins body of Lucy Orta to urban occupations by Trisha Brown-are
sources of inspiration for a generation that believes in transitional urban actions aimed at
the communities rather than enduring monuments to corporate powers.” “The real architects must become not only versatile producers of culture and everyday programmers […]
of the city. They must also become true experts of the road.”
And still on the performative dimension (Gadanho, 2007):
“Thus, the notion of performance must also retain a cultural dimension. Accepting architecture as cultural production, its performative dimension must also contribute to a critical role, that is, to architecture’s capacity to produce commentary regarding the ongoing
transformations of culture and society.”
Our proposal for urban open spaces with intensive autonomous performance is based
precisely on the premise of using advanced information technology to protect architecture from becoming a consumable, self-complacent object, but rather by exploiting information technologies for the diagnosis of needs and to develop their system
and project requirements, as well as inclusion in the benefits offered to users. The idea
is based on performative spaces, where special conditions of exchange and social inclusion take place, generating meetings and discussions, highlighting aspirations, cultural attitudes and values emerging from social life.
Our aim is to scientifically define what are the requirements characterizing these
intensive autonomous performance spaces, to fed the results into the complex studio
tool for Sustainable Urban Open Spaces. This reflection has led us to study how to prepare a data base with the criteria that will allow continue implementation and sharing,
for information relating to contiguous urban places, also to facilitate the redesign of
the connective tissue of the city, meant as a network of eco-technological corridors (Catani and Valente, 2008).
The Definition of Requirements
This new way of considering open urban space implies a complex framework of requirements, reported to both management and fruition needs classes identified for
open spaces in publications already consulted (Dessì, 2008; Valente 2010) and also
new parameters on the research of which we are interested in investigating. In fact,
there isn’t a case study typology identified as appropriate to describe such spaces in
today’s scenario reported to design and construction practices. Therefore we have deLuigi Foglia, Renata Valente Italy
309
cided to refer to “features” or “peculiarities” of paradigmatic examples that have been
helpful to construct the concept of intensive autonomous performance spaces.
Here we present a part of the research, section in which we make the effort to extract concepts for the topic of interest, to point out new incentives for the project. In
search of both descriptive and prescriptive definitions we have prepared an informatics sheet, useful to carry out screening of the projects that have been chosen each
time for some of the aspects related to our goal and thus to be able to extrapolate
useful features to build a sort of identikit. This last is composed of a broad framework
descriptive of complex and hybrids requirements we are seeking. We worked, therefore, on the study of the outcomes induced by the presented projects, trying to figure
out what and how many levels of consequences they could determine.
The iteration of these operations, made choosing projects based on traditional or
innovative and computing technologies, delivers an intelligent tool that can be implemented in time, where topics researches are possible to help both in the identification
of directions of project for similar conditions, and in the control of the sustainability
requirements of other already built solutions.
The mosaic of features identified, properly arranged for families, is already suitable
to some considerations: we can reflect on the recurrence of certain features or on the
frequent absence of some other, combining these observations to the environmental
conditions determining such evidences, reasoning to deduce the right consequences
in terms of need or inappropriateness.
The contribution of IT to architectural design in this case is to aid the construction
of the concept, in outlining the needs resulting from similar sites, their uses and users,
suggesting helpful appropriate meta-design strategies. Moreover, the study of the requirements of this particular type of urban scenario starts from some intuitions, by attachment to cultural beliefs on performance architecture, to the need for new sensory
scenarios, also influenced by the theories of Andrea Branzi (2006). So the experiment
to use information technology goes towards the wish of being able to get new types
of requirements through the application of fuzzy logic to the GIS system, that may
suggest unpredictable strategies for the project and provide families of unexpected
requirements, by processing all types of data received on the site in question.
Our interpretation against the disappearing of human being importance is therefore represented by the proposal of intensive autonomous performance spaces,
which induce positive social and cultural processes through performativity, to rethink
architecture critically responsive to user needs. In this sense, the metaphorical image
that we advance as a reference is the aid that artificial intelligence can give to stimulate and catalyze the natural urban intelligence (M. Pogoreutz, 2006) and the spontaneous creativity existing in the contemporary city.
Study for an Identikit
This section proposes a study in progress aimed at finding the distinctive qualities
able to identify the independent intensive performing spaces. The identikit structure is
based on compilation of a reading card of case studies specifically selected. The purpose is the identification of the output elements (features) that define the distinctive
qualities able to describe the functioning of the developmental aspects and the potentials for a new concept of urban open spaces, providing the designer with an effec310
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
tive supporting tool. This study will adopt a reading method primarily focused on the
outcomes induced by performed interventions. This is obtained by considering, in a
synergistic way, architecture, place and modality of use. Then the resulting processes
are analyzed and related to each other by a feedback reading method, focused on the
effects that architecture produces in the use of public spaces.
The case studies analyzed by the use of the Reading Cards are carefully selected according to parameters specifically identified and listed below:
- Carried out works;
- Components of the urban open space;
- Variable set-up.
It is important that the selected case studies relate to carried out works. In this way the
real outcomes induced on places and on diversified ways of use can be monitored, by
referring not to predictions based on projects but to real data. It is also necessary that
case studies are selected from the components of urban open space (Valente, 2010)
and based on the interaction level with environmental systems and subsystems in
the public space (RUE Regione Emilia Romagna, 2006). The basic research target is the
development of a model for innovative urban open spaces, characterized by multifunctionality and adaptability to the variability of the anthropic and environmental
context, as well as by flexibility in the use. For this reason it is appropriate that the
analyzed works are also selected according to their adaptability and flexibility qualities highlighted by a variable set-up. A variable set-up frequently involves the choice of
the small scale, including an experimental research on space. The small scale interventions, according to the founder of Micro Architecture Group Munich Richard Horden, offer suggestions for the definition of space/architecture identified by an “experimental
and technologically advanced quality, variable, shaped and open, adaptable and recyclable for uses and materials” (2008).
Analysis Card Structure
This section illustrates the reading card structure for a synthetic organization of
knowledge and data come from the analysis. The reading card here proposed represents a model for expeditious investigation, but not reductive, which is still being
tested. The reading card structure (Fig. 2) embraces a flexible and open nature, implementable and easily modifiable on the basis of future detail levels. To ensure an
efficient reading, the card is organized on a graphical base that can be immediately
interpreted. Data and knowledge are organized in a hierarchical model in which are
considered the specific aspects grouped by classes and categories.
The first phase of research is defined by the Passport, in which we aim at providing
a tool for immediate recognition and location of the intervention. Here we present the
elements able to describe a synthetic-descriptive picture of intervention and the useful elements for a quick overview. The parameters taken into consideration are:
- Identifying code, name, icon;
- Image-logo;
- Designers, place, year.
Luigi Foglia, Renata Valente Italy
311
Fig. 2
Reading Card structure, composition and operability.
In a second level of analysis, Typology, we refer to the characteristics that rule the types
of intervention. Here we aim to analyze the specific function of the analyzed project,
related with the environmental system in which it is placed. We refer to the classification applied in the Urban Building Regulations (RUE Emilia Romagna, 2006) of the
municipalities in the Italian Region Emilia Romagna, following attached (Table 1). We
also aims to identify the categories of the elements and components of environmental
micro-landscape design (Valente, 2010). The parameters here considered are:
- Intended use;
- Subsystem link;
- Type of components.
Then, in section Schedule, are analyzed and related to the temporal aspects of intervention program and the diversified uses. They are therefore considered the permanence
times (Valente, 2010) and the use period. In brief, the parameters here considered are:
- Permanence;
- Time use.
In the Outcomes section is presented a reflection carried on an innovative interpretation that doesn’t derive from the analysis of the object itself but from the system
analysis and from the relationships that develop between architecture, site, users. At
312
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Table 1
Environmental subsystems of public space list, RUE Emilia Romagna 2006.
Table 2
Table of the needs, requirements and performance indicators for urban open space. (1) in Dessì,
2007; (2) in ITACA Protocol, 2004; (3) in Valente, 2010; (4) by the author.
this scale are considered the types of public space users following the intervention
as a first indicator of levels of use. Then are analyzed the performed needs based on
the model offered by UNI 8289, conjugated to urban open spaces. Among the seven
needs classes proposed by normative, it was opted to focus a specific attention on
classes: Aspect, Usability, Comfort and Integrability (Table 2). In a next step the uses
of space are analyzed and related before and after intervention. This reading mode,
Luigi Foglia, Renata Valente Italy
313
based on the list of principal uses detected in the public space (RUE Emilia Romagna,
2006), aims at highlighting the influence of intervention on use modality of a selected
area. The parameters here considered are:
- Principal users;
- Performed needs;
- Previous principal uses of space;
- Following principal uses of space.
Table 3
Taxonomy of the features found in the analyzed projects.
314
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
In conclusion, the reading card presents in its lower part the Features section. Features
represent the synthetic profile of the specific qualities detected in the case studies
analyzed. Each identified feature defines a specific potentiality or an evolution chance
of the structure of urban open spaces for the development of an innovative model,
characterized by multi-functionality and adaptability to the variability of the anthropic and environmental context, as well as by flexibility in the use. The combination of
features detected in each case study finally composes the mosaic, organized in a flexible and implementable logic, which describes the identikit for independent intensive
performing spaces. In this incremental layout, resulting features are organized into diversified theme groups. The result is therefore a mosaic able to describe the specific
evolutive characteristics, able to monitor frequency and recurrence of data.
Just the frequency and recurrence of features in diversified examples related to
the design practice can identify the highlights, or constants that are repeated several
times. A double interpretation of the phenomenon can follow the repetition of these
constants. On one side we can highlight the relative quality status achieved in the current design practice, on the other side, a low frequency of data in certain areas may indicate aspects (probably because innovative or problematic) to be further developed
in the current design practice.
Projects and Developmental Perspectives
In this section we report the analysis worked out through the reading cards which are
applied to two specific case studies: the Storefront fort Art and Architecture and the
GreenPix Mediawall (Fig. 3).
Working with the artist and designer Vito Acconci, Steven Holl designs for Storefront for Art and Architecture, a nonprofit gallery on Kenmare Street in Lower Manhattan, a concrete board facade with pivoting panels used as doors, windows, seating and
shelves in endless combinations. In this project Steven Holl has explored an approach
to design in which walls, floors and ceilings function as permeable membranes.
GreenPix (Zero Energy Media Wall) is a groundbreaking project applying sustainable and digital media technology to the curtain wall of Xicui entertainment complex
in Beijing, near the site of the 2008 Olympics.
Featuring a color LED display and a photovoltaic system integrated into a glass
curtain wall, the building performs as a self-sufficient organic system, harvesting solar
energy by day and using it to illuminate the screen after dark, mirroring a day’s climatic cycle.
Although both analyzed projects show shared characteristics and belong to the same
category of urban open space components, the outcomes reading of case studies
shows aspects of a different nature. The architecture of the Storefront Gallery, in his
representation of an analogical process, implies a direct involvement of users in the
shape modifiability. The mechanical modifiability direct by the users, in satisfaction
of their needs, involves a physical and emotional real involvement. The user, in fact,
starts a deep sensory relationship with the building that involves all the senses. The
not independent sensitivity of the building is manifested in the adaptation of its mechanical structure by the users. It is basically the user to decide the shape change and
the building will its support.
Luigi Foglia, Renata Valente Italy
315
Fig. 3
Analysis of two selected projects using the Reading Cards.
In GreenPix Mediawall interaction occurs primarily through a visual flow of light
and colours. The intelligent facade captures solar energy during the day and then
reuses to change constantly its layout. The modifiability level shows itself in the two
dimensions of the front plane that emotionally interacts with users through a continuous unidirectional flow of graphic information. The user does not interact directly
with the building. The facade excites, communicates, tells about events, casts images
316
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
and describes its use level. The exchange occurs with the external environment and
indirectly with the users inside the building. In this project, whose functioning is set
on information technology, the involvement of the user is not kinaesthetic but visual.
Consequently, the level of interaction can be partial.
The resulting read shows in this case that the interactivity levels are not directly
related to the increase of technology in design processes and management. The key
to understanding the phenomenon is given mainly by the mode of integration between architectural space and electronic content. To make space intelligent it is necessary to give to the machine a body, or provide spatial intelligence. It is important that
the new concept of computerization replaces a classical-analytical logic with logic of
senses; a concept of physic space, or a space able to feel its changes, able to show a
contextual sensitivity and to react using appropriate behaviours.
What we are describing now is the current progress level of an analytic and inclusive model which, though it is still being tested, shows a strongly developmental feature. Considered the organizational structure, the reading card here proposed can be
used as a support to the elaboration of a computerized system for a hierarchical research, which can be an useful interface tool between the production world and the
project one. According to the analysed characteristics, this model describes an exportable process for a computerized reading of data through the modality of the Information Technology. We believe that it should be important to draw a process based on
complex models which however can take into consideration developmental logics
linked to the unforeseen (fuzzy logics). This is the reason why we think that it should
be crucial to support the human component within the process described here, so
that it can intervene and settle the organization of the computerized logics for the
reading/project of intensive performing urban open spaces.
Acknowledgments
This paper refers to the Research Work PRIN08 entitled “The systemic integration of renewable technologies in the built environment”, Research Program of National Main Interest, National Coordinator G. Scudo. The Local Team Unit of Second University of Naples consists of Mariarosaria Arena,
Antonio Bosco, Raffaela De Martino, Luigi Foglia, Renata Valente and is led by Sergio Rinaldi.
Renata Valente has edited paragraphs from 1 to 4, Luigi Foglia has edited paragraphs from 5 to 7.
References
Bauman, Z., 2008. Modernità liquida. Bari: Laterza.
Branzi, A., 2006. Modernità debole e diffusa. Il mondo del progetto all’inizio del XXI secolo. Milano: Skira.
Catani, M., and Valente, R., 2008. Intorno alle autostrade urbane. Around Urban Highways. Firenze:
Alinea.
Dessì, V., 2007. Progettare il comfort urbano. Napoli: Sistemi Editoriali
Gadanho, P., 2011. Ritorno nelle strade ovvero l’ascesa della Performance architecture, Domus,
950, pp. III.
Gadanho, P., 2007. Architecture as Performance, Dédalo, 02.
Luigi Foglia, Renata Valente Italy
317
Holl, S., 2000. Parallax, New York: Princeton Architectural Press.
Horden R., 2008. Micro Architecture: Lightweight, Mobile and Ecological Buildings for the Future.
London: Thames & Hudson.
Iacovoni, A. and Rapp, D., 2009. Playscape. Melfi: Libria.
Palumbo, M. L., 2001. Nuovi ventri. corpi elettronici e disordini architettonici. Roma: Testo&Immagine.
Pogoreutz, M., 2006. Urban Intelligence, in: Haydn, F., and Temel, R., ed. 2006. Temporary Urban
Spaces Concepts for the Use of City Spaces. Basel: Birkäuser, pp. 75-80.
Regione Emilia Romagna, 2010. Regolamento Urbanistico Edilizio (RUE).
Sennet, R., 1992. The conscience of the eye. The design and social life of cities. New York: Norton.
Valente, R., 2010. Environmental Design. Napoli: Liguori.
318
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
(Re)thinking a Critically
Responsive Architecture
Christian Drevet
Ecole Nationale Supérieure d’ Architecture de St Etienne
France
Form
and Information
Contemporary architecture undertook a production of projects and realizations requestioning the traditional values of the discipline about the question of the form in
relation with the space, the time, the material, the scale, the place, the artificial, the
language, the geometry, the composition, with new preoccupations as the complexity, the plurality, the movement, the instability, the fragmentation, the peculiarity, the
immediacy, the ubiquity...
Indeed, times have changed:
- Classic times were circular and turned to a sacred past. Architecture, led by the
metaphysical “humanities”, obeyed motionless orders and predetermined figures.
Form was then frozen for ever.
- Modern times were oriented towards an ideal future. The architecture, seeking for
perfection, change, progress, and fulfillment obeyed to dogmas of purity and universality. Form was then crystallized.
- Contemporary times appear in a continuous way in the present in a short-lived
immediacy. The architecture, without the symbols of past neither the utopias of
future, deforms, decomposes, deconstructs, transforms, surpasses and creates
permanently without finalizing, nor even capitalizing. Form remains open and
suspended.
The deep change in this new “Gestalt” is doubtless the end of the “Great Histories”, the
figures and the dogmas and the passage, from a transcending work plan for architecture, to an immanence one. The end of the sacred past and the ideal future, which is
very similar, for the merging present: a kind of existential pragmatism.
Informative Society
The loss of references and models, that they are past or future, learned or popular,
conceptual or traditional, opens the passage to “the indefinite”. It opens to the knowledge of “what arrives” as says Paul Virilio, as it were the current events of the present
to feed the action and the creation.
So, it opens to the indispensable information.
This radical but progressive evolution of the behavior of the individuals to the
world does not coincide, as we can believe it, with the development of the digital
technology but appears in a very previous way.
The informative society indeed appears in a concomitant and interdependent way
with the second industrial revolution. It is the density and the complexity generated
by the spectacular increase of the volumes of production of goods, the acceleration
of the material and immaterial exchanges and finally the beginning of the economic
globalization, on the American model, which produced the necessity of a new mode
of management of the enormous quantities of data and information generated by the
industrial and commercial society.
The social models inherited from the sacred past or idealized future offering a vision
simplified and pure seem exceeded and no more in phase with the reality and every-
322
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
day life. It is this coincidence of the hard and the soft, of the economic and the existential which make irreversible the advent of a society based on the information.
Then, it is the informative society which made possible and generated the digital
era and not the opposite.
However the digital revolution associated definitively the communication with the
information by becoming one of its major vectors because it is only through its circulation that the information acquires a real social impact.
Moreover, in French «computing» is called «informatique».
In the contemporary world, the couple information / communication with its signs
and its codes are going to become a new human condition as important as gravity.
We may affirm today that classical space and modern space had been surpassed by
informational time space.
Information became an essential component in space discipline as architecture an urban environment.
Information has many forms: knowledge, data, education, entertainment, advertising, gossip, blog and so one...
Information is in the same time genetic code, context and circumstances of space
production.
Information is so important that it is in itself an esthetic challenge.
The abandonment of the certainties and the previous orders makes the information
indispensable for the vision of the world and the action.
The information and its first consequence, supra information, simultaneously temporal and plural, produces:
- On the one hand, new complexity, instability and indecision.
- On the other hand, a big interaction of impulse and stimuli promoting a sort of
existential reactivity which associated with the loss of inhibition and the disobedience to previous orders could generate creativity.
It is important to notice that the apparition of the “informative society” coincides with
its dis sedentarization under the influence of the mobility acceleration. The informative society is thus, somewhere, nomad like and confirms the Deleuzien sliding from
the “striated space” (espace strié), as urban grid, to the “smooth space” (espace lisse),
as the ocean. On “striated space” navigation is coordinated, on “smooth space”, navigation is surfing.
The information and the information exchange, according to Norbert Wiener, seem to
be an alternative in the entropy of social order and upsetting the ontological status
of the human being quite as, according to Paul Virilio, the information constitutes the
only links between fragments.
Christian Drevet France
323
The Form in Informative Society
In this new deal and in the architecture realm, the relation between form and information appears very simple: “The form is determined by the information, the data
coming from the environment where they are” as says Greg Lynn. The space of the
conception is imagined as a field of forces and not as a neutral space receptacle of
architectural objects. Form is determined by information like in alive world all the genetic codes as DNA one. May be, that explains the undeniable analogy between new
architectural form and the biologic one.
The form arises then from the materialization of the field of forces which are nothing
else than the information and the data. It is, thus, this field which “in-forms” the architectural design and gives it its form. Moreover, etymologically, inform means to give a
form.
The attention of the designer shifted from the architectural object to the information
and his interest remained dominant on the process of the morphing rather than on
the purely formal or spatial qualities which so remains opened and increases their capacity to produce new experiments for the users.
The new deal is: morphogenesis rather than morphology.
Anyway the outcomes are the same: use informational tools as morphogenesis devices without having to do with form directly, without the “skill hand” without style,
without figures, without typology, without aesthetic orders, without symbols. That is
surely, in a way, the immanence of the form.
The awareness of the uncertainty, the loss of totality and the fact that information is
momentary, in perpetual becoming, finally leads to what we call “no form” or “formless” or informal. Informal must be understood as a form which has lost the “purity”,
the simplicity, the symbolism, the sense and signification, a form which is not completed and becoming, a form one cannot recognize.
Thus, in a certain way, “Informational times” leads to “informal spaces”.
In fact, things are more complex and not always so direct, contemporary architecture
presents a multiplicity of alternative interpreting this relation between information
and form. Synthesis is impossible, let us proceed by stakes and values or “plateaus”.
Writing and Differ(a)nce
The question of the preliminary information of the form and its finally consequent
production, automatic or “machinique” from a driven code lead many designers to
resort to an kind of preliminary “writing” of the project determining in advance its
morphing in a almost natural way. They use for the philosophic reversal operated by
Jacques Derrida with the concept of “ differ (a) nce “ between the writing and the saying establishing that the first one precedes the second contrary to all the classic metaphysical thoughts.
324
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
On the question of “writing”, we can quote, for example:
- The “textual” matrix preceding the form for Zaha Hadid. Matrix is an organizing
fabric.
The textual thought establishes new relations between the forms and their
environment which constitute a kind of landscape and geography. This approach
comes along with an overlapping of the fields of flows and with signs. The Tramway terminal station in Strasbourg is a clear illustration.
- The “morphing vectors” which generate the “spacing” in analogy with Derrida’s
“writing” before the design for Peter Einsenman.
The architectural form is envisaged as a moment in a flow and considered as a
cutting or a freeze frame congealing the unstable geometry. Answering in a way
to the neo-nomadism, the form is inseparable from the field of forces which generates the geometrical mobility. For lack of being really mobile, the form is said “animated” as Geg Lynn calls it.
These co-present cuttings can constitute as a “track” of the trajectory and the
successive geometrical transformations of the shape. The track plays then with the
question of the writing and constitutes an abstract transparency or a «phenomenal transparency “in which the meaning lies in the generation or the «structure
of the shape “. The research work of new architectural dialectics on his houses illustrates perfectly this question, in particular houses 2 and 3.
Fig. 1
Honheim Nord, Zaha Hadid.
Christian Drevet France
325
- The pattern of the “lines” of the places and the construction of “micromegas” with
these lines constitute the “writing” of the formal design for Daniel Libeskind. Lines
of the motion, lines of views, lines of intention, lines of forces, line of desires.
The lines of micromegas restore force of the invisible on the visible. For example, in the Jewish Museum in Berlin, there is a straight line and a zigzag line. The
straight line represents what was lost and zig zag line the chaotic course of the
Jewish people. In the same way that Egyptian hieroglyphs are made by the intersection of lines between them as the “key of life “, Daniel Libeskind’s architecture
establishes almost directly a writing.
- We could speak in the same way about lines of movement which give the form in
Enric Miralles design whose action «to comb flows” constitutes an essential component of his work.
Fig. 2
House III, Peter Eisenman.
326
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
Micromegas, Daniel Libeskind.
Diagramisation
The diagrams are instruments invented by the sciences and used for the compression
of the information and meet thus quite naturally in the manufacturing of the «informative form “. The statistical boards or schematic images contain more information than
pages of words. They allow, besides, seizing at the same time the detail and the general on the same document of information and, so, enabling to forget the hierarchy
which is at the origin of any spirit of composition.
Diagrams thus became an approach of the design.
The diagram is not a metaphor or a paradigm but an abstract machine which is at
the same time content and expression. The diagram avoids the return to typologies,
slows down the apparition of the signs and delays the fixation of the image. The diagram puts in relation specificities and organizational systems, connects the micro and
the macro.
In the register of “diagramming”, we can quote for example:
- The whole work of UN studio and specially their project “Rubber Mat” in Rotterdam
which presents a diagramization associating: value of the ground, situation, built
density, density of activity, increase of business, view quality, and crossing with 4
axes of programming: living room, work, fun, landscape. This diagram allows a mobile organization of the future district without trying to make architecture in an
academic sense.
Christian Drevet France
327
Fig. 4
Rubber Mat, Un Studio.
- The informative data of Dutch Ranstadt which produce the new urban horizons of
the data town and its “datascape” of MVRDV in a pragmatic sublimation connecting “moral” and “normal”. It is here impossible to not speak of Silodam building in
which the dwelling modes distribution diagram and the architectural façade are
quite one and the same thing.
- The architecture of «relative neutrality” for OMA who questions the programmatic
information and transforms it into strategy more than definitive design or signature in, for example the Central Library of Seattle.
Surface
The support of the information is the surface, the sheet, the picture, the matrix, the
panel, the screen, the image, the tablet.
So, as declares it Jean Nouvel “The spatiality is not crucial any more, the tension
between spaces and objects is recorded in the surfaces, in the interfaces”. The thickness and the depth of the Beaux Arts are replaced by the surface, by the multiple surfaces. That is the post modern superficiality.
The “hyper surface” account for the convergence between the cyberspace and the
architecture, transformed in surface of projection.
Surfaces do not define any more the space by enclosing it. They generate it as a
series of layers which follow their inflections. The space is determined by the undulations of various plans that compose the project.
To illustrate the notion of surface, we can evoke for example:
- The Cartier foundation in Paris with Jean Nouvel who oscillates between a threedimensional body and the decomposition surfaces / interface re-questioning the
328
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 5
Silodam, MVRDV.
Christian Drevet France
329
Fig. 6
Fondation Cartier, Jean Nouvel.
Fig. 7
Yokohama Terminal, FOA.
330
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
relationship inside / outside and producing a blurring evanescence between the
sky, the building, the garden, the trees and the boulevard Raspail.
The bi-dimensionality pulls the “hyper perspective” which is the opposite of
the classic perspective with a single base line, a single point of view, a motionless
mono vision. It gives the absolute perspective, the “airport effect”.
- The Terminal of Yokohama of Foreign Office Architects establishes a real interface
between the port and the city, between the inhabitants of Yokohama and the foreign visitors, between the local and the global. Interface allowing to pass from a
system of codes and information to another and constituting a mechanism of soft
integration, a tool of “déterritorialisation / reterritorialisation”, a continued form in
a passing milfoil.
The surface of the ground folds on itself to produce and contain the progresses of everyone through a public place, from the inhabitant to abroad, from the
stroller to the businessman, from the Peeping Tom to the exhibitionist, from the
actor to the spectator.
Ubiquity
Under the push of information technologies and communication, architecture becomes interface between reality and unreality, between potential and actuation, between physical and virtual confirming the Deleuzian ubiquity between the couples
Body / Nature and Brain / Information.
As well Toyo Ito thinks that there is a virtual body and its consciousness which coexist with a real body and its sensibility. “The architecture of the wind “, as he names
it, consists of the meeting of two bodies and allows passing from a 3D existence to an
image.
The media library of Sendaï is one of these points of relation between the body of
the primitive man connected to the nature and the consciousness of the modern
man who takes part to the electronic world. Between “tubes” where are crossing all
the flows as structure, light, ventilation, circulation, information, and settle down new
supports of the human ubiquity: the “fields” which are interfaces between natural and
artificial, material and immaterial.
This ubiquity is not stable because our experience of the world is evolving under
the effects of the virtuality. Doubtless the immateriality of pixels is going to modify
our perception of the materialism of atoms and why not the sensations of Gravity. We
can notice, for example, how disturbing is the use of a travelator followed by return in
the normal walking.
The Fall of Tectonic and Scale
The tectonic and the structure do not any more appear in measure to drive the
project and to produce the form. The structure is effectively much related to the
former typologies and appears finally as a closed system of composition predetermined and hierarchical.
Christian Drevet France
331
Fig. 8
Sendai Library, Toyo Ito.
The apparition of the form on the screen, resulting in a more or less direct way of the
treatment of a coded information, does not need, by no means, technical pre-sizing or
structural “recipes”. Namely the same screen is capable with the technical evolutions
of the construction to realize about anything afterward. The constructive truth and
the high tech belong henceforth to the past.
Being given that the tectonic and the structure contributed for many to the definition of the sizes and the dimensions of the edifice, it also produces a loss of scale.
As well we can ask:
- What become scale in the “smooth” Deleuzian world of mobility which is, a priori,
the instrument of “striated” world?
- What does scale mean in a surpassed geometry of flux and movement as for example “draperie”, “blobs” or “entrelac”?
- Is there a particular hope for scale in blurring limits and losing totality? A scale
could be “becoming”?
- Doe Formlessness and morphogenesis posture use any scale?
We can also observe different postures in contemporary works as:
- Steven Holl or Zaha Hadid using fractal device in place of human scale in a sort of
“multi scale production”.
- Enric Miralles using “rhythmic repetitions” of analogical forms as fish, octopus or
hairs.
- Herzog and De Meuron using “auto similarities” instead of scale in order to singularize objects.
- Peter Eisenman exhibiting the idea of scale in itself in each architectural experience.
332
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 9
Amsterdam Offices, Steven Holl.
Architectural Patterns
Reality can be interpreted in terms of information and “cybernetic” which, according
to its theorists as Norbert Wiener, regulate the relationship between the object and
the subject. The informative processes as the systems, the introduction, the feedback,
the entropy, the circulation, conceptualize itself by visual schemas or organizational
Christian Drevet France
333
Fig. 10
Informational Patterns.
Fig. 11
Natural Patterns.
334
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
and phenomenal “patterns”. This patterns which are considered as the direct visual
translation of the flows of information finds curiously their formal echoes in terms of
in the nature and the alive world, as well in nano, micro or macro dimensions.
In a world having lost its figures, these “patterns” become often design patterns for
architecture. These patterns are used more as motives or themes than as abstract models.
In analogy, we can think at the use of the “arabesque” at the Mongols. The arabesque which served at the same time as expression of their tortuous mentality, as
battle plan with false escape and counterattack, as ornament on their carpets or as
“patron for their famous bow with double curvature.
These design patterns can be used as well as mental structure for the project, as
ornament on the surfaces, or as texture for the envelope. These patterns create another relationship with the informative and digital architecture.
We can quite mention here as example the Mercedes Benz Museum designed by UN
studio where the organization inside plan clover like evoke an information pattern.
We can also mention the Deutsch bank in Berlin by Franck Ghery using the baroque “draperies” as formal showing a particular state of mind between object and subject.
We can also mention the Djeddah Airport by OMA using patterns as motive and
ornament.
Fig. 12
Mercedes Benz Museum, Un Studio.
Christian Drevet France
335
Fig. 13
DZ Bank Berlin, Frank Ghery.
Fig. 14
Djeddah Airport, OMA.
336
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
Complexity
The informative society appeared, like we have seen it, to manage complexity and its
digital vectors of treatment and communication have stressed this capacity to allow
and produce very complex formal answers that man cannot even imagine himself nor
conceive. He can just watch the forms being made.
The more information is unexpected and unpredictable the more it is effective,
the informative form is in wild quest of the unexpected and the unpredictable. As
Zaha Hadid said, there are numerous possible geometries, why to limit itself to only
one?
Beyond, several major concepts appear with information complexity and help designers to exceed the Euclidian “right” geometry:
- The philosophic concept of the “fold” which supplied an alternative in the fragmentation and in the chaos.
- The algorithmic and parametric calculation, which allows producing «inconceivable» forms, creates by experiences and successive modifications.
- The recourse to the “ topological peculiarities “ on the borders of the geometry as,
among others, the Moebius strip, the bottle of Klein, the surfaces of Seifert, or the
curves of Bezier.
Under these effects conjugated, the form can appear then as a natural “merging” such
as one defined it in nano technology.
The possible illustrations are very numerous here:
- Park of relaxation in Torrevieja with Toyo Ito with Bezier curves.
- The faculty of music in Graz with UN studio with Klein bottle.
- The Littoral Park in Barcelona of FOA with field parametric production.
Performance
Informative tools accompany the process of socialization of the information toward an
open system organizing the world on the basis of open knowledge and its activation
in the physical world. They accompany the management of the complexity of the real.
It is the “performative design process” seeking a sort of meta-optimization as responsive as natural world, using parameters, codes and algorithms. This “intelligent
architecture design” explores specially the realm of the use and the material. It comes
from new engineering fabrication. One of the most famous examples of performative design process is, may be, in engineering, the F117 stealth fighter American plane
working specially during night. The only question we can ask on it is: Why it looks as
much like a bat without any prior intention from the engineers.
Nevertheless, we must note that the right use of scientific tools has however abandoned the pre-determined approaches to turn to inductive and experimental
aptitudes.
We mention here the trend for morphogenetic process, auto-organization, participation, adaptation for constructing a more natural world. Technology appears to be a
sort of second nature.
Christian Drevet France
337
After that, performative architecture can become, without joke a real performance, as it were, a real event for the town as Situationists and Bernard Tschumi claim it.
At this moment, architecture will become information itself.
Post Modern Creativity
Simultaneously to performative use, informational tools are diverted by the architects
from their scientific determinate spirit and used beyond their limits, “by accident” and
as games according to deconstructivist principles.
In this way, technology and especially digital tool develop that we call “serendipity”, creative intelligence which is precisely the fact of discovering by chance new answers in the contemporary space time without looking for it.
Serendipity appears more and more to be a human intellectual reaction in front of
complexity and imposed structure.
Informational tools are a language, so, they can be used as a game, a “game of language”, game of words or more, a game of grammar.
These games constitute, in a way, the art of good writing or speaking of the information as it were a rhetoric corpus with NEW figures, as, for example, accumulation,
hybridation, assemblage, montage, collage, anamorphosis, disjunction, inversion, repetition, trope, randomization…
In this direction, we can quote for example:
- The hybridization and the morphing of UN studio which open to new creativities
in answer to the flows of information and give birth for example to “Manimal” a
“photoshoped” crossbreeding between a lion, a snake and a human being. The hybridation of two common things don’t necessarily give a common thing but can
creates an exceptional one.
- The random use of the magic square in the Serpentine Gallery Pavilion by Toyo Ito.
- The dis-contextualization of objects of daily use, the alteration of scale, largely
helped by the machine constitutes a great part of the desire produced by Frank
Ghery’s works.
- Informative tools have certainly been a great help to continue the work between
“repetition and difference” for Peter Eisenman especially in Berlin Memorial.
New Materialities
In the same way as the post modern world has abandoned the typologies and the
models by trying to create “other” forms, it has abandoned traditional and identified
materials for innovative and not recognizable ones, made on the composite and intelligent mode. The informative society and its Trojan horse, the computer, have anyway
changed considerably our experience of the materialism.
338
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 15
Serpentine Gallery, Toyo Ito.
Fig. 16
Memorial, Peter Eisenman.
Christian Drevet France
339
The zoom effect both on the visual and sensorial plan has widely modified our perception and the practice of the mouse brought new sensibilities to the material which
can become what we desire and when we want.
The supremacy of the surface has often transformed the material into a new sort of
ornament with new questions about authenticity. The new materialism is situated at
the intersection between, on one hand, the abstract appealing to signs or codes of information and, on the other hand, the “ultra concrete” based on a sharp physical phenomenon perception and its properties.
Architecture has still a very small access to the advanced materials and, most of the
time, try only to re-order the materials of the modernity, steel, glass and concrete
used in panels.
Fig. 17
Forum 2004, Herzog et De Meuron.
340
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
Don’t forget that the main material of “Junk space” of Rem Koolhaas is “placoplâtre”,
plaster panel.
On the question of material, it is doubtlessly necessary to quote Herzog and De Meuron’s works who confuse form and material. The Suva Bank In Fribourg or the Warehouse Ricola in Mulhouse shows the research for “ambiguity” but at the same time
offers the potential for new experiences of reading. The association of the visual and
the conceptual, in the thought of Remy Zaugg, tries to reconcile the sensitive and the
understandable.
Informative Ambivalences
The Informational tools reveal one of the deepest ambivalences of the digital culture
which is the universality of the object and the peculiarity of the subject. Indeed, while
insuring the mastering and the control of the complexity, it opens to the freedom and
to the individual choices.
The informative tools insure the dialectic connections between the predictable
and the unexpected, the standardization and the peculiarity, the organic and the inorganic, the performative and the sensitive.
Informative tools have, by way, big capacity to associate opposite values of post
modern age.
Architectural design is the act of creation which has always been driven by different
thought backstage as, for example, humanities, engineering, romanticism. Each time,
it has, of course, different consequences on the conception manners.
This one, the informational driven architecture, has considerable consequences on
the cognitive strategy of the architectural design elaboration.
References
Lynn, G (1999), Animate form, Princeton architectural press.
Saggio, A (2008). The revolution in architecture: Thoughts on a paradigm shift. Rome: Carocci.
Speaks, M (2005). After Theory. In Architectural Record June Issue. New York: Mac Graw-Hill.
Rahim, A (2006). Catalytic formation: architecture and digital design. London: Taylor § Francis.
Lynn, G (1999). Animate form. Princeton architectural press, New York.
Lynn, G (1998, 2004). Folds, bodies and blobs; collected essays, la lettre volée, Bruxelles.
Zellner, P (1999). Hybrid space, Thames and Hudson, London.
Berkel, BB, Bos, C (1999). Move, UN Studio and Goose press, Amsterdam.
AA.VV.(2004). Emergence: morphogenetic design strategies, AD Profile 169.
Kolaveric, B (2003, 2005) Architecture in digital age: design and manufacturing, Taylor and Francis,
New York London.
Gibson, JJ (1963). The useful dimensions of sensitivity. American Psychologist, 18, 1-5.
Picon, A (2010). Culture numérique et architecture, Birkhauser.
Virilio P (2002), Ce qui arrive, Fondation Cartier pour l’art contemporain.
Viener, N (1956), The new landscape in Art and Science.
Christian Drevet France
341
Konstantinos Grivas
Department of Architecture
Polytechnic School, University of Patras
Greece
Constellation Pattern:
A Spatial Concept for Describing
the ‘Extended Home’
Ubiquitous Technologies “at home”
Since 1991, when Mark Weiser introduced the term ubiquitous computing1 there has
been extended research, experimentation and innovation on the interaction patterns
as well as the material manifestation of pervasive technologies2. Nevertheless, there is
a considerable lack of research and reflection on the spatial qualities of such technologies, which should be a major area for reflection and research for architects. This is an
especially critical issue when it comes to integrating pervasive technologies into the
domestic realm.
The attempt to integrate ubiquitous technologies inside such a psychologically and emotionally significant place as home should be based on sophisticated, yet
natural to man, schemes for the spatial orchestration and the physical manifestation
of digital aspects of home-life. We tend to consider domesticated technologies those
who have managed to weave seamlessly into daily routine of homely life by acquiring a level of spiritual/emotional significance and intimacy. The physicality and spatiality of domestic technologies plays a key role in the process of domestication, which
means that architecture needs to respond to this call and develop ways to represent
and produce domestic spaces which facilitate “natural” and intimate interactions mediated by ubiquitous technologies.
The research, whose basic outline is presented in this paper, is based on the
assumption that the traditional concept of the domestic (a physically defined territory of immediate personal control) is gradually giving way to the emerging concept of the ‘extended home’ (a conglomeration of home-fragments, coded spaces
and locations, mediated by technologies), and referring to contemporary domestic
spaces as ‘interfaces of intimacy’ aims to highlight this conceptual shift further. In
the ‘extended home’, ubiquitous technologies may enhance our intimate relationships with our fragmented domestic places by addressing the three main aspects
that define all domestic places: a) habits and domestic rituals (a means to reflect and
express personal identity, a form of bodily memory of spaces), b) intimate communication (use of space as a communication device, emotional “investment” of home),
and c) domestic history (manifestation of traces, history and development of identity, autobiographical memories). All of these depend on physical places and space
and the meanings humans invest them with. The phenomenological approach to
home and especially the work of G. Bachelard3 is a key reference. On the other side
the works of M. McCullough4, D. Norman5 and others provide the basic references
on the human-centred approach on technologies. The research hypothesis argues
that if we spatialize technologies in ways that are analogous to the natural ways by
which people –almost effortlessly- produce meaning when using, reading and creating their personal spaces, this may eventually enhance the spiritual side of home,
rather than its technological efficiency. Finally, the context of the research becomes
more complete by drawing references from social sciences (anthropology, psychology, studies on human behaviour and cognition) and creative practices like architecture, design and art.
Technologisation of the Domestic and the “extended home” Concept
The debate over the ‘technologisation’ of the domestic realm intensifies as the dissipation of technological mediation in all aspects of homely life is rapidly spreading. Each
344
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 1
“Domestic” vs “intimate” diagram (K. Grivas, 2007).
day a plethora of new domestic technological inventions, ranging from tiny devices to
worldwide services, promise to effect major or minor improvements on our domestic lives. Usually, any introduction of new technology into the domestic realm is faced
with scepticism, as generally, the act of ‘dwelling’ or ‘residing’ and the notion of technological progress are contradictory to each-other. Actually, most of the technologies
that we use today in our domestic life were initially developed for completely different purposes. The domestication of technologies6 is a slow process which involves
mutual adaptations between technologies, users and their environment. First, technologies become integral part of everyday life and are adapted to meet habitual practices. Secondly, the user and its environment change and adapt accordingly. Thirdly,
these adaptations inform the production of the next generation of technologies and
services. The domestication of ubiquitous technologies is still in its initial phase, but
we may use the feedback from research laboratories that work on this area to contemplate on the consequent stages.
William J. Mitchell, in his book ‘City of Bits’7 provides one possible vision of tomorrow’s inhabitation: “So ‘inhabitation’ will take on a new meaning - one that has less
to do with parking your bones in architecturally defined space and more with connecting your nervous system to nearby electronic organs. Your room and your home
will become part of you, and you will become part of them.” Psychology has established, long ago, that home is considered as an extension and reflection of our own
body and spirit8, but recent technological developments make it possible to envision
home as a sensorial organ literally connected to our bodies. It is no coincidence that
concepts of technological bodies, such as “post-humans” and “cybernetic organisms”
hold a privileged position in contemporary culture.9 What characterises both concepts of body is the ideal of interminable connections to everywhere, and the disKonstantinos Grivas Greece
345
posing of any boundaries between our bodies and their environment. Our culture’s
vision of the body is directly reflecting on our vision for what we call home too, and
the opposite. Accordingly, the environment that these technological bodies merge
into becomes an ecosystem ruled by laws of connections, coding, and appropriate
interfacing.10
The architectural concepts of privacy, thus of boundaries, that reigned the domestic world for centuries is giving way to the idea of domestic space as interface. Domestic space, suddenly, becomes dissipated in all other places, and bodies. Architectural
exploration that embraces this situation focuses on loose topologies, ephemerallity,
and virtuality, as a response to an increasing need for free association and expansion11, but with that comes, also, the weakening of existing spatial models, behaviours, and places. Today’s technologically saturated world is characterized by the vast
complication of man’s places. All traditional boundaries between here/there, interior/
exterior, body/machine, etc. seem to dissolve and everything extends. The importance is transferred to the infinite connections and interactions between all parts, and
all human places -domestic included- may be thought of as interfaces to anywhere,
anyone or anything else. Yet, as biological human beings, our territorial (and not only)
behaviour still depends on physical space. Physical space and matter are sovereign
to our senses. Philosophers12 and scientists have argued, since antiquity, that human
reasoning is based on our perception of the physical space and its objects. We have
an inherent ability to understand space and spatial models more quickly and more
easily than anything else. We navigate through space more naturally than through information and text. We are capable of internalising space, producing a kind of spatial
memory that enables us to recall spaces and events quickly with a lot of contextual
detail13. We also communicate through spatial behaviours and customs14. The way we
position our bodies in space in relation to others is a means of communication by itself. This great variety of intimate forms of spatial communication, with delicate and
fine details, some of them we only perceive sub-consciously is valuable and extremely
refined, and our extended homes should benefit from it.
Since the traditional hierarchical order of containing, contingent, and homogenous physical boundaries is inadequate to define the complexity and multiplicity
of space of the ‘extended home’, architecture needs to devise an alternative spatial
model that describes it appropriately. The space of the extended home would be better described as a delicate, perplex, and highly individual structuring of ‘coded spaces’ equipped with physical ‘signals’, ‘markers’ or ‘representative objects’, ‘dual or triple’
spaces and ‘constellations’ of those, superimposed or weaved in existing domestic
spaces that afford varied interpretations and connections, multiple levels of awareness, of interaction and revealing, and, therefore, various kinds and layers of intimacy.
In this context, it seems reasonable to suggest that “successful” domestic technologies
should shape themselves in ways that relate to the inherently human ability to read,
remember and spiritually connect with architectural space, creating a delicate web of
situated intimacies.
The production of domestic spaces that integrate ubiquitous technologies naturally and give form to the notion of the extended home-place, described above, may
emerge from a systematic study and exploration of the topologies of everyday intimate interactions, physical or digital. Designers may build on the concept of giving
presence to remote parts/periods of home by the coded reading of local/present
346
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
Two-dimensional subjective mapping of one bedroom flat (K. Grivas, 2000).
Fig. 3
Vision of augmented bedroom with rotating bed (K. Grivas, 2001).
Konstantinos Grivas Greece
347
domestic spaces and their activities. This may be effected by simply augmenting the
latter with a new family of objects or space-fixtures that facilitate this coded reading
of other spaces/times, or it may involve the alteration of existing domestic spaces in
the level of internal organization, even the deformation of their spatial geometry as
a means to multiply the topological codes that the extended home should afford. In
all that, it is essential to stress the importance of placing the inhabitant himself as the
main creative force in the environment of home. Creativity and intimacy are inextricably linked. Technologists, designers and architects may begin to explore more the
openness of their proposals, products, and designs.
It appears that one meaningful strategy for creating intimate habitable environments supported by ubiquitous electronic technologies might be the outcome of
cross-fertilization between: a) a study of spatial languages – by default an object of architectural & design inquiry – and systems for locating and manifesting digitally mediated events in domestic spaces based on the inherent human abilities of internalizing
space through habits and memories, b) a post-optimal15 approach that values the performative –poetic at times– attributes of electronic media, as opposed to their utilitarian and efficiency related properties, and, c) an ecological16 vision which accepts that
planning, form and aesthetics of domestic spaces are emergent, neither preconceived
nor marketed, and constantly re-configurable – sometimes in a playful manner too –
by those for whom they cater for.
Intimate Interactions and Spatial Orchestration of ubi-com Technologies
Intimacy is usually connected with physical togetherness and contingency. Today’s
technologies provide ways to overcome physical and geographical distance, at least
when information is considered. The need to share one’s personal space and daily life
with disparate people and places is neither a new nor a fictional scenario. For an increasing number of people, including myself, domestic space is a conglomeration of
several disparate locations of equal personal significance. Moreover, as family structure and everyday lifestyles change, with the increase of singles households, and the
increase of mobility and nomadic life, the notion of the family home, a place which
is owned and lived by several generations is drawing to extinction, so is our ability to
develop a long term intimate relationship with one particular place. The introduction
of ubiquitous computing technologies in our home environments presents an array of
possibilities that may provide counter-effects on these phenomena, providing again
some kind of duration and continuity in the act of inhabiting. With the use of ambient
communication we may inhabit more than one home simultaneously, so that we can
merge our disparate and fragmented home-places creating a third hybrid space. We
may, also, use sensors and recording devices scattered around the space to keep track
of our domestic behaviour and history. If we use these recordings to trigger interactive media in our homes we may superimpose homely experiences of the past into
our present habitation, and even more.
When it comes to sharing our personal space, plain verbal or even visual communication cannot substitute the rich experience of physical togetherness for they generally lack spatiality, sensitivity, and demand attention. In a physical house-share, communication between the inhabitants is multi-dimensional and quite varied depending
on the intimacy between them. Sharing one’s personal space can range from a simple
348
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
feeling of presence in the house to the ultimate sharing of one’s body (lovers.) Sometimes the sense of togetherness created by the mere presence of persons in the same
house, without any other form of interaction between them, is very significant indeed.
This fact led to the development of technologies that provide synchronous exchange
of information between two remotely linked spaces that relate to changes effected
on their ambience. This form of technologically mediated communication has been
termed ‘ambient communication’, or ‘pervasive awareness.’ In general, such a communication operates in the background of the inhabitants’ activities and transfers contextual information, usually related to changes in people’s environment. The potential
benefits and usages, as well as the operational qualities and technical particularities
of ‘ambient communication’ for domestic places have been research and design-wise
explored so far through several research projects and living laboratories.17
Nevertheless, what has not been equally researched is the spatial aspect of activities that are shared, or else, the spatial/geographical relations between, physical
home-places, ubiquitous devices, and digitally mediated events that are imported in
the first. In simple words, in most cases, there is no apparent spatial co-relation between the home-space where an event originally takes place, and the new location
Fig. 4
Fig. 5
Home constellations - graphic representation
(K. Grivas, 2004).
Bed – constellation of sensors (K. Grivas, 2005).
Konstantinos Grivas Greece
349
Fig. 6
Fig. 7
Flat01_Notes of locations and devices, pencil
sketch (K. Grivas, 2007).
Flat01_Map of locations and devices (K. Grivas,
2007).
where the same event is digitally transported, translated and physically introduced.
In order for the extended home to fulfil its basic requirements, inhabitants should be
able to perceive such technology mediated processes in a broader topological and
geographical system where each component has a specific location on some kind of
mental map. In this vision, the locations where the digital aspects of home become
manifested, acquire high significance, and so do the interrelations between those
locations. The inhabitants’ memory, imagination, and consciousness of theirs or others’ living patterns, aided by the various combinations of location/signalled activity/
physical objects, are the keys to an intimate map of an extended home that may occupy more than one physical places and periods of time simultaneously. To do so, the
inhabitants, and the designers of home-spaces need to devise models and means of
representation, of mapping of these complex spatial relations. Finally, these tools for
representing the extended home may allow digital events to be exchanged and integrated into the domestic space in a way which is spatially coherent – which makes
sense in everyday life – so as to allow intimate interactions.
The “constellation pattern”
One approach to mapping the extended home would focus on the community of
small-scale digital devices, sensors, actuators and processors of all kinds that may
be incorporated into everyday objects and pieces of furniture, as well as parts of the
350
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 8
Fig. 9
Flat01_Map of constellation (K. Grivas, 2007).
Flat01_Abstracted constellation map (K. Grivas,
2007).
building envelope itself. Each of these devices has a specific location in relation to all
the others, and can be classified by the inhabitant according to significance or affinity, using an ontological system. If viewed in a graphic representation these locations
form a kind of constellation pattern. This constellation pattern organises homespace at every level and scale giving emphasis to specific locations and their connections, and forms a canvas where multiple layers of information may be attached
onto. Moreover, the constellation pattern allows a differentiated representation of the
home-space which is on the one hand flexible, on the other hand in accordance with
the ways we perceive places as conglomerations of points and links between them.
Thirdly, the constellation pattern, because of its abstracted nature affords several layers of personal coding.
Any such constellation has to correspond to a meaningful and graspable schema
by the inhabitant. For example the constellation “windows” may organise in a common schematic representation all the digital cameras attached to the windows of
my city home as well these of my country cottage that monitor views to the exterior.
Another constellation may be the exact positions of pressure sensors that are tagged
under the dining chairs. In the same manner, one may construct any number of such
constellations, in his unique way, reflecting his own image of the place/es. Ideally, if
one is familiar enough with a home-space, every constellation should be mentally and
bodily recalled (e.g. one should be able to recall, in three-dimensional space where all
other “windows” are placed if he is given the position of one of them.)
Konstantinos Grivas Greece
351
Once we start elaborating on the constellation pattern, we may see that depending on what each constellation represents, its properties may vary significantly. For
instance some constellations may have an almost permanent layout (devices embedded on walls), while others may be highly flexible and mobile (devices tagged on
objects and furniture). Others may vary their layout in a regular and predefined way,
while other constellations may show randomness. It is, also, expected that most of the
constellations will change the number of locations attached to them during their lifespan, and some constellations may remain idle, or eventually “die” when they make no
more sense to the people living with them. An appropriate model of representation of
the various constellations of one’s home should be able to illustrate all the differences
and particularities of each constellation in relation to the others. More than that, as
the structure of constellations becomes increasingly complex and we start using hierarchical structures to classify them, it becomes clear that a meaningful representation,
a map, should indicate points of reference at any class level, be it the specific and local constellation “my bed”, or the general constellation “grandmother’s house”. Finally,
although our spatial memory helps us to recognise or recall in real space any familiar
location-sets with certain accuracy (eg. where the light switch is located in respect to
the door-handle and where we usually leave our slippers at night), it is important to
examine what is the range of the discrepancies between real distances among them
and the perceived or imagined ones, and when these discrepancies increase or decrease. Accordingly, we may imagine that the means of representation may have suitable affordances.
In the end, the constellation patterns of a home produce a new unique space connected with the extended home, the space of the representation itself, which is inextricably linked to the real spaces the extended home consists of, and constantly feeds
back to them. This representation is a code, unique for each home-place, and each inhabitant. This code is, on the one hand a tool that helps us to superimpose and weave
multiple levels of awareness and information into existing spaces, while at the same
time it is capable of affording multiple kinds and levels of intimacy since it can be revealed and shared at will.
References
1 “The Computer for the 21st Century” - Scientific American Special Issue on Communications,
Computers, and Networks, September, 1991.
2 Other relevant terms are: ubiquitous computing, ubiquitous/pervasive technologies, ambient
technologies, ambient intelligence and the like.
3 Bachelard Gaston, (1969), The Poetics of Space, Beacon Press, 1969, N.Y.
4 McCullough Malcolm, (1996), Abstracting Craft: The Practiced Digital Hand, The MIT Press, Cambridge, Massachusetts.
5 Norman A. Donald, (1993), Things That Makes Us Smart: Defending Human Attributed in the Age
of the Machine, Perseus Books, Cambridge, Massachusetts.
6 Silverstone et al. (1992).
7 Mitchell, William, J., (1996), City of Bits, The MIT Press, Cambridge, Massachusetts, p. 30.
8 Cooper Marcus Clare, (1997), House as a Mirror of Self, Conari Press, Berkeley California.
352
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
9 Haraway Donna, (1991), Cyborg Manifesto, included in Simians, Cyborgs, and Women: The Reinvention of Nature, Free Association Books, London.
10 ibid
11 Lynn Greg, (1998), Folds, blobs and bodies, La Lettre Volleé, Belgium.
12 Bergson Henri, (1991), Matter and Memory, Zone Books, New York.
13 Mccullough Malcolm, (1996), Abstracting Craft: The Practiced Digital Hand, The MIT Press, Cambridge, Massachusetts.
14 Hall T. Edward, (1966), The Hidden Dimension, Anchor Books, New York.
15 Dunne Anthony & Raby Fiona, (2001), Design Noir: The Secret Lives of Objects, Birkhaüser, London.
16 ‘Ecological’ as dynamically or collectively balanced, sustained and regulated.
17 Some examples are: a) Remote Home (T. Schneidler, M. Jonsson), and b) comHOME, both developed within the Interactive Institute, Stockholm, c) Adaptive House (M. C. Mozer, Georgia-Tech,
d) bed communication, (C. Dodge), and e) House_n, developed in MIT Medialab, Massachusetts,
USA.
Konstantinos Grivas Greece
353
Anastasia Karandinou
School of Architecture
University of Portsmouth
UK
Beyond the Binary
In this paper we will look into the traditional binary opposition of the material versus
the formal. We will look into how this binary opposition can be reversed and ultimately – following Derrida’s model – opened up by the introduction of a third element.
As the third element we will consider the performative, which describes the temporal, dynamic and hybrid nature of contemporary structures and processes. These notions offer a new understanding and context for the binary opposition and also a new
meaning for the terms form and matter.
In this context, the formal and the material are no longer considered like they used
to. New, fluid or transformable materials, such as interactive surfaces, materials changing colour and transparency, or foggy clouds, change the way in which we perceive
the notion of form and matter.
New Materials – New Forms
For buildings such as Toyo Ito’s Tower of Winds, it is not easy to make a clear distinction between what is the form of the building and what is the matter Yukio Futagawa,
ed. GA Architect: Toyo Ito, 1970-2001 (Tokyo: A.D.A. Edita, 2001). The form is created
by the fluctuating colours and lights, which change in response to the level of ambient sound in that part of the city. The materiality of the building could be considered
as this very same element. Hence, what matters, and in some way dissolves the binary
opposition between form and matter is the responsiveness, the temporality, and ultimately the performance of the building. Ned Kahn’s designs1 open up the binary opposition in a slightly different way. The material elements swing with the wind, creating fluctuating patterns. The surface of the building, and the shapes and patterns it
creates, is formed by the wind currents that make the steel elements swing. Similarly,
Diller and Scofidio’s Blur project is of a performative nature. The main element that
constitutes both the form and the matter of the building is the fog – the evaporating water of changing density and opacity. The form is a fog-cloud and the matter is a
fog-cloud too (Diller and Scofidio, 2002).
Therefore, on one hand, form becomes something different to the form of the
past, since it is no longer static and easily definable. Similarly the notion of matter
changes too; the density, fluidity and movement of matter defines the form – or else
the form describes the nature and dynamics of the matter. With the emergence of
new materials, new digital media, complex methods of construction and the equivalent design techniques, the focus has shifted from the stable, solid, defined, to the dynamic, the process, the hybrid, and the performative.
Performative and Performance
The notion of performance is often used in architecture in order to describe the specifications and behaviour of the materials and structures. As architect and theorist Ali
Rahim notices, “[t]he performance of a technology […] often refers to its technical effectiveness in a specific evaluation or in a set of applications” (Rahim, 2005). The architect and academic Ali M. Malkawi, refers to the notion of performance in order to describe qualities of a building such as “thermal flows, lighting, acoustics, structures, etc”
(Malkawi, 2005). However, for many architects and theorists, performance is a notion
that has a broader and more complex meaning. Within these discussions, the term
356
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
performative is often used instead of the term performance, so as to make clear the
distinction between the technical specifications and the other kinds of performing.
Ali Rahim, for example, when referring to the architectural design process, describes
performativity as “[t]he material, organizational and cultural change that occurs as a
result of this perpetual feedback and two-way transfer of information”2 (Rahim, 2005).
So, the terms performance and performative in some cases are used in a similar
way, whereas in other cases their meaning is different. This depends also on the context of the discourse. For example, in the context of theatre or music, the term performance has a very specific meaning and refers to acting or playing a musical piece.
In architecture, the term performance can refer to the technical specifications of a
building (as noted above), but it can also refer to the visual and aesthetic effect of a
building; a building, for example, may perform by the way it reflects and emits light at
different times of the day, as if making a performance (Kolarevic and Malkawi, 2005).
The notion of performative is often used in architecture instead of the term ‘performance’ in order to describe a process, a gesture, an act, a tendency, other than only a
function or visual effect.
Reversing the Balance - Performative and Potential
The notion of the performative has a long history in cultural studies and has been
extensively appearing within architectural discussions in the last few decades. In architecture, the notion of the performative involves issues of process, changeability,
signification, event, function, program – in other words it can be interpreted as what
the building does. The attention of many theoretical discussions is not focused on the
form or material of the building as something static, but rather in the dynamic way in
which it functions. This may involve the literal changing of the form over time, such as
the change of its shape, luminosity and colour, along the day, for aesthetic, functional,
climatological reasons, or for reasons that have to do with its conceptual performing.
Changes in technology, materials, construction networks, as well as the theoretical and philosophical discourses that those have raised, have displaced the interest
from permanences to the performative. For some contemporary theorists such as Katie
Lloyd Thomas, Therese Tierney, and, Deleuze’s commentator, Brian Massumi, the current building processes, materials, and techniques dissolve the binary opposition of
form versus matter; and it could be argued, hence, that the interest is thus shifted from
the formal and material to the performative (Lloyd Thomas, 2006, Massumi, 1992).
In contemporary discourses, as Thomas argues, the nature of the division between
form and matter has changed (Lloyd Thomas, 2006). The binary – following deconstruction’s typical model – is often reversed, and materiality tends to take the primary
role3 (Derrida, 1976). In contemporary discourses, the sensuous, the tactile, the materiality, the experiential is prioritized over the formal.
As Thomas argues, there is a distinction between the way materials were being
perceived through Derrida’s linguistic model and the way they are perceived by many
architects and theorists today. For Derrida the medium (the language, or the material in the case of a building) remained invisible. However, as Thomas claims, this is no
longer the case; materials are not perceived as ‘invisible’, like they were when “architectural discipline considered the building as a representational system”,4 but as active
and generative (Lloyd Thomas, 2006, Kennedy and Grunenberg, 2001).
Anastasia Karandinou UK
357
She uses the example of language as a medium – referring to Derrida’s analysis of
this – in order to present this specific way in which materials are in certain occasions
considered. Hence, if material is considered as only a medium for the expression of
form, it remains invisible. However, as Thomas claims, in several contemporary examples of architecture materials are not used only as mediums for expressing a form, but
as fundamental elements that drive the design.
Hence, the binary opposition of form and matter changes, due to new technologies, media and new types and use of materials. Materials no longer merely support
the desired form, but often constitute form by themselves.5 Here, I do not only refer to
the parametric design, but to all types of design that involve notions of performativity and responsiveness. So, we could argue that digital technologies are not the only
medium through which a design becomes responsive, performative and generative.
However, digital media and methods have evidently opened up and influenced our
way of thinking about these notions, processes and design methods.
Performative, Temporality and the ‘Chora’
Another way in which the binary opposition of form versus matter can be opened up
is through the consideration of the notion of time. Henri Bergson opens up this discourse by introducing the notion of time and of potential change (Bergson, 2007). As
he argues, time and change are parameters neglected within Plato’s and Renaissance’s
reflection on form and matter. According to contemporary architect and theorist,
Stephen Walker:
Bergson took time seriously, exploring the possibility that matter might differ from
itself over time, and developing an understanding that exceeded the intellect: in order to get at what eludes scientific thought, he argued that ‘we must do violence to
the mind, go counter to the natural bent of the intellect. But that is just the function of
philosophy’ (Walker, 2007).
As Bergson argues:
An intelligence which aims at fabricating is an intelligence which never stops at
the actual form of things nor regards it as final, but, on the contrary, looks upon all
matter as if it were carvable at will […] In short, it makes us regard its matter as indifferent to its form6 (Bergson, 2007, Walker, 2007).
The potential change, therefore, breaks down the distinction between form and matter. Andrew Benjamin, within a similar field of thought, claims that the “concern with
the history of any practice has to recognize that the status of the object, and thus its
presence within differing fields of activity, is always negotiable” (Benjamin, 2007). Objects are not fixed and stable; “rather, they are always in a state of construction” (Benjamin, 2007).
Bergson’s account of time and potential change seems very relevant to Plato’s
‘chora’. Chora, translated from ancient Greek as space, place, territory, location, capital
town of a region, or country, is introduced by Plato as a third entity that allows for the
existence of the other two realms: the one of the ideal reality and that of the everyday
sensible reality (Coyne, 2010, Plato, 2008). For Brisson and Meierstein, philosophers of
science, chora is considered as a ‘spatial medium’ (Brisson and Meyerstein, 1995). As
358
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Coyne mentions, Brisson and Meierstein, define chora “not as space, […] but as ‘that in
which’ and ‘that from which’ the sensible world is made.
So chora is not simply space, but something that is chronologically and logically
prior to it”7 (Coyne, 2010). Derrida, based upon Plato’s writings on chora, describes it
as “the spacing that is the condition for everything to take place, for everything to be
inscribed” (Kipnis and Leeser, 1997) and at the same time, as something beyond our
perception, as what “cannot be represented” (Kipnis and Leeser, 1997). Chora is a kind
of medium that allows things to take forms through it, but eventually remains unchanged itself; it “gives nothing” (Kipnis and Leeser, 1997). Hence, we could consider
Bergson’s introduction of time and duration as a possible interpretation, or rather as
a reflection on Plato’s chora; of the un-intelligible, and of what exists beyond the ideal
and physical worlds. In this context chora, similarly to Bergson’s time, is what allows
for the potential, and what exists prior and beyond the opposition of form and matter.
According to Thomas, within contemporary philosophical approaches, the distinction between form and matter has been extensively negotiated and altered; the world
has been considered “in terms of force, setting up equivalence between persons, objects, words, […] which all act on each other. In such a view the real and the virtual,
or the material and the idea, are part of a continuum of potentiality and actualization”
(Lloyd Thomas, 2006).
Whether or not form and matter are dealt with as two separate aspects, Thomas
considers the world as a system of substances which go through changes and which
are transformed by physical or cultural forces.
Nevertheless, even if we remain within the opposition of formal – material, we
have to mention that both come as a result of multiple social, political and economic
networks; the issue of process and dynamic evolution of things is present also within
the consideration of these two (seeming differentiated) issues: architectural schools,
the training in specific institutions and practices, competitions, personal and communal cultural aspects, determine the form of the built space too. Multiple networks and
practices are involved in a direct or indirect way, not only in the materiality of the built
space, but also in the form and concepts it carries.
Although this may seem to be a new way of looking at things, and different to
Descartes’ or Newton’s consideration of things as stable entities,8 we could argue that
this field of thought sources upon Leibniz’s philosophy of space, time and things. For
Leibniz, space is not an absolute preexisting entity, but, as Coyne presents) “emerges
in a consideration of the relationship between things”9 (Coyne, 1999). For Leibniz,
processes and events preexist the notion of time and objects and the relations between them preexist space. Space, thus, is something perceived through and after objects, processes and the changing relationships between them (Russell, 1951).
Within a contemporary Deleuzian field of thought, the ‘substance’, the ‘piece of
wood’, which Brian Massumi uses as an example, is a result of a chain of cultural and
natural events, and has the potential for further transformations; so does its maker.
Massumi considers every matter or form (without making a distinction) as a result
of forces applied to it. The raw wood, the formed wood, the craftsman, the tools, all
are considered as active substances, results of physical or cultural powers – of natural
procedures, institutions, technical schools, etc. – which ‘overpower’ each other. Massumi points out that it is rather “this power differential that determines that we understand the wood as merely ‘content’ and the craftsman as ‘agent of expression” (Lloyd
Anastasia Karandinou UK
359
Thomas, 2006) (Massumi, 1992)10. In these terms, the form is not separable from the
substance; it is the actions, forces and changes that the substance has gone through
that leads to what we experience. The object is the substance, which has undergone
transformations and has been affected by forces of physical or cultural nature.11
As Massumi argues:
The encounter is between two substance/form complexes, one of which overpowers the other. The forces of one are captured by the forces of the other and are
subsumed by them, contained by them. ‘The value of something is the hierarchy of
forces which are expressed in it as a complex phenomenon’. One side of the encounter
has the value of a content, the other of an expression. But content and expression are
distinguished only functionally, as the overpowered and the overpowering. Content
is not a sign, and it is not a referent or signified. It is what the sign envelops, a whole
world of forces. Content is formed substance considered as a dominated force-field12
(Massumi, 1992).
Massumi does not use the term performative in this context. However, he reflects on
a similar field of thought: the process and potential that something entails. What Massumi considers as the substance of a thing is not its form or matter, but what makes
it what it is and what potentially makes it transform into something else. What characterizes something and what reveals or best describes its substance is not merely its
form and matter, but its potential.
Media Forming Concepts
Philosophical theories, like those presented above, are often inter-related to the scientific, technological and political conditions of their time. The relationship between
substance and forces is possibly, thus, related to discoveries and inventions of the last
few decades.
For example, the electromagnetic waves turn abstract, ‘empty’ space, or air, into
something filled in with forces, waves and fields, carrying potentially readable and
visible content and meaning. Hence the definition and content of notions such as
matter, mass, volume, density is challenged. The field of forces creates a different perception of what is substance and what is not, what is perceived as potentially performative and what as possibly not. Similarly, contemporary digital media introduce
a closer link between architectural design and the notion of temporality and time.
Digital media, allow us to record and represent duration, and, therefore, events and
situations occurring over time can be documented and studied. Moreover, animation
and programming allow designers to think not only statically about space, but also
dynamically through time, movement and interaction. Hence, we could argue that
digital media do not only change the methods and processes involved in designing
and building, but also the theoretical and philosophical discourse around some fundamental notions of architecture such as matter and form.
360
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Notes
1 Here I refer to Ned Kahn’s projects such as the Children’s museum in Pittsburgh, or the Kinetic
façade at the Brisbane Airport parking.
2 As Rahim claims: “Performativity always has the potential to produce an effect at any moment
in time. The mechanisms of performativity are nomadic and flexible instead of sedentary and
rigid. […] The performative subject is fragmented rather than unified, decentralised rather
than centered, virtual as well as actual.” p. 179.
3 As Derrida claims, the first stage for opening up a binary is to reverse it. The secondary notion
– out of the two – becomes the primary one, and this new discourse leads to the opening up
of the binary.
4 As examples, she mentions Lars Spuybroek arguments in relation to the contemporary approach to materials’ behaviour p. 7 (in: Spuybroek, L. 2004. Nox : machining architecture, London,
Thames & Hudson.and Sheila Kennedy’s “Material Presence: The return of the real” (in the book:
5 e.g. the ‘Blur’ project.
6 In what follows, Bergson refers to the relation between his consideration of form and matter and the ‘homogeneous’ space, that we earlier on referred to: “But when we think of our
power over this matter, that is to say, of our faculty of decomposing and recomposing it as we
please, we project the whole of these possible decompositions and recompositions behind
real extension in the form of a homogeneous space, empty and indifferent, which is supposed
to underlie it. This space is therefore, pre-eminently, the plan of our possible action on things,
although, indeed, things have a natural tendency, as we shall explain further on, to enter into
a frame of this kind.” Bergson, H. 2007. Creative Evolution, New York, Random House. p. 173.
7 with reference to: Brisson, L. & Meyerstein, F. W. 1995. Inventing the universe: Plato’s Timaeus,
the big bang, and the problem of scientific knowledge, Albany, NY, State University of New York
Press.
8 See also: Broad, D. C. 1975. Leibniz: an introduction, Cambridge, Cambridge University Press. p.56
Broad presents here Newton’s belief that space is ‘prior to objects’, time is prior to events and
processes, ideas to which Leibniz is opposed.
9 with reference also to: LEIBNIZ, G. 1964. The relational theory of space and time. In: J.J.C.SMART
(ed.) Problems of space and time. New York: Macmillan. pp. 89-98.
10 As Massumi claims, “Deleuze and Guattari distinguish between ‘substance of content’ and
‘matter of content’. A ‘substance’ is a formed matter (the thing understood as an object with
determinate qualities), and a ‘matter’ is a substance abstracted from its form, in other words
isolated from any particular encounter between content and expression (the thing as all the
forces it could embody in all the encounters it could have, either as content or expression).”
Massumi, B. 1992. A user’s guide to capitalism and schizophrenia, Cambridge, Mass.; London, The
MIT Press. (p.25) As Deleuze and Guattari argue: “Substance is formed matter, and matter is a
substance that is unformed either physically or semiotically.” Deleuze, G. & Guattari, F. 2004. A
thousand plateaus: capitalism and schizophrenia, London, Continuum. p. 156.
11 “Massumi provides us with an (Deleuzian materialist) alternative to the hylomorphic account of
the architectural material, which suggests that material is itself active and does not distinguish
between the physical forces (plane smoothing it) and immaterial forces (the building standard
that determined its fire treatment in a certain way) that produce it. Within this account the line
that is a string of code in a computer or an idea is no less material than a piece of wood or a
spoken sentence; each acts”. Lloyd Thomas, K. 2006. Material matters: architecture and material
practice, London, Routledge. p. 6.
12 with reference to: Deleuze, G. 1986. Nietzsche and Philosophy, London, Continuum. p. 7.
Anastasia Karandinou UK
361
Bibliography
Benjamin, A. 2007. Plans to Matter. In: Thomas, K. L. (ed.) Material Matters. London: Routledge.
Bergson, H. 2007. Creative Evolution, New York, Random House.
Brisson, L. & Meyerstein, F. W. 1995. Inventing the universe: Plato’s Timaeus, the big bang, and the
problem of scientific knowledge, Albany, NY, State University of New York Press.
Broad, D. C. 1975. Leibniz: an introduction, Cambridge, Cambridge University Press.
Coyne, R. 1999. Technoromanticism, Cambridge, Mass.; London, The MIT Press.
Coyne, R. 2010. Other Spaces. Manuscript.
Deleuze, G. 1986. Nietzsche and Philosophy, London, Continuum.
Deleuze, G. & Guattari, F. 2004. A thousand plateaus: capitalism and schizophrenia, London,
Continuum.
Derrida, J. 1976. Of Grammatology, Baltimore, Johns Hopkins University Press.
Diller, E. & Scofidio, R. 2002. Blur: The Making of Nothing, New York, Harry N. Abrams.
Kennedy, S. & Grunenberg, C. 2001. KVA: Material Misuse, London, AA Publications.
Kipnis, J. & Leeser, T. 1997. Chora L. Works: Jacques Derrida and Peter Eisenman, New York, The
Monacelli Press.
Kolarevic, B. & Malkawi, A. (eds.) 2005. Performative architecture: beyond instrumentality, London;
New York: Spon Press.
Leibniz, G. 1964. The relational theory of space and time. In: J.J.C.Smart (ed.) Problems of space and
time. New York: Macmillan.
Lloyd Thomas, K. 2006. Material matters: architecture and material practice, London, Routledge.
Malkawi, A. M. 2005. Performance Simulation: Research and Tools. In: Malkawi, B. K. A. A. M. (ed.)
Performance Architecture: Beyond Instrumentality. New York and London: Spon Press.
Massumi, B. 1992. A user’s guide to capitalism and schizophrenia, Cambridge, Mass.; London, The
MIT Press.
Plato 2008. Timaeus, Fairfield, IA, 1st World Library.
Rahim, A. 2005. Performativity: Beyond Efficiency and Optimization in Architecture. In: Malkawi,
B. K. A. A. M. (ed.) Performative Architecture: Beyond instrumentality. New York; London: Spon Press.
Russell, B. 1951. Philosophy of Leibniz, London, George Allen & Unwin Ltd.
Spuybroek, L. 2004. Nox : machining architecture, London, Thames & Hudson.
Walker, S. 2007. Gordon Matta-Clark: Matter, Materiality, Entropy, Alchemy. In: Thomas, K. L. (ed.)
Material Matters. London: Routledge.
362
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Ada Kwiatkowska
Faculty of Architecture
Wroclaw University of Technology
Poland
Architectural Interfaces
of Hybrid Humans
Digital architecture announces the seemingly unlimited possibilities of expressions
of every imaginable form in its multidimensional complexity. Informative technologies are the origins of new generations of architectural interactive forms. Thanks
to electronic devices, the perception of a man are extended. It could determine the
future direction of mankind’s evolution from homo sapiens to hybrid sapiens. At the
same time, there is another process of strengthening artificial intelligence of spatial
structures, which could cause the development of architectural objects from archs
to bits.
The architectural visions correspond to some artists’ concepts, which focus on
the interactive relations between the mutable human body and transformable spatial settings on different levels of architectural objects’ complexity. These visions are
based on the active power of information and on the extension of the awareness of
architectural structures (from info-media to interactive forms, metamorphic to illusory etc.) (Kwiatkowska, 2007). According to Stelarc diagnosis, “evolution ends when
technology invades the body” (1995, p.93), one can say, that evolution of architectural
forms ends when technology invades architecture.
Digital architectural forms can be viewed as the dynamic psycho-objects interacting with human psycho-bodies. Space becomes a medium transmitting bits of
information and communicating with the people. Concept of hybrid humans is expressed in sci-fi literature, films and video games; it seems, that it is not the future
but the reality of scientific laboratories of genetic engineering. In popular culture,
such hybrid humans as Prometheus, Daedalus, Faust, Frankenstein or Androids (Turney, 2001) allow people to face consciously their evolutionary fear from one side,
and from the other side – to look at the future of human race with hope against
hope (Fig. 1).
In artistic and architectural visions, it is possible to distinguish three main concepts of human being’s body and its spatial extensions: bio-body with sensual kinetic spatial extensions, hybrid body with mutable altered spatial extensions and Ibody with sensory interactive spatial extensions. The idea of autonomy and mobility
of the human bio-body is connected with the concept of liberated architecture. The
vision of mutable and hybrid human body leads architectural forms into coding experiments and simulation of the structural metamorphoses in topological space, in
effect, evolving architecture comes into existence. The process of structuralizing of
space in relation to I-body with electronic peripherals causes that architecture has to
be interactive and intelligent to answer the users’ expectations (Brayer and Migayrou, 2001; Liu, 2007; Spiller, 2002; Teyssot, 2005; Zellner, 1990).
Architectural patterns of digital age are based on the higher order, higher complexity of systems and active power of information forcing the A.I. of spatial structures. The questions are, whether liberating the architectural forms and forcing the
intellectual power of structures to interact with evolving human body will extend
the human perception or will be a trap for human mind; whether strengthening the
A.I. of spatial structures will be accompanied with the development of intellectual
man’s abilities to understand the meaning of a man-space dialogue or a man could
be caught in a trap of a being-driven by the artificial settings, which would be too
intelligent and too complicated for a human perception?
364
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 1
A human being in the built-environment (drawing by author).
Bio-body with Sensual and Kinetic Spatial Extensions
Liberated architectural forms express potential and possibilities of the dynamic movement and flow of the spatial structures following the sensual and kinetic skills of human bio-body (Fig. 2). Architectural forms are treated as bubbles or wrapped-forms
covering the human bodies, which can generate the movements of architectural
structures simultaneously with the changing positions of bodies or just the opposite –
forms can influence human bodies’ motion, like in the art of Ernesto Neto (2005). Liberated forms result from the fascination of architects with the ideas of spatial mobility
and flexibility, which oppose the static character of traditional architecture. They are
liberated from the necessity of foundations and the limits of rigid geometry expressing only the fundamental forces of gravitation. They are inspired by the ways of bodies’ movements, which are translated into the spatial language of architectural structures in kinaesthetic space.
Liberated forms can regenerate and adapt themselves to the changing needs and
preferences of the users, and to the challenges of the spatial settings of human activities. Forms are characterized by the lack of scale and a new kind of geometry, which
are the expression of dynamic variable inner-outer forces coming from the human
body or its spatial surroundings. Liberated forms are becoming the alternative skins
in ergonomic and aesthetical meaning (e.g. Basic house, arch. M.R. de Azúa; Loading
skin, NUC: Nomad Use Camaleonics, arch. S’A Arquitectos) (Smith and Topham, 2002;
Brayer and Simonot, 2002).
Ada Kwiatkowska Poland
365
Fig. 2
Bio-body with sensual extensions in kinaesthetic space (drawing by author).
Bio-body and its sensual kinetic spatial extensions are the origins of dynamic forces, which are engraved in materials (e.g. Mute Room SF, arch. Thom Faulders; Ether,
arch. dECOi) and in concepts of structures (e.g. Portable House, arch. Philippe Gregoire, Claire Petetin; Xurret System, arch. ReD) (Spiller, 2008; Zellner, 1999; Brayer and
Migayrou, 2001; Koralevic and Klinger, 2008). Liberated forms are forces-driven shapes
meeting with the matter resistance. Their uniqueness and sensuality are determined
by the physical features and hidden phenomena of matter.
Hybrid Body with Mutable and Altered Spatial Extensions
Evolving forms express the process of structuralizing of space in relation to the hybrid human being’s body with mutable and altered spatial extensions, like in the art
of Matthew Barney (2005). This process is based on the metamorphic transformations
and transmutations of the architectural shapes, substances and structures in topological space (Kwiatkowska, 2003) (Fig. 3). To materialize the evolving forms it is necessary to introduce the nanotopian materials of genetic engineering to architecture.
Evolving forms are information-driven shapes, it means, they are founded on the algorithms of proceedings, which play similar role to DNA – genetic codes of organic matter. Evolving structures modeled on the bases of the algorithms of proceeding consist
of the originator - an initial form, generator - a set of copies and mutations of an initial
form, and the rules of forms’ ordering in the space (e.g. fractals’ geometry of L-System,
Logo, Turtle graphics).
366
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 3
Hybrid body with mutable extensions in evolving space (drawing by author).
Evolving forms are liberated from functional, structural and materials’ standardization, because of the potential of the algorithms to adapt and transform shapes
to the challenges of the surroundings (e.g. Forms of Energy, La Biennale di Venezia,
arch. AMID; transPORTs2001, arch. Kas Oosterhuis) (Spiller, 2008; Oosterhuis, 1999).
Hybrid architectural structures can change their appearance and volumes according
to the users’ needs (e.g. reducing or growing of the architectural volume in relation
to the number of users in the certain time - Loop.Space, arch. BASE 4 or users’ preferences - Variomatic house, arch. Kas Oosterhuis) (Liu, 2007; Smith and Topham, 2002).
Data-flow causes transformation of architectural structures in accordance to direction of info-energy transmission (e.g. Saltwater Pavilion, arch. K. Oosterhuis) (Zellner,
1999).
Hybrid body and its mutable spatial extensions are the bases of the metamorphic
alterations of the architectural structures, becoming the organic forms that can grow
and mutate like biological ones (e.g. Digitally Grown Botanic Tower, arch. Dennis Dollens; Orgone Reef, arch. Philip Besley) (Dollens, 2005; Spiller, 2008). It means that contradiction between the natural and architectural structures will disappear in near future, because both of them are characterized by the possibilities of evolution through
activating the internal algorithm of transformation and by the lack of control of the
growing processes.
Ada Kwiatkowska Poland
367
Fig. 4
I-body with sensory extensions in interactive space (drawing by author).
I-body with Sensory and Interactive Spatial Extensions
Interactive architectural forms express the potential of the communication between a
man and spatial surroundings based on electronic devices built-in human bodies and
e-environment (Fig. 4). I-body becomes the biotechnological organism that interacts
with intelligent spatial settings thanks to electronic sensors, like in the artistic performances of Stelarc (1995). Space is defined as digital peripheries of the human body.
Basic patterns of the interactive forms are founded on the gaming strategies, which
make possible the mutual communication between the users as senders and architectural structures as receivers in the info-mechanistic language.
Interactive architecture is characterized by the strategy of the selection, codification, personalized exchanges of information and pluralism of forms. Interactive communication is wide open to fortuity, combination, permutation and variation of the
information messages contained in the architectural structures, and it breaks the
space-time continuity. Intelligent architecture mediates between the users’ activities
and their spatial settings. Fusions of architectural forms with digital technologies allow the users to control all parameters of the structures.
I-body with sensory and interactive spatial extensions introduces the multiple
points of view in architecture. It means that the object can be viewed by the user from
many points of view at the same time and the user can interact simultaneously with
many levels of the spatial structure. Interactive forms connected with digital technologies announce that virtual reality will be the supplementary component of the
368
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
architecture (e.g. Enteractive, Electroland; Virtual New York Stock Exchange, arch. Asymptote) (Electroland, 2007; Brayer and Migayrou, 2001) according to Marcos Novak
diagnosis, “form follows the functions of fiction” (1995, p. 47).
Design Strategies of Technology-driven Architecture
In the process of creation, technology-driven architecture is more controlled by the
architects than imagine-driven one because of the most complex view into the spacetime and possibility of the simulation of seemingly infinite structural and formal variables. However the question of a human being’s perception caught in a trap of too
intelligent and too complicated spatial settings of life is still open.
New digital technologies initiated at the end of XX. Century have great influence
on the design strategies. The most important advantages of the architectural computer programs reveal in the control and optimization of parameters of the structural
effectiveness, fusion of the architectural, technological and environmental systems,
coding and decoding of the architectural language, or in generating the variety of innovative design strategies.
There are some misassumptions, taken from the beginning of this process, that
one can point out the uncritical undergoing of the dictate of digital technologies and
lack of discussion about all consequences of their uses for human being and built-environment. In a sense, 3D-CAD software unifies architectural forms and intensifies a
diffusions of international style all around the world with no respect to architectural
heritage of different cultures. The procedures contained in architectural programs,
especially related to structural and energy optimization of architectural forms, are
treated as objective but architects, as a matter of fact, cannot prove their correctness.
Concepts of the architectural structures are dependent on the software reliability.
Generally, fascination of digital technologies causes domination of the rational and
logical over irrational and inexplicable thinking.
A critical attitude to the advantages of digital revolution is becoming a challenge
in architecture today. It is important to develop the ideas which are more open to intuitive, subjective, metaphysical or fuzzy ways of imagining and thinking, to simulate
the life processes, to oppose the structural and formal unification, to apply different
cultural codes and architectural patterns, and to express the human being’s dreams in
architectural design.
Or maybe forcing the intellectual power of the architectural structures will really
extend the human perception and architecture of the future will express the hybrid
humans’ dreams.
Bibliography
Barney, M., 2005. Art now: Matthew Barney. In: U. Grosenick, B. Riemschneider, eds. Art Now – Artists
of the Rise of the New Millennium. London, Taschen, pp. 28-31.
Brayer, M.A., Migayrou, F. eds., 2001. ArchiLab – Radical Experiments in Global Architecture. Orléans:
Thames & Hudson.
Brayer, M.A., Simonot, B. eds., 2002. ArchiLab’s Future House: Radical Experiments in Living Space.
Orléans: Thames & Hudson.
Ada Kwiatkowska Poland
369
Dollens, D., 2005. Digital-Botanic Architecture: D-B-A. Santa Fe, New York, Barcelona: Lumen Books.
Electroland, Enteractive, available at <http://www.electroland.net/flash.php> [Accessed 21 May
2007].
Kolarevic, B., Klinger, K. eds., 2008. Manufacturing Material Effects: Rethinking Design and Making
in Architecture. New York, London: Routledge, Taylor & Francis Group.
Kwiatkowska, A., 2003. Trans-formation in the age of virtuality. In: R. Kronenburg, ed. Transportable
Environments 2. London, Spon Press, pp. 32-41.
Kwiatkowska, A., 2007. Simulation games with the architectural forms. In: Y.T. Leong, G.E.Lasker,
eds. Architecture, Engineering and Construction of Built Environments. Proceedings of the 19th International Conference on Systems Research, Informatics and Cybernetics, Baden-Baden. Tecumseh,
Ont.: The International Institute for Advanced Studies in Systems Research and Cybernetics, vol.2,
pp. 4-9.
Liu, Y.T. ed., 2007. Distinguishing Digital Architecture: 6th Far Eastern International Digital Architectural
Design Award. Basel, Boston, Berlin: Birkhäuser.
Neto, E., 2005. Art now: Ernesto Neto. In: U. Grosenick, B. Riemschneider, eds. Art Now – Artists of
the Rise of the New Millennium. London, Taschen, pp. 216-219.
Novak, M., 1995. Transmitting architecture: transTerraFirma/TidsvagNoll v2.0. Architects in Cyberspace, Architectural Design, no. 118, pp. 43-47.
Oosterhuis, K., 1999. Trans_Ports 2001. Hypersurface Architecture II, Architectural Design, no. 9-10,
pp. 87-89.
Smith, C., Topham, S., 2002. Xtreme Houses. Munich, London, New York: Prestel Verlag.
Spiller, N. ed., Cyber_Reader. London: Phaidon.
Spiller, N., 2008. Digital Architecture Now: A Global Survey of Emerging Talent. London: Thames &
Hudson Ltd.
Stelarc, 1995. Towards the post-human. Architects in Cyberspace, Architectural Design, no.118, pp.
91-96.
Teyssot, G., 2005. Hybrid architecture: An environment for the prosthetic body. Convergence, vol.11,
no.4, pp. 72-84.
Turney, J., 2001. Slady Frankensteina: Nauka, genetyka i kultura masowa (oryg. Frankenstein’s Footsteps: Science, Genetics and Popular Culture). Translated from English by M.Wisniewska. Warszawa:
PIW.
Zellner, P., 1999. Hybrid Space: New Forms in Digital Architecture. London: Thames & Hudson.
370
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Pau Pedragosa
Department of Theory and History of Architecture
Escola Tècnica Superior d’Arquitectura de Barcelona (ETSAB)
Universitat Politècnica de Catalunya (UPC)
Spain
Immaterial Culture
in the Age
of Information Technology
“One has to become cybernetic in order to continue being a humanist.”
(P. Sloterdijk)
Reactionary positions that limit themselves to protest and that forgo any further attempt at intellectual discourse are commonplace nowadays. The reason for this attitude is the awareness that classic humanism seems to have become more and more
exhausted; the humanist is humiliated and on the defensive. Nevertheless, the necessary and urgent vindication of humanism can only be achieved through technological
modernity.
In this paper I propose a philosophical reflection, on the fundamental shift that
took place from a culture and an architecture founded on the values of humanism,
a culture based on the craft technique, which materialises the eternal ideas of the human being held since Antiquity, to another directed by the radically new values of
the information society of machines, a society based on the post-industrial or cybernetic technology of automatic production, diffusion and archiving of information. The
shift from one culture to the other—from one epoch to another—was made possible
through the transitional modern culture that was based on industrial technology. This
article is based on information theory and its cultural interpretation as developed by
Vilém Flusser, and on phenomenology and authors related to it like Hans Blumenberg.
I draw upon some of the key aspects of Vilém Flusser’s work to establish the chronological periods —craft-based, industrial and post-industrial periods— according to
the different historical techniques interpreted in relation to the corresponding concepts of matter and form —“form” being the root of the word “information”. We will
see how history is the process of increasing prioritisation of form over matter that
leads to the current culture of in-formation, characterised as an immaterial culture.
Immateriality refers to the apparition of a surprising type of object, without historical precedent, characterised by being an indifferent material support of highly relevant information. This situation can be characterised as the dematerialisation of the
previous material culture, coherently defined by Heidegger as a culture of homelessness. In this sense, we can legitimately talk of immaterial or homeless architecture (or
culture in general). This type of pure in-formation culture brings raise to the struggle
between the human freedom that projects a form and the automatic program that
only produces redundancy, between artistic creation and technification, between
freedom and necessity.
From phenomenological criticism and the concept of the life-world we can see
that the most important problem that technological architecture presents is that it rejects the questions that relate to its meaning, to its physical and historical place, to the
justification of its existence, because its ultimate justification is that it works. Technological architecture presents itself as something obvious in its automatism and redundancy; it does not provoke any questions, it silences all questions relative to its better
optimisation, such as whether it is or not necessary, whether it is meaningful, whether
it is worthy, etc. It is necessary, therefore, to create spaces where such questions can
be made.
It is necessary to reflect on the classic questions of mankind but translated and
transposed to the new technological context that we now inhabit and in which architecture is produced and built.
372
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Theoretical Background
Matter and form
Aristotle defined the ability of human beings to mould their environment by virtue
of their strength and skill with the concept of téchne. The term “téchne”, translated
into Latin as “ars”, designated much more than that which is nowadays understood by
“technique”: it comprised all human skills employed in the creation of works or in the
shaping of reality, and it embraced both the artistic and the artificial, two terms that
are nowadays clearly differentiated (Blumenberg, H. 1999, p. 73). For the following two
thousand years after its formulation, the definition of the relation between nature and
technique given by Aristotle has been considered valid: technique (or art) is an imitation of nature. According to Aristotle, technique consisted in completing and imitating
nature (Aristotle. Física. Libro II. 8. 199a 15-17.) Aristotle says that a man who builds a
house does only what nature itself would do if it built houses (Aristotle. Física. 199a
12-15.) Nature and art, or technique, are equivalent in structure: the characteristics
of the artificial environment are the same as those of the natural environment. They
obey the same laws. How is this structure that is common both to nature and to human artifice characterised? The structure is composed of two basic elements: matter
and form.
The term “matter” is the result of the Latin translation of the Greek concept “húle”,
that originally meant “wood”. The opposite concept is “form”, in Greek “morphé”. Both
concepts are opposite but inseparable. The oppositional pairing of matter-form is
synonymous of the conceptual pairing (also Greek) of potentiality (dúnamis)-actuality
(enérgeia) (Aristotle. Física. Libro III, 200b 12-201 b 15.), and both stress the contraposition between absence (the invisible, the hidden) and presence (the visible, the uncovered). The Greek concept of “truth” (alethéia) means “to un-cover” (Heidegger, M. 1993,
p. 33), to make present that which is hidden, to bring to light or to bring forth, to give
form (give presence) to matter (invisible) or to in-form the matter. Matter in a pure
state does not exist, it always possesses some sort of form. In our environment we find
materialised forms or formed matter, that is to say, we are surrounded by objects (form
+ matter) made of determined materials and with determined forms. The action of informing, of giving form to matter, is called téchne (Heidegger, M. 2000, p. 16.)
Novelty and repetition: the information theory
Culture is a type of information: the information that has been learned, archived and
transmitted between human beings from generation to generation (Mosterín J. 2009,
p15). Material objects are carriers of information by virtue of their form. The information itself is immaterial, carried by the form of the material objects that transmit it. Various material supports can have the same form: different units of the same car model,
different issues of the same edition of a book or various discs with the same music
recorded (Mosterín J. 2009, p18).
In the book The Mathematical Theory of Information (1949), Claude Shannon and
Warren Weaver developed a highly influential theory of information (Taylor, M.C. 2001,
p. 107.) In this theory, the concept “information” differs from its habitual use as “meaning”. In the strict sense, “information” is defined as the inverse of “probability” (Taylor,
M.C. 2001, p. 109): the more probable that something is (a material object or a mesPau Pedragosa Spain
373
sage), the less information it transmits. Information is therefore identified with the
improbable: the surprise, the novelty, the difference. Information must be sufficiently
different (surprising) as to transmit something new, but not so different as to be completely unrecognisable. Information is found, therefore, between the insufficient difference (that which is totally undifferentiated, homogeneous, always the same) and
the excess of difference (that which is totally other, unique). Both the insufficiency and
the excess of difference result in the emission of noise. The information emerges from
the noise through the articulation of differences.
When téchne, the technique, transforms an object, it converts it into another one,
it introduces a difference, a novelty with respect to its previous state. This change consists in the transition of how an object is to how it can be, in the realisation of a potentiality or a possibility of the object. This power to be different that we impose on the
objects through the technique is the new form that we give to them. A built, manufactured or produced object is an informed object. The sum of the informed objects,
that is to say, the sum of the new objects, is culture (Flusser, V. 2001, pp. 144, 145).
There are different ways of giving form to matter, of producing objects, according to
the different techniques that have occurred historically. In short, there are different
types of culture: craft-based, industrial and post-industrial culture.
Craft-Based, Industrial and Post-Industrial Techniques
Craft-based technique
Everything started with the hands. With these complex biological instruments, the
first human beings gave shape to the world around them. But human beings have the
capacity to go beyond their natural biological conditioning, and this resulted in the invention of tools as the extension of the hand. Tools were invented because they could
be used more quickly and effectively in nature compared to bare hands. Tools are the
extension and imitation of the hands. Craftsmen give form to matter using tools, for
example, wood in the form of a table. ¿How is this gesture to be interpreted? From
Plato onwards and throughout the Middle Ages it is thought that forms are hidden
behind appearances and they can only be captured through theoretic sight. In fact,
“morphé” (form) is a term usually used by Greek philosophers as a synonym of “eidos”
or “idéa” (Blumenberg, H. 1999, p. 83.) One of these forms is the form of the table, the
perfect table (the idea of table). The craftsman “sees” this form, and the criteria used
to assess the worth of a particular table is its degree of approximation to the perfect
form “table”.
According to Plato’s theory of ideas, when a craftsman builds a table, he copies
the idea of the table that has always existed: imitating nature means reproducing the
idea of table. In this manner, all possibility of novelty is denied: everything already exists beforehand. Nature cannot be enriched by the work of mankind (Blumenberg, H.
1999, p. 88). When the craftsman builds an object, he does so guided by the forms
that govern the totality of nature, to which the craftsman himself belongs. The craftsman does what nature would do rather than what he himself would like to do. The
ancient belief that the work of the human is the imitation of nature is based on the
understanding of nature as an “ideally preformed” totality, the best possible totality,
so it would be foolish to think of improving it or changing it: one can only desire that
which already exists (Blumenberg, H. 1999, p. 90.) The work is not a means towards
374
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
the self-realisation and self-knowledge of man. As we will see later on, when the conception of human artifice as imitation of nature is finally superseded in the modern
age, it must also be understood as a liberation.
In the Renaissance a revolution occurs (Flusser, V. 2001, pp. 146, 147. Flusser, V.
2002b, pp. 85, 86): in the workshops, craftsmen pass on their knowledge to their apprentices, for example, of how to make tables. The transmission of this knowledge
occurs through praxis: the apprentice learns to make tables, and the more tables he
makes, the better his technique becomes. The forms of the objects are no longer ideal
models to be copied and become models that can be progressively improved through
the work. This marked a radical change in the conception of the world that opened
the doors to modernity. Modernity affirms that forms are “flexible” models that can
be progressively improved, rather than eternal ideas that must be materialised. The
concept of theory changes, it is no longer just the contemplation of ideas (ancient and
medieval notion of knowledge) and its imitation in matter, but instead becomes the
experience and construction of forms. Ideas are no longer revealed to theoretic sight
or to faith, but are contrasted with experience or with scientific experiment. For example, the form “table” is improved by knowledge of materials; a table made of wood
is not the same as a table made of stone: the constructive properties of the material
condition the form. This knowledge of the materials is empirical, the fruit of experience and experiment. There is no longer a pure theory but rather an empirical science.
There is no perfect table, or perfect society or perfect human being. Everything must
be improved by work (Flusser, V. 2002b, p. 86.)
Therefore, the craftsman’s work improves continuously generation after generation, becoming more professional and specialised. When primary importance
is placed on the work (the struggle with materials), the work itself as a production
process is scientifically analysed in order to maximise its productive capacity, resulting in the machine (in 1911 Frederick Winslow Taylor published Principles of Scientific
Management where he describes how the work process can be divided into separate
movements that can be regulated with precision) (Taylor, M.C. 2001, p. 7). This is the
industrial revolution.
Industrial technique
The next important stage after the invention of tools (craft-based culture) was the irruption of machines (industrial culture). What is the relation between matter and form
in industrial objects? Industrial revolution brings about the following situation: it is
the machines that do the work (Flusser, V. 2002b, p. 86.) First the craftsman materialised an ideal form, then the form was improved through successive materialisations
(apprenticeship in the workshop), and now the machine “prints” forms in the objects.
Industrial work is divided in two phases (Flusser, V. 2001, pp. 147-149. 2002b, p. 87):
the first, that we can call soft, human or semi-artisan, consists in the design of a form;
the second is hard, mechanic or automatic, in which the form designed by a human
being is reproduced mechanically by a machine.
In the first phase forms are designed according to the machine’s capabilities; simple and geometric forms, without mouldings, so that they can be reproduced mechanically. And so emerges the aesthetic of the machine and the criticism of ornamentation that defines the artistic and architectonic avant-garde of the period between
Pau Pedragosa Spain
375
the wars. In the second phase, the industrial one, the machine repeats set movements, which have been scientifically calculated and programmed, to produce a large
quantity of identical objects. If the initial design is an original model, a prototype or a
standard, the objects produced by the machine are identical copies of the same prototype. The copies of the same model are interchangeable, but the models are unique
(Flusser, V. 2001, p. 148).
The most preeminent characteristic of the industrial society is the mass production of identical objects. Industrialisation produces a highly stereotyped and redundant mass culture. To a large extent this is the reason why modern philosophy in the
first half of the 20th century defined itself by its disassociation from this banality and
vulgarisation: the Husserlian epoché or the authenticity and extreme and original situations of Heideggarian anguish and tedium. The consequence is that when a culture
becomes more stereotyped, interest shifts from the copies (redundancies) to the originals (novelty). The value is in the original design, in the new form of the object, and
not in its indefinite identical copies. The originality and novelty of the form is diluted
and “wasted” in its repeatability. When redundancy dominates the cultural panorama,
it becomes manifest that what interests us about objects is not their materiality, the
material support, but the form that is expressed in these different supports. When we
buy a table we buy the information that it contains by virtue of its original form. The
consequence of this is a separation between the form of the model and its materialisation in multiple objects, a difference that will be accentuated in post-industrial
production.
Post-industrial technique
The last phase is the current situation, called post-industrial. Post-industrial society is
characterised by a different type of machines that are not mechanical anymore, but
electronic, and that produce a different type of objects. The first post-industrial machine and the first post-industrial object are the photographic camera and the photograph (Flusser, V. 2001, p. 151). A photograph is a reproducible object, but it is reproduced in a different way than an industrial object (a table, a car, a pen). In the case of
photography, the model or prototype is the negative and the copies are its different
developments. An industrial object (e.g. a car) is mass produced and once it has worn
out it loses its value and we substitute it with another identical one (another copy of
the same model) or, more often, with a different one (a copy of another model). Postindustrial objects like photographs are, on the other hand, reproducible, in the sense
that we can develop the same photograph in different material supports without it
suffering any meaningful change. The photograph of a newspaper is not consumed
in the same way in which we consume a pen; the newspaper has a short life of just
one day, but not because the paper becomes worn out, but because the information
is substituted by other information (Flusser, V. 2001, p. 151). The difference between
the industrial and the post-industrial object consists in a difference in the relation
between matter and form. We can use the analogy of a printing press. Industrial machines are like imperfect printing presses: printing presses because they print a form
into matter (the table form into wood), but imperfect because the form is profoundly
linked to the matter of the object and cannot be separated (the form of the table cannot be separated from the wood without losing the object). The photographic tech376
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
nique, on the other hand, is like a perfect printing press: we print (develop) the form
(the negative) superficially on any material (from a newspaper to a t-shirt) in such a
way that it seems to “float” on the surface of the material and, therefore, we can separate it, that is to say, the image’s material support is relatively unimportant (Flusser, V.
2001, p. 151, 152).
From this point of view we can interpret industrial society as a transition from
craft-based society to post-industrial society, a transition from a society of originals,
from handicraft, to a society of information or of repeatable forms that glide superficially without a support: the information society (Flusser, V. 2001, p. 151). A postindustrial object is not exactly an object (matter + form), but pure information (form
without matter). Post-industrial “objects” are the digital images and texts that nowadays we find everywhere: TV, internet, etc. Any design made with a computer is a postindustrial object: it is pure information that we save on the computer or on the “cloud”,
that we print on paper or send via e-mail. In all these cases, the information is not altered by the change of the material support or hardware.
Rethinking the Human in the Technological Culture
Necessity and freedom
Now we can ask ourselves how we have come to arrive at this state of affairs. A possible answer might be: because objects are not good memories (Flusser, V. 2001, p. 157).
Objects (information linked to matter) are consumed until they are worn out and then
become “misinformed”, redundant. This creates two problems: 1. When the object becomes worn out, the material has to be thrown away, leading to the problem of object
accumulation and the need for recycling (giving it a new form, making it the support
for other information). 2. If they are not yet worn out, we keep the cultural objects in
museums (archives of historical memory), but they occupy a lot of space. On the other
hand, pure information is archived in the vast virtual warehouse of the internet, the
museum of the future, where it can be manipulated and transformed constantly. The
information is created, disseminated and stored in a virtual space that it is nowhere
and everywhere at the same time.
The information is intangible (because it is immaterial) but accessible everywhere.
The human being defines himself less and less by the culture and the material world
of the humanist tradition, and more and more by the relations that he maintains with
others, with whom he interchanges information. What is important is the intersubjectivity, the dialogue (Flusser, V. 2001, p. 159). Each one of us is a knot in an information
net in which we receive, manipulate and send information.
From the point of view of the information theory that we have outlined at the beginning we can try to assess the current situation (cfr. Flusser, V. 2001, pp. 153-157). If
we define repetition in the strict sense as something that must always happen, something that cannot be in any other way and, therefore, that prevents the apparition of
novelty as something unexpected, then, “repetition” is equivalent to “necessity”. As
we have seen, this is the situation in the ancient and medieval world shaped by craft
technique. If information is novelty, we can refer to it by the term “freedom”. Freedom
is possibility, that which permits a break in the necessary course of things by introducing something unexpected. Freedom includes the possibility of breaking the law,
the norm, the normality. And so, the information society supposes the triumph of freePau Pedragosa Spain
377
dom over necessity, the triumph over death (all information is archived and can be
reactivated), the triumph of form over matter, the triumph of the power to be over actual being. If, moreover, as we have said, information is available anywhere in the virtual space, this means that it is nowhere and everywhere. Isn’t this precisely the definition of u-topia (that which has no place, topos)? This is a possible interpretation albeit
a unilateral one (and obviously naive in its excessive optimism), as it defines human
nature exclusively as being free, and, moreover, it presupposes that only technology
allows the full realisation of freedom.
There is also the other side of the coin, the alternative interpretation according to
which the current situation does not result in the triumph of freedom, but with the
submission of the human being to the automatism of machines. The reverse of this
situation becomes clearly manifest when we answer the question: How do we produce information? Through software programmes, that is to say, through automatic
programmes. “Programme” and “automatism” are terms opposed to that of “freedom”
(Flusser, V. 2001, p. 153). When we design we do not do whatever we want, but only
what the program allows us to do. Automation can already be seen in our gestures:
everything is reduced to the movement of the fingers over the keyboard or the
“mouse”. If we imagine that we are observing a craftsman in his workshop and a worker seated in front of the computer, the contrast between their bodily gestures alone
might lead us to suspect a loss of freedom rather than greater freedom. The craftsman moves his entire body in the space of the workshop, as the material requires different gestures, different exertions; the computer only requires the movement of the
fingers. The danger is, therefore, that the programme that allows us to design such
“free” forms is at the same time programming our gestures, it automates them, as the
industrial assembly line had done previously (cfr. Flusser, V. 2002a, pp. 51-58).
With both interpretations we are faced with the paradox of freedom and necessity
that encloses the concept “technology”. On the one hand, technology frees us from nature (materiality, our body, what conditions us, definitiveness, death), but, on the other
hand, it submits us to another necessity, that of the automatism, the programme. On
which side do we place technology: on that of freedom (understanding it, therefore, as
opposed to nature, from which it frees us) or on the side of necessity (understanding it
as an automated programme)? Phenomenology produces a displacement of meanings
that allows us to understand the paradox of technology in another way.
Architecture: technology, art and the life-world
We normally associate nature with necessity and technology with freedom. Phenomenology brings about a shift of meanings that approximates the technological and
the natural and distinguishes them from the meaning of freedom. One of the key and
more fruitful concepts of phenomenology is that of the life-world (Lebenswelt) or, as
he also calls it, the natural world, terms with which he refers to the world of everyday
life. The life-world is both the world of nature and the artificial world of technology
that, to us, has already become second nature. This world is not defined by the types
of objects of which it is composed, but by our attitude towards them, by the relation
that we have with the things that surround us, whether they are artificial or natural
(Blumenberg, H. 1999, pp. 41, 55). The life-world or the natural world is the world
characterised by what is obvious and familiar (Blumenberg, H. 1999, p. 48), by what
378
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
is presupposed, that is to say, the contrary of information, of difference, the unexpected and the surprising. In this way, our natural attitude in the world of everyday life is
identified with an attitude that is technical, automated, thoughtless and unconscious.
If a technological attitude towards things is an attitude that moves in the field of what
is obvious, art, on the other hand, can be understood as the dismantlement of things
that are considered obvious.
The technical relation with the world is a characteristic of our time. An important reason of this situation is the immense quantity of knowledge accumulated
throughout history, which we are able to access (internet is the clearest example of
this availability of knowledge). When knowledge has accumulated to the point where
it is beyond the capacity of a person to acquire it all within their lifetime, then it is
transmitted in such a way that although it may not be fully assimilated, it will still be
functional, profitable and effective. That is to say, we have a technical or instrumental
knowledge that we can use but which we cannot question. When we use knowledge
as an instrument, its meaning becomes lost: we know what we are doing and how
to do it but not why we have to do it. The technification of our knowledge of things
therefore becomes an active ignorance (Blumenberg, H. 1999, p. 56).
A clear example of the technical attitude is offered by the computer, or any technological apparatus, considered as a black box: objects that we know how to use but
that we do not really understand how they work or how they are built. We press a button and this triggers a series of effects. What is important is that the effect is available
to be produced. Behind each computer there is a long history of human discoveries,
although all of this remains hidden behind its ease of use —and it would be a bad
technical object if it allowed us to see the entire history behind it. The technological
object refuses all questions related to its meaning, it presents itself in such a way that
these questions do not flourish, especially the ones that refer to the justification of its
existence (Blumenberg, H. 1999, pp. 59, 60). Its ultimate justification is that it works.
In this way, the technological object presents itself as something obvious, that is
to say, it does not provoke any questions, it silences all questions relative to its better optimisation, such as whether it is or is not necessary, whether it is meaningful,
whether it is worthy, etc. Artificial reality makes its home in the life-world, understood
as the place of the obvious, the familiar and the redundant. But the very same artificial
reality can distance itself from the life-world and create estrangement, difference and
surprise. Therefore, what is necessary is to create some spaces where questions can
be made. A space opened by artistic creativity that, obviously, in our times is realised
through technological means.
Architecture as an object that is simultaneously technological and artistic is able to
open one such space, when it is not considered merely as something functional and
efficient. Architecture can show, apart from its functionality, its meaning. According to
the measure in which this can be achieved, we can go beyond the simple technification of things or the natural way in which we understand them.
Bibliography
Aristotle. 1994. Física. Madrid: Gredos.
Blumenberg H. 1999. Las realidades en que vivimos. Barcelona: Ediciones Paidós.
Pau Pedragosa Spain
379
Flusser, V. 2001. Una filosofía de la fotografía. Madrid: Síntesis.
Flusser, V. 2002 a. Filosofía del diseño. Madrid: Síntesis, 2002.
Flusser, V. 2002 b. Writings (Andreas Ströhl Editor). Minneapolis: Minnesota University Press.
Heidegger, M. 1977. Gesamtausgabe. Band 2. Sein und Zeit. Frankfurt am Main: Vittorio Klostermann GmbH.
Heidegger, M. 2000. Gesamtausgabe. Band 7. Vorträge und Aufsätze. “Die Frage nach der Technik“.
“Bauen Wohnen Denken“. Frankfurt am Main: Vittorio Klostermann GmbH.
Mosterín, J. 2009. La cultura humana. Madrid: Espasa.
Mulder, A. 2004. Understanding Media Theory. Rotterdam: V2_/Nai Publishers.
Taylor, M. C. 2001. The Moment of Complexity. Emerging Network Culture. Chicago: The University
of Chicago Press.
380
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Yannis Zavoleas
School of Architecture
University of Patras
Greece
House-as-Machine:
the Influences of Technology
during early Modernism
By focusing on the concept of house-as-machine, machine’s early adaptations to the
house are compared to the socio-cultural context of early modernism. A set of variations of all-electric house developed in Europe and the US is presented along with the
cultural differences between these two geographic areas. Meanwhile, in response to
the economic crisis of the Thirties, house-as-machine was adapted into case studies on
the minimum dwelling [Existenzminimum]. Rooms having specific mechanical function such as the kitchen and the bathroom were symbolic ones concerning economy
and competence. Additionally, the machine became a reference to a more systematic
understanding of architectural design. Ernst Neufert’s book Architect’s Data, first published in 1936, is a comprehensive guide about the integration of machine’s characteristics into architecture, whereat the house is given a prominent place. Le Corbusier’s
Dom-ino House and Citrohan House are concise manifestations of the architect’s related interest. Upon comparison, it is claimed that the designs are not always automatic translations of the machine into architectural space; in that case, they rather carry
contradictious qualities.
House-as-machine
In early modernism, the machine was appointed as a multifaceted reference for the
definition of the modern house. The related research focused on the technological
evolution of domestic appliances, also on efficient and ergonomic design, the development of new typologies for the house unit and the housing complex, the standardization of construction using prefabrication and the building industry, the new materials
and the highly sophisticated solutions and techniques in the construction site. These
studies were framed by an upraising interest in building development, including its
market. In the present paper, machine’s influences to the house are organized according to an evolutionary schema. First, technology is viewed as a cue being supportive to
domesticity. Then, ergonomics and functionality are being discussed in relation to the
design of the house. Finally, the machine is examined as a model of organization in the
modern house, with respect to architectural space and construction. This set of references of the machine represents the spectrum of metaphors between technology and
architecture, which has been established since modernism, as it stands to the present.
The support of domesticity by technological means was possible due to a variety
of technological innovations of electrical appliances, whose aim was to modernize
common operations. One of the first technologically equipped houses was presented
in the IV Triennial Exhibition of Monza, Italy, in 1930 (Fig. 1, 2). The Electric House, [La
Casa Elettrica] designed in Milan by Luigi Figini, Auido Frette, Adalberto Libera, Gino
Pollini of Gruppo 7 and Piero Bottoni, is an exemplary case of machine-house offering
a variety of typologies, with potential for mass-production. Apart from the desire to
incorporate a number of appliances in the household, the proposal demonstrates a
new architectural view about technique, manufacturing and efficient arrangement of
space, being expressive of modern life. Technology was present in the form of new
building materials and techniques, also of contemporary queries on the problem of
dwelling, mainly with the rearrangement of the plan. There are apparent differences
between the Electric House and the folk Italian house. The traditional ways of living
were closely attached to the Mediterranean climate, and so it was difficult to adapt
the new technological achievements, the international standards and the modern
382
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 1 & 2
The Electic House [La Casa Elettrica], designed in Milan in 1930, by Luigi Figini, Auido Frette, Adalberto Libera, Gino Pollini of Gruppo 7 and Piero Bottoni.
aesthetics to what had existed before. The Electric House proposes a dialectical relationship between new technologies and the special character of traditional life. In respect, special attention was given to unite the interior with the exterior through wide
openings, also with the placement of greenery to the interior and the positioning of
semi-open spaces in the building’s volume. Overall, the Electric House demonstrates
a mechanical logic responding variously to a set of timely contemporary challenges
about space organization, advanced manufacturing and construction methods, hightech equipment and a wholly new approach to aesthetics.
During Interwar, industries such as General Electric Company and Westinghouse
Electric & Manufacturing Company promoted a series of all-electric houses built across
Yannis Zavoleas Greece
383
USA, aiming to conquer the related market. In 1933, at the Chicago Century of
Progress exhibition featuring fully electrified houses, General Electric Company introduced the Talking Kitchen as this (Fig. 3):
“‘There are no attendants in this kitchen,
but ... a voice from an unseen source
announces that this is the last word in
kitchen equipment. As if by magic the
door of the electric refrigerator opens
and the voice, coming apparently from
the refrigerator, relates how the refrigerator saves money for the owner. Then a
spotlight falls on the electric range, the
oven door lowers,’ and a voice explained
its operation, and so on through the rest
of the appliances. By eliminating the attendant, the company seemed to say
that the kitchen worked by itself” (1933,
as cited in Nye, 1990, pp. 357-358).
Fig. 3
The Talking Kitchen, by General Electric Company designed in 1933.
An interest on the matter was growing rapidly and in 1935, General Electric Company
in cooperation with Federal Housing Authority sponsored the Home of the Future architectural competition. Westinghouse Electric & Manufacturing Company’s dynamic response was to promote its first All-Electric Home in Mansfield, Ohio, in 1934 and later, in
1939, to publish a proposal on The Electric Home of the Future (Fig. 4) in Popular Mechanics magazine, asserting full automation of every function and maximized practicality
in design. More ideas followed in the following years and in 1949 “Science Illustrated”
magazine hosted an advanced version of the Electric House (Fig. 5), which, apart from
automation —using equipment from Touch Plate, Square D. and General Electric Company— was equipped by a central control system operating through a wireless network, with the prospect that a large part of the market embraces these technological
advancements within ten years (Science Illustrated, May 1949, pp. 66-69).
Taken together, the various companies’ views on the house-as-machine are domestic expressions of the “technological sublime” (Nye, 1990, pp. 359-360). Ideally, the
domestic activities would be programmed so that the machine-house functions as a
sort of self-decisive mechanism controlling all aspects of everyday life. However, the
proposed relationship between the dweller and technology was not unresistingly
accepted by the public; rather, the consumer felt a loss of control in his own sphere
of control, a probable reason for its marketing failure in the first place. In retrospect,
the venture had much more complex implications. The psychosocial significance of
the house as a symbol of tradition and of customary habits being transferred essentially unchanged from one generation to the next, had to be replaced by a modern
view of the house being at the center of technological advancement. To this end it
took a long campaign, so that any of the resident/consumer’s hesitation due to the
invasion of technology in the household was eventually alleviated. As an example, in
384
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 4
Hybrid body with mutable extensions in
evolving space (drawing by author).
Fig. 5
A technologically advanced example of The Electric House, published in Science Illustrated magazine
in 1949, using equipment from Touch Plate, Square D. and General Electric Company.
Yannis Zavoleas Greece
385
The Electric Home of the Future article, the three-dimensional artistic image depicts the
living standards of tomorrow: a highly equipped kitchen adds to the domestic coziness, being united to the extended living room, where the children play happily and
the husband reads his newspaper, while the wife/mother is serving coffee. Advertising
images of similar content gradually altered the state of consciousness for the consuming public and technology was related to intimacy, privacy, control, individuality and
character. This friendly and technologically advanced domestic environment would
be expressive of the new values such as utility, luxury, sophisticated style, or a combination of them. From then on, the modern view of dwelling would be about a stylishly
equipped household functioning tirelessly, also with safety, comfort and economy,
in so doing providing more spare time for relaxation and wellness (Ross, 1996, p. 89).
Over the years, the industry and the market of domestic appliances have managed
to be very successful to the updated standards. Designing the machine-house of the
future would be about the seamless integration of all sorts of appliances into a unifying total. Technology would serve every domestic activity, while its presence, refined,
contemporary, modern, simple and friendly, suggested a sense of progress towards a
better future. Technological evolution has continued since then, being supported by
research in engineering, industrial design, electronics and computer science, also under the objective to take hold of the market. Seen from this perspective, the machinehouse is better understood as a case study for a new lifestyle, for which technological
innovation responds to the daily needs, even those being artificial.
In architecture, machine-house would infer a total redesign of the house as an “apparatus for living.” For that purpose, it was firstly important to identify those everyday
activities also offering an updated understanding of them; then, to address these
activities with regard to spatial relevance and to the relationships among them, in
order to outline the new design principles. This new trend is reflected in the House
that Works article, published in Fortune magazine, in 1935: “a building, whether it be a
dwelling or a factory or a post office, is a tool. That is to say, the house is an instrument
fabricated for a purpose” (Fortune 12, Oct. 1935, p. 59-65, 94, as cited in Nye, 1990,
p. 359). The view of machine-house as an instrument refers to all of the functions assigned to the household, with a generalized intent for maximized efficiency. Such requirements highlight the importance attached to the relationship between a task and
the time needed for its execution. The aim for optimal time-management is typical of
modern life. Richard Buckminster Fuller describes time as a new dimension that must
be taken fully into account in the design of the house. Time-saving is accomplished
by the segregation of functions “being individually solved,” (Fuller, 1928) thus being
translated to a dominant philosophy often requiring specialized design and technological support. Overall, the machine-house responds to the idea of a house that “performs” as effectively as possible, under the premise of maximum functionality.
In Response to the Economic Crisis
The Minimum Dwelling
The definition of functionality in the house was influenced by broader social factors, even by historical upheavals. For example, in the early thirties, functionality was
viewed as a possible response to the economic crisis. Architecture’s socially driven
mission was to meet the living needs of the less privileged layers of population. Thus,
386
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
functionality would be translated to efficiency in space utilization. A growing interest on Existenzminimum being about the house on a minimum size would represent
a new culture of everyday life in Europe (Heynen, 1990, p. 41). Existenzminimum
was part of a broader survey under the name Das Neue Frankfurt, first presented in
1926 and then published in 1929 for 2nd CIAM held in Frankfurt, whose theme was
The Minimum Subsistence Dwelling [Die Wohnung für das Existenzminimum] (Heynen,
1999, pp. 43-8 and Mumford, 1999, pp. 33-4) (Fig. 6, 7). The study was to set the minimum standards of quality and comfort for a house unit, then to incorporate this unit
into housing complexes and so to make up entire neighbourhoods, all following the
same principle. It took place during the Twenties in Central Europe and especially in
Germany, where many housing assemblies were built known as Siedlung. Hilde Heynen explains how in a short time period from 1927 to 1931, due to the economic crisis,
the objectives of the design research on the minimum dwelling were in total harmony
to the qualities of the machine (Heynen, 1990, pp. 48-63). Indeed, the dimensions of
the constituents of a house unit and the relationships between them would have to
be assertive of efficiency. The same intention is evident in the functional definition of
rooms being typical of domestic living such as the living room and the bedroom, also
in an effort for the minimization of supportive areas such as the entrance hall and the
corridors.
Fig. 6
Poster of the exhibition Die Wohnung für das
Existenzminimum, held in Frankfurt, in 1929.
Fig. 7
Design by Grete Schütte – Lihotzky and Wilhelm Schütte, standard worker’s apartment
house in Frankfurt, (2nd CIAM, 1929).
Yannis Zavoleas Greece
387
In particular, the machine model is applied in full in rooms with advanced mechanical requirements, such as the kitchen and the bathroom. Because of their engineering and technological specifications, these rooms were viewed as symbols of the
mechanical operations in the domestic environment and they were often given a hitech aesthetics, as in the scientific laboratory. A typical example of this is The Frankfurt
Kitchen [Die Frankfurter Küche] (Fig. 8) by Grete Schütte – Lihotzky in 1926, designed
to be in every household. The project was included in the extended survey of Das
Neue Frankfurt on the minimum dwelling. As Schütte – Lihotzky explains,
“we regarded the kitchen as a kind
of laboratory, which, because so
much time would be spent there,
nevertheless had to be ‘homey.’
The time required to carry out the
various functions was measured
using a stopwatch ... in order to
arrive at an optimum, ergonomic
organization of the space. ... The
cost savings resulting from the reduced size of the kitchen remained
significant, however, so that the
Frankfurt Kitchen offered the double advantage of lower construction costs and less work for the occupants. Only by arguing in these
terms, was it possible to persuade
the Frankfurt city council to agree
to the installation of the kitchens,
with all their sophisticated worksaving features. The result was
that, from 1926 to 1930, no municipal apartment could be built
without The Frankfurt Kitchen”
(Schütte – Lihotzky, n.d.).
Fig. 8
The Frankfurt Kitchen [Die Frankfurter Küche], designed by Grete Schütte – Lihotzky, in 1926.
In the above description, the conditions of efficiency and performance are evaluated
through a combinatory compromise between optimal work in shortest time and lowest cost. Functionality is directly related to economy and is implemented along with
technological support of the domestic activities, size minimization and industrialized
fabrication using standardized measurements. In general, machine’s nominative characteristics would be fixed up as a set of principles outlining the new ideal about dwelling, and then be transferred to a more systematic design of the house.
A focus on systematic design grew along with the development of ergonomic
standards and guides describing human activities, as it was extended in designing other building types, too. For example, Ernst Neufert’s Architect’s Data handbook
guide provides an essential reference for the design and planning about a building
project, as a comprehensive collection of data about requirements, criteria, considera388
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
tions of function and site adaptation. Neufert’s book, originally published in Berlin in
1936 (with the title Bauentwurfslehre), has been a prominent contribution to the survey on efficient space design. Neufert addresses the subject using detailed tables and
schematic explanations, so that every decision would be taken in response to fully
measurable data, also clearly specified descriptions and methodologically controlled
design operations. In the same logic, the book outlines detailed listings about every
space type, function and activity, aiming to lay down any possible case. Specifically for
the house, the schematic proposals refer to each room in relation to function, such as
the living room, the bedroom, the auxiliary room, the circulation areas and the engine
room, even to outdoor areas, the garden, the pool, the parking lot and every other
one. Additionally, a series of alternatives is presented, such as sites with different characteristics, cabins, villas, apartment buildings and housing complexes. The variations
are exhaustingly analyzed and accurately designed, with specific dimensions relating
to every description, including construction solutions, details, amenities, interior layout and furniture design.
The author extends the ideal of efficient design by suggesting an all-systemic approach for the house, as well as for any other functional program and building type.
Additionally, he offers solutions about the static behaviour of the building, also sustainability, energy efficiency, even aesthetics. In fact, the chapters “The Eye”, “Man”
and Colour” are devoted to visual perception (Fig. 9). A set of comparative criteria is
given to study the aesthetic effects of various sizes, shapes and colours (Neufert, 36th
ed., 2004, pp. 37-43). The chapter “Proportions” is devoted to the geometric construction of Golden Section, which, in subsequent editions of the book, is compared to Le
Corbusier’s Modulor. In Neufert’s book, architectural design is treated as a decision-
Fig. 9
Ernst Neufert, The eye and Proportions: Modulor, in Architect’s Data, 36th edition (in Greek), 2004.
Yannis Zavoleas Greece
389
making process, whereby the building’s space and the final form arise through firmly
rationalized steps being related to predefined imperatives and problems, leading towards their best resolution. Systemic approaches in architecture are notably rooted
on this part of modern tradition, whereat the machine model with its symbolic meanings and connotations is infused in every design decision.
Le Corbusier’s Trajectory on the machine-house
Since early modernism, Le Corbusier developed laborious studies on the machinehouse, finally leading to a complete redefinition of the house. Initially, he focused on
space organization with regard to efficiency and ergonomy; soon, however, he sought
for an entirely new approach, for which the machine-house aimed at establishing the
design principles about modern architecture. Le Corbusier’s architectural vision presented in his influential book Toward an Architecture in 1923 is based on the machine
model. As in manifesto style, the architect/author compares the values of modern architecture to those a machine is built. His view of the house as a Machine for Living
[Machine à Habiter] (Le Corbusier, 1923), was revolutionary in the sense that contemporary dwelling is defined as a problem needing rational solution. For Le Corbusier,
the house is a central case: “To study the house for the common man, for ‘all and sundry,’ is to recover human foundations: the human scale, the typical need, the typical
function, the typical emotion. There you have it! That’s crucial, that’s everything” (Le
Corbusier, 1923). After attempting to pose “right at last” the problem of dwelling —as
it was still pending, he states— Le Corbusier sets the daily necessities as the basis for
rational resolution. Then, he establishes a list of “conveniencies” upon which he composes a Housing Manual (Le Corbusier, 1923). In his comprehensive work, Le Corbusier
records in detail the requirements of dwelling, also mentioning the significance of
every room and element in the house, including the terrace as a place for sunbathing,
the outdoor space, the garage for the car, the bicycle and the motorcycle, the maid’s
room and the bathroom. The list goes on with the walls, the exercise equipment and
furniture such as tables, chairs, storage units and drawers, the gramophone and the
ventilating panes. The author responds to the problem of dwelling by a set of standards based on the logic of practicality, functionality, economy, prediction, regulating
lines and standardization of construction (Le Corbusier, 1923). The machine contributes in the solution as a model that directs, enhances and improves both the design
of the house and the experience of dwelling.
Two years before, in 1921, Le Corbusier had presented Citrohan House (Fig. 10), as
a typical solution for the mass-produced house based on economy. The name invokes
the French automaker Citroën, as with it Le Corbusier compares the house to a car.
The house-as-car concept introduces two more principles: first, Le Corbusier prefers
using the term “equipment” [outillage] rather than “furniture;” second, he exerts the
importance of mass-production, affecting price, availability and fabrication in short
time (Le Corbusier, 1923 and Banham, 1960, pp. 221, 233). Spatial arrangement responds to efficiency and practicality of function. Special attention is given to natural
lighting about each room regarding orientation and occupation, also to the maid’s
room and the bathroom. The solutions given reflect a long time period of research
and were based on standardization of design, also of the plan and construction, under the aim to keep costs as low as possible. The proposal brings up a combination of
390
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 10
Le Corbusier, Citrohan House, 1921 and Dom-ino House, 1914.
adaptability of the general design to specific requirements, along with compatibility
of the construction elements and the possibility of fabrication and assembly on the
site. In 1914, Le Corbusier had responded to these issues with Dom-ino House, a framework structure that could be repeated also permitting a great variety in the grouping
of the houses. The frame is made of elements of reinforced concrete. It supports the
walls (cavity walls with 20 cm voids enclosed by 3 cm skins made of cement sprayed
onto stretched sheet metal), the floor slabs and the stripes of factory window frames,
all to the same module. Most of the propositions of Dom-ino House are present in Citrohan House. Citrohan House is highly innovative for its time for the way it readdresses
the problem of dwelling under the premise of standardization.
With Citrohan House, the machine model conveys a mechanical quality to the
processes of design, also to construction and domestic living. First, new techniques
are invented for the systematization of fabrication, along with the designing of the
prefabricated walls, the framing system and the foundations (Banham, 1960, p.244).
The structure exploits the potential of new materials and of reinforced concrete, also
of the advancements brought by building technology and industry and so it proposes a new view on flexibility in regards to construction and spatial organization.
Furthermore, the house meets the challenges of modern living as a “utensil” carefully designed, also being subtle, compound and adjustable. Modern house’s lifetime
is defined by the time period that a house satisfies its operational requirements, as it
may be repaired even be replaced, much like when a machine, a car, or a tool does
not work, does not fit, or is not sufficient anymore and so this would be the time to
throw it away (Banham, 1960, p. 241). Citrohan House may be mass-produced as a system that is repeated horizontally and vertically as well. It may also be in great variety,
Yannis Zavoleas Greece
391
responding to the individual’s requirements, social status and budget, from factory
housing to artists’ studios to villas and to mansions for the rich (Le Corbusier, 1923). As
Frederic Migayrou comments, “by linking the name Maison Citrohan to the field of automobile production, Le Corbusier had definitely caused a seminal break in the understanding of dwelling. The house became a true product, an object entirely organized
by a system of processes, finite, circumscribed and valued in terms of cost and return
on investment” (Migayrou, 2002, p. 18). The rationale of the machine model yields its
qualities to the produced object: the design of the house follows closely the emerging technological capabilities, meanwhile setting the aesthetic trends for the house
of the future. Le Corbusier’s extended survey on standardization includes the Village
of Pessac (1925). By taking advantage of the recent advancements in the industry of
prefabrication, new methods of standardization were applied for the production of architectural elements such as walls, floors, ceilings and beams. The whole village was
completed in less than a year at low cost, using reinforced concrete and prefabricated
components, also following preplanned methods of assembly.
In respect, Le Corbusier’s study on the machine-house would induce to an abandonment of the romantic idea of the traditional home that was expressive of the owner’s customs and personal style, promoting instead the values of the machine, along
with up-to-date requirements such as hygiene, comfort, utility and practicality. Soon,
similar issues would set the main guidelines for the design of the modern house, further establishing a whole new culture. Architecture’s main interest would be to supervise all of the individual parameters and the decisions along the processes of design
and construction. In this course, Le Corbusier went as far as to compare the machinehouse to a tool. Meanwhile, he treated architectural qualities and aesthetics as being
more permanent, reflecting the true values and the essential preconditions of life; in
so doing, he would ensure livingness and sustainability for the house and for architecture as well. Le Corbusier’s research trajectory manifests an evolutionary transition
towards a completely original conception about architecture, for which the machine
represents an integral model about space in regards to function, order of organization
and the modes of production.
Conclusion
In the course of modernism, the concept of house-as-machine is reflected into a set of
variations. Initially, it is identified as a plain support of the domestic activities by technological means; then, it is extended to ergonomic design of the house, in response to
every possible design issue. Rooms with mechanical requirements favour a more integrated implementation of the machine, whereas rooms such as the living room and
the bedroom are more flexible, as these may change in house’s lifetime. Studies on the
house of that era would focus on a special kind of variableness permitting transformations through standardization. The machine offered itself as a main reference to a radical revision of the design principles and a new philosophy about architecture. Compound structural models may offer more complex properties and may better respond to
space’s total behavior; in that case, machine’s qualities such as efficiency and performance may be combined to other ones such as openness, flexibility and adaptability.
392
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
References
Anon, The ‘Electric House’ at the IV Monza Triennale. In Domus 32, August 1930. Translated by G.
Ponti vol. I, Taschen, Cologne, 2006.
Anon, 1949. New Wiring Idea May Make the All-Electric House Come True. In Science Illustrated.
May 1949, pp. 66-9. Available at: <http://blog.modernmechanix.com/2009/04/30/new-wiringidea-may-make-the-all-electric-house-come-true> [Accessed 10 July 2010].
Banham, R, 1960. Theory and Design in the First Machine Age. Oxford: Architectural Press.
Buckminster Fuller, R, 1928. 4D Time Lock. 1972. Albuquerque, NM: Lama Foundation/Biotechnic
Press.
Heynen, H, 1999. Architecture and Modernity: a Critique. Cambridge MA: The MIT Press.
Le Corbusier, 1923. Vers Une Architecture. Toward an Architecture. Translated by J.Goodman., 2007.
Los Angeles: The Getty Research Institute.
Le Corbusier, Le Corbusier 1910-65, 1967. Boesinger, W. and Girsberger, H. eds. 1999. Basil, Boston
and Berlin: Birkhäuser Publishers.
Migayrou, F, 2002. Particularities of the Minimum. In M.A.Brayer and B.Simonot, eds, 2002. Archilab’s
Futurehouse: Radical Experiments in Living Space. New York: Thames & Hudson.
Mumford, E, 1999. The CIAM Discourse on Urbanism 1928-1960. Cambridge MA: The MIT Press.
Neufert, E, 1936. Architect’s Data. 36th ed., 2004. Athens: Giourdas Publishing.
Nye, D, 1990. Electrifying America: Social Meanings of a New Technology. Cambridge: MIT Press.
Ross, K, 1996. Fast Cars, Clean Bodies. Decolonization and the Reordering of French Culture. Cambridge,
MA & London, England: October Books / The MIT Press.
Schütte-Lihotzky, M, 1980-90. Erinnerungen (Memories). Vienna, unpublished. Available at: <http://
www.mak.at/e/sammlung/studien/studiens_frakue_e.html> [Accessed 12 October 2010].
Yannis Zavoleas Greece
393
Claus Bech-Danielsen1
Anne Beim2
Charlotte Bundgaard3
Ulrik Stylsvig Madsen2
The Danish Building Research Institute (SBi)
1
RDAFA – School of Architecture
2
The School of Architecture Aarhus
Denmark
3
Tectonic Thinking
- Developing a Critical Strategy
for a Responsive
and Adaptive Architecture
This paper derives from a larger research project that is performed across the three
major Danish research institutions in the architectural field: The Royal Danish Academy of Fine Arts, Schools of Architecture, Design and Conservation, Aarhus School of
Architecture and Danish Building Research Institute, Aalborg University. This research
project, funded by the Danish Research Council, focuses on tectonics, sustainability
and building culture. It is thus the project’s overall purpose to update tectonic theories - in the light of current changes in our building culture and in the light of the new
challenges arising from the requirements for transition to sustainability.
The research group is convinced, that the architectonic quality of a building is
closely related to its tectonic structure. The term tectonic is known from geology,
where it describes an understanding of the mountains original appearance as well as
current activities in the earth crust, such as volcanic eruptions and earthquakes. In a
tectonic perspective the planet is considered a living organism, through studies of the
forces, relocations and movements that over time have occurred in Earth’s outer surface. In a sustainable perspective, there seems to be a direct parallel to the view of
our planet, described in James Lovelock’s Gaia theory from the 1970s (Lovelock, 1979).
This theory sees the Earth as a living system - proposes that all organisms and their
inorganic surroundings on Earth are closely integrated to form a single and self-regulating complex system, maintaining the conditions for life on the planet (Lovelock,
1979). Common for the two theoretical positions is that they both lead to procedural
and open views on architecture.
In a concrete reality it may be harder to spot the immediate relationships between
sustainable development and tectonic quality in architecture. One of the greatest challenges in a sustainable perspective is the growing energy consumption and the global
climate changes - which disrupts Gaia’s self-regulating system. Construction and operation of buildings are in most European countries responsible for 40-50% of total energy
consumption (European Commission, 2010), and the political objective to reduce the
consumption of energy, therefore puts high requirements on the built environment.
This also includes Denmark where the societal objectives are aiming for a reduction of CO2 emissions by 20% before year 2020. Regarding this the standards on isolation and energy consumption were improved in the Danish Building Regulations in
2010. Further improvements will be adopted in 2015 and in 2020, and by then all constructions in Denmark are expected to be built according to passive house standard.
This is done primarily through increased use of insulation. Energy requirements will
thus have a major influence on shaping the future construction and development of
tectonic qualities in architecture.
Also, in the existing building stock, the requirements for energy reduction results
in architectonic challenges. The annual share of new buildings in Denmark is only a
bit more than one per cent of the total building stock (Christensen and Nielsen, 1999),
and if energy consumption is to be reduced seriously, it is crucial that the consumption in the existing building stock is reduced. Also, in this respect, external insulation
is seen as the primary solution, and the energy requirements will thus have major architectonic implications. It is therefore important to discuss how the energy improvements can be integrated in architecture, and can be implemented without jeopardizing the architectural heritage.
A second challenge is the increasing industrialization and its impact on products
and processes in our built environment. The development, which has transformed tra396
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
ditional craftsmanship, based construction to industrialized construction with computer-controlled production processes, calls for an updated theoretical basis and a new
understanding of architectural practice. A reinterpretation of tectonics as a concept
must necessarily respond to this.
In doing so, the present research project is based on existing exploration of tectonic aspects of architecture, and the project’s primary task is to explore and discuss
the existing theoretical positions in the light of current demands and cultural issues.
These issues will, among others, deal with sustainability (resource consumption) and
our current building culture (technology development and cultural identity), and they
will be illustrated through a series of related sub-projects. The research project’s overall objective is to develop a strategy for a tectonic sustainable building practice.
Tectonic Thinking – Research Question
The research project is thus based on the fact, that the current environmental agenda
entails a number of requirements and potentials to be uncovered and unfolded as active parameters in architectural development. The overall research questions can be
formulated as follows: Can tectonic thinking form the basis for new strategies for sustainable building practices and industries?
To answer this question it is necessary to understand to what extent and in what
manner the paradigm of sustainability influences the tectonic. In this respect we turn
to Marcus Vitruvius Pollio for a moment. He wrote De Architectura in the first century
AD, and in this classical doctrine on architecture he stated that architectural quality
occurs in buildings when the quality both includes firmitas (durability), utilitas (functionality), and venustas (beauty) (Rowland and Howe, 1999). Firmitas concerns the material and the constructive items in architecture that is relevant to a building’s physical durability. Utilitas concerns the functional issues of architecture, that is to say the
building’s capacity to respond to the demands and needs made by the users and the
surrounding society at various times. Venustas concerns the beauty of architecture,
which in Vitruvius’ classical universe was about the building’s ability to mime (from the
Greek ‘mimesis’) the cosmic order of nature.
In this light it becomes interesting to look at the tectonic, e.g. at Semper, who attributed the contemporary understanding of the concept of tectonics (Beim, 2004).
We will focus particularly on two aspects of Semper’s concept of tectonics. Firstly, according to Semper tectonics is a result of conscious artistic work (Semper, 1851). Secondly, according to Semper tectonics primary deals with the material design of constructions, while the functional content of architecture is not the main focus in the
concept (Beim 2004, p.49).
Following the first point, it is worth noting that the design of tectonic aspects in
architecture must involve an artistic idea. Therefore, Hartoonian (1994, pp.29-30) argues that tectonic work has a ‘purpose’, is based on an intention and has a meaningful content. Tectonic design requires an artistic idea, which acts as an organizing and
structuring principle. An overall principle that forms materials and constructions into
coherent structures.
Following the second point it is worth noting that tectonic work in the view of
Semper’s definition can be regarded as an artistic form - having only secondary focus
on the functional content in architecture. Tectonic work therefore discusses artistic
Claus Bech-Danielsen, Anne Beim, Charlotte Bundgaard, Ulrik Stylsvig Madsen Denmark
397
design (venustas) based on architectural physical structures and materials (firmitas)
- rather than focusing on architectures functional content and its everyday use (utillitas). This fact is important when answering the described research-question dealing with the influence of sustainability on the tectonic work. Part of the sustainability
concerns are about resource savings, for example, reducing energy consumption and
CO2 emissions related to construction. These requirements directly affect the choice of
materials and constructions; materials must be evaluated from ‘cradle to cradle’ (McDonough and Braungart, 2002), and constructions must likewise be evaluated by their
ability to allow the building to be part of a greater resource cycle, such as the ability
to disassemble constructions at the end of use in order to have the materials recycled
and reused in new contexts. In the present research project, these circumstances raise
the research question, whether the tectonic work can be approached so that resources are used more deliberately - and how we, taking into account the growing climate
and environmental problems, develop a strong tectonic building culture.
Another element of the sustainability paradigm is, however, the social and societal
structures providing more general questions to the norms and values; we base our cities and buildings upon. There are in this respect reasons to recall that architecture is
not only built on materials - bricks, concrete, glass and steel. Architecture is based on
ideas - the ideals and values that express visions and conceptions of ‘the good life’ that is, conditions that can be categorized under ’utilitas’.
These notions are affected drastically by the environmental problems, an idea
which manifests itself as a ‘non-intentional dark side’ of the prosperity and consumption that has been the driving force behind the development of Western society since
World War II. A similar dark side is reflected in the architecture - with Koolhaas’ words
in the form of ‘junk space’ (Koolhaas, 2001).
Other societal and cultural ideals also have an impact on contemporary building culture. For instance, the increased societal individualization leads to demands
concerning architectural diversity, flexibility and adaptability (Madsen, 2008). The research project will focus on these issues and discuss their influence on the architectonic process and quality. Thus - as part of the described research project - it will be
discussed whether (and if so, how) the social and humanistic oriented ideals, which
among other things arise from the sustainable development, influence the tectonic
work. Can tectonic thinking sustain a responsive and adaptive architecture that involves a more sensitive involvement of human values?
Tectonic Thinking - Hypothesis
In this paper, we will discuss the potentials embedded in tectonic thinking as the
underlying basis of architectural practice. By focusing on the tectonic aspects of the
building itself and the influence of a tectonic mindset on the connection between the
design process and the construction, a set of exceptional vistas arises. These can be
summarized in the following hypothesis:
Tectonic thinking – defined as a central attention towards the nature of the making, and the application of building materials (construction) and how this attention forms a creative force in building constructions, structural features and architectural design (construing) – can be used to identify and refine strategies for
improving contemporary building industry.
398
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
The Italian architectural theorist Marco Frascari describes the concepts construction
and construing in the essay The Tell-the-tail Detail (Frascari, 1984). According to Frascari
the two concepts have to be united to give signification to architecture. The actual
construction of the building structure, in this way, needs to contain a narrative layer
making possible a construing of the meaning embedded in the structure. This close
link between the creation of concrete solutions and the creation of meaning forms
the core of tectonic thinking. By understanding the potentials in the materials and the
construction methods used and by transforming them into a design solution - that reflects the logic of the construction - the appearance of a building and the process of
construction are united.
In her book Tectonic Visions in Architecture (Beim, 2004) the Danish architect and
researcher Anne Beim states that when talking about the tectonics of a building, the
focus will then be on the meaning embedded in the specific construction, as it is interpreted both by the architect and the user. The connection of the construction and
the construing is then both a core element in the design process as well as in the perception of the actual design result. The concept of tectonic thinking focuses on the
potentials in both the genesis of the solution and in the appearance of the final building structure.
In the essay above-mentioned, Marco Frascari uses the work of the Italian architect
Carlo Scarpa to describe how the connection between the design of the construction
and the meaning embedded in the design solution can be used as a creative force in
the process of developing a given solution. In his work, Scarpa pursues an artistic idea
as the starting point of an investigation of the detailing of the structure. In the design
of the detail, Scarpa studies both the technical construction and the appearance of
the final solution by using different ways of sketching within the same drawing. The
aim of this process is to unite the artistic idea and the selected construction in one
narrative that articulates both the artistic idea and its connection to the logic of construction (Frascari, 1984). By focusing on the construction as a narrative expressing
its own logic, then both the process of designing and the final piece of work become
more transparent. This elucidation of the structure and its creation makes it easier to
recognise and keep focus on the main problem at play. The transparency of the process then facilitates communication between the different parties involved in the design process (architects, engineers, contractors and craftsmen). In this way, tectonic
thinking can form the platform for strategies for refining and improving the contemporary building industry seen in the light of sustainability, by creating a clear focus in
the design process and a common language among the parties involved.
By focusing on tectonic thinking in the design of buildings, one forms a strategy
of establishing a link between the intensions embedded in the design of the structure
and the way these are understood by the user of the final building. Working on the
tectonic aspects of the structures strengthens the appearance of its potentials. By offering the possibility to the user of the building to conceive the logic of the construction, the user gets actively involved in the understanding of the potentials embedded
in the structure. This knowledge can help the user both in the daily handling of the
building and in a future reconstruction of the building structure. Tectonic thinking
can, in this way, form the platform for strategies in order to ensure a sustainable use
of resources through the lifespan of the building by creating an understanding of the
logic and the potentials of the construction among the users.
Claus Bech-Danielsen, Anne Beim, Charlotte Bundgaard, Ulrik Stylsvig Madsen Denmark
399
Tectonic Thinking – State of the Art in Contemporary Building Processes
Contemporary building industry has developed radically in terms of advanced industrialized manufacturing. In particular, digital technologies have provided new and
different ways of fabrication through the past couple of decades. These make series
of identical objects unnecessary, industrially manufactured components can now be
customized to fit a particular architectural construction design. This evident change
in manufacturing processes and similarly the associated design processes have at first
glance been defined as digital tectonics within the architectural realm.
According to Leach, Turnbull and Williams, who were among the earliest theorists
who thoroughly analyzed this field; “The term Digital Tectonics is being used […] to
refer to a new paradigm of thinking in architectural culture […] computer technologies have infiltrated almost every aspect of architectural production, and is now being used to bring insights into the realm, even of the tectonic. In particular, they are
allowing us to model – with increasing sophistication – the material properties of
architectural components” (Leach, Turnbull and Williams, 2004). Neil Leach describes
this movement as a ‘structural turn’ in architectural culture, which has launched a new
spirit for collaboration between architecture and engineering and which seems to influence the production of buildings in the future to come.
One of the earliest - yet advanced and highly illustrative examples underlining this
movement is the Guggenheim Museum in Bilbao (1997), where specific computer
programs, the manufacturing process and the use of titanium as a ‘sustainable material solution’ were developed in parallel (Van Bruggen 1998, p. 4). Although, Leach has
criticized this building for being the culmination of separating the structural concerns
from the aesthetic ones, it still represents state of the art in terms of developing and
applying new computer programs and manufacturing technologies for solving challenging construction solutions.
The primary computer program used for this exceptional project was Catia (Van
Bruggen 1998, p.135).1 Catia that was originally developed for the French aerospace
industry offered a comprising of different packages of software. These were among
others: A modeller of architectural faces and volumes and a definer of the paths used
by the milling machines in the construction process. Supposedly, the software did not
provide all the needs of the designing architects so they had to customize it and invent new applications for the software as they went on. The lay out process was accelerated in making use of the new computer program. Sculptural shapes could be computed, similar to the structuring of the steel frame and the fitting of the great number
of assorted panels and thereby offering a more time-saving and economical way of
building. This way of processing worked both for construction based on high technology, such as numerically controlled manufacturing, as well as for traditional craft by
use of full-scale templates in order to design shapes. As for the roof cladding material,
titanium was chosen as a replacement for lead copper, which was outlawed as a toxic
material. The titanium offered the properties needed for a demanding building envelope like this and it could be milled and cut in endless varieties of shapes. Titanium is a
highly expensive material yet the titanium used for the roofing is a third of a millimetre thick and there is a hundred year guarantee against deterioration. When comparing the life span of titanium to the one of stone cladding, that is vulnerable in terms of
deterioration due to city pollution, Frank Gehry has stated that the understanding of
stability has to be reconsidered (Van Bruggen 1998, p. 141).
400
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 1
The Guggenheim Museum in Bilbao. View of the curved façade covered with titanium.
Photo: Georg Rotne 1998 (KA-billeder).
The Guggenheim Museum in Bilbao represents a one-to-one prototype where each
and every construction solution and building component were designed, developed
and customized for this particular building. As such, the specific construction solutions may not have been easily applicable or possible to transform into succeeding
building projects, yet the computer technology has been developed further into a
widely known commercial product: Gehry Technologies, which offers support as well
as various digital products.
Another highly interesting example is architectural works and ideas by Kieran &
Timberlake Architects, who have developed a theoretical framework, a sort of manifest, which forms an investigation into the tendencies in advantages of contemporary
construction industry. With the book; Refabricating Architecture: How Manufacturing
Methodologies are Poised to Transform Building Construction (2004), they offered a different approach to the architectural design process fostered by the computerized
tools emphasizing their communicative force as information management tools (Kieran, Stephen and Timberlake, 2004, p. xii).
In that sense, they bring the discussion of contemporary manufacturing in the
building industry and the role of the architect further than initially proposed by Neil
Leach and exemplified by Frank Gehry. They define the role of today’s architect similar
to the master builder of the past: ”An amalgam of material scientist, product engineer,
process engineer, user, and client who creates architecture informed by commodity
and art. By recognizing commodity as an equal partner to art, architecture is made
accessible, affordable and sustainable as the most exclusive consumer products available today.” (Kieran & Timberlake, 2004) Through projects as Loblolly House and Cellophane House, KTA has shown how their ideas can be developed into both experimental and attractive architecture that is not only meant for the few, but which holds
potentials for mass production without losing its customizable dimension. As they argue: “[...] Modern humanism is communication not geometry!” (Kieran & Timberlake,
2004, p. xii).
Claus Bech-Danielsen, Anne Beim, Charlotte Bundgaard, Ulrik Stylsvig Madsen Denmark
401
Fig. 2
Loblolly House. The meeting between the aluminium scaffold and the wooden catridges.
The house is composed entirely of off-site fabricated elements and ready-made components,
assembled from the platform up in less than six weeks. Specification is no longer conceived and
structured about the sixteen divisions of the CSI that organizes thousands of parts that make up
even a small house. Instead, the conception and detailing are formed about four new elements of
architecture: the scaffold, the cartridge, the block and equipment. The aluminum scaffold system,
coupled with an array of connectors, provide both the structural frame and the means to connect
cartridges, blocks and equipment to that frame with only the aid of a wrench.” Loblolly House was
built in 2005 at Taylors Island, Maryland, and is privately owned by Stephen Kieran and his wife.
(Kieran & Timberlake, 2008)
Photo: CINARK 2008.
402
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Tectonic Thinking – The Making of Architecture
We already described Marco Frascari’s two concepts, construction and construed,
and we accentuated the simultaneity of construction and narration (Frascari, 1984).
Constructing relates to the physical act of building, of assembling building elements,
while construing is about creating meaning. According to Frascari both dimensions
have to be present in meaningful architecture, and thereby he underlines that architecture cannot simply be construction without also containing a narrative layer – neither can it be pure narration without also having a physical, material manifestation.
As to industrialized architecture, one could claim that until now it has mostly been
regarded as ‘construction’, emphasizing the building systems and ways of assembling
building components. With Frascari’s concepts in mind, industrialized architecture
must also contain a narrative and meaningful dimension.
We have to develop an architectural approach that may provide a basis for developing and strengthening a contemporary and future building practice. We build ac-
Fig. 3
Cellophane House. The corner of the building showing the meeting between the aluminium construction and the facade elements.
The house is a five-story, offsite fabricated dwelling commissioned by the Museum of Modern Art’s
exhibition Home Delivery: Fabricating the Modern Dwelling, on display July 20 through October 20,
2008. The concept was that a: “building is, at root, nothing more than an assemblage of materials
forming an enclosure. We recognize that these materials came from somewhere, are held together
for a time by the techniques of construction, and will at some future time transition into another
state. While we tend to think of buildings as permanent, they are in fact only a resting state for
materials, a temporary equilibrium that is destined to be upset by the entropic forces that drive
the physical universe.” (Kieran & Timberlake, 2011)
Photo: CINARK 2008.
Claus Bech-Danielsen, Anne Beim, Charlotte Bundgaard, Ulrik Stylsvig Madsen Denmark
403
cording to certain conditions in our present time – conditions which are concerned
with production methods, construction and materials as well as ethics, meaning and
values. Exactly this relationship between the work as such and the conditions behind
its coming into being is a crucial point. Industrial prefabrication must be taken at
face value and not just as a rationalisation of what craftsmen used to do when they
placed brick on brick. Industrialization has resulted in buildings, which are composed
of ‘ready-mades’: factory-produced building components which are delivered and
assembled quickly and efficiently. This method of production forms a basis for the
development of an industrialized architecture which utilises and exposes the juxtaposition of building elements, and which thus consciously practices montage as an
architectural strategy (Bundgaard, 2011). As building components (as mentioned
above) can even be unique and individualized, due to IT-based production methods,
it is possible to create a much more heterogeneous architecture characterised by expression and character which is even open to the demands of users and to demands
that change over time. Montage architecture takes a significant leap away from what
has traditionally been seen as the essence of industrialisation, i.e. standardisation, repetition and uniformity. This means that montage architecture is an expression of the
conditions of production; it is a result of, and also that it, simultaneously, expresses
current tendencies in society for individualization, heterogeneity and a world which is
continually undergoing change.
As an approach, montage generates alternative contexts and ways of perceiving
the built environment. Potentially, current principles and premises allow an opportunity for architectural experimentation and for developing new formal idioms, architectural hierarchies and expressions. The architectural discussion is reinserted in the industrial universe and the consistent pursuit of montage and juxtaposition may trigger
a special potential of form, while, at the same time, the advanced methods of production provide an opportunity to work in a controlled and precise manner with material,
optimisation, et cetera.
One of the most important challenges of our time, the environmental crisis and
subsequently the demand for sustainability and resource consciousness, will, undoubtedly, characterise and shape future developments. Industrial production methods and the montage universe are very much capable of accepting the challenge to
create sustainable and energy-efficient buildings. The development of industrialized
building components, produced in advanced technological environments, will be
able to meet the demand for minimisation of resources, experiments with new materials, life-cycle thinking, optimisation of energy consumption, etc. At the same time,
the idea of the openness and variability of the architectural work makes it possible
to think in differentiated life spans and possibilities of exchanging building elements,
building components and materials (Bundgaard, 2011).
Some of the features of contemporary industrialized manufacturing are also comparable to characteristics in tectonic thinking – just as they have a significant connectivity to important aspects within the realm of sustainability. The attention to the use
of resources (material), the methods of processing (fabrication) and the definition of
systems (context) are important aspects within industrialization, tectonics and sustainability. The relationship between the material, the fabrication and the context
underlines the close dependency between the material and the way it is being processed within a specific context with certain conditions. The three aspects have to be
404
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
continuously unfolded, investigated and brought together if we wish to develop an
adaptable and responsive architecture which is able to meet contemporary and future
demands.
Tectonic Thinking – A Model of Analysis
In terms of working out a model of analysis for looking at tectonic aspects at close
hand there is no doubt that this model ought to be open and flexible in order to engage different sorts of data and analytic approaches, and in order to grasp the specific
context and nature of the study object that is centre for attention. This characteristic
is important at all times but maybe more so necessary, when the complexity of the
contemporary building industry calls for theoretical models as well as analytical tools
that can match this and unfold the complexity in ways that are comprehensible and
possible to improve.
When analyzed, tectonics should be approached from different angles, due to the
composite nature of the concept; in this context also defined as the holder of the interplay between construction and construing, and secondly due to the vast and highly
complex field of building construction (the edifice) and the building industry (the professional context). In that sense, the model of analysis is primarily based on qualitative questioning and methods, yet it makes sense to define specific levels for analysis,
which could reflect the scaling in architecture or the order of the building structure
and the processes involved.
As for the levels of a tectonic analysis, the following three levels can be defined:
• product level – focusing on assembly of various elements or building components
• system level – focusing on integration of various systems
• building level – focusing on concepts for various building constructions/designs
When analyzing the product level of building components it is not only a matter of examining the single product or component as individual parts, but similarly important
to focus at how these building parts are put together/ assembled and how they work
and what they signify, when they are built into a specific construction. In that sense,
both the very object – the building component, its material properties, the manufacturing processes involved and the principles linked to the building technology are targets for analysis. Furthermore, the correlation and the synthesis between these various parts and the meaning embedded in this should be taken into consideration.
At system level, the integration of different ‘hardware’ systems such as: structural systems, construction systems or service systems et cetera – or ‘software’ systems
such as: sustainability issues, aesthetic intentions, organization strategies can be focused on, separately or in common. In this case, systems are not necessarily limited
to those found in one singular building project; neither should they be regarded as
specific ‘closed entities’ or ‘systems agendas’ detached from a wider context. They must
be examined in relation to other (similar) systems or principles of thought and maybe
include elements from analyses at product level and/or building level.
Tectonic models for analysis at building level are manifold, yet certain fundamental elements should be included. These are linked to aspects of construction and construing at all levels and how these are connected to a wider context. This may, as menClaus Bech-Danielsen, Anne Beim, Charlotte Bundgaard, Ulrik Stylsvig Madsen Denmark
405
tioned above, include elements of the other two levels of tectonic analysis but maybe
more importantly it may include a societal dimension of cultural, historical, social,
economical or ecological consideration. In that sense, the tectonic analysis at building
level is not self-referential but instead inclusive, and opens for discussion.
It is important to note that this proposed model for analysis is to be developed further and the ambition is – through this research project and hereby testing the model in various ways – to see if it works satisfactorily in terms of addressing the nature,
practice and findings linked to tectonics, and finally, to see if it enables us to discuss
tectonics in regard of contemporary building practice and construction industry.
Tectonic Thinking – Concluding Statements
In this paper we have discussed tectonic thinking as a potential strategy for development of architectural practice. We have pointed out two main issues, from where
these demands for development arise: Sustainability and contemporary methods of
production.
In relation to sustainability we have argued that the growing requirements for energy savings will have a major influence on tectonic qualities in the architecture, and
we have suggested that development of advanced technology, LCA-methods and experiments with new materials are important ways to reduce the resource consumption in construction. Subsequently we argued that tectonic thinking holds potentials
to ensure a sustainable use of resources through the lifespan of the building by creating an understanding of the logic and the potentials of the construction among the
users.
In relation to contemporary architectural methods we have discussed the relation between the uniform and repetitive nature of ‘traditional mass production’ and
the new demands on diversity and individuality. Thus we have argued, that computerbased methods makes it possible to create a heterogeneous architecture, that takes a
significant leap away from what has traditionally been seen as the essence of industrialization, i.e. repetition and uniformity. Subsequently, we have described the work of
Kieran & Timberlake Architects as such an example: They have an approach allowing a
sustainable architecture, which holds potentials for mass production without loosing
diversity in the expression.
Finally, we have presented our current ideas for a model of analysis. The model will
be further developed in the future, and in doing this we will aim for an open and flexible model in order to include different sorts of data and approaches. Also the model
will be developed in order to comprehend the specific context of the cases. The analysis will have focal points on different levels, reflecting the architectural scale as well as
the processes involved in construction. The three levels are: 1) Product level – focusing
on assembly of various elements or building components. 2) System level – focusing
on integration of various systems and 3) Building level – focusing on concepts for various building constructions/designs.
The proposed model for analysis will be developed further in the future work of
the research group. The model will be tested in various ways in order to see if it works
satisfactorily in terms of addressing the nature, practice and findings linked to tectonics. The ambition is that it enables us to discuss tectonics in regard of contemporary
building practice and construction industry.
406
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Note
1 Catia deals with polynomial equations instead of polygons which make it capable of defining
any surface as an equation. This means that the computer can define any point at the surface.
References
Beim, Anne, 2004. Tectonic Visions in Architecture. Copenhagen; The Royal Danish Academy of Fine
Arts, School of Architecture Publishers.
Bundgaard, Charlotte, 2011. Montagepositioner – en nytænkning af industrialiseret arkitektur. Aarhus;
Aarhus School of Architecture Publishers.
Christensen, J.C. & Nielsen, C., 1999. Trends i udviklingen af dansk byggeri. Aalborg; Aalborg
University.
European Commision, 2010. Energy Efficiency in Buildings [online]. Availible at: <http://ec.europa.
eu/energy/efficiency/buildings/buildings_en.htm>.
Frascari, Marco, 1984. “The Tell-the-Tale Detail”. In: VIA 7 The Building of Architecture. Architectural
Journal of the Graduate School of Fine Arts. University of Pennsylvania.
Hartoonian, Gevork, 1986. Poetics of Technology and the New Objectivity. In: JAE, Princeton
ArchitecturalPress.
Hartoonian, Gevork, 1994. Ontology of Construction: On Nihilism of Technology in Theories of Modern
Architecture. New York; Cambridge University Press.
Hartoonian, Gevork, 1997. Modernity and its other: a post-script to contemporary architecture, College Station; Texas A&M University Press.
Kieran, Stephen & Timberlake, 2004. Refabricating Architecture: How Manufacturing Methodologies
are Poised to Transform Building Construction. New York; McGraw-Hill Companies Inc.
Kieran, Stephen & Timberlake, 2008. Loblolly House – Elements of a New Architecture. New York;
Princeton Architectural Press.
Kieran, Stephen & Timberlake, 2011. Cellophane House. Philadelphia; Kieran Timberlake.
Koolhaas, Rem, 2001. Junk Space. In: Domus Gennaio, Januar 2001.
Leach, N., Turnbull, D., Williams. C., ed., 2004. Digital Tectonics. West Sussex; John Wiley & Sons Ltd.
Lovelocks James, 1979. Gaia: A New Look at Life on Earth. Oxford; Oxford University Press.
Madsen, Ulrik S., 2008. Robust Arkitektur - arkitekturen som en aktiv del af organisationers Identitetsskabelsesproces. Copenhagen; The Royal Danish Academy of Fine Arts, School of Architecture
Publishers.
McDonough, W., Braungart, M., 2002. Cradle to Cradle: Remaking the Way We Make Things. New
York; North Point Press.
Rowland, I.D. & Howe, T.N., ed., 1999. Vitruvius. Ten Books on Architecture. Cambridge; Cambridge
University Press.
Semper, Gottfried, 1851. Die Vier Elemente der Baukunst. Ein Beitrag zur Vergleichenden Baukunde.
Braunsweig; Friedrich Vieweg und Son.
Van Bruggen, Coosje, 1998. Frank O. Gehry Guggenheim Museum Bilbao. New York; Guggenheim
Museum Publications.
Claus Bech-Danielsen, Anne Beim, Charlotte Bundgaard, Ulrik Stylsvig Madsen Denmark
407
Stylianos Giamarelos
University of Athens
Greece
Have we ever been Postmodern?
The Essential Tension within
the Metamodern Condition
“The contemporary architectural style that has achieved hegemony within the
contemporary architectural avant-garde can be best understood as a research
program based upon the parametric paradigm. We propose to call this style:
“Parametricism”.
Parametricism is the great new style after Modernism. Postmodernism and Deconstructivism were transitional episodes that ushered in this new, long wave of
research and innovation.”
Zaha Hadid and Patrick Schumacher (2008)
“My question will always be: can interactive Architecture be beautiful? It is certainly
necessary and functional, but can it compete with historic architecture and be appreciated as good, relevant and beautiful? I believe that it can.”
Kas Oosterhuis (2009)
The call for papers of the conference defined its own frame for discussing our reflections on the human in the age of technology-driven architecture. This paper will start
by relating the conference call with some selected excerpts from Kas Oosterhuis, Zaha
Hadid and Patrick Schumacher’s recent texts. By doing so, it will eventually reframe
the questions posed by the call for papers and offer a broader alternative perspective
of elucidating them. This paper as a whole should therefore be considered as a contribution to the methodology of our reflecting on our present condition and the relevant terms that can drive our debates about it.
The selected excerpt from a recent manifesto by Zaha Hadid and Patrick Schumacher (2008) reveals their current self-understanding. Having recently articulated
their own parametricism, they opt to present themselves as the direct descendants
of modernism, by simultaneously keeping a clear distance from postmodernism and
deconstructivism. Meanwhile, Kas Oosterhuis (2009) was recently wondering whether interactive Architecture can also be beautiful – since it is already standing and
functioning as a construction. His question echoes Le Corbusier’s plea for an architecture that moves us. This emotive element draws the line between real architecture
and a mere construction that is able to hold itself together. On a similar note, Oosterhuis (2011) seems almost reluctant to call his most recent work ‘architecture’. In his
latest book, he opts for a title that reveals his concerns by echoing Le Corbusier once
again: Towards a New Kind of Building. The deepest concerns of both architects rest
on undoubtedly classical grounds, though. In fact, they reach way back to a Vitruvian
origin.
However, upon reading the call for papers, one cannot help but notice the fact
that it focuses entirely on developments that have taken place over the last decade,
when the abovementioned architects express feelings that develop within our modern predicament and go way further back in time. Stemming from some of the pioneering figures of a recent data-driven practice, these feelings need to be elucidated
and clearly articulated if we are to gain a deeper understanding and a critical awareness of our contemporary condition which I intend to call ‘metamodern’. Focusing in
the last decade would eventually mean ignoring the long roots of the phenomena
that characterise our present condition and the latest developments in the field of
technology-driven architecture. It would mean taking for granted what is most crucially at stake: Is it at all possible for us to reconcile the value-driven humanistic per410
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
spective with a technological perspective that relies on the physical and the factual?
Can the data-starving IT really offer the key to answering the distinctively modern
problem of the fact-value dichotomy? The call for papers undoubtedly strikes for a
crucial balance between accepting IT and technology-driven architecture while maintaining the criticisms that have been addressed to it regarding its acknowledged humanistic shortcomings. However, it is highly doubtful that this desired balance can be
attained by focusing on the last ten years only. Although few would be unwilling to
admit its often ingenious implementation of the latest developments in IT, the recent
guise of architecture as a data-driven practice is essentially intertwined with a complex web of relations with wider historical and theoretical movements that have yet
to be clearly, let alone fully, addressed. If that is the case, then it is clear that the place
of the human in technology-driven architecture cannot be truly reflected upon and
comprehended when considered as a matter of the last decade.
Hadid and Schumacher’s remarks serve as to pose a thought-provoking challenge.
Any serious attempt to reflect upon our present condition should start by exploring
the implications of their abovementioned distantiation from postmodernism and deconstructivism. Why would a pioneer of technology-driven architectural practice denounce postmodernism, when IT is in fact at the heart of the postmodern condition?
Combined with the Wittgensteinian notion of language games and an emphasis on
particular stories and narratives instead of universal metanarratives, IT is there right
from the start, being already heralded as the driving force of a radical change in the
production of knowledge in Jean-François Lyotard’s (1979) widely discussed attempt
at epistemology, which was to define the postmodern condition. How should we understand Hadid’s and Schumacher’s statement of apparent distantiation, then, when
IT is clearly at the heart of their work, too? A fruitful way is to read it as a theoretical
challenge: If we are to understand our present condition, a whole generation later, it
may be more useful to wonder whether we, the architects, have ever been postmodern in the way that Lyotard proposed, and if a term like that still applies in elucidating
our present condition. This second leg of the question calls for making a stand for the
wider cultural issues and philosophical problems that have arisen during this roughly
three-decades-long period of time that has elapsed since postmodernism first entered the intellectual fray. Posing such a kind of question in a general manner, especially when it concerns terms like modernity and postmodernism which have historically proven rather loose in themselves, certainly risks oversimplifications and a kind
of inability to acknowledge the often admirable argumentative architecture of the diverse approaches to these wide cultural subjects. By covering the various theoretical
facets that these terms have presently come to incorporate in a few pages means that
we most probably will not be able to satisfy the usual scholarly standards of analysis.
However, this is the form in which these intellectual currents usually enter the public
domain and shape the dominant ideological trends. Thus, we seem obliged to dare
and plead into this mostly insecure and unstable general ground in the hope that we
are not unfair or inaccurate to the individual theoretical approaches that are presently
labelled as ‘postmodern’. Only after attempting to define the contours of a contemporary intellectual uneasiness, can we return to the special case of architecture, if we
are to shed a refreshing light upon our contemporary problems. Only after having attempted to face the question of postmodernism in its more general form, can we recognise the way in which it makes sense when posed in the architectural domain.
Stylianos Giamarelos Greece
411
I.
The question “Have we ever been postmodern?” indeed alludes to Bruno Latour’s
(1991) infamous suggestion that we have never been modern. According to him, we
are essentially amodern, as even our most cherished ‘pure’ practices of modernity,
like the development of modern science, are always mediated hybrid forms, internally contaminated with the residue of heterogeneous constructions that should have
been suppressed or imaginarily exorcised, in case the modern spirit had prevailed.
Thus, the Great Divide between man and nature, between modern and traditional culture is a kind of intellectual bias, nothing more than an imaginary construction that
was never attained on the practical level. There is in fact no discontinuity, chasm or
asymmetry between the moderns and the ancients. Man is still intertwined with Nature; Science is still not clearly distinguished from Politics. Latour seems to approach
modernism primarily as a mode of thinking, as the adoption of a certain viewpoint
or worldview, a kind of ideological prejudice that filters our perception of reality and
obstructs us from realising the essential interconnections between what appears as
the orderly divided areas of knowledge. In that sense, he wants to be amodern – not
modern at all. Hybridisation is the norm and this is the case even for the domains of
knowledge that are considered to retain an almost absolute degree of purity.
Yet, according to Fredric Jameson (1991), it has been exactly a kind of hybridisation that was visible in the world of everyday life, this common presence of automobiles and trains moving in the backdrop of a mostly neoclassical or baroque urban environment, for instance, that originally gave rise both to a dynamic sense of modernity
and its special relation to history, as well as to the idea of progress in the first place.
The total dominance of the simulacrum that followed in the age that coincides with
the third stage of capitalism, accompanied by the loss of historical awareness and
the future itself, are in fact the characteristics that distinguish the modern from the
postmodern. What is extremely interesting in Jameson’s approach is that he spots the
clearest conscious expressions of the modern/postmodern ethos in architecture and
the built environment. Jameson relates postmodernism with a special kind of space
and a special kind of human consciousness, a hyper-space ideally perceived by a hyper-consciousness that exceeds the modern (and Cartesian) conceptions of space and
the human subject. His latest work that invokes a “spatial” kind of dialectic (Jameson,
2009) and in fact follows 1991’s plea for a cognitive mapping of our place within the
postmodern condition, which is partly inspired by Kevin Lynch’s (1960) seminal work
on the Image of the City, are not to be ignored, either. It seems we have not ceased to
be in need of this kind of cognitive mapping and we should probably show the intellectual rigour required for carrying out this task.
I am reluctant to agree with Latour that we should abolish the idea of the Great
Divide between the modern condition and the condition before that. I can follow him
in rejecting the idea of any kind of discontinuity or of a change that can be simultaneously sudden and radical in the historical course of human activity. This unavoidably continuous course doesn’t exclude the possibility of reaching a point in history
from the viewpoint of which it seems that there is no way to turn back, though. When
we have attained that viewpoint, it is certain that we can re-interpret the past as a
forerunner to the present condition or trace selected crucial characteristics of other
periods that shaped the modern notions in a decisive way. If that is the case, then
modernity is indeed a problem for us and the Great Divide not only reappears, but
412
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
it immediately takes the form of a problematic periodisation, since we seem capable
to pinpoint it at any point in history, wherever we can trace the genesis of the ‘modern notion of X’ from the early modern 15th century (Toulmin, 1990) or, like Nietzsche
(1872) and Bernard Williams (2002), even further back to the age of Socrates and Thucydides. This should not be interpreted as a claim that the moderns’ often nostalgic
depictions of the past are unqualifiedly correct (and this is the point where one could
meet Latour again), but as an indication that once certain questions have been raised
in the way of modernity (read: Enlightenment), we can no longer be satisfied with historical or philosophical accounts which do not face or provide any answer to them. In
that sense, modernity is intellectually irreversible, although it can still “be reversed in
historical fact, given a substantial enough political power or natural catastrophe” (Williams, 2002, p. 254).
If modernity itself is always posed to us as a problem, as a challenge for developing our theoretical, intellectual and practical tools for coping with a world that is in a
rapid process of modernisation, then the term “postmodern” itself is representative of
a rather awkward condition in our attempt to cope with this major problem that modernity represents for us.1 The prefix “post-” indicates succession in time, rather than
heralding a situation that is radically different from the modern one. It is not certain
if postmodernism could ever lead us to exiting, overcoming or opposing modernity
in a polemical way, as some of its guises seemed to imply. By developing as a kind of
countermovement, postmodernity was always bound to the essence of that against
which it was rebelling. By radicalising the critical methodology of modernity while offering deconstructive genealogies of the fundamental modern ideals, postmodernity
could only end up in an inextricable entanglement in modernity. In fact, even Lyotard
(1988) himself in his later writings preferred to talk about ‘rewriting modernity’ as a
term more suitable than ‘postmodernism’, which was treated as redundant by its main
inventor from that point on. If we are indeed to abandon the notion of postmodernism, this may be due to an inherent contradiction. The widely heralded postmodern
motto “anything goes”, as it was firstly posed by the likes of Paul Feyerabend for instance, should first and foremost be read as a call for the absolute freedom of a creativity that is not bounded by the chains of any grand narrative; at the same time, it
is an ideal that can never be realised: if it was realised, it would immediately have to
be rejected!2 We can never continue to hold an ironical kind of distance to the sum
of our web of most basic beliefs, even if we have managed an almost impossible feat
of gradually replacing each and every one of our beliefs with another. What’s more,
postmodernism doesn’t seem able to provide the suitable kind of motives for such an
unbounded creativity. Why should anyone really cater to invent endlessly new vocabularies and redescriptions of language games and not stick with the ones already at
hand, when at the end of the day none of them is proven to be superior to another or
even support a claim to the truth? By putting the emphasis on difference, postmodernism has led itself to a short-circuit. By strongly opposing modernity’s inherent hegemonising tendencies that could lead to totalitarianism, the postmodern condition
secured a space for the oppressed and minority voices, only for them to retain their
current status quo and develop in a juxtaposed seclusion from other voices. Yet, this
emancipatory dimension of postmodernism is paradoxically associated with the end
of utopian thinking and the rejection of the grand narratives. The separate groups and
voices are left as they stand in their polyphony without a clear vision for their future.
Stylianos Giamarelos Greece
413
This loss of the future is particularly evident in times of crisis like ours, this third
era of modernity, when the decisively modern and, in its core, ethical question “What
should we do?” returns with renewed ferocity. The question is put forth in the sense
of “Where should we be going?”. Postmodernism cannot answer it, because it opposes any kind of commitment to a minimum of convictions that can produce propositions which are not afraid to – even tentatively – claim truth status. Yet, it is exactly this
virtue of commitment to normative or regulative beliefs that postmodernism lacks
which is what really matters when guiding our action on the horizon of a desired common future. Therefore, although postmodernism and its recent guise of postcolonial
studies may still seem significantly prolific in areas like comparative literature, it has
already started showing its limits.
The feeling that postmodernism has most probably ran its course is gradually
becoming common and shared around the globe. Nicolas Bourriaud (2009) has recently proposed the notion of the altermodern, the exemplary hero of which is the
figure of the radicant artist. The radicant is happily participating in the multicultural
flow of people around the world and is open to communicating with them, while at
the same time retaining his roots to his homeland culture (through ‘mobile’ practices
that can be as ordinary and everyday as cooking, for example). More recently, the exhibition titled “Postmodernism. Style and Subversion 1970-1990” at the Victoria and
Albert Museum triggered a slew of articles that more or less herald the death of postmodernism.3 I am not convinced by the prospects they offer, though. In a way quite
similar to that of Bourriaud, Edward Docx (2011) is pleading for the dawn of the age
of authenticism now that we can very well understand that postmodernism is dead.
Even if one could surpass the doubt of the current relevance of this return to a term
of the Heideggerian discourse which developed in the climax of modernity to be explored by postmodern thinkers, his view of authenticism is rather limited, too. The
only ground for his argument is a reappreciation of the authentic in the global marketplace. This notion seems too feeble to sustain our present predicament. Although
one could agree with Hari Kunzru (2011) that the 9-11 attacks were also a blow to
postmodernism as an intellectual current, his notion that postmodernism was ‘essentially pre-digital’ has to be avoided. The internet is definitely not to blame for the
murder of postmodernism, since – as we have already mentioned – it was a vital component of the postmodern condition from the moment of its inception in 1979. Hari
Kunzru’s argument rests on statistics regarding the bibliographical occurrence of the
terms ‘postmodernism’ and ‘internet’ since 1975 provided by Google ngrams. However, the causal explanation he attempts is quite different from and should not be confused with mere statistical correlation – it requires much more than that, i.e. decent
theorising.
II.
At the end of the day, although the question of postmodernism remains open, it
seems that our current situation seems inescapable and, in that sense, we can never
not be modern; nor can we ever be premodern again.4 We can still try to elucidate our
unique modernity, though. Could we finally face modernity as a whole, not in the narrowly polemical direction that was often followed by postmodernism? We are now
able to see the whole modern phenomenon in the light of the shortcomings of post414
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
modernism itself. This is a perspective that was not available to any previous generation of thinkers and writers. Now, more than ever, it seems that our task is to reflect
upon the modern phenomenon as a whole in a way that manages to include the
dead ends of postmodern thinking, with the added fuel of the motivation provided
by our presently apparent need to move forward. We are certainly on the grounds of a
hermeneutical kind of metamodernity.
To elucidate this last term, one could draw an analogy with the way in which Alfred Tarski (1944) used a richer meta-language (that contained the truth predicate)
in order to provide an extensional definition of truth for an object-language. We can
most conveniently detect the distinction of object-language and meta-lanaguage in
the case of translation, in sentences like «“Schnee ist weiß” is true (in German) if and
only if snow is white». In this example, German is the oblect-language and English is
the meta-language that allows us to speak of the satisfaction conditions of truth in
German. What would that meta-language be in the case of modernity, though?
Without ignoring any widely admitted differences between the modern and the
postmodern condition, the thinker of the metamodern condition is called to work on
their deeper existential ground, their common root or source. The liminal question
of metamodernity would then be: Could we possibly go beyond the one and only
correct interpretation, but also beyond the consummative post-modern equalisation
(or rather neutralisation or elimination) of every interpretation within the closed sum
of an unsociable multiplicity? This would probably involve moving away from a fixed
truth, as well as from the infinite interior of self-referentiality, the imaginary non-existence of any truth whatsoever. At a time when the major questions have considerably lost their dramatic weight, nobody is looking for answers. Yet, as time goes by
and postmodernism is exceedingly running its course, it becomes all the more evident that its formerly euphoric and playful nihilism leads to a dramatic uneasiness,
an hitherto unforeseen numbness. What seemed to be the liberation from the artificial identity of every kind of ideology (that is, the metaphysical and secular idealisms), has turned out to be a step on the vacuum. There was nothing at all outside
the endless interior we struggled to overcome. Being metamodern means having the
intellectual rigour to re-pose the major questions in a way that can help them regain
their weight.
Thirty years ago, the prefix “meta-“ would immediately trigger the intellectual
alarm of an early postmodern thought: Weren’t we supposed to get rid of all this overarching second-level kind of thinking in the first place? Could we just as easily return
to endowing our trust on metanarratives like we did in the pre-war modern times? We
should not treat the term ‘metamodernity’ in its strongest epistemological and metaphysical connotations, though. In the sense that we are avoiding here, metamodernity
would also herald a kind of a triumphant exodus from or a surpassing modernity. This
is definitely not the case here. We should rather understand and employ the term in
the same way we currently understand metaphilosophy: even if we cannot exit our
philosophical way of posing problems and thinking about them, we can still reflect
upon our philosophising and its method, just as well as we reflect upon any other
kind of practical activity. In that sense, our current metamodern condition doesn’t
mean we have escaped modernity. It implies a re-enactment of our reflection upon it
with the added experience gained by postmodern thinking.
Stylianos Giamarelos Greece
415
III.
After all this general and wider discussion on our current condition and its connections with the ones that preceded it, it is now time to return to the architectural domain and its own specificities through the questions we posed at the beginning of
this paper. As we have already witnessed, Jameson heralds the built environment
and the architectural domain itself as an exemplary manifestation of postmodernism
that contributes to our understanding of the phenomenon as a whole. What could be
said about architecture and this notion of metamodernity we just sketched? In which
sense should we pose the question ‘Have we ever been postmodern?’ to the architectural domain?
I think there is indeed a strong sense in which our architecture has never been
postmodern. Yes, we have definitely been postmodern in the sense defined by Charles
Jencks and his architectural language of double-coding etc. In fact, ever since the
publication of his influential work on postmodernism, we, the architects, consciously
become Jencksian postmodernists for a rather short period of time. However, I don’t
think we have ever been Lyotardian postmodernists – at least, not in a conscious way.
I would like to propose that MVRDV’s work – in VPRO, for instance – with its explicit
emphasis on IT and the construction of stories already represents a kind of unconscious Lyotardian postmodernism. Lyotard’s (1979) “Report on Knowledge” focused on
the production and evaluation of knowledge in the age of digital diffusion of information. MVRDV’s work unconsciously follows a Lyotardian program, in the sense that they
base their architectural endeavour on a conception of the flow of information and its
consequences for architectural practice. The introductory text for their work on VPRO
attempts the definition of the architectural object in contemporary terms. Their intentions are already clear from the opening sentence: “Architecture is an interface. Not
just now: it always has been” (MVRDV, 1997, p. 3). Their approach rejects an artistic approach to architecture, moving its emphasis on the double role of information in architecture as interface. Traditional architecture is a source of information that can be
collected in a statistical database, whereas the element of interaction supplies the defining characteristic of the IT age. “Form is […] dependent on […] the handling of information about reality. A part of the ‘unexpressible’ aspect of architecture has come
down on the side of its statistical treatment” (MVRDV, 1997, p. 4). The VPRO book constantly reminds the reader that this is not the presentation of a building, but that of
a story. Just like architecture, the book is defining our contact with the reality (of the
built), by handling information. “[I]n our own times the handling of information is becoming a basic code for acquiring knowledge about reality” (MVRDV, 1997, p. 9). This
almost Lyotardian remark does not lead to any kind of significant formal eclecticism
from the collected database of past architectural styles. Thanks to the computer, architecture as interface forms an horizontal relationship with reality. Gone is the hierarchical practice of past architecture that rested upon a creator who maintained a vertical
categorical relationship with reality. The computer introduces the horizontal dimension
to this relation without totally abolishing the primacy of the creator. The conception
of architecture as interface rejects any kind of relation to an objective understanding
of the world. The transformation of interface-architecture itself transforms in turn our
very reality. The history of architecture can be then re-interpreted from the viewpoint
of interaction and information.5 A more precise definition of architecture would then
416
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
be that of an “interface of computerized reality [in a] direct relationship between information and form” (MVRDV, 1997, p. 16). The distance from a Jencksian, as well as the
resemblance to a Lyotardian, kind of architectural postmodernism is apparent.
But what if we would self-consciously aim for a genuinely Lyotardian postmodernism in architecture? MVRDV’s approach seems to imply that we can move further
than proposing IT and digital technologies as a meta-language that can handle and
incorporate diverse design logics from ancient and traditional systems of harmony to
recent green design, as seems to be the norm today, which works with a clearly Jencksian logic at its core. Could our re-visiting of Lyotard’s texts prove to be a source of
self-conscious architectural inspiration with IT positively at its core? If that is the case,
then we, the architects of the metamodern condition, can realise that we have never
been self-consciously postmodern. On the other hand, if metamodernity means recasting postmodernism in the terms of modernity and vice versa, with an emphasis
on the return to a common future, it is still unclear whether IT and technology-driven
architecture can provide a satisfying reply. A Socratic reading of the modern question
par excellence: “Where should we be going?” turns modernity into a deeply ethical
problem, the problem of autonomous commitment to laws that we ourselves pose
to ourselves, according to the Kantian ethical project. The aftermath of the Hegelian
criticism has left us in doubt about the a priori necessity of a feat that was in principle historical, therefore contingent. In the years that followed, modernity was somehow condemned to constantly re-iterate this debate between Kant and Hegel without
proving able to overcome it (Pippin, 1999). It is this kind of a deep ambivalent feeling
towards our own autonomy that undoubtedly drives Oosterhuis to the need to appeal
to a classical conception of architecture, exactly at the point when his work is at the
forefront of technology-driven architecture.
There is certainly an essential tension here. It no longer seems plausible to believe
we should be rid of the technical rationality that is intertwined with recent developments in the field of IT. We should instead treat it as revealing yet again the multifarious nature of our discipline and as an urgent need to deepen our understanding of
the special characteristics of our contemporary condition and its relation to modernity. Having at our disposal both the results of modernity and the conclusions of postmodernity as an expression of the guilty conscience of the former, we are now in the
position to treat modernity ‘homeopathetically’ as both our greatest achievement and
our greatest tragedy at the same time.6 It would then be time to return to the complex
web of relations that contextualises the development of a critically responsive architecture of the last decade in the depth of its relation with modernity, thus advancing
our all too human understanding of technology-driven architecture. There are two
ways in which one can be lost: (a) when he has no map of the territory and (b) when
he has a map of the territory but cannot define his position on that map. I hope that
this paper provided at least a rough sketch of that map or a way of positioning oneself
on a rather sketchy map.
Notes
1 It should be noted that postmodernism is just one of the available answers to that problem,
which usually involves thinkers like Jean-François Lyotard and Fredric Jameson, to name just
Stylianos Giamarelos Greece
417
a few. Other proposed answers include a desire for a partial return to essentially premodern
ideals (offered by the likes of Hannah Arendt or Alasdair MacIntyre) or a self-aware defence
of ‘the unfinished project’ of modernity (offered by the likes of Stephen Toulmin and Jürgen
Habermas).
2 Neil Gaiman offers an ample demonstration of this wish turned into an imbearable curse in his
“Calliope” story from the Sandman saga. When writer Richard Madoc holds Calliope, the Muse,
captive in his attic, Morpheus takes his toll on him by granting his original wish for inspiration
to the extreme degree. Madoc suffers from a literal brainstorm of his endless inspiration and
ideas for stories he cannot even jot down – let alone write.
3 Cf. Docx (2011) and Kunzru (2011), for instance.
4 This could be the case not only due to a present lack of the relevant metaphysical context,
but also due to our much different practices of social life and institutions that support it. Cf.
Williams (2002) and Dworkin (2011) respectively.
5 Cf. MVRDV, 1997, pp. 11-3.
6 This is the term Fredric Jameson himself uses to describe his own method. It is inspired by
young Karl Marx’s work on early German history.
References
Bourriaud, N., 2009. The Radicant. New York: Lukas & Sternberg.
Docx, E., 2011. Postmodernism is Dead, Prospect Magazine, [online] Available at: <http://www.
prospectmagazine.co.uk/2011/07/postmodernism-is-dead-va-exhibition-age-of-authenticism/>
[Accessed: 4 November 2011].
Dworkin, R., 2011. Justice for Hedgehogs. Cambridge, Mass.: The Belknap Press of Harvard University
Press.
Hadid, Z. & Schumacher, P., 2008. Parametricist Manifesto. In: Out there: Architecture beyond Building.
vol. 5, Manifestos, 11th International Architecture Exhibition La Biennale di Venezia. Venice: Marsilio,
pp. 60-3.
Jameson, F., 1991. Postmodernism, or the Cultural Logic of Late Capitalism. Durham: Duke University
Press.
Jameson, F., 2009. Valences of the Dialectic. New York: Verso.
Kunzru, H., 2011. Postmodernism: From the Cutting Edge to the Museum, The Guardian, [online] 17
September. Available at: <http://www.guardian.co.uk/artanddesign/2011/sep/15/postmodernismcutting-edge-to-museum> [Accessed: 4 November 2011].
Latour, B., 1991. We Have Never Been Modern. Translated from French by C. Porter, 1993. Cambridge,
Mass.: Harvard University Press.
Lynch, K., 1960. The Image of the City. Cambridge, Mass.: MIT Press.
Lyotard, J.-F., 1979. The Postmodern Condition: A Report on Knowldege. Translated from French by
G. Bennington and B. Massumi, 1984. Minneapolis: University of Minessota Press.
Lyotard, J.-F., 1988. Réécrire la modernité. Les Cahiers de Philosophie, 5, pp. 193-203.
MVRDV, 1997. MVRDV at VPRO. Barcelona: Actar.
Nietzsche, F., 1872. The Birth of Tragedy. Translated from German by R. Speirs. In: R. Geuss & R. Speirs,
eds. 1999. The Birth of Tragedy and Other Writings. New York: Cambridge University Press, pp. 1-116.
Oosterhuis, K., 2009. iA Bookzine, [blog] 10 March. Available at: <http://www.bk.tudelft.nl/live/
pagina.jsp?id=771c1b4c-39ee-41e9-a197-990b3ace02b5&lang=en> [Accessed: 4 November 2011].
418
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Oosterhuis, K., 2011. Towards a New Kind of Bulding: A Designer’s Guide for Nonstandard Architecture.
Rotterdam: NAi Publishers.
Pippin, R. B., 1999. Modernism as a Philosophical Problem: On the Dissatisfactions of European High
Culture. 2nd ed. Oxford: Blackwell.
Tarski, A., 1944. The Semantic Conception of Truth and the Foundations of Semantics. Philosophy
and Phenomenological Research, 4, pp. 341-375.
Toulmin, S., 1990. Cosmopolis: The Hidden Agenda of Modernity. Chicago: University of Chicago Press.
Williams, B., 2002. Truth and Truthfulness: An Essay in Genealogy. Princeton, NJ: Princeton University
Press.
Stylianos Giamarelos Greece
419
Ana Maria Hariton
“Spiru Haret “University
Bucharest
Romania
Reading-Rewriting
-Learning from and Building
on an Exemplary Precedent:
Bucharest`s Blind Alleys
Reflections on Contemporary Architecture
When confronted with the contemporary forms and preoccupations of architecture
one cannot stop asking oneself a series of very simple questions:
Where is it from?
We are witnessing the emergence of a new International Style at an unprecedented
scale and degree of uniformization. Similar designs are built around the world, claiming to relate in mysterious ways to “local traditions” or schools of thought, when, in
fact, it is almost impossible to identify the locations of buildings situated in very different cultural areas. This lack of identity concerns not only the “conspicuous flashy items
of display”1 represented by many public buildings, but also usual building programs
like housing. One of the characteristics of contemporary architecture seems to be the
disappearance of the multitudinous traditional ways of making and creativity that are
absorbed into a dominant way of making and thinking2.
Almost anything can be built – but should we build anything?
The spectacular continuous curvilinear forms created by cutting edge technologies in
building construction are appropriate for iconic objects and significant public buildings. But are they becoming for housing? The question is related not only to costs but
also to our way of living. Is it possible to erase trough architectural forms any trace of
familiarity in our everyday life?
We respect nature. Should we respect people?
Sustainability seems to be the main ethic concern in contemporary approaches, and it
is seen in itself as guaranty of good architecture. In the field of domestic architecture
we should ask ourselves how much are we willing to give up, in terms of quality of life.
The quest of energetic efficiency leads to bizarre changes in the way we inhabit our
homes. If living in a hermetic box might be (and already is) an acceptable choice in a
gigantic polluted city, the construction of (passive and active) houses with no opening windows situated in breathtakingly beautiful and unpolluted landscapes is somewhat puzzling.3
Does contemporary architecture have any social goals?
The disappearance of place, while architectural objects are transformed into gigantic
sculptures seems to characterize contemporary architecture. ”Not function but the
designer’s highly personal preference dictates the look of the architecture. With that
architecture has been brought one step closer to the visual art” 4- and one step away
from its complex nature. An auto referential and impoverished architecture, centered
on image and technology dominates much of the contemporary production.
Modernism has also rejected the traditional structure of the city creating discontinuity
in the traditional urban patterns. This disruptive effect at the level of urban form, was
however compensated by its social aims. Are the perfect forms and sustainability of
contemporary architecture enough? How should we address the problem of a socially
sustainable architecture?
422
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
A human architecture, trying to dissolute the negative aspects of technology and
massification, seems to be the answer to many of the above formulated questions.
The meaning and interpretation of “human” is however very different and fluctuates
according to the dominant cultural paradigm.
The modernist approach has started from the “generic human” of the hygienist vision5, considering people as being alike and having the same basic needs: space, light,
nature. This type of humanistic approach led to some very inhuman results: monotony, loss of a sense of continuity, refusal of context.
Today’s digital architecture (parametric, topological, etc...) emphasizes a “beyond
the human” approach, generating forms according to nature`s rules. Architects break
and construct codes envisioning “a later stage, more like the stage of organic life in
which technology becomes an autonomous species”.6 Relating to nature`s forms and
structures was a recurrent theme of XX century architecture, starting with Art Nouveau and its organic images, continuing after the late fifties with Japanese Metabolism - based on the principles of organic growth, and ending with Santiago Calatrava`s
skeletal structures. The biological approach is valid as long as it is human centered,
resorting to our empathic understanding of nature. Biology includes the human, but
the human is actually defined particularly by the fact that it is more than biology.
Culture, as the essential human creation, should therefore constitute the relevant
aspect of a human architecture.
Our quests should be directed not only towards an amazing world forms but towards the culturally significant forms.
Fig. 1
Typology and characteristics.
Ana Maria Hariton Romania
423
Building in Bucharest – Typical Problems
Compared to other cities in Europe Bucharest’s urban evolution is somewhat atypical. Since the nineteenth century the architecture of the city underwent a process of
rapid modernizing, reflecting the evolution of European building styles. The image of
the city was dominated by Eclecticism until 1930 when large scale modernist buildings were erected along the new central boulevard. The Modernist Movement had
a particularly strong influence in Bucharest infusing the tastes of different social layers –from the high bourgeoisie to the more modest lower middle classes. Associated
with progress and urbanity,the new style was employed to build the vast majority of
housing between 1930-40. At urban level, compared to the archetypes of Modernism,
interventions in Bucharest were adequate and (with the exception of the main Boulevard) did not create a radical change in the scale of the urban fabric.
The huge scale dormitory districts of the socialist period, though affecting the
general image of the city, left its historic area intact. With the exception of major urban destruction in the southern part, caused by Ceausescu`s gigantic “House of the
People” and its axis, Bucharest`s central perimeter retained its main morphological
characteristics.
Urban spread followed in the ‘90ies, bringing the emergence of gated communities and developer promoted suburban architecture. Gentrification never took place
Fig. 2
Intrarea Ursuletului (Alley )-view.
424
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 3
Intrarea Ursuletului elements of morpho- typological analysis.
Fig. 4
Intrarea Ursuletului elements of morpho- typological analysis.
Ana Maria Hariton Romania
425
in Bucharest. Severe building regulations and heritage preservation rules restricted
the interventions in the first ring. The central area situated within the second circulation ring became thus of utmost importance. A new pattern of building behavior
emerged: agglutination of adjacent small plots, maximal use of land, height and building dimensions completely out of scale - justified by the building masses and heights
on the nearest boulevard but completely inadequate to the immediate neighborhood.
This type of aggressive building led to the emergence of a series of questions,
awaiting an “architectural answer”:
- Is there a more adequate way of building, respecting the scale of the inner city
while ensuring an economically feasible density ?
- Are there any historical precedents in this type of intervention?
- Is there a persistence of spatial models in Bucharest’s housing?
Fig. 5
Intrarea Pictor Verona – general view.
426
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 6
Intrarea Pictor Verona -elements of morpho- typological analysis.
Fig. 7
Intrarea Pictor Verona -elements of morpho- typological analysis.
Ana Maria Hariton Romania
427
Fig. 8
Intrarea I.L. Caragiale - general view.
428
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 9
Intrarea I.L. Caragiale -elements of morpho- typological analysis.
Fig. 10
Intrarea I.L. Caragiale -elements of morpho- typological analysis.
Ana Maria Hariton Romania
429
Stages and Results of the Design Studio Work
The aim was to take housing design to a further level (not by amplifying the number
of buildings or scale of the program, but through the study of complex urban sites)
and to offer an adequate local response both to the alarming situation in the urban
evolution of the city and to the ubiquitous generic in contemporary architecture.
In the attempt to create a local but contemporary architecture,studio work focused on questions of typical spaces –at both building and urban scale. The study of
urban morphology, considered not only in terms of the built form but as a reflection
of the narrative of a particular place, was an essential part of the process.
Choosing the model
A morphological analysis of the urban patterns in the central area of Bucharest highlights the frequent presence of blind alleys. Their presence is even more noticeable
when walking, due to their stylistic characteristics. Built on deep plots, during a relatively short period (1933-1940) they show a remarkable harmony of the facades,
avoiding uniformity. Embodying a local form of modernism, and being built by relatively unknown architects, they were considered unrepresentative, therefore none of
them is included on the List of Historic Monuments. In fact, their features reflect the
process of negotiation between domestic and foreign, the way in which the foreign,
represented by the Modernist Movement is rephrased within the new context. They
represent the main characteristics of late Romanian modernism: moderate, un radical both at formal and social level. Less known than the main Boulevard, the great
official buildings or the iconic urban villas of the thirties, the blind alleys integrate
new stylistic forms in a city dominated then by Eclecticism. Generating a new spatial pattern, that translates the typical sequential pathway that connects the external
and the interior spaces of the XIX-th century urban houses7, they create a nontraditional but “continuous “use of space, making radical formal change acceptable and
enjoyable.
The study started from the analysis of the existing models determining their spatial and social characteristics. Built in a variety of alveolar or square shapes (1) all of
them are characterized by the existence of a series of intermediate spaces between
the public space of the street and the private spaces of the apartments (semi public space of the alley, symbolic garden, decorated entrance door/hall). Their building
height usually varies between 1 and 5 floors, but constantly respects the scale of the
existing neighborhood. The buildings comprise 1 to 3 apartments per floor, a spatial
configuration considered optimal in terms of the inhabitant`s comfort.
Their spatial conformation determines a series of social characteristic: the space is
lived as a personal one, facilitating the communication and the encounters of neighbors; the possibility of visual survey and the withdrawal from public circulation makes
them safe places for children, creating an adequate living environment even in the
noisiest areas of the city.
Student work
The analysis of a series of characteristic examples of blind alleys started with morphotypological and stylistic elements. During the study, the stylistic elements were elimi430
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
nated, the capacity of assimilating new architectural forms being one of the main reasons in choosing the examples.
Each student group was assigned an urban site situated within the second circulation ring in the western part of the central area of Bucharest. One of the more complicated sites offered perhaps the most interresting and adequate responses.
Solving the difficulties raised by irregular plots in a low rise area was challenging
and led to a series of very different designs, characterized by spatial coherence. From
the interpretation of the same historical model emerged a variety of solutions ranging from subdivision of spaces prioritizing intimacy through fluid continuous spaces,
to volumes characterized by Cartesian regularity and order. Spatial continuity and coherence, building scale and height were considered essential determinants by every
participant, while the questions of stylistic expression remained secondary. The conformation of the apartments, with a dominance of smaller flats, reflected the changes
in the family structure.
The contemporary interpretation of a socio-spatial model of urban living proved
to be beneficial, shifting the focus from the architectural object to its spatial and historic relation to the city.
Fig. 11
The site – aerial view, site plan and pictures of the adjacent buildings.
Ana Maria Hariton Romania
431
Fig. 12
Student project – Luiza Balaceanu.
432
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 13
Student project – Oana Constantin.
Ana Maria Hariton Romania
433
Fig. 14
Student project – Ciprian Sivu Daponte.
434
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 15
Student project – Izabela Davisca.
Ana Maria Hariton Romania
435
Conclusions
In a globalised world, cultural identity is one of the main topics of contemporary architectural discourse. Fascinated by the new seamless, fluid forms, many architects
tend to forget, deny or minimize the historic continuity of the city.
The analysis and research of historic urban forms is not directed at striking out
contemporary architectural expression, but tries to inscribe it in a logical socio-spatial
continuum. This type of research is even more relevant when it is not based merely
on stylistic attributes and interprets examples that have themselves successfully introduced changes into the local architecture. Seeking the expression of our cultural
identity,our goal is not to banish the new forms related to contemporary technology
but to eliminate the forms that cannot be integrated into the continuous and significant evolution of place.
Today, when technology is an essential part of our everyday lives, no one can contest its importance in architecture. Computer programs facilitate the move between
design generation, structural analysis and design proposal; play an essential part in
the of study of energetic efficiency; generate complex shadow analysis. However their
most influential and visible area is the creation of a smooth architecture tied intrinsically to a broader cultural and design discourse.
The “contemporary digital architects find their legitimization in their exploitation
of the latest technological advances, new digital means of composition and production, and the corresponding aesthetics of complex curvilinear surfaces. As a manifestation of new information driven processes that are transforming cultures, societies
and economics on a global scale, they are seen as a logical and inevitable product
of the digital zeitgeist.”8 Opposing a new paradigm to the reductionist tendencies
of modern architecture9, digital architects fail to see that the claimed complexity is
present only at the level of forms and technologies, transforming architecture into an
empty permutation of perfect shapes. The progress of CNC techniques allows the fabrication of unprecedented forms, generating a new tectonic. However the major issue
in the architecture of digital forms is not buildability but the lack of cultural continuity.
Making tabula rasa of the preceding architectural evolution, the digital architects limit
built space to form, creating impressive science-fiction environments.
We assist at the convergence of global interests transforming our cities into gigantic and impersonal show cases. Almost everywhere municipalities try to forge brand
new identities, building megalomaniac architectural productions, that ignore and destroy the identity, memory and significance of place.
In an era of instant circulation of images and ideas, it is essential to understand
that built space is not limited to form. The space of architecture is the inhabited space,
occupied by the activities and reminiscences of human collectivities.10 By trading
our,maybe not so glamorous, but meaningful local way of building for the ubiquitous new forms, we lose more than obsolete buildings, giving up willingly our cultural
identity. Therefore we must consider the alternatives proposed to the architect by history and narrative through the study of urban patterns.
“For architecture today the opportunities lay where the sources of differences are
located. One source is the narrative that is specific though not necessarily unique, to
a particular place and it`s culture. When we consider the nature of specific narratives
whether canonical and royal or subversive and vulgate, what we find is the memory of
436
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
times and places that have fragmented and recorded into the past. The potentialities
of architecture today reside in the permutation of such mnemonic fragments that can
actually pronounce and amplify the difference”.11
Without eliminating contemporary forms, but trying to relate them to the local differences, design, based on research of historical models is one of the main sources of
continuity and spatial coherence. The study of urban patterns, not as empty forms, but
as reflections of our culture that are generated by/ and generate themselves patterns
of living, becomes - through the acts of reading /rewriting - the source of a spatially
and culturally related, human architecture, creating a vibrant palimpsest, opposed to
the tabula rasa of globalization.
Notes
1 Gordon Mathews in Cultural Identity in the Age of Globalization – Implications on Architecture in
The Domestic and the Foreign in Architecture, 010 Publishers, Rotterdam, 2007.
2 Arif Dirlik ,Architecture of Global Modernity, Colonialism and Places, in The Domestic and the
Foreign in Architecture, 010 Publishers, Rotterdam, 2007.
3 Joinery is one of the major sources of thermal loss. Still, breathing fresh air through ventilation systems and being unable to hear the noises of nature when living in the countryside is
a hardly acceptable price for energetic efficiency.
4 Kas Oosterhuis, The Synthetic Dimension, in Architecture goes Wild ,010 Publishers, Rotterdam
2002, Pg.209.
5 As defined by Françoise Choay in; L’urbanisme, utopies et réalités: Une anthologie, Paris, Seuil,
coll. « Points », 1965.
6 Kas Oosterhuis, Architecture goes Wild,010 Publishers, Rotterdam, 2002, pg.140.
7 Continuous urban fronts were uncharacteristic for Bucharest (with the exception of the boulevards) therefore this spatial sequence consisting of an ornate gate, garden alley, flight of stairs
and a more or less elaborate glass porch (“marquise” ) can be seen even in the most modest
eclectic house.
8 Branco Kolarevich, Architecture in the Digital Age:Design and Manufacturing, Spon Press 2003,
pg.6.
9 see Lev Manovitch`s article - Abstraction and Complexity (2004), http:// manovich.net/articles/.
10 Noppen, Luc, Architecture forme urbaine et identité collective, Editions du Septentrion , 1995,
Sillerry (Québec).
11 Sang Lee , Architecture Remixed, in The Domestic and the Foreign in Architecture, 010 Publishers,
Rotterdam, 2007, pg. 242.
Bibliography
Baumeister, Ruth/ Lee, Sang (edited by) The Domestic and the Foreign in Architecture, 010 publishers, Rotterdam 2007
Choay, Françoise, L’urbanisme, utopies et réalités: Une anthologie, Paris, Seuil, coll. « Points », 1965.
Gheorghiu, Petru, Un model de locuire bucurestean, Editura UAIM, Bucuresti,2007
Kolarevich, Branco, Architecture in the Digital Age:Design and Manufacturing, Spon Press 2003
Ana Maria Hariton Romania
437
Larkham, Peter, Understanding urban Form? - in Urban Design/ Winter 2005 /Issue 93.
Machedon, Luminita / Scoffham Ernie, Romanian modernism: the architecture of Bucharest 19201940, MITT Press, 1999.
Manovitch, Lev - Abstraction and Complexity (2004), http:// manovich.net/articles/.
Noppen, Luc, (sous la direction de ) Architecture forme urbaine et identite collective, Editions du
Septentrion, 1995, Sillerry (Québec).
Oosterhuis, Kas, Architecture goes Wild, 010 publishers, Rotterdam, 2002.
Pelletier, Louise / Pérez Gómez, Alberto,(edited by) Architecture Ethics and Technology, Mc GillQueens University Press, 1994.
438
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Beril Özmen Mayer
Eastern Mediterranean University
Northern Cyprus
Digital Design Thinking
and Moving Images Syndrome
Today’s Psychology – Media Age
In the media age today, the psychology of people and their personalities have different impacts due to interaction with online technologies; individualism has let pacification of face-to-face communication that one could not predict these shortcomings
alongside with the technical development. In 1960s, everyday life was predicted depending on more robots than computers. Conversely, if we forecast today for next
generation advances, it could be thought of extreme reliance on ICT and media technologies as well as cyberspace and cyber-bodies. Yet, digital domain and technology
would bring complex continuum of human-machine fusions, so that new forms of
symbolic communication occurs with ‘techno-philiac body’ as declared in1989 that
may point out a screen to stare at, of cyberspace and of cyberbodies (Featherstone
and Burrows, 2000). Thus, visual dimension provokes us to get away from other dimensions’ necessities. Media remain faithful to the extension of the “global village” to
situate our experiences as a happening to support the virtual / unreal world. Whereas,
the nature of the design process has undergone dramatic changes with the emergent
usage of computer aided design software, which are the tools of information communication technologies developed and used in the field of architecture profession.
The moving images in all mediatique arts were started to be integrated with
this very marketing picture. Thus, it is being to be intensified with the postmodern
thought so that it expels as the popular spatial experience. This is exaggerated, now
and then, with a negative effect, as an eyesore in transformation of spatial entities of
architecture in a form of image. Architecture has shifted into an obsession with an image creating occupation in the contemporary world and it has become a part of the
media itself. Nowadays, large numbers of cities advertise themselves to assure larger
number of audience through the image of their architectures such as Sydney Opera
House or various Guggenheim Museums that architecture seemed to not have designed for the real users as the immediate inhabitants of the city but to serve the image of a city signature into the global image. This kind of image creating mechanism
forces a struggle to bring designated objects into being –just for the sake of- the new
and the better, but not in the context of factual data.
The star cult architecture that intended to change clichés relatively against to the
human nature; it makes the product become more superficial, unreal and abstracted
form the real world that works and serves only itself as arts, which is only for satisfying
the aesthetical senses, and not considerable opposing view of the role of art within
the social, cultural and political notions in this very competitive sphere. Thus, this phenomenon reveals a belief to stand on how this image architecture would influence
and shake the human being with its social entities further as the only functional art
until the invasion of the image on realities.
Furthermore, our perception of tragic events like wars, disasters, etc. have become unreal and happened alike a trailer through moving images in any screens as
in TV, internet and in the movie theatre. Interestingly, architecture, as a profession of
construction buildings for real needs, started to serve otherwise. Concisely, the real
needs in architectural programme represents itself with unrealistic –moving- images
of buildings, out of the context around in a neighbourhood, a community or a town
and become a watching / stalking experience for anyone. Outcomes of such design
actions become arts only arts as in the famous controversial dictum.
440
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Changes in the Learning Attitude
Although perception has not really changed with the new technologies, representation of design has been modified in digital design thinking in different conditions. Media articulate conceptual schemes into external visible artefacts with the intention of
that kind of communication channels bring about alterations in transforming the internal mind to the external world (Wang and Tsai, 2011). Through the shift in the role
of computers has promoted from tool to media can be pointed out the ‘social dimension of learning’, which can be observed in extensive use of the net and ICT facilities.
However, there is a breaking glass of this interface and collaboration with the digital
world that “students see themselves as integral part of the interface between design
knowledge and the affordances of new technologies’. (Spicer and Huang, 2001).
Whereas the mechanism of accumulating knowledge is based on the human mind
has similar standards and limits in the globe, teaching practices may exhibit various
approaches. Moreover, student-oriented approaches, mixture of informal and formal
learning, different experiments in the case of state, private or foundation universities,
in which seeing student as a type of customer may affect the education in various levels. Thus, architecture student’s behaviour from the level of awareness to the competence should be investigated through the digital media. When ICT technologies extend to the all over the world, a new type of self learning creates a leap of learning
through social networking as a private course provider. In last decades, lifelong learning concept has established due to the need of developing the social capital in the
developed countries. Consequently, new era of learning bring a range of possibilities
with the screen-based technologies in any place, any way and in any age.
Recent pedagogical developments construct a dialogue between teaching and
learning to connect two sides of the very same coin to set off a new emphasis on the
learning with the individual capacities and the needs of learners. Learners are not
seen anymore as the passive recipient of the knowledge and skills, but active participants of the process. In the digital education, ‘Pedagogy before technology’ has turn
out to be the common catchphrase, which suggests to locate new technologies within the proven practices and the models of teaching’ rather than to create one in a total
new ground. Educational researchers clarified that tools as ‘papyrus or paper’, ‘chalk or
print’; ‘overheads projection or TV’ cannot formulate great difference unless innovation being assimilated into pedagogical practice without altering fundamental truths
on how people learn. Besides, a paradigm shift should be admitted due to multiple
impacts on the nature of learning. Traditional pedagogical thought have various approaches such as ‘Constructivism’ (Jonassen, 1999), ‘Social Constructivism’ (Vygotsky,
1986), ‘Activity Theory’ (Engestrom, 1999), ‘Experimental Learning’ (Kolb, 1984), whereas the new approaches to the digital age are developed as e-learning, networked
learning tendencies; ‘Instructional Design’ by Gagne, 2004; Collaborative Computing
from McConnell, in 2000 and such others (Beetham and Sharpe, 2007).
Digital Thinking in the Studio
As a generational sieve, architectural education has changed a great deal in the last
century. 1940s was the first time to catch the notion of the digital, follows 1950s
which had some experiments triggering, then first VLE has started in 1990s in the
Beril Özmen Mayer Northern Cyprus
441
States. At the same time, there were other influences in the profession as explained
below as the “genius of the era”.
‘It was engineering in the early modernists, social advocacy in the 1960s, energy in
the 1970s, post-modern historicism and literacy theory in 1980s and computers in
1990s.” (Kelbaugh, 2004)
Architectural education has developed a manner allowing the ICT and media presentations to play a very role to teachers and students in design studies and even publishing them virtually. This has created a new type of education and a professional
expression. In the digital use within the realm of architectural education will be discussed and examined with the aim to assess to what extent the digital technology
and the use IT tools has influenced the design thinking and workflow of the design
product.
Architects and educators state that these innovative technologies and computing devices have brought alternative and supportive interactions to both the design
phases and project representation in the professional education platform. However,
similarly to other IT based professional practices, the implementation of computer
technology at the start in design and representation in the virtual environment cause
a limited interactive behaviour between human beings and computers if you consider
having very powerful background of conventional graphics systems in design professions. This discourages an active attitude, particularly if the software program is the
lead of the designer unconsciously not to take the active role, but make him / her
the complementary part in the design delivery process due to the expertise and new
mindset needed for this new representative image language.
The significance of the subject lays in how to place ICT in architectural education
in order to develop new tactics for architectural education in the context. A key move
would be transferring the images world into the reality as in the conventional drawing
as a communicative tool to record the descriptions of valuable experience through
the emphasis of human-centred architecture; and search for different methodologies
of conveying design ideas according to these new attempts.
Methodologies of communication in architectural design were defined with the
ecological accountability to picture this complex and contingent relationship in a
wider perspective within the environment as similar analogy between ‘a river and its
banks where moulded in’ versus computer-aided architectural design systems. The
weakness of this novelty may not fully acknowledge the environmental complexities
of site and context through the design process. In this contemporary agenda in the
architecture, every day practices have changed by a transdisciplinary shift ‘towards
image and signs in the development of notational systems’. This crucial move to ‘images
and signs world’ through new CAD tools has brought a struggle with other discursive
media where architecture expels deficiency due to transforming the reality in the educational practices. Transition from the analogue to the digital, which has been still in
the early stage, caused that these brand-new tools are being used with the same-conventional perception of a kind of drawing board. The dilemma in this kind of attitude,
presentations in this new digital image world are deficient in the crucial relationship
with the immediate environment in the context (Murray, 2011).
442
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
As a result, one can point out that our mental schemata have yet not been ready
to adapt this new mindset of the new technology and not-known capacity of the new
technology to transfer design ideas into images with the best interest due to the highspeed acceptance of the digital innovations. Therefore, this ambiguity may cause to
misuse the capacity of the new notational structure of representation in a more effective way. Besides, it needs time to digest new and search for the ways to present and
describe design issues as competent as before.
In this digital revolution in architectural education, it is a crucial issue to detect
what condition or situations make the difference on the new generation’s the learning attitudes and habits. What kind of social and cultural change goes parallel with
this digital revolution specifically in architectural education? Is there any contextual change in the methods in the architecture design studio? Perhaps, we have a
very long way to answer this question; yet we can perceive some of the positive and
negative circumstances. In the course of questioning use of those innovations and
retaining information of the traditional communication techniques of architecture,
which has its own practical skills for such as sketching, scaling, drawing; how architectural presentations to be implemented with the new technologies have remained
unanswered.
The knowledge that is forgotten or lost in the process of digitization’ practical skills,
know-how, deeply embedded in the context of use, and other tacit knowledge associated with the habits of practice.’ (Dreyfus and Dreyfus, 1986).
As a general attitude, design students’ attempt to producing eye-catching images,
animations and three-dimensional models in the creation of design ideas by using
various software in order to represent design thoughts and proposals illustrate resemblance to the prima-donna architecture apparently due to the psychology of the
digital age. The product itself becomes an abstraction as well due to the media has
changed into virtual manner in education. The realization can be different in the architecture studio because education is still a cognitive process of a human brain and
it is more abstracted. When we compare the studio happenings from the real life experiences, architectural representation actually has shown this abstraction in another
manner as in the design process due to difference between architectural practice and
education.
Designing virtual environments has dealt with the real world by mirroring realistic images reflected from people’s perception. However, if design has not given as a
real world problem, the result would be sole creation of an artefact and its accidental
images, which are represented from imagination (Wang and Tsai, 2011). Accordingly,
digitalizing architectural design studio may exhibit naïve and shallow interpretation of
architectural knowledge through images created by software, but not by an architect
or designer in some circumstances. It can be said however that the use of ICT and CAD
tools have potentials to create better design, and entirely presenting genuine design
ideas successfully. This attitude should be encouraged, however, be supported properly not to turn into a threat against a good design production by those unrealistic
representative imagery.
Beril Özmen Mayer Northern Cyprus
443
Educational practices in the recent developments in the academia, in the mode
of contemporary tutoring may result provocative products to catch this competitive
psychology of the digital age. Moving images of the global world gives an impression to the individual learners of today screened and scanned informational graphics everywhere. At the end, it is difficult to perceive how much of these educational
practice can reach by the audience as participants and practitioners of designing act
in the very fast moments. This psychology exhibits resemblance and reflection in architectural education: To be an observer of many things caused to adopt a kind of
behaviour may be seen in any presentations; animations, power-points, videos, which
are led us less words, more staring at images; prefer neither talk, nor asking questions,
but texting messages to others by I-Pads / I-Phones (SMS, twitting, face-booking) and
even recording to another media such as photos, videos, and any signals over those
images.
Architecture candidates should understand where to take position in their profession. As Teymur pointed out the difference between the concepts of “architecture”
and a “building” is quite interesting to think and discriminate the process from the
product. Thus, this conflict can be considered to be theorized and idealized within architectural design in a more abstract way; architect become someone, who is in between a star-architect and a lay person (Teymur, 2007). Fallacies in our contemporary
architecture culture have been conceived with a number of obligations such as being
a solo artist, an inventor or an extremist. Most interestingly, triumph of the ‘more bigger and higher’ to the little, ‘urbanism to architecture’ and ‘globality to locality’ were
well defined as well as ‘the forgotten middle’ users in the fast mediated world. Exclusively, media architecture can keep this pace of the world with its untrue truths due to
the addiction to the newness and science, not being slow, local, site-specific, humancentric, humanitarian, buildings, simultaneously it should be naked with all senses
(Kelbaugh, 2004). Overall, educating architects should give the understanding of such
professional attitude in which institutions should consider and emphasize the importance of this necessity and responsibility to act with this ethics.
In that sense, re-humanizing the architectural education is necessary in the technologically-driven world of architecture. The architectural product should be represented through images with competent use of the software. A thorough analysis
should reveal on the use of the computer aided design, it should be aware of design
understanding as a learner towards to competency particularly at a university. Thus,
the role of architecture student in the design process should be better than a script
writer or scenario creator.
Final Remarks in the Technological Drive
Consequently, it has been experienced that the IT representation has led to detect of
a shortcoming that is called ‘Moving images syndrome’ by the author, which is based
on presentations to catch the eyes at first sight and avoid deeper design critique. On
one hand, using animated language in such presentations expels a strong and dominant tool and it pushes imaginations of designers to the far edges; on the other, it
may not correspond to an expertise design solution with this computer aided action
or hide deficiencies behind this representative media. This is a growing phenomenon
that needs to be addressed in order to understand the new design process and In444
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
formation-Communication-Technology-Mindset to encourage the spread of the designerly presentation patterns in architectural education. Hence, recent generation of
architectural students become an audience and the outcome of architectural product
becomes a screen / stage performance. In this case, the software is appointed as the
maker of the architecture. Furthermore, computer technology has contributed to the
creation of a kind of “Hollywood Scenery” within an educational context, which seems
to be a very elitist approach to design, arts and the profession and cannot be reflected
to social responsibility within.
The method and the tool are different in the digital however, the problem remain
the same or what we are searching for a solution. The main issue here is what digital
changes will influence design education and how; and up to which amount, level or
position that would effect and alter the mood of design. If architectural representation
becomes only an image, this decreases the quality of design in education and products in the practice, into the more superficial, even artificial one; to the meaningless
and to the drained ones within the context. Then we have to search for a kind of studio education which relies on the solution of an architectural problem, not only consist of context and images. Learning and teaching modes should be tested depend on
the main psychology of using these tools in different manner in different places.
From the viewpoint of teaching, it is significant to feel the new conception of moving image syndrome could be handled as ‘image and context compatibility’ in the jury
presentations at school and not taken for granted with these escape points. In the
practice, presentation of technology, fabrication, manufacture and mass production
expels especially boutique design to sell as seen building dreams; and in competitions
from the concept and preliminary design to the design development phase; from imagination to the reality. These questions remain from the discussion for the moving
images syndrome. Design itself needs to be connected with the problem of the life in
the built environment, at houses, in neighbourhoods, at urban quarters, and the world
ecology. The design of architectural objects no longer abstracted from its duty and the
roots. It serves human beings and technology that is created by human, which can
be defined as artificial as conducted, however it cannot not be contented to serve to
the images world, even if we are living at that matrix for long time. The technologically driven architecture is a brand-new struggle; it should be humanized as much as
possible.
References
Beetham, H. and R. Sharpe, (Eds.) (2007) Rethinking pedagogy for a Digital Age: Designing and Delivering E-learning, Routledge, New York.
Dreyfus, H.L. and Dreyfus, S.E. (1986) Mind Over Machine: The Power of Human Intuition and Expertise
in the Age of the Machine, Oxford: Basil Blackwell.
Featherstone M. and R. Burrows, (Eds.) (2000) Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, Sage Publication, India, pp. 1-18.
Kelbaugh, D. (2004) Seven Fallacies in Architectural Culture, Journal of Architectural Education,
September, V. 58/1: 66-68.
Murray, S. (2011) Identification of the Burgeoning Field: Eniatype, Design Ecologies, Volume 1: pp.
13–29, Intellect Limited.
Beril Özmen Mayer Northern Cyprus
445
Spicer, D. E. and Huang J. (2001). Gurus and Godfathers: Learning Design in the Networked Age.
Education Communication and Information, Volume 1: 3, GSD, Harvard, USA: 325-358.
Teymur, N. (2007) Vitruvius in the Studio: What is Missing? In Design studio pedagogy: Horizon for
the Future, in Salama, A. M., N. Wilkinson, (Eds.) Gateshead, UK, 91-110, Urban International Press.
Wang, X. & J.J.-H. Tsai (Eds.) (2011) Collaborative Design in Virtual Environments, ISCA 48: 131–140,
springerlink.com.
446
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Charalampos Politakis
Manchester Institute for Research and Innovation
in Art and Design (MIRIAD)
Manchester School of Architecture (MSA)
Manchester Metropolitan University
UK
Skeletal Apotheosis
of the Human Body
The human reception of architecture is interpreted through the human body. The
form of the human body has therefore been used as a model and metaphor for architecture since antiquity. This research is based on the relation of the human body and
architectural structures and especially how the human body has been the inspiration
for the exterior form of architectural structures.
According to Santiago Calatrava (2002, p. 91) ‘Another topic that is also very important in architecture is anatomy and the idea of reading in the human body structures, or
appreciating in the human body, a sense of architecture. Whatever we do, the magnitude
or the dimension of a thing is always related to our bodies. Architecture, in a very natural way, is purely related to humans, because it is done for --and by-- people. This makes
anatomy a very powerful source of inspiration.’ This paper explores Santiago Calatrava’s
approach to architectural design and the combination of the ‘mechanical’ and the
‘human’ aspect on his work. The case that this paper examines is the Planetarium of
Valencia’s City of Science (1996). The building reveals a reference to the shape of the
human eye mimicking the mechanical kinesis of the eyelid. The disembodied human
eye is structured with a skeletal synthesis adapted to the architectural and mechanical
practices with a personal/expressionistic approach to its ‘realistic’ structure. The skeletal synthesis of the eye by Calatrava with the adaptive kinesis reveals a surrealistic
approach to the eye’s structure, an approach that presents a part of the human body
as a colossal remnant of the human existence. Calatrava reveals the inner architecture
of the human eye and transforms it as the external shape of the building. Furthermore as far as the colour of the structure it might be said that the extended use of
white colour (mimicking the white colour of the human skeleton) that dominates the
building, reveals and underlines the skeletal architectural structure. With this practice, Calatrava is creating a paradox of the eye’s structure, creating an infinite chain of
structures within the structure; inhabiting an inner layer of the human body that has
always been secluded from our perspective, transforming the inner structure and revealing it as the outer structure of the building and of the human body itself.
The problem which Calatrava’s work explores is that whenever a creator is trying
to imitate/project or approach the human body from its physical shape and function this notion inevitably generates philosophical, sociological and anthropological
questions and metaphors upon perhaps the most popular and intimate object; our
own body. This notion of mimesis of the human body appears as a single architectural structure in Calatrava’s work (even if it appears as a surrealistic, expressionistic or
experimental artistic approach to the human body). Furthermore, it indicates a narrative exploration similar to a narrative ancient Greek metope, or perhaps nowadays
the fashionable projections or the use of screens on the outer surfaces of buildings.
Through technological advancement the projection of the mechanization of the human body through cyber culture, robotics and computer based practices has many
manifestations. An interesting part of this specific work by Calatrava is the absence of
gender/identity with the use of an organ common to all human genres, for that makes
it a global mark. As Juhani Palasmma mention’s (1996, p. 45) ‘It is similarly inconceivable that we could think of purely cerebral architecture that would not be a projection of
the human body and its movement through space. The art of architecture is also engaged
with metaphysical and existential questions concerning man’s being in the world.’
Christopher Hight on his book ‘Architectural Principles in the age of Cybernetics’ (2008)
gives an explanation on the relationship between building and body. According to
448
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 1
Feature0184_03x. [image online] Available at: <http://www.archnewsnow.com/features/images/
Feature0184_03x.jpg> [Accessed 10 October 2011].
Fig. 2
Feature0184_02x. [image online] Available at: <http://www.archnewsnow.com/features/images/
Feature0184_02x.jpg> [Accessed 10 October 2011].
Charalampos Politakis UK
449
Hight (2008, pp. 21-22) ‘The relationship between building and body provides the ground
(topos) that sets humanity apart in an architectonically closed and symbolically ordered
space from the world of nature.‘ With that, Hight refers to the antithesis between human structures (that create topos) and nature. Hight also mentions that for phenomenologists and post-structuralists ‘the body is the natural model for architecture, which
became a model for nature, which was, in turn, a model for the body...‘ (2008, p. 37). The
use of the human body as an architectural structure or building therefore creates metaphors on the meaning of such practice and to what might such practice refer.
In their book for instance ‘Metaphors We Live By’ by Lacoff, G & Johnson, M (1980,
p. 3) ‘…metaphor is pervasive in everyday life, not just in language but in thought and action’. Going further against the tradition of objectivism in western culture as the authors state: ‘We see metaphor as essential to human understanding and as a mechanism
for creating new meaning and new realities in our lives’ (1980, pp. 195-196). Metaphors
are an important tool according to Lacoff and Johnson in order ‘to comprehend partially what cannot be comprehended totally: our feelings, aesthetic experiences, moral
practices and spiritual awareness’ (1980, p. 192). An anthropomorphic building (its exterior shape at first as a visual object and as a public structure and then its interior)
manifests a monumental mirror of a new reality.
At this point the researcher would like to add that the initial idea of the research
came as a result in 2009 in the artistic and architectural application of anthropomorphic structures using 3D design and game engine software. Initially the idea for this
research has its roots in ancient Greek mythology, and the tradition on Crete where
the Mount Juktas located near Heraklion was believed to be the grave of the King of
Gods, Zeus (Rice, 1998, p. 212). The observation of nature and the parallel influence
of the myth generated the desire to create a 3D anthropomorphic building based on
the exterior structure of the human head. With the interactive installation, entitled
‘Emerging Face’ the user explores a 3D environment defined with three main places:
Fig. 3
01_THE_ArxanseXT2 [image online] Available at: <http://www.archanes.gr/wow/photos.asp?id=2>
[Accessed 10 October 2011].
450
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 4
Politakis, C., 2009, Emerging Face. [image online] Available at: <http://www.architizer.com/en_us/
projects/pictures/emerging-face-2009/24110/206887/> [Accessed 10 October 2011].
Fig. 4
Politakis, C., 2009, Emerging Face. [image online] Available at: <http://www.architizer.com/en_us/
projects/pictures/emerging-face-2009/24110/206888/> [Accessed 10 October 2011].
Charalampos Politakis UK
451
Fig. 6
Politakis, C., 2009, Emerging Face. [image online] Available at: <http://www.architizer.com/en_us/
projects/pictures/emerging-face-2009/24110/206897/> [Accessed 10 October 2011].
the city, nature (symbolized with the use of the mountains) and the human head as a
building between these two places.
In this dialogue between sculpture and architecture, questions arise about the
‘use’ of bodily structures and mechanized/kinetic bodily influenced structures in architecture. From an architectural perspective what the meaning of such a bodily representation is and how such architectural practice embodies meaning in the relation
between man and the mechanized world? Does this practice differ from the use of information technology driven architecture?
In ‘Timaeus’, Plato’s description of the human body is a cartographical approach with
an extended use of metaphors because of the observation of nature. Also common in
‘Timaeus’ is a metaphysical approach, used in order to be given an explanation of the
cosmos. Plato’s description of the human body locates features such as the head, the
senses, the liver, the bones etc. An important feature in his doctrine is the description
of the human head. Plato describes it as the most important feature of the human
body, the divine part of us, which imitates the spherical shape of the universe and has
a back and a front side. As Plato mentions in ‘Timaeus;, ‘first, then, the gods, imitating
the spherical shape of the universe, enclosed the two divine courses in a spherical body,
that, namely, which we now term the head, being the most divine part of us and the lord
of all that is in us: to this the gods, when they put together the body, gave all the other
members to be servants, considering that it partook of every sort of motion. in order then
452
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
that it might not tumble about among the high and deep places of the earth, but might be
able to get over the one and out of the other, they provided the body to be its vehicle and
means of locomotion; which consequently had length and was furnished with four limbs
extended and flexible; these god contrived to be instruments of locomotion with which it
might take hold and find support, and so be able to pass through all places, carrying on
high the dwelling-place of the most sacred and divine part of us. Such was the origin of
legs and hands, which for this reason were attached to every man; and the gods, deeming
the front part of man to be more honourable and fit to command than the hinder part,
made us to move mostly in a forward direction. Wherefore man must have his front part
unlike and distinguished from the rest of his body’ (Plato, Timaeus).
This partly metaphysical explanation and description of the human body gives
also the basic philosophical hypothesis of Plato that the body is the medium of the
soul; the body therefore is created in such a way in order to satisfy the prospects of
the soul ‘and so in the vessel of the head, they first of all put a face in which they inserted
organs to minister in all things to the providence of the soul, and they appointed this part,
which has authority, to be by nature the part which is in front’ (Plato, Timaeus). An interesting fact is the description of Plato in ‘Timaeus’ for example of the peptic system and
the inspiration and expiration process; his descriptions are always given with a metaphoric paradigm based on the observation of the functions of nature. Specifically ‘Now
after the superior powers had created all these natures to be food for us who are of the
inferior nature, they cut various channels through the body as through a garden, that it
might be watered as from a running stream’. As far as inspiration and expiration Plato
mentions ‘This process, as we affirm the name-giver named inspiration and expiration.
And all this movement, active as well as passive, takes place in order that the body, being
watered and cooled, may receive nourishment and life; for when the respiration is going
in and out, and the fire, which is fast bound within, follows it, and ever and anon moving
to and fro, enters through the belly and reaches the meat and drink, it dissolves them, and
dividing them into small portions and guiding them through the passages where it goes,
pumps them as from a fountain into the channels of the veins, and makes the stream of
the veins flow through the body as through a conduit’ (Plato, Timaeus). These metaphoric
descriptions related to similar functions in nature reveal a ‘mechanical’ function similar
to the human body.
In contrast to Plato, Descartes observed: ‘I am really distinct from my body, and I can exist without it’. This hypothetical assumption by Descartes mentioned on his 6th Meditation explores an unexplored perspective of the soul. The dualistic dichotomy between
the corporeal materiality of the human body and the incorporeal soul/mind suggests
and separates the presence and being of them in space/dimension; if there is such
separation then the mind exists or could exist in separate spaces/dimensions; so on
one hand there is a common locative presence for body and mind and on the other
hand there is also another presence of mind in a different level/location. The body
then is just the corporeal carrier ‘the thing’ of mind and soul; therefore it is a medium,
an extension of mind in a materialistic level.
A further fact that could be mentioned concerning the dualism in the existence
of ‘ego’ is based on the observation by Descartes and his distinction on the difference
between the divisible body and the indivisible mind. According to Descartes ‘there is
a great difference between the mind and the body, inasmuch as the body is by its very naCharalampos Politakis UK
453
ture always divisible, while the mind is utterly indivisible’. Descartes here outlines clearly
the characteristics of the corporeal body explaining that if any part of the body for
example a leg, arm etc. cut off this fact would not have an effect on the mind but only
on the shape of the body; so because there is no effect on the mind ‘nothing has thereby been taken away from the mind’, this argument separates the corporeal body from
the incorporeal mind. Descartes, considering the question of ‘what I was’ proposes his
thoughts with the description of the body. ‘Well, the first thought to come to mind was
that I had a face, hands, arms and the whole mechanical structure of limbs which can be
seen in a corpse, and which I called the body’. Descartes clearly defines the mechanistic
and materialistic form of human bodies (and animals) by considering also that ‘as a
kind of machine equipped with and made up by bones, nerves, muscles, veins, blood and
skin perform all the same movements as it now does in those cases where movement is
not under control of the will or, consequently, of the mind.’ And he continues ‘by a body I
understand whatever has a determinable shape and a definable location and can occupy
a space in such a way as to exclude any other body’.
After the metaphoric description of the human body in relation to the functions,
the cosmological observation of nature by Plato, and the mechanized description of
the human body by Descartes, the researcher has sought to embody these two philosophical theories in regards to architectural buildings and architectural theory. How
are these two philosophical approaches connected to architecture? When a building
has an anthropomorphic shape and mechanisms have been added in its structure
that creates movement, and not merely a sense of movement, what metaphors can
be generated from that practice? The duality with regards to anthropomorphic architecture and the human body is not the separation of the mind and the body; rather it
is the dual existence of the human body and its manmade disembodied fragmented
representation in the structure of anthropomorphic architecture.
The mechanized architectural representation of the human body in Calatrava’s work
and the ‘mechanized’ IT driven architectural design raises a philosophical and theoretical question: Are we at a time that we are ‘adapting’ Le Corbusier’s theory to a
different mechanization of architectural practice and inhabitation involving IT technologies and our bodies? Creating a living ‘digital’ unconventional and conventional
architecture?
In their article ‘The Mechanical vs. Divine Body: The rise of Modern Design Theory in
Europe’ Alexander Tzonis et al., (1975) examined through paradigms the ‘conceptual
systems of architecture in France between 1650 and 1800’. In their article they are referring to several metaphorical observations and statements that were made regarding
the connection of human body and architecture and the use of buildings as instruments; as they mention ‘The first buildings conceived as machines were those which were
compared to enlarged instruments’.
At this stage Le Corbusier’s architectural theory could contribute to the exploration of the hypothesis considering the human body as a building. Le Corbusier mentioned that, ‘A house is a machine for living in’. Taking Le Corbusier’s idea and combining it with Plato’s and Descartes’s philosophical theories it can be said that:
454
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
(
(
IF → a House/Building = Machine → Then a House/Building = Corpus
Thus ↓
A House/Building = Body
As a result then from philosophical perspective:
Because: a Building = Body → Then the examination of the hypothesis
‘human body as a building’ has a philosophical background
that has to be taken further into account.
IF → the Building = Body = Machine → Then the anthropomorphic
shape of buildings could be analysed from a philosophical perspective.
In conclusion, both practices (the ‘digital’ unconventional and conventional architecture)
through the paradigms that were previously examined generate metaphors, and therefore new realities which according to Lacoff & Johnson are notions of mechanization not
only of the architectural structures and practice but also of the human body itself.
Bibliography
Benedikt, M.L., 1991, Cyberspace: First Steps, MIT Press.
Calatrava, S., 2002, Santiago Calatrava: Conversations with Students - the M.I.T. Lectures, Princeton
Architectural Press.
Cottingham, J., ed., 1996, Descartes Meditations on First Philosophy, Cambridge University Press.
Drake, S., 2008, A Well-Composed Body - Anthropomorphism in Architecture, VDM Verlag Dr Müller.
Frascari, M., 1991, Monsters of Architecture: Anthropomorphism in Architectural Theory, Rowman &
Littlefield Publishers, INC.
Hight, C., 2008, Architectural principles in the age of cybernetics, Routledge.
Hutton, S., & Hedley, D., ed., 2008, Platonism at the Origins of Modernity: Studies on Platonism and
Early Modern Philosophy, Springer-Verlag New York Inc.
Lacoff, G., & Johnson, M., 1980, Metaphors we live by, The University of Chicago Press.
Le Corbusier, 1985, Towards a New Architecture, Dover Publications.
Lethaby, W., 1994, Architecture Mysticism and Myth, SOLOS Press.
Palasmaa, J., 1996, The Eyes of the Skin: Architecture and the Senses, Academy Editions.
Politakis, C., 2009, Emerging Face, Available at: <http://www.msa.mmu.ac.uk/continuity/index.
php/2010/01/25/emerging-face/> [Accessed 10 September 2011].
Rice, M., 1998, The Power of the Bull, Routledge.
Sharp, D., 1996, Architectural Monographs No 46: Santiago Calatrava, Academy Editions.
The Internet Classics Archive 2009, Timaeus by Plato, Available at: <http://classics.mit.edu/Plato/
timaeus.html> [Accessed 24 October 2010].
Tischhaser, A., & von Moos, S., ed., 1998, Calatrava - Public Buildings, Birkhäuser.
Tzonis, A., et al., 1975, The Mechanical vs. Divine Body. The rise of Modern Design Theory in Europe,
[online] Available at: < http://tzonis.com/dks/dks/publications/online%20publications/1975-JAEthe%20mechanical%20vs%20the%20divine.htm> [Accessed 24 October 2010].
Zardini, M., 1996, Santiago Calatrava Secret Sketchbook, The Monacelli Press.
Charalampos Politakis UK
455
Lucilla Zanolari Bottelli
Politecnico di Milano
Italy
Wall-e
This paper argues on an open question that leads to a statement. Where is interior architecture going to in a technology driven architecture trend? Technology is part of
our history and it is a continuous development of human knowledge into practice,
thus designers and schools of architecture should consider human and technology as
a sustainable system not letting one overwhelm the other. The balance between human ecology and technological systems toward a better future and a higher quality
of life is possible in the respect of Nature. Utopia is to consider this unbearable and to
prefer one aspect over the other.
Technology is needed today in order to face problems connected with environment
and socio-cultural issues. We cannot conceive anymore a lifestyle without a personal
computer, a cellular phone or a simple electrical wiring. We cannot imagine a world
without plains, fast track trains, or cars when mobility is not the future to be, but the
present to take.
On the other hand getting used to technology induces to sit down while the machine does the rest. One of the biggest challenges of the digital era is a risen brain
activity over the physical. Due to advanced media communication distances have narrowed over verbal connections, allowing people from opposite sides of the world to
speak and to look at each other at the same time, still remaining seated in their homes
or offices or cafés.
Most of the times the space of the office coincides with home, generating a continuous cycle in the daily schedule. Through the www. we might even have the groceries brought directly home or a chat with family and friends and never leave the
same ambient. This apparent seclusion than might change the picture, and turn technology into something negative. Indeed the fault is ours because we are addicted to
the interactive screen device: the less we actively interact with the outside world the
more we are attached to innovative technological devices.
This paper doesn’t assume to find a recipe in order to control the desire of technology
driven interiors, nor to give a good reason to choose one lifestyle instead of another.
The paper proposes a positive thinking based on the teaching by Carlo De Carli,
Maria Zambrano, Gregory Bateson, Leonardo da Vinci and Henry Riviére.
Where is interior architecture going to in a technology driven architecture trend?
Lets begin from the end by introducing the story of Wall-e, the American computer-animated science fiction film produced by Pixar Animation Studios and directed
by Andrew Stanton in 2008.
Wall-e is a robot. His duty is to collect wastes on Earth turning them into skyscrapers
blocks. It is the year 2700 and humankind is cruising the universe in search of natural soil
to colonize, because Earth seems like wasteland and life on it had become impossible.
The space ship is a clean, aseptic and shiny scenario, while a computer masters the
daily schedule. Inhabitants lay on movable advanced innovative technology couches with
an interactive screen in front of their faces. Human’s posture has turned man into a blob
shape individual: physical motion has been substitute by IT devices. Emotions and feelings
seem frozen on a superficial level until Wall-e finds the way to enter the spacecraft…
458
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
Like one of us, Wall-e has
curiosities, gets surprised and
excited, learns and understands
the meaning of beauty. He
wakes up in the morning, has
a sunny recharging breakfast
through the solar panels on his
metal body and goes to work.
His job is to compact wastes
into cubes and store them one
on top the other, thus building
walls of waste-cubes that turn
into skyscraper towers. Indeed
his name stands for Waste Allocation Load Lifters – Earth.
As any respectable worker he
has a break during the working
schedule. At lunchtime he collects peculiar objects that make
him wonder. His home is a container that shelters him and his
domestic objects. But this is not
the only peculiar aspect of his
personality. Wall-e likes to look
at the stars and to listen to music. His uniqueness goes even
further by sensing the importance of a touch as something
beyond holding hands: a nonverbal communication full of
significance.
On the other hand humans on the spacecraft seldom
touch each other, being too
busy looking at the screen. The
screen is the only contact with
the environment and it allows
them to interrelate one another
on a verbal relationship. The
moving device that transports
them around the spacecraft is
programmed to follow distinct
trails and directions. The computer master controls the daily
light by switching on and off
the artificial sunlight and even
changing the color of the peoLucilla Zanolari Bottelli Italy
Fig. 1
Aurora. Chania 29/8/2011.
459
ple’s clothing or providing them beverage and food at certain times. Humans pay little
attention to the surroundings since it is just the background of their screen.
The comparison between Wall-e and the future human is not the point: we are not
speaking about who the human is and who the machine is. Wall-e is a cartoon character and we are not turning into robots because of technology or IT driven architecture.
The question is about the possibility of the setting: if one day we will have spacecrafts
villages like in the Pixar movie or in Star Trek, we’ll we behave differently than now?
Will physical contacts become obsolete? Will interiors reflect an automotive system at
the expense of human relations?
Human behavior generates space relations. As designers and architects we all have
been taught in school that distance matters when human actions reveal themselves
into space. Therefore dimensions are important both for the room distribution and
the furniture scale. Movement and gestures build the using space of domestic space
reflecting individual’s culture, background and heritage. Meeting rooms have tables
different in size and design than living rooms since the interrelation between colleagues is supposed to be detached, standing some steps further than the one with
friends and relatives. Different relationships require different interpersonal distances
and particular space situations. Architecture cannot escape from these schemes that
should be the basis of manuals and regulations. Behavior, as the synthesis of culture,
society, history is related to the origin of the person and it demands a systemic structure of these aspects translated into place. Consequently a gesture is a synthesis of
information: social, cultural, functional, emotional, individual. The space needed for a
gesture to perform in relationship with the others refers to primary body language.
Carlo De Carli (1910-1991) has defined spazio primario the combination of the two,
which is the subject of architecture. During the exhibition for the celebration of his
centennial birth, Silvana Annichiarico, director of the Triennale Design Museum in Milan, said that De Carli’s writings: “are testament to the groundlessness of any separation
between exterior and interior, or between big and small, do not draw attention so much to
a space or to an item as such, but to the “process that led to the creation” of such space or
item and to their mutual relationships, in which multiple factors are at stake, sometimes
with conflicting interests, calling for a solution that must solve as well as transcend them.
Defined as the “space of first inner tensions” as well as the “space of gesture” or “relational space”, the primary space is born as soon as the self opens up to the others and to the
world, in a gesture of embrace and human solidarity. This is not just about the physical
atmosphere we are all surrounded by and breath, this is about giving sense to or “making
sense” of such embrace and thus of the place in which it does or could occur. At first, the
primary space has no physical properties or any shape or any other formal description, it
lies entirely in the “preciousness” of the human being in a tight relationship between architecture and ethics, and between architecture and nature, that goes beyond mere functional usefulness in the attempt to understand and translate its meaning into a built project
to such an extent as to “pale” into representations laden with existential moments. As De
Carli wrote, when it is born it is already imbued with life, with all of one’s lived experience.”
Wall-e’s interest on the meaning of holding hands and the pleasure of dancing or
sharing a smile is related to situation and space. Innovative technology doesn’t interfere with human relations but when applied to space it might change the inhabitant’s
460
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
Human relations. Chania 31/8/2011.
gestures and therefore their behavior. Changing doesn’t imply substituting them: architecture is the reflection of primary spaces in which technology can be an aspect
and perform as an attribute. Keeping this in mind humans can participate actively on
technology driven architectures through the relations that occur in them, through them
and by them.
De Carli used to say that the teacher-student relationship builds the classroom,
thus architecture shapes space creating the environment for learning. Technology
driven architecture has to deal with e-learning in which the extension of speech is filtered by an electronic device. The question then drives toward the space of human
relations and how it has developed over time. It seems that even school can enter the
domestic sphere and inhabiting tends toward a hybridism of functions, which people
get easily accustomed to.
The contemporary trend to interrelate with the outside without moving will produce high quality domestic spaces more and more focused on a world wide communicational platform. Interrelation is the most human aspect of the inhabiting. And
due to this reason, even if everything can be performed by a technological white box
that turns to be a forest landscape or the house of our dreams in a matter of second,
we will still need a real window with a real view in order to open it and to smell the
outside.
Virtual reality cannot compete with real dimension although it can induce similar
feelings, touching the empathic sphere of the human.
Lucilla Zanolari Bottelli Italy
461
It is true that everything can be designed as a setting, and it may satisfy all our
needs and desires listed on an infinite catalogue of pre-established possibilities.
Though man is a curious animal and the Ulysses that each of us keeps in his soul will
always lead him toward the unknown and the inexperienced. We will always look for
what is beyond the wall. Technology driven architecture is the result of this aspect of
the human.
Our domestic world might turn into a private stage controlled by technology, as in
the movie The Truman Show. But the question is: can we live inside an installation or a
setting where the inhabitant has limited freedom and everything is pretty much decided? Temporarily yes, precisely because it is a short time location. In order to seed
some roots the inhabitant needs to interchange with his space, to interfere with it, so
to leave a sign into his world.
If technology allows the setting of architecture, it helps also to achieve sustainable
constructions.
The threat we have nowadays of technology driven architecture is indeed the consequence of a bad use in the past. Don’t forget the application of alternative energies
applied to new constructions and renovated buildings, and how green buildings are
technologically driven.
Even if we can speak of good and bad architecture, technology is pretty much
above this concept since it is use that makes it good or bad for the environment. Architects and designers should always face this responsibility, since for environment we
don’t address only Nature but also the primary space with its relations. The architectural project should consider the tensions that occur between the subjects into space.
How to see them in advance makes a sensible designer and a good project. There is a
link between the relations defined by De Carli and the tensions into space: an invisible
presence that any technology isn’t able to detect.
The Spanish essayist and philosopher Maria Zambrano (1904-1991) in the book De
la Aurora wrote about the invisible threads of relations and tensions occurring into
place. These tensions, that Zambrano described through analyzing Las Meninas painted by Diego Velasquez in 1656, are always there. The connections and the strength
of the spatial relations reveal themselves through light during the aurora. It is a matter of seconds yet it is the essence of space. As an appearance they come into light.
Aurora enlightens the relations as well as the observer that witnesses the moment.
The apparent distinction between philosophy of art and architecture has indeed an
important common aim: the reading of the relations between the actors on stage,
the picture, as well as the objects into space. Considering these tensions is part of
the creative design process and enables to foresee the in becoming architecture. In
Zambrano’s philosophy this characteristic is what may define beauty in the painting
as something expressing harmony and balance even through the contradiction of the
parts. Therefore beauty is always present and its enlightenment depends on the observer’s sensibility and the architecture.
Technology helps to control most of the decisions made in the design process, yet
the aurora belongs to the experience and the sensibility. Both aspects, technological
462
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
driven architecture and sensible design process, certainly provide a high quality architecture. There shall be no conflict between them, thus collaboration.
Hence while space gains its value by gestures and traces, experience runs through the
senses. Vision, smell, touch, taste, and sound manifest the qualities of space and allow
the individual to recognize and memorize the place. Identity and recognition reveal
architecture. Therefore the essence of space, the primary space, lays into the relations
that occur in that specific space over a determinate time.
But what happens when space is simultaneous? When technological means allow
a continuous transformation of the spatial variables?
Contemporary installations tend to be more and more interactive. Active and passive experiences apply to both visitors and space. Sound, touch, vision, sometimes
even smell and taste, are design variables in most of the creative events presented in
worldwide expos on inhabiting and furniture. Besides the strong impact these experiments have on the public, the real question is: will it become the house of the future?
Are spatial tensions becoming more and more dynamic? Can we exercise a sensibility
that welcomes technological means in order to assure a good inhabiting? So far these
architectures are ephemeral or temporary in the sense that they don’t last the definition of home. Maybe in the next future it will be different, and responsive architecture
will be a style of living and not merely a creative event.
Indeed technology driven architecture already allow a flexible inhabiting, since
through media communication there is a conversion of functions into the domestic space. On the other hand fire alarms detections, or domotic systems for example
structure the building as an organism that responds to certain stimuli.
Contemporary technological means assure a value base responsive architecture
in two ways: one related with the integration of functions and spatial distribution, the other through the systems that run the building and make it a performing
construction.
The picture of extreme responsive interiors is not just a cartoon imagination. The
space of inhabiting changes over time, not only as generations go by but also as the
inhabitant grows up and turns from individual to family. Home is a space of belonging, which the human considers a small root of identity. Martin Heidegger (1889–
1976) used to call this infeeling of being here and now the Dasein, defining it as the
human condition. This brings to a dual relationship between the inhabitant and space:
each of them transforms one another. As a consequence the contact produces a place
that the human holds into his memory and takes it as part of his story.
Places and spaces are quite different as for the human perception. Marc Augé (1935)
used to address place as a space with a higher quality of life than other spaces can
provide, since the first belongs to the people while the second ones are lacking of
identity. Architecture has the aim to transform spaces into places, yet it is the inhabitant the secret ingredient. Architecture without the inhabitant is useless, not accomplishing what it is meant for.
Technology driven architecture actually performs even more for the human than
previous architecture did. The interaction level between the inhabitant and technological space is higher due to the demand technology reserved to the inhabitant both
Lucilla Zanolari Bottelli Italy
463
as function and offer. Technology driven architecture provides spatial relations that
traditional buildings couldn’t behave. Basic human needs are always the same and
they are assured, though machines, wi-fi systems, domotic systems and other devices
have a reflection on the dimensions and the distribution of domestic spaces.
Compact electronic devices take less and less room in interiors, and allow smaller
spaces to be perfectly inhabited. Indeed the decreasing size of domestic surfaces in
cities is also due to the the market prices that rise considerably and consistently over
time. If in the Modern Movement compacting was an effort for the machine vivant,
today it is even supported by economical means. This aspect though doesn’t change
a bit the feeling of the place: home keeps being human’s private rooting place. Even
if pictures are stored into small boxes (hard disks) that sometimes substitute shelves
of books, the domestic space holds our belongings, memories and history. Even if albums are available on line from any side of the world, home keeps being a real 3D
place in which relations occur and keep it alive. Being here and now – the Dasein - is
sharing the contingent world. Humans need a real hug, a walking together, a shared
silence in order to feel the same reality. Thus innovative technological interiors still
contain our story and belongings. Even if domestic spaces change over time, receptive
architecture is still a social container and it reflects the people’s need to be technologically updated. The trend is undoubtedly to isolate the individual in the physical sphere
while opening his mind to a wide world of external untouchable possibilities. The digital era is separating the direct social behavior from an indirect relationship made by
visual effects and script. Technological interiors may by green for the environment;
what about the ecology of mind?
Similar to De Carli’s primary space, Gregory Bateson (1904–1980) referred to space
in the means of a relational system, addressing architecture to structure the ecology
of mind. Culture is a mutually interdependent world wherein individual relationships
shape socially shared meanings, while these collective meanings simultaneously inform the individuals’ understandings of their actions. According to the definition of
ecology as the scientific study of the relations that living organisms have with respect
to each other and their natural environment, architecture builds an ecological system
in which to welcome human relations. This systemic approach reminds to Zambrano’s
analysis of the tensions between the subjects in place and to De Carli’s primary space.
Ecological space refers to a variety of opportunities that make the place alive and
proactive. The demand of interactive devices and high performing architecture turns
space to be more and more receptive adding a new variable in the ecology of mind:
architecture in itself as part of the system. The tensions between the subjects in place
gain another coordinate relatively to the technological offer provided. This time the
invisible thread is a potential use, a service that yet directs the subjects on certain
behaviors instead of others. In this point of view technology is architecture as well.
The attitude to look at things in a different perspective is part Bateson’s teaching. His
youngest daughter, Nora Bateson in 2010 released a documentary about her father
“An Ecology of Mind”. At the opening she said: “I am inviting you to do the thing he
did best, which is to look at a thing from another angle, anything at all, be it an earth
worm, a number sequence, a tree, a formal definition of addiction.” Ecology of mind is
a view toward a sustainable world, possible to build and rebuild according to the contingent human necessities. Thus from the technological point of view we will discover
a world of new possibilities that architecture can offer.
464
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
Technology driven architectures are a present developing reality and picture contemporary society. In an ecological approach, we shall consider how the local identity
(the interior root) and the global belonging (the outside world) equilibrate each other
with the need for nature and for the community ensemble. In other words, we should
lead technology toward a systemic approach in the respect of sustainability. Architecture in this way may also make a difference in world economy and environment issues
toward a building society that respects itself and the generations to come.
The discussion on Nature toward Science leads toward the writings of Leonardo da
Vinci (1452-1519). In his books he wrote that Beauty lays into Nature inside form/
function/space relationship. As an exercise of balance between techne and behavioral responses (natural and human), the project is an act of love toward the future
inhabitant.
If form mat be considered over technique then it drives the architect toward
processing an organism, which sustains itself by the technical form and it becomes
alive through inhabiting.
Through Leonardo’s eyes innovative technology driven architecture is a matter of
dedication toward life. The result is a mechanism, a project, a place that can be associated as a primary experience. Getting accustomed by new inventions and technological spaces is also part of the system, since development and enterprises are endless
until the human will be a thinking being.
Technology is persecuted for the human and relates to the beauty of Nature as an
attribute to the human experience. In Leonardo’s conception there is no techne without the human, there is no beauty without Nature, but life is a combination of all. The
ecology of mind refers to this kind of relationships that are able to understand one another after discovering each other. Technology is not there to understand but to provide the answer to the human needs. Therefore as an act of love, adaptive architecture
may accept a more influential and decisive presence of human being.
In the movie Wall-e, advanced technology holds the inhabitants as prisoners, which
are unable to take decisions for themselves and to oppose their will to a stuck environment. But the real issue is how to have the will, the desire, the curiosity to be an
active part of this world. Until the virtual Ulysses that hides in each us will still travel
and look for new meanings and new possibilities, technology will not take over. No
matter how innovative our homes will turn in, or how automatic system will substitute
actions and customs, knowledge and sharing will always pursue the difference. Why?
Because of feelings.
Man will always reach his limits and never satisfy his thirst for knowledge. Emotions
are the propulsion to realize his dreams, to answer to his questions and to find his
role into the universe. The transcendent quality of human mind comes into practice
through progress. Technology leads to the new frontier of the unknown, but without
relationships there is no place, and without a dialogue between the parts, interior/exterior, object/user, space/inhabitant there is no story.
Architecture, technology driven or not, refers to this story and enables people to
fill their contingent carnet. By considering the memory of the place at any time – past,
Lucilla Zanolari Bottelli Italy
465
Fig. 3
History of the place. Chania 30/8/2011.
Fig. 4
Interior roots. Athens 2/9/2011.
466
ENHSA- EAAE no 55 Rethinking the Human in Technology Driven Architecture
present, future - architects and designer will respect the human being. How this is
possible is a matter of codifying and reading the place.
Cities of nowadays tend to transform themselves through futuristic masterplans
that provide a metropolitan aspect, most of the times leaving the history of the place
behind. The consequence is a high performing urbanization that has lost its roots. It
might be a new starting point, but what about the interior roots that a place keeps
inside? What about the belonging and the identity of the people that recognize themselves in that place?
In order to conceive architectures that allow the continuity with local roots, technology should be able to run together with the sensibility of the place and should not
be used to overcome it. This is possible only while facing the project. As a project variable, and for a sustainable future, advanced innovative technology must protect architecture from becoming consumable, self complacent, and formal attractive.
Teaching has an important role in this issue. The argument touches every discipline at any level, since what occurs in space is a synthesis of human decisions about
economy, environment, politics, society, industry, … Architecture is a subject that
considers any implication, yet it needs help in order to proceed coherently. Technology driven architecture is a group working process that can highlight each part to be
part of the story and have a voice in it.
As long as these directions survive into our conscience, space will always be human
related, no matter how AIT might be involved into it. The architect Henry Riviére
(1965-2010) said “Architecture is only made within the context of human adventure”.
The adventure may be giving the first kiss, tasting an unusual flavor, learning how to
dance, speaking through a drawing, applying for a job, … in brief experiencing in person. Technology driven architecture participates in it as well.
Where is interior architecture going to in a technology driven architecture trend?
It’s to be discovered. Prepare yourselves to the adventure.
References
Augé, M. 1992. Non-Lieux. Introduction à une anthropologie de la surmodernité. Ed. Dominique
Rolland. Milano: Elèuthera.
Bateson, G. (1972). Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology. University of Chicago Press.
De Carli, C. 1982. Architettura: spazio primario. Milano: Hoepli.
Heidegger, M. 1951. “Bauen Wohnen Denken”. In Vorträge und Aufsätze (1936-53). Ed. Gianni Vattimo.
Milano: Mursia.
Moatti et Riviére. 2009. Moatti Et Riviere La Promesse De L’Image, Images En Manoeuvres.
Zambrano, M. 1986. De la aurora. Trans. Laurenzi, Elena. 2000. Genova: Marietti.
Lucilla Zanolari Bottelli Italy
467
(Re)scripting and fabricating
a Critically Responsive
Architecture
Jan Slyk
Warsaw University of Technology
Faculty of Architecture
Poland
Interactive Shadow Play
As Lisa Kennedy remarked: “So sci-fi is really a candy store for someone with imagination. Real science fiction always has a platform of truth. The best is really about the preternatural, and the worst is the kind where there’s no science to the fiction (…)”.
Undertaking works on creating life scenery of year 2054, Steven Spielberg gathered
huge intellectual capital. He invited fifteen experts to the hotel Shutters in Santa Monica. Stewart Brand, Douglas Coupland represented humanistic approach, emerged
from their writing experiences, Jaron Lanier, one of the virtual reality research pathfinders, shared his experiences from cyberspace exploration, Neil Gershenfeld, chief
of Center for Bits and Atoms MIT introduced newest achievements and products of
high technology. What is most interesting – substantial part of think-tank works constituted spatial issues. Peter Calthorpe presented urban outlook, and William Mitchell
– architectural.
Debaters inquired how technology will influence changes in the world around us.
Concepts and hypotheses developed during the discussion were used over the work
on the movie. It is difficult to guess if the milieu presented in the “Minority report”
describes our future. Anyway, it seems to be a possible version of the incoming days
because of not escalating extreme ideas. It respects human habits and sentiments.
Allows the watchman in the cyber jail to enjoy kitsch lamp. And Lamar Burgess –
president of PreCrime – to live in traditional Victorian style house. Cyber-transferred
elements of space are mixed with conventionally formed, aiming at expanding and
functionally improving the environment.
GAP store, in which Anderton buys clothes, was organized according to ingrained
standard of shopping. The client has to enter, touch the garments placed on the hangers, try them on. The difference is in discretely situated cyber upgrades. Store recognizes the client, remembers his likings and disliking, nonpersistently suggests solutions and asks questions about impressions. All these happens thanks to sensors and
signal emitters associated with interactive system. Scenery of “Minority report” can be
named as a composite environment embracing real elements – supporting human
senses as well as cyberspace components, which raise efficiency of perception. What
is interesting, many elements shown in the movie have been realized until now.
Technical equipment of the environment in which we live has a significant influence on architecture. Fulfillment of human needs demands rational action taking advantage of the actual capabilities, with the minimum of cost. Physical building effect
arises as a result of negotiations between aspirations and limitations. Among the last
ones we increasingly notice reduction of accessible resources – natural, spatial and
higher density of tasks affecting reduction of human’s operation capabilities – in timing, physical and emotional aspects.
Information revolution creates a chance to overcome barriers in several layers. It
allows to obtain more effective solutions according to previously known scenarios
(to optimize materials and energy expenditure, organize space in a rational way). It
provides global coordination (concerning compatibility with the newest knowledge,
collision-free issue, cooperation). It opens the way to physical enlargement of the architectural activity area (through exploration of the expanded and virtual space). It
changes interpretation of time and spatiotemporal phenomena (thanks to parallelism
of the processes, interactivity, continuous actualization).
472
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Starting from observations from the field of psychology, Antonino Saggio creates
the image of contemporary architectural environment, which consists of the spheres:
objective (materiality) and subjective (space). The first one is based on a Cartesian set
of notions and measurements. The other one – requires consideration of individual
features of the context.
Intensification of information structure, which is a result of widespread computerization, extremely strongly influences interpretation, it is capable to create conditions
of perception skipping material means.
During the creation of architectural works we transform physical components of
the world. Essence of the creative intervention as well as certainty of the communication with the receiver – remains still the form of space shaping, or in fact – information
about the form. The continuously variable environment of global connections network marginalizes real things and highlights the significance of relation. Precise tools
enable to embrace in the package of bits almost complete definition of architectural
object, which includes not only physiognomical characteristics but also description of
use conditions, history and range of variability.
When, in pre-Christmas season I was walking through the shopping mall hall my
sight was caught by a group of people performing very odd body gesture. They were
behaving in an organized manner and – at the same time – in a way completely detached from the context. When I approached, I saw a computer screen and control device above it.
Multi-sensory access to virtual reality is a challenge, to which contemporary technologies haven’t successfully answered yet. Still - thanks to, inter alia, game consoles they relatively shortened distance between computer and its user. Power Glove made
by Nintendo enables viewing your own hand inside the space of the game. Wii controller, which contains responsive triaxial position sensor - strengthened potential
connections linking algorithm realized by the computer with user’s behavior. Grasped
in a hand - it was transformed into tennis racket or golf stick. Fastened to an ankle - allowed to score a goal vs Barcelona in the scenery of Camp Nou.
People who intrigued me jumping before the screen - were using Kinect driver
- even simpler and more versatile. IR camera was scanning chosen area and specific
software reconstructed the arrangement of the objects in space. Thanks to efficient
procedure enabling shape recognition, people gestures and relations between participants of the game - became information understood by the computer. For system
was reacting in a real time to a behavior (voice, movement, contact with elements of
the projection) - participants of the game soaked into virtual world losing contact
with the real environment.
Developing computer devices, improving mechanisms of interaction - we change
ourselves. It is perfectly visible in comparing the behaviors. Mature members of the
society do not trust sense amplifiers. Contact with computers, numerically controlled
machines and particularly - interactive devices - they treat as a challenge, a task requiring procedure compatible with the manual. Subconsciously they divide the process into components, address functions and hierarchy to each stage. Children - just
the opposite - they embrace wholeness of the new environment, learning through exploration. How often we observe irritation of the preschooler vexed because electronic toy “does not do” something. In initial assumption a five-year old treats information
infrastructure as an organism capable of anything, self-learning, reactive.
Jan Slyk
Poland
473
Fig. 1
Wooden skin of the pavilion under construction.
Gilles Deleuze explains changes in world perception through comparison of a society of discipline and a society of control. The first one refers to generation formed by
industrial era - linear and direct. Interpersonal relations and interaction with environment are described by the set of rules. A member of the discipline society participates
in the processes by acceptance or rejection of the roles, using rules and taboos. He begins and finishes stages of activity (education, work, leisure). He uses machines which
he fully controls till they are broken or sabotaged.
Society of control is our present day. Contacts have here non-linear character parallel and relational. Principles and behavior patterns are replaced by passwords
and codes providing access to particular contacts and functions. Position in global
network of connections depends on fullfilning certain conditions. Man of control does
not use machines dedicated to specific tasks, he uses computers and Internet. Contact
with them does not consist in management, but interactive control.
Architectural communication changed in the age of information technology. Materiality of space as well as human responsiveness are exposed now for the impact of
digital messages and representations. Mixture of physical and virtual components
construct augmented three dimensional context. Multiple interpretation came to architecture in the sense that exists in music (Mitchell, 1998).
As a replacement for definite building image new architecture offers numerous
alterations dependent on parameters. Environmental constraints as well as user demands influence spatial solution in the real time. Interactivity catalyzes architectural
creation and perception (Saggio, 2010).
Since interaction became a component of the message – architects started to develop creation techniques embracing feedback features. Consequently – architectural
474
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Fig. 2
1st phase accomplished: skin of the box waiting for the interior.
education needs to address interactivity as it complements contemporary professional workshop. However, creating interactivity differs from traditional shaping space as
it is forming process rather than any permanent structure. It demands programming
and interdisciplinary knowledge.
ASKtheBOX pavilion described below came into being as a learning environment
built by students to experiment with interactivity. Implemented methods tend to integrate any components needed (space, material, process, program). However – infrastructure and tools are limited to allow full understanding and participation. During
two weeks of intensive workshop the entire installation was designed and constructed from scratch. As interactive “instrument” – needed tuning that took place before
the exhibition.
Background
Architecture for Society of Knowledge – ASK – is a new form of master studies in
English at the Faculty of Architecture Warsaw University of Technology. The concept
Jan Slyk
Poland
475
Fig. 3
Digital model sliced for fabrication.
Fig. 4
Assembling digitally-cut styrofoam elements.
476
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
of the studies profile prepared in 2009 by professor Stefan Wrona’s team received financial support from European Union resources. Since September 2010, group of 24
students is trained taking part in classes developed particularly for this program. ASK
profile was built around the problems related to information technologies development, their public and social results and consequences affecting architectural activity. During the studies several problems are discussed, in particular: issues of digitally supported creation, algorithmic and parametric design, digital prototyping and
fabrication, simulation, optimization, construction of integrated environments for
knowledge acquisition, building information systems, mobility, robotic structures.
Application of new techniques and technologies takes place in didactic atmosphere
ingrained in tradition of Warsaw School of Architecture. Hereof arises form of classes
and method of selection of subjects – creating a bridge between valuable achievements of the past and chances, which are uncovered by the future.
Experimental projects constitute an important component of the ASK concept.
They interweave during the studies with traditional design classes. They allow to focus the attention on the research method. They require to define observational environment (laboratory), to perform the tests and experiments, demand objectification
of observations. Design techniques, spatial result, simulation of the final target state,
functional remarks – co-create background conditions for the architectural experiment, which is a subject of the semester work for students.
Pavilion was built at the Faculty of Architecture Warsaw University of Technology, in
the classes room named for Professor Stefan Bryla. It was opened for visitors since 19
April 2011. It was developed as a result of experimental project course as a part of the
design studio conducted by Jan Slyk for second semester ASK students.
Pavilion ASKtheBOX serves to perform the experiment through separating a place
in the space. Place – which state and behavior can be modeled and transformed. Constant elements of the endowment determine conditions of usage. Minimalistic infrastructure collect and generates the signals. Control program coordinates sequence of
events in time and space.
In order to provide functionality enabling manipulation without programmatic interference, pavilion was equipped with “control panel” enabling access to internal regulation mechanism. ASKtheBOX is not only design of space, but Project of the process
in which author, or even user – is able to define individual environment of interaction.
Space
A wooden plate box (2,5 x 2,5 x 3,75 m) was built at the beginning as the limit of
space and support for any inside structure. Essential demand for the interior was to
stimulate users to undertake various actions (move, sit, stand, dance, touch elements
etc.). Taking series of images we researched on how people explore empty space.
Multilayer still-frames gave us grounds to bore the cave which became the idea for
interactive space.
A three-dimensional model came into being through several alternative procedures:
traditional sculpturing in clay + 3d scanning, shaping voxel space with haptic devices
(SenSable 3D Phantoms), 3D computer modeling based on the ergonomic measureJan Slyk
Poland
477
Fig. 5
Cardboard laser-cut model preparation.
Fig. 6
Pavilion model in 1:10.
478
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
ments. As it was finished a 1:10 scale model was prototyped and the fabrication process started. The complex curve shape needed layering and division. Computer – controlled thermal machine cut out over 500 elements that were labeled and mounted.
Interaction
ASKtheBOX plays (creates) music that corresponds to human behaviors inside. We decided to keep installation as simple as possible – therefore no motion detectors and
other sensors were installed. The pavilion applies projection mechanism as camera
obscura does. Strong light source generates user’s shadow at translucent back wall.
The image is then grabbed on the opposite side by simple USB camera. Sequence of
images transmitted to the computer managing the system contributes coded information about movements inside the box. Music played as output is composed in the
real-time.
People visiting the pavilion start from getting familiar with its functionality, investigate possibilities. After that they make more conscious choices. Calibrating their own
sound environment and try to move within it composing music...
The basic procedure of output production depends on predefined acoustic loops.
Samples prepared to be played simultaneously are stored in the memory of the system. The moment shadow analysis detects signal, a particular loop is switched on and
lasts as long as the user remains position. In this mode ASKtheBOX program works like
musical sequencer: synchronizes loops controls delays, easy in and out effects.
Besides, pavilion generates its own music. The program contains module producing MIDI sound. Parameters taken from shape recognition are transcoded to control
MIDI (frequency, time, modulation etc.) There is no functionality generating global
structures like harmony or form jet but the system is open to absorb composing algorithms as David Cope’s ALICE.
The first procedure (sequencer) allows to strictly control the rules but limits the variety of output. The second (generator) gives back unlimited sound diversity but demands highly complicated managing mechanisms.
Strong correlation between space and music is not accidental here. And it is not only
output format that benefits from musical methodology. Long time before computing became crucial technique for organizing processes, music researched algorithmic
composition using analog or semi-analog tools. You can find signals of this direction
in medieval Guido d’Arezzo Micrologus and in Mozart’s Musikalisches Würfelspiel
(where composer sets rules and playing dice contributes needed entropy).
Digital representation is for architecture not a path of choice but a natural consequence of common use of CAD. Each design project prepared with the use of computers is realized in the virtual environment. We can ignore this fact or try to exploit it. In
basic aspect – as a cheap research polygon, which – thanks to suitable tools simulates
future building effects. In wider view – as an individual environment of expression.
Creating objects, which have digital representation we obtain a chance to publish not
only their aesthetics but also functionality. Thanks to the Internet and virtual space access devices users “inhabit buildings” remotely, in parallel time, when they are completed, or during construction.
Jan Slyk
Poland
479
Fig. 7
ASKtheBOX sensitivity: function controls (red) and behavior switching
spots (green);
Fig. 8.
Playing with ASKtheBOX – recognizing behaviors and composing music.
480
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Architectural work gains in the digital representation aspect of execution. According to used means – it generates signals transferred to several senses or complex simulations. Among the last ones – flowing VR visualizations, which are capable to create
impressions congenial to natural and automatically fabricated prototypes. Thanks
to digital media functionalities, the method of architectural work notation is getting
closer to music script specificity. The composition of space, just like musical composition – defines boundaries of freedom determined by the author. Electronic realization
ensures multiplicity of representations and generates space for interpretation. Through
opening the access to tools for space perception simulation even buildings not existing in real world for years, obtain new performances, which constitute object extension or its parallel.
Society of control, thanks to digital representation can change the character of interaction with spatial work. It will be able to create and use objects not only in linear
sequence of experiences. Through telepresence, remote participation and advanced
means of communication will gain skill to “steer” architecture, so to interactively modify its forms and usable scenarios. It is also possible, that the relationship between architect and client will be changed. Place of industrial specialization will be taken by
integration of competences. Prosumer will obtain ability to independent creation
(control) of buildings. Architect will take over a role of coordinator and manager in a
process of digital creation and fabrication.
At this point I just want to bring the example of John Cage’s Imaginary Lanscapes
from the fifties. They belong to the movement called aleatorism that gave much
more freedom to the performer than it was permissible before. Prepared score defines only components. Real form appears during the concert. Blocks combine global
image on the base of performer intuition, audience reactions, a particular moment in
time.
I have the feeling aleatorism influences architectural creation now. The digital
structure more flexible than traditional tectonic substance allows to control architecture rather than constructs static designs.
Process
The essence of design – as typical for IT-filled architecture – is not the project but the
process (Oosterhuis, 2007). ASKtheBOX depends on the program that runs installation. Each component – physical shape, spatial environment, projection mechanism,
computer infrastructure takes part in a cyclic process of interaction. The way installation works is defined within system procedure that is individually designed and
coded.
A processing environment was implemented to combine different aspects of installation. Images coming to the computer are simplified and interpreted. Matrix of
36x24 pixels allows to follow the movement. The grey contrast factor (darkness) contributes information about the presence of the user and the distance from the screen.
Particular image sectors are defined to be active. Detected grey amount in the sector
turns on the acoustic component. “Analog” music – comes from an outside application
(Logic) as synchronized loops. “Digital” music is generated within a processing program (as MIDI).
Jan Slyk
Poland
481
Fig. 9
Interior of the pavilion: light projection and translucent responsive screen.
Fig. 10
Installation elements: pavilion, external sensor, controlling unit with user interface that allows
managing interactivity.
482
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
User interface prepared for ASKtheBOX allows to model interaction scenarios
outside programming core. Active sectors of the screen are presented as “drag and
stretch” mouse sensitive objects. Music content depends on configuration sheet. Even
an unprepared user, contacting installation for the first time can tune its behaviors
and change interaction rules.
Conclusions
What did we learn experimenting with ASKtheBOX?
First, we explored a new medium of architecture. A medium that Bill Mitchell described as augmentation of space that changes rules.
According to the typology given by Lev Manovich (2001) the new medium is obviously: digitally represented, modular and automatic. What stroke us much more - it
is also variable and transcoding. After the insertion of information genes - space become undefined. It is the population of possible physiognomies rather than the finished form.
The medium itself translates signals. Location in space, image, music data are on
the basic level identical. By transcoding information we are able now to switch between different languages of artistic expression.
Design object changed. As Kas Oosterhuis underlined - we do not design objects
anymore.
Through the evolution of interests we focused on the subjects which are in most
cases processes of getting shape, functioning, falling apart, recycling… To design the
process ability of programming is needed. As a matter of fact - this new competence
determines participation in the process from both sides - author’s and user’s.
As previously proved in XX century music - the role of artist changes. Passive perception does not satisfy prosumer foreseen by Alvin Toffler (1989). Men of the age of Information customizes the environment. It starts from a smartphone screen and ends
in a Concert Hall. Architecture is more and more exposed on individualization. As it
absorbs information infrastructure the control gets fragmented.
Forming final shape of architectural objects proceeds in an interactive process. The
role of the architect is to coordinate the process and to establish rules.
A so called user interface is needed to allow manage interactivity without entering
scripting level of the structure.
The information environment changes the sense of time so it affects perception of
space. Analytical process of building solutions adequate to conditions, which Saggio
mentions as a foundation of twentieth century methodology, does not provide the validity of an inference any more. The multithreaded and unstable context of the digital
reality becomes a matter of choice and loses its objective meaning.
Interactivity turned to be today a dominating feature of the environment and the
interaction – main technique of exploration. We are not satisfied with the frozen part
of perception. We are also not able to embrace the moving whole. So we download
next portions of information in a contextual series of decisions. We develop, verify, actualize and change the picture of phenomena constituting the subject of cognition.
Thanks to the hypertext, use network access windows and – with time – also telepresence, we move inside the discontinuous digital environment.
Jan Slyk
Poland
483
Thanks to computers building structures can react immediately and the designer
obtains control over the interaction scenario (algorithm), which he forms similar to
shape the space. The perception of traditional architecture was based on the combination of sensory impressions and interpretations of the context (real and intellectual). Perception of interactive architecture requires consideration of the ambient conditions dynamics.
References
Manovich, L. “The Language of New Media”, MIT Press 2001.
Mitchell, W. J., “Antitectonics: The Poetics of Virtuality” [in:] The Virtual Dimension: Architecture,
Representation, and Crash Culture, (red.) J. Beckmann, New York, 1998.
Oosterhuis, K. Lenard I. “Mission Statement ONL” [in:] Spiridonidis C., Voyatzaki M. “Teaching and
Experimenting with Architectural Design”, Thessaloniki 2007.
Saggio, A. “The IT Revolution in architecture; thoughts on paradigm shift”, New York 2010.
Toffler, A. “The Third Wave”, Bantam Books, 1989.
484
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Alexandros Kallegias
University of Patras
Greece
Machinic Formations
/ Designing the System
A revolution is currently under way in design-based disciplines such as industrial design, mechanical engineering, and many others, caused by the introduction of digital
fabrication technologies to facilitate prototyping processes. Research projects are being conducted in multidisciplinary fields as well as across different scales. They extend
from the production of human tissues through the printing of stem cells, rapid prototyping of bone prosthetic parts, food printing through the use of edible ink, 3D printing of textiles, furniture and other objects, to the development of contour crafting
technologies to print concrete buildings. Digital fabrication is rapidly becoming part
of many processes from conception to materialization of physical artifacts at any scale.
In architecture, CNC fabrication equipment has given designers unprecedented
means for executing formally challenging projects directly from the computer. Yet the
impact of digital production in Architecture goes far beyond the mere production of
complex geometries. As CNC technologies become increasingly available outside the
larger manufacturing industries and information travels at a planetary scale, the physical production of design ideas is becoming distributed and localized. The growing
number of fabrication shops, together with an emergent and growing DIY community
dedicated to the construction of personal CNC equipment, demonstrates that the development of smaller, cheaper, faster and more versatile machines (desktop manufacturing, Fab@home, ReP Rap and others), is happening at enormous speed. Indeed, it
indicates that the fabrication of designed goods could soon happen in local fab centers or directly in people’s homes.
The possibility of creating mobile fabrication centers suggests yet another dimension. The deployment of fabrication equipment on to a specific site, the possibility
of building in remote locations which have no infrastructure as well as serving communities that are part of social layers still disconnected from technological advances
arise. While traditionally these challenges have been addressed with standardized,
prefabricated solutions, mobile CNC equipment enables architects to think beyond
the constraints of these production modes. It is now possible to conceive customized
prototypical architectures, which can be adapted to differentiation in various inputs
and outputs, distributed across global networks and built in different parts of the
world.
In parallel, the ongoing shift towards customization of computational design
methods through the development of scripts and algorithms in the academic and research environments is impacting architectural practices, enabling the surpassing of
traditional CAD tools and causing a fundamental shift in the architectural design process. Architecture is liberating itself from limitations imposed by software and is beginning to operate in a territory where it can control both its own digital design tools and
the corresponding technological solutions.
Equally to how pre-packaged CAD platforms are being updated or replaced by
customized scripting tools (open source software), it can be predicted that CAM environments and CNC machines will undergo a similar shift and be supplanted by more
open hardware solutions. The rapidly growing number of digital design practices and
their increasingly challenging production demands indicate that widely available and
sophisticated production facilities may soon be in high demand. Architects should
formulate a critical position on the status and characteristics of digital fabrication
methods that are currently in use and investigate the potential of producing highly
mobile, highly flexible, and customized fabrication apparatuses.
486
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
Machine-Building Relation | Role of the Machines
The way by which machines have been perceived in the architectural discipline
throughout history has constantly been changing with the advances in technology,
from the medieval castles perceived as defense mechanisms and the poetic imaginations of the Renaissance period, to modern achievements in the age of fast paced
technological development. If we consider the notion of the machine in the mid 20th
century, we can distinguish two approaches in the perception of machine as part of
architecture. The first approach is pioneered by Le Corbusier’s pragmatic and realistic understanding of a house as a machine for living1, while the second approach can
be exemplified by Archigram’s utopian view through projects like Living Pod (David
Greene) or Walking City (Ron Herron)2.
Robotics in Contemporary Architectural Discourse
The initial concept of the machine has been developed into a more elaborate approach as a result of the ongoing technological revolution in digital fabrication processes during the 21st century, currently underway in a number of design-driven
disciplines. As such, architects are being given the tools for production of complex
projects directly from the computer throughout CNC fabrication equipment. This has
led to increase in usage of numerous machinic systems in construction processes in
the last decade. Increasing research on applied robotics and robotic processes in both academia
and architectural practice continues on the research pioneered by
the likes of Nicholas Negroponte
and Architecture Machine Group
at MIT (SEEK)3 (Fig. 1), Cedric Price
(Fun Palace)4, John Frazer (An Evolutionary Architecture)5. These
projects have also dealt with the
questions of the interaction between the machines and its environment and how people can afFig. 1
fect such systems.
Industrial Robots and Digital Materiality
More recent developments incorporate architectural processes involving the use of
industrial robots in fabrication processes, which are being developed and applied
across the globe in research institutes (Grammazio & Kohler (Fig. 2), MIT, Stuttgart
Institute for Computational Design), and by numerous architectural practices (SPAN,
Greg Lynn Form, Robo-fold (Fig. 3), Supermanoeuvre). However, in all of these cases
the fabrication process itself can be described by linearity, where the robot itself is
confined only to its productive mode; thereby neither acting as a sophisticated tool
for carrying out the architect’s ideas nor producing a real interaction with the environment in order to influence the design process.
Alexandros Kallegias Greece
487
Fig. 2
Fig. 3
Open Source and Custom Robotic Tools
Parallel to the use of existing industrial machines, open source robotic platforms, such
as the Arduino controlled robot named SERB or other custom built or ‘hacked/modified’ versions of existing machines are finding their way in the process of digital fabrication. The method of ‘hacking into’ robotic devices enhances their performance in
terms of sensory capabilities and ability to send feedback to the user or the system
which controls them. Robots can become aware of their surroundings as well as communicate with other robots in the same form as the one found in nature’s swarm systems (i.e. swarm of birds, ants, fish). In other words, robots can become, and are becoming important part of the development of the discipline of architecture.
Robot | The Body, the Mediator, the End Effector
The conclusion which can be drawn from the above discussion is that, according to
the ways robots are used and envisioned within the architectural discourse, they can
be predominantly divided into three main categories:
1. The end effector - Machine is used as a fabrication tool which executes programmed commands with the goal of producing highly sophisticated products.
2. The mediator - Possessing a certain degree of intelligence and sensory abilities,
machines can provide feedback and act as an interface between the user/mainframe and the architectural object.
3. The body - Machine is the architectural product itself, whether the building is embodied as a machine and functions as one, or the machine/robot is inseparable
part of the building’s tissue.
As mentioned in the beginning, the deployment of fabrication equipment on to a
specific site also provides the possibility of having the machines/robots present on
488
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
the construction site, even after the construction process is finished. This allows us to
extend the process of fabrication from initial ‘birth-giving’ process to the whole lifecycle of the building and allows for the state of constant (re)production, the state in
which machines create a unique symbiotic relationship with the architecture. As an
outcome, the concept of constant construction gives us the architecture in the state of
ongoing evolution, re-adjusting to changing requirements.
As shown on the diagram (Fig. 8), standard mode of fabrication is a linear one,
where the machine executes preprogrammed commands, and has no further connection to the building. On the other hand, having the machine as a part of the architectural project opens the possibility of achieving the machinic-driven evolutionary process, which would span throughout the whole life-cycle of the building.
Roboteque | Participation and Man Machine Collaboration
Roboteque is a team-based design research project completed at the Architectural Association School of Architecture Design Research Lab (DRL) during 2009-2011, awarded with a Master of Architecture and Urbanism degree. In the light of the above discussion, Roboteque’s intention is to challenge the linearity of architectural production
process and fixity which dominates the world of built architecture, through the use of
machinic systems which would reconfigure the higher system they inhabit.
The challenge of reconfiguration presents a crucial step in Roboteque research;
what is activating the changes in the system? In this respect, the possibility to bring
the man and the machine together in the process of production/reconfiguration becomes important, since man can be seen as one of the key actuators for the process
of reconfiguration. By his actions, man can trigger the process as well as influence its
course.
This notion of collaboration has been approached in the past by architects such as
Yona Friedman who suggested that the user would interact with the system through
a Flatwritter keyboard.6 This system however, was dependant on a supporting infrastructure network, and dealt with the set of predefined elements.
In another example, the collaborative project of Cedric Price and John Frazer (Fig.
4) was about a system that had some rudimentary self-awareness and knowledge. The
communication between the man and their project’s system was on the level of the
interface, where the machine would take the already mentioned ‘mediator’ role, and
the machine would collaborate
in the design stage. On the other
hand, project SEEK by Nicholas Negroponte and The Architecture Machine Group presented in a way an
experiment on what could be compared with the real-time responsive construction process, where
the robotic arm would respond
to the change in the environment
made by the gerbils who would reshuffle the placed blocks.
Fig. 4
Alexandros Kallegias Greece
489
Fig. 5
Fig. 6
Fig. 7
490
As the ability to transform
and to reconfigure space constitutes one of the basic ingredients of this project, Roboteque seeks to enable the
reformation in response to
both environmental changes
and time-based shifts in human
activities. The challenge is to
infuse architectonic space with
literal transformation capacities, in order to close the gap
between the stable physicality
of space and the constant flux
of human activities existing in
it. As such, in contrast to the
precedent mentioned projects,
the ambition is to implement
a fully adaptive construction
logic that follows a human/surroundings-responsive design
process in order to allow for a
structure that can be reconfigurable throughout its lifecycle.
Taking this goal into consideration, the research has
focused on an unfixed architecture; an architecture that is
made out of parts which can
be attached and reattached to
form different structures. This
notion of detachable components directs our study towards
construction
methodologies
with no use of mortar, connection or scaffolding (Fig. 5).
Within the discourse of architecture, there are various
architectural precedents dealing with this type of construction. These projects can be
described as structures that
take into consideration gravitational forces and follow the
load sharing logic to achieve
structural balance. Looking at
historical paradigms in archiENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
tecture, the successful application of this building method becomes evident; from
examples found in very early settlements such as the Mycenae in ancient Greece, in
India, Cambodia and Ireland to civilizations such as the Aztec, Incas and the Maya. A
later instance of this construction style can be found in south of Italy, where dwelling
units were built without any cement or mortar, thus avoiding taxation and being able
to be disassembled and reassembled in a different location. The availability of calcareous stone in the area allowed the formation of structures by simple superimposition
of equal stones on top of each other. Stability was achieved thanks to the geometry
of the house which was always a combination of cylinders and cones; it didn’t require
any advanced knowledge apart from an empirical understanding of balancing aggregates. Similarly, Gothic constructions with the use of flying buttresses and ribbed
vaults follow the main principle of decomposing forces into a system which remains
in equilibrium. Nevertheless, in Gothic architecture one can see a complex combination of the existing technologies of that time, while today’s existing technologies
can achieve complexity through the use of an even more straightforward structural
methodology.
Therefore, considering re-configurability as the main goal of this project, the research concentrates on the logic of stacking material in order to achieve stability,
whereby the design is focused on pure compression structures which offer more fit
design possibilities for our system.
Roboteque | 3D Cellular Automata
Since Roboteque’s design system aims to transcend the construction style of the past
and to have a creative system that can be adaptable in real-time alterations, the research goes beyond empirical knowledge of aggregate formations and makes use of
computational tools which would allow complete control of the digital design parameters that are characterized by continuous change.
Searching for a computational logic which would be able to perform complex
calculations from a set of simple rules, Cellular Automaton was selected as a suitable system, due to the CA’s ability to produce elaborate structures given very specific
instructions/rules to follow. Thus, by using simple rules in a Processing script, it was
possible to reproduce and study specific CA rules, such as rule 30, rule 90 (Fig. 6), and
rule 110.7
While earlier studies in CA tended to create patterns that could be used as the
project’s design grammar, soon it became clear that these patterns should have additional properties. The first additional property that the digital setup should incorporate is the 3rd dimension. The rules must be applicable to a 3D grid of cells, where every cell represents an aggregate’s position. The second quality of the code should be to
enable dynamic intervention between a pattern’s intervals. As such, at any point of a
pattern’s formation the user should be able to interfere by introducing his/her choice
resulting in an altered and more user-related end form.
After recreating some of Wolfram’s CA algorithms, a series of different rules were
implemented so as to generate patterns for the unit’s structure. A special class of CA
known as totalistic CA was used to make a set of three-dimensional digital models
since totalistic CA allows for more elaborate rules pertaining to the concepts of neighborhood relations by checking the sum of the values of the adjacent cells (Fig. 7).
Alexandros Kallegias Greece
491
Roboteque | Intervening in the Process
The next step in the digital computation setup was to have different set of rules interacting in the same code while maintaining the option of interposing in that process.
In this way, even after the effectuation of the user‘s initial choices, the shape could
still be modified and made adaptable to the user’s new preferences. In the following
examples (Fig. 8), one can see the very subtle changes that take place as the user interferes with the formation process.
In the following computational models, the change becomes more apparent and
more specific as the intervention is precisely targeted. In the two cases on the right,
Fig. 8
Fig. 9
492
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
the user interferes with the formation process either by adding extra seeds in the system or by adding seeds and removing existing elements from the form (Fig. 9).
The investigations presented aims to provide knowledge in terms of form-finding
and pattern-making methods that make use of distinct structural elements. These
experiments also help to reach the most suitable geometrical shape for the project’s
material system. After deciding that the normal orthogonal is the most fitting shape
because it allows for both multi-directional aggregation as well as repositioning, the
next question poses itself as what are the appropriate proportions of the system’s aggregate. Eventually, the problem becomes a case of choosing between a cube-shape
element and an orthogonal with one of its sides elongated. The brick-type or jengatype component provides more possibilities in terms of design patterning (Fig. 10).
The main reason is that these proportions allow for better overlapping of individual
components. In other words, the length of this kind of element when it is about twice
its width, allows for it to be laid bonded in a structure to increase its stability and
strength.
In addition to the proportion of the structural element, another crucial factor has
been the scale of the component. The variation in scale relates to the size of the working
force that is handling the component. In the case of a human worker handling it, for efficient layering the brick-type
aggregate should be small
enough to be picked-up by
hand. A bigger size working
force such as a crane can handle brick-type boxes 20 times
bigger than a regular brick (i.e.
containers). In the case of the
project’s working machinic
system, the optimal dimensions can be found in between
Fig. 10
the previous two extremes.
Roboteque | Machinic Intelligence
Considering the requirement of the on-site fabrication process, as well as the requirements of this proposed system, it has become clear that most of the existing on-site
fabrication techniques posses a number of limitations. 3-axis machines used for contour crafting and large scale 3D printing, which could be considered as material depositing machines are limited by only vertical deposition of the material and fabrication speed.
On the other hand, the 6-axis industrial robotic arm presents a high-precision machine which, thanks to its number of operational axes has the ability to place or remove
the material in the way that other existing machines cannot. Since Roboteque’s system
demands a tool which is able to deposit material in different sequences, and due to
the delicacy and demand for high precision in the formative process based on dynamic
stability, in the further research this type of machine has been the main focus.
Alexandros Kallegias Greece
493
Fig. 11
As an initial experiment, the research team built a physical model with the purpose to gain a complete overall knowledge of what is needed in order to make the
system function successfully in relation to reconfiguration, and man-machine interaction. This model consists of an Arduino controlled robotic gripper and elbow, in conjunction with the CNC machine, which all together forms a 5-axis system. Using this
system, it was possible to conduct series of different tests, utilizing the CNC machine.
The gripper itself has a rotational axis in the head part and an elbow which allows 90
degrees rotation. Attaching the gripper to the existing 3 axis system enabled the control of 5 axis in total (Fig. 11). This system allowed to automate the process completely
in which, until now, cubes were being hand placed instead of an automated assembly process. However, having basically 2 separate machines, gripper + elbow and the
CNC machine working as a single system, it was not possible to control both parts of
the system from the same source. This condition meant that the CNC was controlled
through its original software, while the gripper was controlled through Arduino.
As a further development of this project, the research has been directed towards
introducing an additional level of intelligence in the machinic system. The aim is to
extend and refine Roboteque’s proposal by improving the performance of formation
machines with responsive technologies (sensors / control systems / actuators). Thus,
in an effort to produce a system that reflects the technological and cultural conditions
of our time an investigation has been carried forward in order to gain a better understanding on the technical part of our system.
Initially, the explorations have
been targeted to the possibilities
of ‘hacking into’ a 5 axis toy robotic
arm. Thus, a light sensor and a webcam were attached to the arm, and
that gave inspiration to take the next
step and have better control over the
movement of the arm (Fig. 12). By
using Arduino, the robotic arm was
liberated from constrains of using
batteries and the default remote controller. In addition, the Arduino board
enabled the control of different sensors and also allowed for a more inFig. 12
494
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
teractive process. In this way initial knowledge and understanding of the robotic arm’s
mechanical system began to built-up. However, in order to obtain more accurate control and achieve intended results Roboteque’s research moved beyond the limitations
of this particular machine.
In the following experiment (Fig. 13) the possibility of interaction / responsiveness between machine (the system) and environment/sign was tested. Here, the
same available tools as in the previous experiments were used, with the addition of
the newly acquired gripper, which gave us control over 2 additional axis. The wrist was
programmed in a way so its actions are controlled by the light sensor. According to
the light values read by the light sensor, the angle of the wrist rotates for 15˚, 30˚and
45˚. The differences in the light values are consequence of the distance between the
sensor/arm and the light source. This experiment aided to test the dialogue between
environmental conditions and the final result. Two lines/layers were created in order
to explore the effects that would the first layer have on the changes of the light values and in that way influence the shape of the second wall or the second layer of the
same wall (Fig. 14).
Fig. 13
Fig. 14
Alexandros Kallegias Greece
495
The concept of the loop incorporates the notion of feedback mechanisms. In our
system, two attempts were made so as to get the return of information about the
result of the formation process. In this attempt, the aim was to overcome the delay
of the feedback in the system. A digital camera was placed to monitor the formation
process, which could provide the system with real time information. The input from
the camera was filtered by the software named Processing in order to track motions
and colors. In this way, it was made possible to read the boundaries of the actual construction in order to understand where a change took place. Then, that information
was streamed directly back to the modeling software. Here, although the whole procedure was faster, the quality in precision was insufficient.
The other attempt was by using laser scanning named David Scanning Device. The
device produces a green laser beam that scans the physical environment and then
generates a 3-dimensional digital version of that, on screen. This 3-dimensional model
was then analyzed and used for assessing the formation steps of the system. In this
method, the precision in terms of the outcome was inversely proportional to the duration of the process (Fig. 15).
After the consummation of the machinic experiments, the validity of the project’s
inception became evident. Indeed, one can move beyond existing top-down design
techniques that are governed by a central overview of the system with its subsystems
specified and foreseeable. One can now argue for having bottom-up architectural
methods that can be executed with the use of existing technology.
Roboteque | The Proposal
The architecture of smooth transitions and reconfiguration, being the premise of Roboteque, is also achieved at the level of the functionality of the spaces being created.
Reconfigurable architecture at the physical level is highly linked with different events
at the programmatic level. The architecture responds to the fast flow of events taking place on two levels, as the collaboration between the requirements of the human
being and the responsive optimization system. The requirements of the human being
include the pavilion’s temporality, spatial size, spatial composition, spatial characteristics (such as lighting, amount of opening), and requirement of changing. The responsive optimization system operates on the climate condition (environmental data), flow
of people through the site (human behavior), intervention on the construction process (human action), and the location of the pavilions (negotiation with the neighbors).
These two sets of data are gathered by the central brain and then transferred to the
robots implementing construction reconfiguration during the fabrication process. The
phenomenon of responsiveness occurs with two major triggers in Roboteque proposal, the human and the environment.
The main interaction between the human and the proposal takes place with the
online applet which the user goes through specific steps that result in the formation
of a pavilion. The central brain of the system gathers all the data fed into the website, evaluates it, and produces an optimized end-result that is constructed on site by
the robots. This process can be named as system-optimized reconfiguration. On the
other hand, the human can also interact with the pavilions on site, whereby one can
inform the robot of the formal changes he/she would like to achieve. The robot receives these signals with its sensor and recognition device and begins to implement
the changes.
496
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
The interaction between the environment and the proposal occurs
as the result of the environment acting as a trigger on the reconfiguration process. Climatic data relating
to the site is collected on a daily basis and fed into the central brain of
the system. As the central brain gets
the user choices and evaluates them,
it also receives the climatic data sequence, which affects the final pavilion output in terms of its enclosure/
opening level, and sectional thickness. The data collected from the
surrounding environment gives the
project its contextual characteristics.
As such, the pavilion formation
comes into being as a result of the
interaction of the user with the central brain through the online applet,
the interaction of the environmental
data with the central brain, and the
interaction of the user with the robot on a local level (Fig. 16).
Fig. 15
Fig. 16
Fig. 17
Alexandros Kallegias Greece
497
Roboteque proposes an “open social playground” in one of London’s tourist
hotspots. Discrete structural elements are constantly positioned and repositioned in
a range of different configurations to form a series of social pavilions in the area of
Southbank next to the London Eye, the Dali Museum and the London Aquarium. The
aim is to intensify the level of activity in that area and accentuate the capabilities of
spatial modification by responding to the people who are visiting the site.
Being one of the most attractive locations in London, the site is surrounded by various types of important institutions, such as touristic attractions, cultural, educational,
and governmental buildings. The site is located next to a very significant transportation hub, the Waterloo Station. Since it is positioned next to the Thames River, the site
also benefits from water transportation (Fig. 17). As such, as a result of being in the
center of significant landmarks that draw tourists’ attention, and benefiting from the
nearby transportation nodes, the Roboteque site will be in constant flux of pedestrian
flows that persist throughout the day. The dynamic flows created by humans are an
important ingredient for the Roboteque research proposal.
The prototypical aspect of Roboteque lies in its assembly processes, whereby the
potential user and the robot interact in the generation of specific units which are built
with a unique dynamic assembly procedure. As such, the way Roboteque goes about
implementing this scenario is by allowing people to influence the spatial formation of
the “playground” through the operation of the WIFI interface applet. More specifically,
a person expresses his/her spatial preferences through the software that is linked with
the fabrication process (Fig. 18). The given choices are first filtered by a central “brain”
that computes what can be realized or not according to the design’s primal system
configuration. When all the choices are entered and accepted, the fabrication begins,
where a group of robots on rails are put into motion so as to carry out the building of
the particular units, within their reach. The entire site can be constructed as the rails
are deployed in order to cover the maximum possible area using the minimum amount
of tracks. Then, within a specific timeline the constructions on-site can be transformed.
Transformation of architecture is done by using a stacking logic that always respects the structural and material balance of the entire structure. In order to achieve
Fig. 18
498
ENHSA - EAAE no 55 Rethinking the Human in Technology Driven Architecture
equilibrium, the built models follow specific formation patterns. By changing the orientation of the elements and fine-tuning the overall structure for stability, the system
is equipped with a family of different formation patterns that correspond to certain
spatial conditions, intended for the user to choose from. The amount of material that
is employed for the entire system is kept at a uniform level by shifting it throughout
the construction site during the transformational life cycle of the structure. In the end,
the whole system of Roboteque comes together a
Download