Uploaded by Bruno Forti

How could phenomenal consciousness be involved in mental function?

advertisement
This article appeared in a journal published by Elsevier. The attached
copy is furnished to the author for internal non-commercial research
and education use, including for instruction at the authors institution
and sharing with colleagues.
Other uses, including reproduction and distribution, or selling or
licensing copies, or posting to personal, institutional or third party
websites are prohibited.
In most cases authors are permitted to post their version of the
article (e.g. in Word or Tex form) to their personal website or
institutional repository. Authors requiring further information
regarding Elsevier’s archiving and manuscript policies are
encouraged to visit:
http://www.elsevier.com/copyright
Author's personal copy
New Ideas in Psychology 27 (2009) 312–325
Contents lists available at ScienceDirect
New Ideas in Psychology
journal homepage: www.elsevier.com/locate/
newideapsych
How could phenomenal consciousness be involved in
mental function?
Bruno Forti*
Dipartimento di Salute Mentale, Unità Locale Socio Sanitaria N. 1 Belluno – Regione Veneto, Ospedale San Martino,
Viale Europa, 22, 32100 Belluno, Italy
a r t i c l e i n f o
a b s t r a c t
Article history:
Available online 22 November 2008
In a phylogenetic perspective, the phenomenal and the functional
aspects of consciousness cannot be separated because consciousness, as a phenomenal experience, must be causally effective. The
hypothesis I propose is that the fundamental property of
consciousness consists of a self-organizing process: the differentiation of a content. The differentiation of a content occurs on the
basis of the relations internal to a representational whole, which
behaves like a field and tends towards a condition of equilibrium.
This hypothesis can be somehow considered an extension of
Gestalt visual perceptual theory. Unlike neurocomputational
processes, which are non-conscious and extrinsic to the representation, conscious processes are intrinsic to the representational
whole. Consciousness, as an intrinsically self-organizing process
interwoven with its phenomenal aspects, can be more than
epiphenomenal and it can be involved in mental function. The
paper then discusses the implications of this hypothesis for
subjectivity and the explanatory gap.
! 2008 Elsevier Ltd. All rights reserved.
PsychINFO classification:
2380
Keywords:
Consciousness
Differentiation
Functionalism
Gestalt theory
1. Introduction
We are undoubtedly conditioned by the need to make effective predictions about the world around
us. This need is the basis of a scientific approach to the outside world. In order to make effective
predictions also about the world within us, namely the mind, we have built what we could call Neurocomputational (NC) models (Churchland, 1989, 1995; Churchland & Sejnowski, 1992). NC models do
* Tel.: þ39 0437 516020; fax: þ39 0437 943510.
E-mail address: bruno.forti@ulss.belluno.it
0732-118X/$ – see front matter ! 2008 Elsevier Ltd. All rights reserved.
doi:10.1016/j.newideapsych.2008.10.001
Author's personal copy
B. Forti / New Ideas in Psychology 27 (2009) 312–325
313
not involve consciousness in their functioning (Overgaard, 2006) and they are based on serial or
parallel information processing programmes designed to deal with a predictable universe.
Even our approach to consciousness has been conditioned by the search for a NC consciousness, and
it has led us to see consciousness very differently from what it really is. A divide has emerged between
the functional and the phenomenal aspects of consciousness (Marcel, 1988), a dichotomy between
feeling (that characterizes the mind according to phenomenology) and doing, i.e. playing an appropriate causal role (that characterizes the mind according to psychology) (Chalmers, 1996). The result,
on the one hand, is that functional hypotheses have little to do with phenomenal reality. Many
functional properties, for example consciousness as awareness (Humphrey, 1983), meta-cognition or
higher-order-thought (Gennaro, 2005; Johnson-Laird, 1988; Rosenthal, 1995, 2000; Weiskrantz, 1997),
dominant focus (Dennett & Kinsbourne, 1992), access (Block, 1995), simulation of behaviour and
perception (Hesslow, 2002); binding or integration (Baars, 1988, 2002; Dennett, 2005; John, 2002;
Naccache, 2005; Schacter, 1993; Tononi, 2005), do not assist us in discriminating between what is
conscious and what is not (Chalmers, 1996; Marcel, 1988). On the other hand, phenomenal
consciousness, when its existence has not been outright denied (Dennett, 1991; Rorty, 1979), has
sometimes been considered as epiphenomenal (Chalmers, 1996; Jackendoff, 1987; Kinsbourne, 2000),
qualitative (Clark, 1992), ineffable (Dennett, 1988), irreducible because of its first person ontology
(Nagel, 1974; Searle, 2004; Toribio, 1993), beyond our cognitive capacities (McGinn, 1991), if not
immaterial (Bechtel, 1988). All these characteristics make it very difficult to attribute a functional role
to phenomenal consciousness.
Phenomenal question and functional question are actually two sides of the same coin. Since the
theory of evolution is hardly compatible with an epiphenomenal consciousness (Lindahl, 1997; Pauen,
Staudacher, & Walter, 2006; Searle, 1992), phenomenal experience as such must have a causal status
(Marcel, 1988). But epiphenomenalism poses another constraint, as phenomenal experience caused by
NC processes cannot have a causal status (Jackendoff, 1987). Whatever NC processes we may imagine as
giving rise to consciousness, the result would be epiphenomenal. Indeed, any solution we try to give to
the problem – which we may define as the double translation problem – leads us back to NC processes,
which do not require a conscious nature to have a causal effect on other NC processes. From this point
of view, consciousness may be compared to a computer screen. Why does a processor project an
internally codified image on a screen if, in order to elaborate it further, the processor has to translate it
back to the previous format? The image on the screen helps the human user to interact with the
machine, but it has no functional meaning for the actual processor operations (Bechtel, 1988).
The need to attribute a function to phenomenal experience thus poses a very difficult challenge to
any theory of consciousness (Morsella, 2005). A plausible approach may be to consider consciousness
as a specific self-organizing process which is causally effective on the NC system but not caused by NC
processes.1 Several authors have considered consciousness as a process (Edelman, 2003; Goerner &
Combs, 1998; James, 1890; Manzotti, 2006; Shanon, 2001; Varela & Maturana, 1980). However, the
main problem is to define the nature of the relationship between conscious process and phenomenal
aspect of consciousness. In order to avoid epiphenomenalism, the conscious self-organizing process
should be interwoven with the phenomenal aspect of consciousness and it should not involve agents
external to consciousness (Ellis, 1999).
Furthermore, if we want to ascribe an adaptive role to conscious processes, they should differentiate
themselves from NC processes. The difference between conscious processes and NC processes cannot
be based solely on the involvement of certain brain areas (Crick & Koch, 1995; Raichle, 1998; Sergent &
Dehaene, 2004; Tong, 2003; Weiskrantz, 1997), but it must necessarily go beyond that and pertain to
the very nature of the processes (Bickhard, 2005). This difference, which does not imply a mind-body
dualism, will now be examined at the functional level of the dualism between conscious processes and
NC processes. A functional dualism does not necessarily imply that the two processes work in parallel,
that they are opposed to each other or that the conscious mind autonomously produces thoughts and
1
In this perspective, we could be tempted to eliminate NC processes entirely. But NC processes have a well recognised role in
mental function. The problem is rather, on the one hand, to give a functional role to consciousness, and, on the other hand, to
bridge the gap between our knowledge and mental functioning, and NC process are unlikely to bridge such a gap.
Author's personal copy
314
B. Forti / New Ideas in Psychology 27 (2009) 312–325
behaviours. It rather means that conscious processes should play a specific causal role on NC processes
and that they should play a non-marginal role in mental function. In order to do so, conscious processes might somehow be complementary to NC processes and integrate with them in mental processes
(Morsella, 2003). In brief, a theory of consciousness would be plausible if it met the following
requirements:
1. Consciousness is the expression of self-organizing processes interwoven with its phenomenal
nature.
2. Conscious processes are different from NC processes and they are not causally reducible to them.
3. Conscious processes play a causal role on NC processes and they have a non-marginal role in
mental function.
Despite being very well-known, the Gestalt approach has rarely been applied to consciousness
(Lehar, 2003; Searle, 1993, 2004). Yet, it may be the model which best meets the requirements above.
Early perceptual organization, which leads to the formation of the visual object, is the result of the
dynamic self-distribution of the processes triggered by sensory input and it is an essential requirement
for further recognition or interpretation processes (Kanizsa, 1991). According to Kanizsa (1985), ‘‘the
formation of a visual object as an entity separate from other objects must occur before the object can be
identified’’. Other approaches have traditionally attributed self-organizing processes to the mind
(Gibson, 1950, 1979; Kitchener, 1986). Piaget conceives knowledge as the product of self-organizing
activity that leads to a continuous reorganization of the existing structure. However, Piaget, like other
authors, does not clearly distinguish between processes which occur in the presence of consciousness
and processes which occur in its absence.
In recent years there have been several attempts to explain the fundamental laws of perception at
a NC level (Desolneux, Moisan, & Morel, 2003; Sigman, Cecchi, Gilbert, & Magnasco, 2001; Sokolov,
1997; Westheimer, 1999). However, none of them accounts for the global aspects of the perceptual field
(Lehar, 2003). According to Elder and Goldberg (2002), ‘‘the general problem of computing the
complete bounding contour of an object of arbitrary shape in a complex natural image remains
essentially unresolved’’.
However, in this case, the issue is different. Can the Gestalt theory be considered a theory of
consciousness? The Gestalt approach provides an explanation of the functioning of certain conscious
processes, but it does not explain what consciousness is. Consciousness seems to be placed at the same
time at a lower (raw feels, qualia) and at a higher (subjectivity, self-consciousness) level. Moreover, the
Gestalt theory only examines a small part of mental processes, in particular the early perceptual
processes (Herrmann & Bosch, 2001), while consciousness accompanies most mental processes.
However, the main problem is that it is unclear if Gestalt processes of organization of the visual
field occur already in the presence of consciousness, or if, one the contrary, they are necessary for
the existence of consciousness. In other words, there are no criteria defining the relationship
between organization of the visual field and consciousness. For instance, can we consciously
experience a myriad of sensations or does being conscious imply a minimum level of structuring of
the field (Kanizsa, 1980)? If we could equate early perceptual processes (Kanizsa, 1985) to conscious
processes – different from conventional NC processes and not caused by them – and recognition
and interpretation processes to NC processes, then conscious processes would be interwoven with
phenomenal perception and they would play a causal role on NC processes. In other words, visual
phenomenal experience as such would have a causal status (Marcel, 1988); hence, it would not be
epiphenomenal.
In order to broaden our perspective from the field of visual perception to the field of consciousness,
let us start by asking what phenomenal experience should be identified with. With the whole field of
consciousness? With the focus of consciousness (Clark, 2002; O’Regan & Noe, 2001)? Phenomenal
experience can hardly be identified with the whole field of consciousness, since some field components
are privileged from a phenomenal point of view. The components are not all equally important and
salient. The field also includes vague, peripheral, fringe components which we normally overlook when
we want to analyze consciousness (Brown, 2000; Gurwitsch, 1964; James, 1890; Mangan, 1993; Seth,
Baars, & Edelman, 2005). On the other hand, our phenomenal experience includes something more
Author's personal copy
B. Forti / New Ideas in Psychology 27 (2009) 312–325
315
than mere focus. An object is not phenomenally such without the background to which it belongs. Let
us draw a black triangle on a white sheet of paper. Even if we cut the triangular shape from the white
paper to which it belongs, we will never be able to perceive it without an appropriate background. The
perception of that triangle will be conditioned anyway by the brightness, the colour and the size of the
background and, most of all, by the way that background models the shape of the triangle.
Furthermore, what is the relationship between phenomenal experience and conscious representation? Mental processes operate on representations (Gardner, 1987), and a correspondence in terms of
information structure between what is represented at a conscious level and at a NC level (Chalmers,
1996; Velmans, 2002) is essential to enable a functional interface between conscious processes and NC
processes. Conscious Representation (CR) is distinguished from Non-Conscious Representation (NCR)
on the basis of characteristics such as subjectivity, first person ontology (Searle, 2004), intentionality,
quale nature and other representational qualities (Gennaro, 2005; Seager, 2002). However, despite
such differences, CR is commonly considered as a NCR, a processing unit which is defined by its
extrinsic causality relations (Fodor, 1975), something inert which is processed externally.2
On the other hand, can the functional difference between NC processes and conscious processes be
constituted exactly by a difference between the properties of what is represented at a NC level and the
properties of what is represented at a conscious level? This is not only a format difference (Velmans,
2002). The hypothesis I put forward is that, while at a NC level processes are extrinsic to the representation, at a conscious level they are intrinsic to what is represented, which, as a representational
field, acquires self-organizing properties.
2. Consciousness is a differentiation process
Light is an indispensable element of vision (Gibson, 1950), and the switching on of an inner light is
often used as a metaphor for consciousness (Baars, 1997a; Gregory, 1987; Jaynes, 1976; Shanon, 2001).
However, light is indispensable only because it eliminates absolute darkness, as blindness can come
both from absolute darkness and from absolute light. It is essential to create a differentiation of the field
of consciousness, something that occurs when we see a light object – illuminated by the ‘‘light’’ of
consciousness – on a dark background but also, in the same way, when we see a dark object on a light
background. This differentiation tends to take exactly the form of the foreground–background relationship. The bigger the object-foreground difference is, the more vivid our conscious experience will
be. When there are several contents, the most differentiated ones are phenomenally salient. Different
contents can be grouped in a Gestalt on the basis of some sort of internal homogeneity and of a contrast
with the rest of the field. This is the process of differentiation of a figure, foreground, content, on the
basis of their relationship with a background. Of these three, I will use the term content because it is
more general and is therefore more suitable to identify phenomenal experience and what is
consciously perceived. In this sense, the content is something internal to what is represented. We do
not perceive what we represent. This meaning might sound counter-intuitive in a NC perspective, in
which the content coincides with what is represented (Velmans, 2002). In fact, in this way I am
attributing to the content its original meaning, i.e. a content is such only in relation to a container,
which in this case is represented by the background.
This meaning is coherent with the phenomenal nature of conscious contents. All our sensations
occur on the background of something. Of course, the process of differentiation is not limited to mere
visual perception. It also refers to indistinct forms, e.g. when pain in a knee emerges from the background of a painless body. It may occur between different modalities, e.g. when we hear a sound
coming from the visual space to our right. It may occur on the background of memory, e.g. when we
hear a sudden noise or we feel a sense of relief from strong pain. At the same time, the idea of qualia, of
raw feels (Clark, 1992) as basic properties of consciousness seems questionable (O’Regan & Noe, 2001).
There is no sensation, if not inside a conscious perceptual experience (Hardin, 1992). Some think that
2
In a neural network the code/process distinction is not clear-cut and an activation pattern cannot be considered as inert.
However, a stable activation pattern can be considered as a representation which has effects elsewhere in the system (Jeffery &
Reid, 1997; O’Brien & Opie, 1999).
Author's personal copy
316
B. Forti / New Ideas in Psychology 27 (2009) 312–325
there can be such a thing as a consciousness inhabited solely by qualia, while a quale invariably belongs
to a content-background complex, the definition of which is never based on the quale alone. Qualia are
first and foremost the expression of the differentiation of a field. We see something yellow; none of us
experiences a homogeneously yellow field of consciousness.
The differentiation of a content can be considered the fundamental property of consciousness, so
that we can state that there is no consciousness without differentiation. NC processes are based too on
differentiation – e.g. between the presence and the absence of a stimulus, or among a range of stimuli.
However, this is an extrinsic differentiation, i.e. a discrimination that allows to discriminate between
two or more states (Edelman & Tononi, 2000) and involves computational operations on substantially
inert representations. Discrimination is an act of judgement (Zeman, 2001) which does not necessarily
take the form of a figure-background relationship. Conscious differentiation rather entails the generation of knowledge (Shanon, 2001) – intrinsic to the conscious state – which does not derive from
a comparison with pre-existing knowledge of reference, but which occurs through the dynamic
foreground–background relationship.
The basic difference between differentiation of a content and foreground–background organization, which is considered one of the main characteristics of consciousness (Chalmers, 1996; James,
1890; Mangan, 1993; Searle, 1992; Seth, Izhikevich, Reeke, & Edelman, 2006; Shallice, 1988), is that
in the foreground–background organization the visibility of the object is attributed to an agent
external to the image – attention or consciousness itself – which selects it against the background
(James, 1890; LaBerge, 2001; Lambie & Marcel, 2002; Russell, 2005). The foreground–background
organization is considered as something static which is affected by some sort of process. However,
the notion of selectivity does not account for the close interdependence of foreground and background, for which an object is such in relation to the background which delimits it. According to the
differentiation hypothesis, the visibility of an object is attributed to the background itself, whose
function is to ‘‘make us see’’ the object which differentiates itself from the background.3 In other
words, conscious contents are not differentiated by an external device, but they autonomously and
actively differentiate themselves from a background through a process intrinsic to the field. It is best
to specify that terms such as differentiation, dynamic foreground–background relationship, representational field, are defined as process properties at a descriptive level. It is beyond the scope of
this paper to attempt proposing a model of these process properties, e.g. by hypothesizing what
physical processes could be involved. I will merely argue that differentiation processes cannot be
accounted for by NC process models.
In the simplest case, which I call primary differentiation, the content is the expression of a homogeneity which is internal to the content itself and of a contrast with the background. Both factors are
required for such a differentiation to take place. A primary differentiation occurs when one content
differentiates itself from a background in a field which is not further differentiated. A secondary
differentiation occurs when several contents differentiate themselves. However, in this case too, there
is a tendency to form a Gestalt (Searle, 2004), i.e. to differentiate a main, somehow privileged content,
which differentiates itself more than the other contents. In secondary differentiation the main content
organizes the other field contents and is at the same time organized by them. Secondary differentiation
can emerge, as a sub-differentiation, from the inhomogeneity internal to an already differentiated
content (for instance when we see the details of an object), but also, as a meta-differentiation, from the
union of several homogeneous contents (for instance when Gestalts are formed thanks to similarity or
proximity factors). The main contents can thus be external or internal to other contents: even a detail
can organize an object.
Gestalt processes can be considered as processes of differentiation of a content. The objects attract
and look for each other (Kanizsa, 1991). Gestalt laws essentially concern unifying factors. However, in
3
If the function of the background is to ‘‘make us see’’, it should not ‘‘be seen’’ despite being part of the phenomenal field.
The contradiction is only apparent. Let us consider the ambiguous figures, in which one of the two belongs to the phenomenal
field but it cannot ‘‘be seen’’ when it acts as the background of the other. One may object that we also see the background. But
the background we see is perceived – as an ‘‘object’’ – thanks to another background. We see the background of the landscape
we observe at the window thanks to the window frame, and we see the frame thanks to the wall which surrounds it (Russell,
2005; Searle, 1993).
Author's personal copy
B. Forti / New Ideas in Psychology 27 (2009) 312–325
317
order for the parts to group together, they must at the same time differentiate themselves as a whole.
A Gestalt is inevitably the most differentiated content too. Otherwise, we consider consciousness as
already acquired.4 Differentiation itself can contribute to grouping. For instance, in proximity, we must
necessarily take into consideration that the bigger extent of the space delimiting the contiguous
objects also contributes to them grouping together.
The regulatory principle of dynamic processes in the field of consciousness is the tendency towards
equilibrium of the process of differentiation. Conscious content represents the condition of equilibrium
of the field, which coincides with the maximum possible differentiation in a given field. The condition
of equilibrium in a process of differentiation may be considered similar to what is called condition of
equilibrium, good form, simplicity, regularity (Boselie & Wouterlood, 1989; Hatfield & Epstein, 1985;
Hochberg & McAlister, 1953; Kanizsa & Luccio, 1987; Koffka, 1935; Köhler, 1920, 1947). Hence, this
hypothesis agrees with the central idea of the Gestalt theory, i.e. that a self-organizing process tending
towards a condition of equilibrium takes place in the perceptual field. The differences with the Gestalt
model, which will be further analyzed in this paper, may be summarized as follows:
1. The differentiation of a content is the fundamental process of consciousness and it is required for
visual perception and other forms of phenomenal experience, starting from the simplest ones. The
condition of equilibrium of the field of consciousness coincides with the maximum possible level
of differentiation in a given field.
2. The functional role of consciousness is not limited to early perceptual processes (so that there is an
output from conscious early perceptual processes to NC cognitive processes), but it includes all
mental processes occurring in the presence of consciousness.
3. Together with the global aspect, which depends on the way the whole field is organized, we must
consider the existence of basic properties which can be ascribed to the isolable components of that
field.
4. The parts of a conscious field are not only juxtaposed to one another, but they can also be
superimposed over one another.
5. Subjectivity is a particular type of figure-background relationship, in which the background
(subject Self) coincides with the point of view from which we see the external world and ourselves
(object Self).
3. Input and output of conscious differentiation processes
While the traditional view of consciousness is centred on the phenomenon, on the conscious
experience – or, on the contrary, on processes occurring in the absence of phenomenal consciousness –
in this hypothesis the core, as it were, of the idea of consciousness is the process together with the
phenomenal aspect. In the first hypothesis, the main question is ‘‘what causes consciousness?’’, while
in the second hypothesis the main question is ‘‘how are conscious processes related to the phenomenal
aspect?’’. In its traditional formulation, i.e. presupposing a NC causality (Crick & Koch, 2003; Zeman,
2001), the question ‘‘what causes consciousness?’’ inevitably leads to an epiphenomenal consciousness
and to the impossibility of differentiating conscious processes from NC processes, as illustrated in Fig. 1
(derived from Jackendoff, 1987).
4
The different view of the relationship between perceptual field organization and consciousness between this theory and the
Gestalt perspective is evident in Wertheimer’s claim that ‘‘when an object appears upon a homogeneous field there must be
stimulus differentiation (inhomogeneity) in order that the object may be perceived. A perfectly homogeneous field appears as
a total field opposing subdivision disintegration, etc. To effect a segregation within this field requires relatively strong differentiation between the object and its background’’ (Wertheimer, 1923). In the Gestalt perspective, differentiation must be strong
enough to oppose a perfectly homogeneous perceptual field. However, a perfectly homogeneous field is not the white sheet of
paper devoid of the classic Gestalt figures, which is in its turn perceived on the background of something else. Hence, we must
wonder if a perfectly homogeneous field may be perceived, if it is compatible with consciousness, or if, on the contrary, the
internal differentiation of a field is an essential requirement for it to be perceived. In other words, the Gestalt theory implicitly
assumes that the visual space in which the Gestalt processes take place is perceptible. When we analyze consciousness, we have
to wonder in which conditions this space may be perceived.
Author's personal copy
318
B. Forti / New Ideas in Psychology 27 (2009) 312–325
As regards the question ‘‘how are conscious processes related to the phenomenal aspect?’’, the
content – or phenomenal consciousness – is neither the phenomenal effect of NC processes, nor
something which enables conscious processes to take place. On the contrary, the content is the result of
conscious differentiation processes occurring in the presence of a representational field.
Differentiation process
Representational field !!!!!!!!!!!!!! Conscious content
Rather than with the mere content, consciousness should be identified with the whole process, since
the content only represents the condition of equilibrium of the field of consciousness. But how is the
differentiation process related to NC processes? The functional role of consciousness presupposes not
only a causal effectiveness of conscious processes on NC processes, but, in a broader perspective, the
existence of input and output between conscious processes and NC processes (Kendler, 2005).
What do conscious processes organize? Sensory input, as in the Gestalt perception, but also
something else. Differentiation is the process which organizes what is brought to consciousness both
by sensory afferences and by internal NC processes. While the Gestalt theory only considers the output
from early perceptual processes to NC processes, conscious processes also involve an input from NC
processes. This occurs in perception itself. For instance, attentive NC processes can increase the
contrast in some field areas and reduce it in others. This does not mean that they take the place of
conscious processes (Baars, 1997b), but that in this way they modify the information material which
will be subjected to the self-organizing processes of the field of consciousness.
Unlike in the Gestalt field, let us assume that the components of a field of consciousness are not only
juxtaposed to one another, but that they can also be superimposed over one another. The conscious
superimposition can occur between different modalities, like when we smell the scent of a rose, we
listen to what the person in front of us is saying or when we appreciate the taste of chocolate (Seager,
2002). Admitting the superimposition of conscious images may help us to explain the peculiar richness
of our phenomenal experiences and the extent and complexity of the functional role played by
consciousness.
On the other hand, the content, that we identify with phenomenal consciousness, is not only the
focus of consciousness: first of all, the content involves, in a dynamic sense, the background, but also
a more or less homogeneous internal surface, or the details ‘‘contained’’ in it, other field elements
which contribute to delimiting the focus or which turn the attention away, etc. A content is perceived
on the basis of the relations internal to what is represented, and it always has a relational nature. Far
from being monadic or non-relational properties, so-called qualia are actually defined in relation to
something else: in the warm/cold, bright/dark, pleasure/pain dichotomies, or in relation to a range of
characteristics, as in the case of colours (Hardin, 1992). A spot is related to the background from which
it differentiates itself. Furthermore, as a homogeneous surface, the spot is the expression of an internal
relationality (Brown, 2000). Other expressions of relational aspects, external and internal to the
content, are simple figures such as an ‘‘H’’, a ‘‘T’’, a wheel with spokes and a question mark. A straight
line also entails the symmetrical subdivision of a space and the organization of a set of contiguous and
perfectly aligned points. These relational aspects seem to be the primary expression – not mediated by
higher processes – of phenomenal experience as such. They are not the expression of the static NCR
having the same configuration – if not as topological relationships.
To summarize, on the one hand, the input to consciousness corresponds to a representational
whole, i.e. to a non-fully organized set of afferences and internally generated representations. On the
?
Conscious experience 1
Conscious experience 2
Conscious processes?
?
?
Brain state 1
Brain state 2
NC processes
Fig. 1. Relationship between consciousness and NC processes.
Author's personal copy
B. Forti / New Ideas in Psychology 27 (2009) 312–325
319
other hand, in phenomenal experience, all components of the conscious field seem to interact with
each other in a condition of equilibrium, on the basis of their mutual relationships. In order for the
differentiation process to take place, we have to assume that the components of what is represented
acquire interactive properties, which cannot be attributed to afferences or to a conventional NC activity.
Only by acquiring interactive properties can the representational whole behave like a representational
field which self-organizes on the basis of existing internal relationships, thus giving rise to phenomenal
experience and reaching a condition of equilibrium. Hence, the differentiation processes can be defined
as conscious because they occur in the presence of the interactive properties which are responsible for
the representational field (Fig. 2).
The ability to organize complex and constantly changing relationships existing in the represented
reality makes it necessary to consider a global aspect, which depends on how the whole field organizes
itself following a foreground–background pattern, as well as the existence of basic properties which
can be attributed to the isolable components of that field. These components interact on the basis of
relationships which may hypothetically belong to the homogeneity-contrast and proximity-distance
dimensions. However, it will not be possible to further analyze this important aspect in this short
paper. We can only assume that, in a certain global configuration, i.e. in a foreground–background
structuring of the field, these properties are necessary and sufficient for the existence of phenomenal
consciousness. In these conditions, the content, as the condition of equilibrium of the field, is not the
starting point, but the final result of conscious differentiation processes. It is the result of processes
which are causally effective on the NC system (Fig. 2). Such a perspective is compatible with a nonepiphenomenal consciousness.
Of course, the ensuing question is ‘‘what does consciousness cause?’’, i.e. ‘‘what is the causal effect
of the content on NC processes?’’. A content is the expression of how all the components of a representational field interact and organize themselves in delimiting the content itself. We ‘‘see’’, as it were,
all the factors involved in the process. This characteristic of phenomenal consciousness has a precise
meaning in a functional perspective. Phenomenal consciousness does not merely correspond to
a representation, but it is the expression of a process of differentiation which leads, in NC terms, to the
definition of the relationship between a representation and a representational whole (Fig. 2). In brief,
consciousness results from the ability of a representational whole to internally generate a representation, i.e. to differentiate the representation from the whole itself.
The functional meaning of this process may be related to the fact that the relationship between the
representation and the representational field is new, as it is invariably the expression of a self-organizing process of a non-(fully) organized representational whole. It is recognized that processes which
occur in the presence of consciousness are flexible and able to deal with novelty and that complex
Representational field
Conscious content
Conscious processes
(differentiation)
interactive
properties
Non-(fully) organized
representational whole
NC processes
Differentiation of a R from
a representational whole
Etc.
NC processes
NC processes
Afferences
Afferences
Fig. 2. Input and output of conscious differentiation processes.
Author's personal copy
320
B. Forti / New Ideas in Psychology 27 (2009) 312–325
learning requires the presence of consciousness to an extent that is inversely proportional to the level
of automation of the new procedures (Baars, 2002; Goerner & Combs, 1998; Gregory, 1988; Oatley,
1988; Pani, 1996; Pribram & Meade, 1999; Raichle, 1998; Schneider & Shiffrin, 1977; Sieb, 2004; Zeman,
2003). A NCR can organize itself on the basis of a set of rules (Dalenoort, 1990), i.e. when all the possible
interactions of the representational whole are predictable. For instance, a representational whole could
internally differentiate a representation if it would be provided with the right program. On the
contrary, a basic property of consciously processed representational wholes is that they are not fully
organized, primarily in the comparison between the expected image and a constantly changing
external reality. This characteristic is related to one of the most important, yet often overlooked
properties of consciousness, i.e. its constant changefulness (James, 1890). If an activated representational whole were fully organized, it would not be subjected to conscious self-organizing processes. It
would not become conscious because it would not need to. Instead, it would be subjected to automatic,
non-conscious processes.
4. Subjectivity and consciousness
To hypothesize that differentiation is the basic property of consciousness does not seem to fill the
explanatory gap (Levine, 1983), the so-called hard problem (Chalmers, 1996), related to the first person
ontology of consciousness. This is also the limit of the Gestalt approach. First person ontology means
that consciousness exists only as the experience of a human or animal subject, and in this sense it exists
only from a first person point of view. In this respect, conscious states differ from nearly all the rest of
the Universe – mountains, molecules and tectonic plates – which have an objective mode of existence
(Searle, 2004). Hence, first person ontology of mental phenomena is irreducible to third person
ontology of other phenomena. It is certainly true that all our conscious experiences as human beings
are subjective. But it is true because we cannot avoid ‘‘introducing’’ images about the self in the field of
consciousness. Somatic experiences probably represent our first conscious experiences (Rosenfield,
1993) and they are present even when we think about the meaning of the universe.
But subjectivity is not a basic characteristic of consciousness. It is true that when we watch a picture
we cannot avoid perceiving something of ourselves, yet these sensations only play a marginal role in
the overall visual experience. When we turn our attention to the outside world, our conscious experiences are not characterized by introspective awareness (Seager, 2002). If we hypothetically eliminated the subjective component of consciousness, the phenomenal problem of vision – about why
a yellow triangle appears as such and it is not just a configuration eliciting a response – would still
remain unsolved. The fact that a yellow triangle appears to us cannot be the only element accounting
for its appearance and for its phenomenal ontology. On the other hand, adding elements of the self to
an activated representation (Churchland, 2002; Damasio, 1994; Johnson-Laird, 1988) does not add
anything special to that configuration. Moreover, subjectivity has not to do only with what is represented, but also with how the aspects of the self are represented. The self can be both the subject and
the object of the experience.
What is conscious subjectivity then? According to Chalmers (1996), the sense of the self can be an
aspecific element, ‘‘a kind of background hum that is somehow fundamental to our consciousness’’. An
aspect of subjectivity, of what we consider our point of view (Hofstadter & Dennet, 1981; Sieb, 2004), is
to represent a kind of background of our conscious experiences. This idea is counter-intuitive and
phenomenally sound at the same time. Based on our common sense, we do not think to be part of our
conscious images. Yet, at the same time, we can somehow perceive ourselves while we perceive
ourselves and the world (Hume in Damasio, 1998; Flanagan, 1991; Selby-Bigge, 1978). The idea of the
point of view probably stems from the comparison of several points of view and it is conditioned by the
fact that, as seen from the outside, we are a point in the world in which we live and which we observe
from our perspective. But in our subjective experience, it is the world which is contained in ourselves, in
what we represent of ourselves also at a physical level. In this sense, the way we see the world through
subjectivity is essentially similar to the way we see the landscape through a window in our house.
Subjectivity is primarily a background surrounding a scene, which enables the scene to be seen and, at
the same time, to organize itself through secondary differentiation.
Author's personal copy
B. Forti / New Ideas in Psychology 27 (2009) 312–325
321
Of course a wall is neither a subject nor a point of view. This is not only because the wall is surrounded in its turn by the background of the representation of the self, which characterizes all our
perceptions, but also for other reasons. Firstly, because the representation of the self is far more
complex than a mere visual background. Secondly, the representation of the self is an invariable
element, a constant, as opposed to the much bigger variability of the representations of the outside
world, which is the basis of our sense of personal continuity (Gallagher, 2000). These invariable
elements are our own containers of the world. Thirdly, subjectivity implies that the self is represented
not only on the background, but also inside that background, usually in relation to something else.
Subjectivity as a point of view means that we as the background perceive some aspects of ourselves in
relation to an object or to an external situation, e.g. when we want to go out on a date or when we feel
immersed in nature. Otherwise, the Self acts as the background of some of its own aspects, like when
we feel pain or when we are satisfied with ourselves. In other words, subjectivity as a background is
something which makes us perceive something else, be it the external reality or some aspects of the
Self. The latter differentiate themselves as an object Self perceived on the background of the subject
Self. However, being at the same time part of the subject to which they ‘‘resemble’’, these aspects
differentiate themselves clearly from the objects of the outside world. These peculiar intraphenomenal
relations partly account for the fact that, because of subjectivity, we see the world from a ‘‘point of
view’’ discriminating between what belongs to us and what does not and at the same time we can
hardly perceive ourselves entirely (Lambie & Marcel, 2002).
Therefore, subjectivity is a particular type of background-content relationship through which we
see things from a certain point of view. Consequently, it may be considered as the expression of
a differentiation process. The laws regulating this relationship do not differ from the ones regulating
the relationships of proximity or similarity in a Gestalt field or the relationships between an object and
the space surrounding it. However, the latter relationships can be seen as belonging to a third person
ontology, because they can be explained in terms of what happens in the visual field and because their
phenomenal ontology only marginally involves subjectivity. Hence, the first person ontology which
characterizes subjectivity is just a particular case of third person ontology, i.e. of the process of differentiation which leads to the formation of a conscious content.
This view of subjectivity helps us to provisionally outline the relationship between Gestalt organization and consciousness: Gestalt organization of the visual field could be the simplest form of
consciousness. Simpler forms of consciousness do not exist. So-called qualia and raw feels can be
consciously experienced only if they belong – as more or less distinct forms – to a content-background
organization. More complex forms of consciousness could be, like subjectivity, more complex forms of
the content-background organization. Of course, even the simplest visual organization we may
perceive – a figure against a background – is not really a primary consciousness. We could say that
Gestalt tacitly assumes that the background surrounding the sheet of paper does not significantly affect
the Gestalt laws occurring within it.
5. Conclusions
Even if we consider the first person ontology which characterizes subjectivity as a particular case of
third person ontology, a problem remains unsolved. A scientist studying bats can get comprehensive
information as a third person, but he does not know what it is like to be a bat (Nagel, 1974), he does not
have access to the phenomena occurring in the consciousness of the bat. Searle (2004) holds that, just
as we are in relation to some entities, our experiences of colours, the bat is in relation to some entities,
its experiences of what it is like to be a bat. A complete third person description of the world leaves
such entities out, so the description is incomplete. Similarly, the scientist Mary knows everything about
the neurophysiology of colour vision (Jackson, 1986), but she has never directly experienced it. Most
people do not know what it is like to be affected by schizophrenia, dementia, or central blindness. After
all, none of us truly knows what it is like to be someone different from ourselves, with different
experiences from ours.
Yet, this problem coincides with the one pertaining to the description of any physical process and it
is in no way specific to conscious experience. We cannot say that the first person experience of an adult
woman or of a bat is different from a hurricane, a mountain, a molecule, a myocardial cell, an
Author's personal copy
322
B. Forti / New Ideas in Psychology 27 (2009) 312–325
electromagnetic field or the whole rest of the universe. This does not exclude that ‘‘being’’ any of these
phenomena is different from describing them. Knowing everything on a certain kind of electromagnetic radiation and on light is not equivalent to emitting either light radiation or heat. No process can
ontologically be reduced to its description. Moreover, ‘‘being’’ any of these phenomena is different from
‘‘being’’ a phenomenon of the same kind, if the processes involved are complex enough to differentiate
themselves from similar processes of the same kind. No process with a certain degree of complexity can
ontologically be reduced to a similar process, not even if they both imply feeling or thinking something.
Despite the subjective aspects, having a conscious experience is ‘‘simply’’ the expression of a certain
physical process, and in this sense it is not different from other processes involving a certain degree of
complexity.
Since our conscious experiences are characterized by us feeling something, knowing our mental
states and having insight into others’ mental states (Brune & Brune-Cohrs, 2006), we may think that
a scientific theory of consciousness would represent a major step forward. The truly misleading
element is not the subjectivity of our mental states, but the correspondence between phenomenal
properties and pre-scientific knowledge. But no scientific knowledge can transform phenomenal
properties into something else. To have insight into others’ mental states is not the same as to feel what
others feel. At the same time, scientific knowledge of conscious experience is not to feel what others
feel, it is not to have the subjective experience of a conscious organism.
Another relevant problem is the irreducibility of consciousness to NC processes. What does that
imply in the light of the hypothesis put forward in this paper, according to which being conscious
corresponds to differentiating a content? The differentiation of a content, as a fundamental process of
consciousness, occurs, as it were, at a macroscopic level, which we should all be able to verify in our
own experience. Our conscious experience is the expression of how all the components of a representational field interact and organize themselves to differentiate a content. Consequently, differentiation is not something to which consciousness may be reduced; it is only an attempt to grasp its
central aspect. Yet, the process of differentiation cannot be reduced to NC processes in its turn.
As stated above, because of the double translation problem, any conscious format caused exclusively
by NC processes has no functional meaning. This does not mean that epiphenomenalism, despite not
being a very appealing solution, should be completely ruled out in an evolutionary perspective
(Pockett, 2004). Epiphenomenalism would imply that some NC processes, organized in a certain way,
produce not only NC states, but also conscious states as a by-product (Morsella, 2005). This is what is
implicitly held in most current theories of consciousness (Zeman, 2001). However, such an event
appears improbable. It is hard to imagine that NC processes, even at a certain degree of complexity,
cannot occur if not producing a phenomenal ‘‘effect’’.
A minimum requirement for a process to be defined as conscious is its phenomenal specificity. A
tissue or a lung cannot reproduce themselves as the whole living organism. However, they both belong
to the emerging whole and they can be defined as biological entities because, from a structural and
functional point of view, they are specific components of a living organism. Starting from a certain
level, each component of a living organism is characterized by properties which do not guarantee
reproductivity by themselves, but which contribute, together with others, to the reproductivity of
a living organism. In this way there is a shift from a physical-chemical field to a biological field. Instead,
NC processes do not seem to have any specificity with respect to phenomenal experience. Not even the
most complex levels of the NC organization have similar properties to the ones which generate
consciousness. Therefore, a process which seems to have nothing to do with phenomenal consciousness is very unlikely to suddenly and magically generate consciousness. Even a NC ‘‘programme’’
determining the self-organization of the representational whole and enabling the differentiation of
a content has nothing to do with phenomenal consciousness. Such a program not only appears hardly
feasible, but it would also simulate the interactions between the components of the representational
whole. Indeed, a NC organization does not have the characteristics of a field and, since its processes are
syntactically governed (O’Brien & Opie, 1999; Searle, 1980), it does not depend on the relations internal
to the represented reality. Consequently, it would not have the phenomenal properties which our brain
shows in certain circumstances.
Hence, we have to follow another direction. Given a representational whole whose informational
structure is organized in a foreground–background pattern, we should try to understand (1) which
Author's personal copy
B. Forti / New Ideas in Psychology 27 (2009) 312–325
323
properties enable this representational whole to behave like a self-organizing field and (2) which field
components are responsible for such properties, i.e. which components can be functionally isolated. Of
course, these properties should be such to generate, at a macro level, a phenomenal experience. These
are complicated matters which cannot be investigated in this short paper.
Another aspect worth considering is the contribution of consciousness to mental function. The
ability of conscious processes to define the relationship between a representation and a representational field starting from a non-fully organized representational whole might be a key element. The
possibility of a superimposition of several images broadens the functional role of consciousness well
beyond early perceptual processes. If the mind derives from the integration of conscious processes and
NC processes, consciousness acquires different meanings depending on the relations existing between
the conscious sphere and the NC sphere. Consciousness has certainly undergone a deep change in the
course of evolution, and the prerogatives of consciousness have certainly modified the role of NC
processes, even before the appearance of cognitive processes. These aspects should therefore be
further analyzed in order to achieve a better understanding of the functional role of consciousness.
References
Baars, B. J. (1988). A cognitive theory of consciousness. New York: Cambridge University Press.
Baars, B. J. (1997a). In the theatre of consciousness. Journal of Consciousness Studies, 4, 292–309.
Baars, B. J. (1997b). Some essential differences between consciousness and attention, perception, and working memory.
Consciousness and Cognition, 6, 363–371.
Baars, B. J. (2002). The conscious access hypothesis: origins and recent evidence. Trends in Cognitive Sciences, 6, 47–52.
Bechtel, W. (1988). Philosophy of mind. An overview for cognitive science. Hillsdale, NY: Lawrence Erlbaum.
Bickhard, M. H. (2005). Consciousness and reflective consciousness. Philosophical Psychology, 18, 205–218.
Block, N. (1995). On a confusion about a function of consciousness. Behavioral and Brain Sciences, 18, 227–287.
Boselie, F., & Wouterlood, D. (1989). The minimum principle and visual pattern completion. Psychological Research, 51, 93–101.
Brown, S. R. (2000). Tip-of-the-tongue phenomena: an introductory phenomenological analysis. Consciousness and Cognition, 9,
516–537.
Brune, M., & Brune-Cohrs, U. (2006). Theory of mind-evolution, ontogeny, brain mechanisms and psychopathology. Neuroscience and Biobehavioral Reviews, 30, 437–455.
Chalmers, D. J. (1996). The conscious mind. Oxford: Oxford University Press.
Churchland, P. M. (1989). A neurocomputational perspective. The nature of mind and the structure of science. Cambridge, MA: MIT
Press.
Churchland, P. M. (1995). The engine of reason, the seat of the soul. Cambridge, MA: MIT Press.
Churchland, P. S. (2002). Self-representation in nervous systems. Science, 296, 308–310.
Churchland, P. S., & Sejnowski, T. J. (1992). The computational brain. Cambridge, MA: MIT Press.
Clark, A. (1992). Sensory qualities. Oxford: Clarendon Press.
Clark, A. (2002). Is seeing all it seems? Action, reason and the grand illusion. Journal of Consciousness Studies, 9, 181–202.
Crick, F., & Koch, C. (1995). Are we aware of neural activity in primary visual cortex. Nature, 375, 121–123.
Crick, F., & Koch, C. (2003). A framework for consciousness. Nature Neuroscience, 6, 119–126.
Dalenoort, G. J. (1990). Towards a general theory of representation. Psychological Research, 52, 229–237.
Damasio, A. R. (1994). Descartes’ error. New York: Putnam.
Damasio, A. R. (1998). Investigating the biology of consciousness. Philosophical Transactions of the Royal Society of London. Series
B, Biological Sciences, 353, 1879–1882.
Dennett, D. C. (2005). Sweet dreams. Philosophical obstacles to a science of consciousness. Cambridge: The MIT Press.
Dennett, D. C. (1988). Quining qualia. In A. J. Marcel, & E. Bisiach (Eds.), Consciousness in contemporary science (pp. 42–77).
Oxford: Clarendon Press.
Dennett, D. C. (1991). Consciousness explained. Boston: Little, Brown and Company.
Dennett, D. C., & Kinsbourne, M. (1992). Time and the observer: the where and when of consciousness in the brain. Behavioral
and Brain Sciences, 15, 183–247.
Desolneux, A., Moisan, L., & Morel, J. M. (2003). Computational gestalts and perception thresholds. Journal of Physiology, Paris, 97,
311–324.
Edelman, G. M. (2003). Naturalizing consciousness: a theoretical framework. Proceedings of the National Academy Sciences, 100,
5520–5524.
Edelman, G. M., & Tononi, G. (2000). A universe of consciousness. New York: Basic Books.
Elder, J. H., & Goldberg, R. M. (2002). Ecological statistics of gestalt laws for the perceptual organization of contours. Journal of
Vision, 2, 324–353.
Ellis, R. D. (1999). Dynamical systems as an approach to consciousness: emotion, self-organization, and the mind-body problem.
New Ideas in Psychology, 17, 237–250.
Flanagan, O. J. (1991). The science of the mind. Cambridge: MIT Press.
Fodor, J. A. (1975). The language of thought. New York: Crowell.
Gallagher, I. I. (2000). Philosophical conceptions of the self: implications for cognitive science. Trends in Cognitive Science, 4, 14–21.
Gardner, H. (1987). The mind’s new science. New York: Basic Books.
Gennaro, R. J. (2005). The HOT theory of consciousness. Journal of Consciousness Studies, 12, 3–21.
Gibson, J. J. (1950). The perception of the visual world. Boston: Houghton Mifflin.
Author's personal copy
324
B. Forti / New Ideas in Psychology 27 (2009) 312–325
Gibson, J. J. (1979). The ecological approach to visual perception. Boston: Houghton Mifflin.
Goerner, S., & Combs, A. (1998). Consciousness as a self-organizing process: an ecological perspective. Bio Systems, 46, 123–127.
Gregory, R. L. (1987). The Oxford companion to the mind. Oxford: Oxford University Press.
Gregory, R. L. (1988). Consciousness in science and philosophy: conscience and con-science. In A. J. Marcel, & E. Bisiach (Eds.),
Consciousness in contemporary science (pp. 257–272). Oxford: Clarendon Press.
Gurwitsch, A. (1964). The field of consciousness. Pittsburgh: Duquesne University Press.
Hardin, C. (1992). Physiology, phenomenology, and Spinoza’s true colors. In A. Beckerman, H. Flohr, & J. Kim (Eds.), Emergence or
reduction? Prospects from nonreductive physicalism (pp. 201–210). Berlin/New York: De Gruyter.
Hatfield, G., & Epstein, W. (1985). The status of the minimum principle in the theoretical analysis of visual perception.
Psychological Bulletin, 97, 155–186.
Herrmann, C. S., & Bosch, V. (2001). Gestalt perception modulates early visual processing. Neuroreport, 12, 901–904.
Hesslow, G. (2002). Conscious thought as simulation of behaviour and perception. Trends in Cognitive Sciences, 6, 242–247.
Hochberg, J., & McAlister, E. (1953). A quantitative approach to figural ‘‘goodness’’. Journal of Experimental Psychology, 46, 361–
364.
Hofstadter, D. R., & Dennet, D. C. (1981). The mind’s I: Fantasies and reflections on self and soul. New York: Basic Books.
Humphrey, H. K. (1983). Consciousness regained. Oxford: Oxford University Press.
Jackendoff, R. (1987). Consciousness and the computational mind. Cambridge, MA: MIT Press.
Jackson, F. (1986). What Mary didn’t know. Journal of Philosophy, 83, 291–295.
James, W. (1890). The principles of psychology. New York: Holt.
Jaynes, J. (1976). The origin of consciousness in the breakdown of the bicameral mind. Boston: Houghton Mifflin.
Jeffery, K. J., & Reid, I. C. (1997). Modifiable neuronal connections: an overview for psychiatrists. American Journal of Psychiatry,
154, 156–164.
John, E. R. (2002). The neurophysics of consciousness. Brain research. Brain Research Reviews, 39, 1–28.
Johnson-Laird, P. N. (1988). The computer and the mind: An introduction to cognitive science. Cambridge, MA: Harvard University
Press.
Kanizsa, G. (1980). Grammatica del vedere. Bologna: Il Mulino.
Kanizsa, G. (1985). Seeing and thinking. Acta Psychologica, 59, 23–33.
Kanizsa, G. (1991). Vedere e pensare. Bologna: Il Mulino.
Kanizsa, G., & Luccio, R. (1987). Formation and categorization of visual objects: Höffding’s never refuted but always forgotten
argument. Gestalt Theory, 9, 111–127.
Kendler, K. S. (2005). Towards a philosophical structure for psychiatry. American Journal of Psychiatry, 162, 433–440.
Kinsbourne, M. (2000). How is consciousness expressed in the cerebral activation manifold? Brain & Mind, 2, 265–274.
Kitchener, R. (1986). Piaget’s theory of knowledge. New Haven: Yale University Press.
Koffka, K. (1935). Principles of Gestalt psychology. New York: Harcourt, Brace & World.
Köhler, W. (1920). Die physischen gestalten in ruhe und im stationären zustand. Braunschweig: Vieweg.
Köhler, W. (1947). Gestalt psychology. New York: Liveright Publishing Corporation.
LaBerge, D. (2001). Attention, consciousness, and electrical wave activity within the cortical column. International Journal of
Psychophysiology, 43, 5–24.
Lambie, J. A., & Marcel, A. J. (2002). Consciousness and the varieties of emotion experience: a theoretical framework. Psychological Review, 109, 219–259.
Lehar, S. (2003). Gestalt isomorphism and the primacy of subjective conscious experience: a Gestalt Bubble model. Behavioral
and Brain Sciences, 26, 375–408.
Levine, J. (1983). Materialism and qualia: the explanatory gap. Pacific Philosophical Quarterly, 64, 354–361.
Lindahl, B. I. (1997). Consciousness and biological evolution. Journal of Theoretical Biology, 187, 613–629.
Mangan, B. (1993). Taking phenomenology seriously: the ‘‘fringe’’ and its implications for cognitive research. Consciousness and
Cognition, 2, 89–108.
Manzotti, R. (2006). A process oriented view of conscious perception. Journal of Consciousness Studies, 13, 7–41.
Marcel, A. J. (1988). Phenomenal experience and functionalism. In A. J. Marcel, & E. Bisiach (Eds.), Consciousness in contemporary
science (pp. 121–158). Oxford: Clarendon Press.
McGinn, C. (1991). The problem of consciousness: Essay toward a resolution. Oxford: Blackwell.
Morsella, E. (2003). The function of phenomenal states: is there progress for the ‘‘softer problem’’ of consciousness? Psychological Report, 93, 435–440.
Morsella, E. (2005). The function of phenomenal states: supramodular interaction theory. Psychological Review, 112, 1000–1021.
Naccache, L. (2005). Visual phenomenal consciousness: a neurological guided tour. Progress in Brain Research, 150, 185–195.
Nagel, T. (1974). What it is like to be a bat? Philosophical Review, 83, 435–450.
Oatley, K. (1988). On changing one’s mind: a possible function of consciousness. In A. J. Marcel, & E. Bisiach (Eds.), Consciousness
in contemporary science (pp. 369–389). Oxford: Clarendon Press.
O’Brien, G., & Opie, J. (1999). A connectionist theory of phenomenal experience. Behavioral and Brain Science, 22, 127–196.
O’Regan, J. K., & Noe, A. (2001). A sensorimotor account of vision and visual consciousness. Behavioral and Brain Sciences, 24,
939–1031.
Overgaard, M. (2006). Consciousness studies: the view from psychology. British Journal of Psychology, 97, 425–438.
Pani, J. R. (1996). Mental imagery as the adaptationist views it. Consciousness and Cognition, 5, 288–326.
Pauen, M., Staudacher, A., & Walter, S. (2006). Epiphenomenalism: dead end or way out? Journal of Consciousness Studies, 13,
7–19.
Pockett, S. (2004). Does consciousness cause behavior? Journal of Consciousness Studies, 11, 23–40.
Pribram, K. H., & Meade, S. D. (1999). Conscious awareness: processing in the synaptodendritic web. New Ideas in Psychology, 17,
205–214.
Raichle, M. E. (1998). The neural correlates of consciousness: an analysis of cognitive skill training. Philosophical Transactions of
the Royal Society of London. Series B, Biological Sciences, 353, 1889–1901.
Rorty, R. (1979). Philosophy and the mirror of nature. Princeton: Princeton University Press.
Author's personal copy
B. Forti / New Ideas in Psychology 27 (2009) 312–325
325
Rosenfield, I. (1993). The strange, familiar and forgotten: An anatomy of consciousness. New York: Vintage.
Rosenthal, D. M. (1995). A theory of consciousness. In N. Block, O. Flanagan, & G. Guzeldere (Eds.), The nature of consciousness
(pp. 729–754). Cambridge, MA: MIT Press.
Rosenthal, D. M. (2000). Consciousness, content, and metacognitive judgments. Consciousness and Cognition, 9, 203–214.
Russell, J. A. (2005). Emotion in human consciousness is built on core affect. Journal of Consciousness Study, 12, 26–42.
Schacter, D. L. (1993). Neuropsychological evidence for a consciousness system. In A. Goldman (Ed.), Reading in philosophy and
cognitive science (pp. 415–444). Cambridge, MA: MIT Press.
Schneider, W. E., & Shiffrin, R. M. (1977). Controlled and automatic human information processing: I. Detection, search, and
attention. Psychological Review, 84, 1–66.
Seager, W. (2002). Emotional introspection. Consciousness and Cognition, 11, 666–687.
Searle, J. R. (1980). Minds, brains, and programs. Behavioral and Brain Sciences, 3, 417–457.
Searle, J. R. (1992). The rediscovery of the mind. Cambridge, MA: MIT Press.
Searle, J. R. (1993). The problem of consciousness. Ciba Foundation Symposium, 174, 61–69.
Searle, J. R. (2004). Mind. A brief introduction. Oxford: Oxford University Press.
Selby-Bigge, L. A. (1978). David Hume: A treatise of human nature. Oxford: Oxford University Press.
Sergent, C., & Dehaene, S. (2004). Neural processes underlying conscious perception: experimental findings and global neural
workspace framework. Journal of Physiology, Paris, 98, 374–384.
Seth, A. K., Baars, B. J., & Edelman, D. B. (2005). Criteria for consciousness in humans and other mammals. Consciousness and
Cognition, 14, 119–139.
Seth, A. K., Izhikevich, E., Reeke, G. N., & Edelman, G. M. (2006). Theories and measures of consciousness: an extended
framework. Proceedings of the National Academy of Sciences, 103, 10799–10804.
Shallice, T. (1988). Information-processing models of consciousness: possibilities and problems. In A. J. Marcel, & E. Bisiach (Eds.),
Consciousness in contemporary science (pp. 305–333). Oxford: Clarendon Press.
Shanon, B. (2001). Against the spotlight model of consciousness. New Ideas in Psychology, 19, 77–84.
Sieb, R. A. (2004). The emergence of consciousness. Medical Hypotheses, 63, 900–904.
Sigman, M., Cecchi, G. A., Gilbert, C. D., & Magnasco, M. O. (2001). On a common circle: natural scenes and Gestalt rules.
Proceedings of the National Academy of Sciences, 98, 1935–1940.
Sokolov, E. N. (1997). The problem of gestalt in neurobiology. Neuroscience and Behavioral Physiology, 27, 323–332.
Tong, F. (2003). Primary visual cortex and visual awareness. Nature Reviews Neuroscience, 4, 219–229.
Tononi, G. (2005). Consciousness, information integration, and the brain. Progress in Brain Research, 150, 109–126.
Toribio, J. (1993). Why there still has to be a theory of consciousness. Consciousness and Cognition, 2, 28–47.
Varela, F., & Maturana, H. (1980). Cognition and autopoiesis. Dordrecht: D. Reidel.
Velmans, M. (2002). How could conscious experience affect brains? Journal of Consciousness Studies, 9, 3–29.
Weiskrantz, L. (1997). Consciousness lost and found. A neuropsychological exploration. Oxford: University Press.
Wertheimer, M. (1923). Untersuchungen zur Lehre von der Gestalt. Psychologische Forschung, 4, 301–350, [translated as Laws of
organization in perceptual forms. In W. D. Ellis (Ed.), A source book of Gestalt psychology (pp. 71–88). (1938) New York:
Harcourt Brace].
Westheimer, G. (1999). Gestalt theory reconfigured: max Wertheimer’s anticipation of recent developments in visual neuroscience. Perception, 28, 5–15.
Zeman, A. (2001). Consciousness. Brain, 124, 1263–1289.
Zeman, A. (2003). Consciousness: A user’s guide. Yale: Yale University Press.
Download