Uploaded by adiagajames

Research Definition (1)

advertisement
Research Definition
Research is "creative and systematic work
undertaken to increase the stock of
knowledge".[1] It involves the collection,
organization and analysis of evidence to
increase understanding of a topic,
characterized by a particular attentiveness to
controlling sources of bias and error. These
activities are characterized by accounting
and controlling for biases. A research
project may be an expansion on past work in
the field. To test the validity of instruments,
procedures, or experiments, research may
replicate elements of prior projects or the
project as a whole.
The primary purposes of basic research (as
opposed to applied research) are
documentation, discovery, interpretation,
and the research and development (R&D) of
methods and systems for the advancement of
human knowledge. Approaches to research
depend on epistemologies, which vary
considerably both within and between
humanities and sciences. There are several
forms of research: scientific, humanities,
artistic, economic, social, business,
marketing, practitioner research, life,
technological, etc. The scientific study of
research practices is known as metaresearch.
Creativity is a phenomenon whereby
something new and valuable is formed. The
created item may be intangible (such as an
idea, a scientific theory, a musical
composition, or a joke) or a physical object
(such as an invention, a printed literary
work, or a painting).
Scholarly interest in creativity is found in a
number of disciplines, primarily psychology,
business studies, and cognitive science.
However, it can also be found in education,
the humanities (philosophy, the arts) and
theology, social sciences (sociology,
linguistics, economics), engineering,
technology and mathematics. These
disciplines cover the relations between
creativity and general intelligence,
personality type, mental and neural
processes, mental health, artificial
intelligence; the potential for fostering
creativity through education, training,
leadership and organizational practices;[1]
the factors that determine how creativity is
evaluated and perceived;[2] the fostering of
creativity for national economic benefit; and
the application of creative resources to
improve the effectiveness of teaching and
learning.
Ideas
In common usage and in philosophy,
ideas are the results of thought.[1] Also in
philosophy, ideas can also be mental
representational images of some object.
Many philosophers have considered
ideas to be a fundamental ontological
category of being. The capacity to create
and understand the meaning of ideas is
considered to be an essential and
defining feature of human beings. In a
popular sense, an idea arises in a
reflexive, spontaneous manner, even
without thinking or serious reflection,
for example, when we talk about the
idea of a person or a place. A new or an
original idea can often lead to
innovation.
History
The argument over the
underlying nature of ideas is
opened by Plato, whose
exposition of his theory of
forms--which recurs and
accumulates over the course of
his many dialogs--appropriates
and adds a new sense to the
Greek word for things that are
"seen" (re. εἶδος) that highlights
those elements of perception
which are encountered without
material or objective reference
available to the eyes (re. ἰδέα).
As this argument is
disseminated the word "idea"
begins to take on connotations
that would be more familiarly
associated with the term today.
In the fifth book of his
Republic, Plato defines
philosophy as the love of this
formal (as opposed to visual)
way of seeing.
Plato advances the theory that
perceived but immaterial
objects of awareness constituted
a realm of deathless forms or
ideas from which the material
world emanated. Aristotle
challenges Plato in this area,
positing that the phenomenal
world of ideas arises as mental
composites of remembered
observations. Though it is
anachronistic to apply these
terms to thinkers from antiquity,
it clarifies the argument
between Plato and Aristotle if
we call Plato an idealist thinker
and Aristotle an empiricist
thinker.
This antagonism between
empiricism and idealism
generally characterizes the
dynamism of the argument over
the theory of ideas up to the
present. This schism in theory
has never been resolved to the
satisfaction of thinkers from
both sides of the disagreement
and is represented today in the
split between analytic and
continental schools of
philosophy. Persistent
contradictions between classical
physics and quantum mechanics
may be pointed to as a rough
analogy for the gap between the
two schools of thought.
Philosophy
Plato
Main article: Theory of Forms
Plato in Ancient Greece was
one of the earliest philosophers
to provide a detailed discussion
of ideas and of the thinking
process (in Plato's Greek the
word idea carries a rather
different sense of our modern
English term). Plato argued in
dialogues such as the Phaedo,
Symposium, Republic, and
Timaeus that there is a realm of
ideas or forms (eidei), which
exist independently of anyone
who may have thoughts on
these ideas, and it is the ideas
which distinguish mere opinion
from knowledge, for unlike
material things which are
transient and liable to contrary
properties, ideas are unchanging
and nothing but just what they
are. Consequently, Plato seems
to assert forcefully that material
things can only be the objects of
opinion; real knowledge can
only be had of unchanging
ideas. Furthermore, ideas for
Plato appear to serve as
universals; consider the
following passage from the
Republic:
"We both assert that there are,"
I said, "and distinguish in
speech, many fair things, many
good things, and so on for each
kind of thing."
"Yes, so we do."
"And we also assert that there is
a fair itself, a good itself, and so
on for all things that we set
down as many. Now, again, we
refer to them as one idea of
each as though the idea were
one; and we address it as that
which really is."
"That's so."
"And, moreover, we say that the
former are seen, but not
intellected, while the ideas are
intellected but not seen."
— Plato, Bk. VI 507b-c
René Descartes
Descartes often wrote of the
meaning of the idea as an image
or representation, often but not
necessarily "in the mind",
which was well known in the
vernacular. Despite Descartes'
invention of the non-Platonic
use of the term, he at first
b
followed this vernacular use. In
his Meditations on First
Philosophy he says, "Some of
my thoughts are like images of
things, and it is to these alone
that the name 'idea' properly
belongs." He sometimes
maintained that ideas were
innate[3] and uses of the term
idea diverge from the original
primary scholastic use. He
provides multiple nonequivalent definitions of the
term, uses it to refer to as many
as six distinct kinds of entities,
and divides ideas inconsistently
into various genetic
categories.[4] For him
knowledge took the form of
ideas and philosophical
investigation is devoted to the
consideration of these entities.
John Locke
John Locke's use of idea stands
in striking contrast to Plato's.[5]
In his Introduction to An Essay
Concerning Human
Understanding, Locke defines
idea as "that term which, I
think, serves best to stand for
whatsoever is the object of the
understanding when a man
thinks, I have used it to express
whatever is meant by phantasm,
notion, species, or whatever it is
which the mind can be
employed about in thinking;
And I could not avoid
frequently using it."[6] He said
he regarded the contribution
offered in his essay as necessary
to examine our own abilities
and discern what objects our
understandings were, or were
not, fitted to deal with. In this
style of ideal conception other
outstanding figures followed in
his footsteps — Hume and Kant
in the 18th century, Arthur
Schopenhauer in the 19th
century, and Bertrand Russell,
Ludwig Wittgenstein, and Karl
Popper in the 20th century.
Locke always believed in the
good sense — not pushing
things to extremes and while
taking fully into account the
plain facts of the matter. He
prioritized common-sense ideas
that struck him as "good-
tempered, moderate, and downto-earth."
As John Locke studied humans
in his work “An Essay
Concerning Human
Understanding” he continually
referenced Descartes for ideas
as he asked this fundamental
question: “When we are
concerned with something
about which we have no certain
knowledge, what rules or
standards should guide how
confident we allow ourselves to
be that our opinions are right?”
[7]
Put in another way, he
inquired into how humans
might verify their ideas, and
considered the distinctions
between different types of
ideas. Locke found that an idea
“can simply mean some sort of
brute experience.”[8] He shows
that there are “No innate
principles in the mind.”[9] Thus,
he concludes that “our ideas are
all experienced in nature.”[10]
An experience can either be a
sensation or a reflection:
“consider whether there are any
innate ideas in the mind before
any are brought in by the
impression from sensation or
reflection.” [7] Therefore, an
idea was an experience in which
the human mind apprehended
something.
In a Lockean view, there are
really two types of ideas:
complex and simple. Simple
ideas are the building blocks for
more complex ideas, and
“While the mind is wholly
passive in the reception of
simple ideas, it is very active in
the building of complex
ideas…”[11] Complex ideas,
therefore, can either be modes,
substances, or relations.
Modes combine simpler ideas in
order to convey new
information. For instance,
David Banach [12] gives the
example of beauty as a mode.
He points to combinations of
color and form as qualities
constitutive of this mode.
Substances, however, are
distinct from modes. Substances
convey the underlying formal
unity of certain objects, such as
dogs, cats, or tables. Relations
represent the relationship
between two or more ideas that
contain analogous elements to
one another without the
implication of underlying
formal unity. A painting or a
piece of music, for example,
can both be called 'art' without
belonging to the same
substance. They are related as
forms of art (the term 'art' in this
illustration would be a 'mode of
relations'). In this way, Locke
concluded that the formal
ambiguity around ideas he
initially sought to clarify had
been resolved.
David Hume
Hume differs from Locke by
limiting idea to the more or less
vague mental reconstructions of
perceptions, the perceptual
process being described as an
"impression."[13][14] Hume
shared with Locke the basic
empiricist premise that it is only
from life experiences (whether
their own or others') that
humans' knowledge of the
existence of anything outside of
themselves can be ultimately
derived, that they shall carry on
doing what they are prompted
to do by their emotional drives
of varying kinds. In choosing
the means to those ends, they
shall follow their accustomed
associations of ideas.d Hume
has contended and defended the
notion that "reason alone is
merely the 'slave of the
passions'."[15][16]
Immanuel Kant
"Modern Book Printing" from
the Walk of Ideas
Immanuel Kant defines ideas by
distinguishing them from
concepts. Concepts arise by the
compositing of experience into
abstract categorial
representations of presumed or
encountered empirical objects
whereas the origin of ideas, for
Kant, is a priori to experience.
Regulative ideas, for example,
are ideals that one must tend
towards, but by definition may
not be completely realized as
objects of empirical experience.
Liberty, according to Kant, is an
idea whereas "tree" (as an
abstraction covering all species
of trees) is a concept. The
autonomy of the rational and
universal subject is opposed to
the determinism of the
empirical subject.[17] Kant felt
that it is precisely in knowing
its limits that philosophy exists.
The business of philosophy he
thought was not to give rules,
but to analyze the private
judgement of good common
sense.e
Rudolf Steiner
Whereas Kant declares limits to
knowledge ("we can never
know the thing in itself"), in his
epistemological work, Rudolf
Steiner sees ideas as "objects of
experience" which the mind
apprehends, much as the eye
apprehends light. In Goethean
Science (1883), he declares,
"Thinking ... is no more and no
less an organ of perception than
the eye or ear. Just as the eye
perceives colors and the ear
sounds, so thinking perceives
ideas." He holds this to be the
premise upon which Goethe
made his natural-scientific
observations.
Wilhelm Wundt
Wundt widens the term from
Kant's usage to include
conscious representation of
some object or process of the
external world. In so doing, he
includes not only ideas of
memory and imagination, but
also perceptual processes,
whereas other psychologists
confine the term to the first two
groups.[13] One of Wundt's main
concerns was to investigate
conscious processes in their
own context by experiment and
introspection. He regarded both
of these as exact methods,
interrelated in that
experimentation created optimal
conditions for introspection.
Where the experimental method
failed, he turned to other
objectively valuable aids,
specifically to those products of
cultural communal life which
lead one to infer particular
mental motives. Outstanding
among these are speech, myth,
and social custom. Wundt
designed the basic mental
activity apperception — a
unifying function which should
be understood as an activity of
the will. Many aspects of his
empirical physiological
psychology are used today. One
is his principles of mutually
enhanced contrasts and of
assimilation and dissimilation
(i.e. in color and form
perception and his advocacy of
objective methods of expression
and of recording results,
especially in language. Another
is the principle of heterogony of
ends — that multiply motivated
acts lead to unintended side
effects which in turn become
motives for new actions.[18]
Charles Sanders Peirce
C. S. Peirce published the first
full statement of pragmatism in
his important works "How to
Make Our Ideas Clear" (1878)
and "The Fixation of Belief"
(1877).[19] In "How to Make
Our Ideas Clear" he proposed
that a clear idea (in his study he
uses concept and idea as
synonymic) is defined as one,
when it is apprehended such as
it will be recognized wherever it
is met, and no other will be
mistaken for it. If it fails of this
clearness, it is said to be
obscure. He argued that to
understand an idea clearly we
should ask ourselves what
difference its application would
make to our evaluation of a
proposed solution to the
problem at hand. Pragmatism (a
term he appropriated for use in
this context), he defended, was
a method for ascertaining the
meaning of terms (as a theory of
meaning). The originality of his
ideas is in their rejection of
what was accepted as a view
and understanding of
knowledge as impersonal facts
which had been accepted by
scientists for some 250 years.
Peirce contended that we
acquire knowledge as
participants, not as spectators.
He felt "the real", sooner or
later, is composed of
information that has been
acquired through ideas and
knowledge and ordered by the
application of logical reasoning.
The rational distinction of the
empirical object is not prior to
its perception by a
knowledgable subject, in other
words. He also published many
papers on logic in relation to
ideas.
G. F. Stout and J. M. Baldwin
G. F. Stout and J. M. Baldwin,
in the Dictionary of Philosophy
and Psychology, define the idea
as "the reproduction with a
more or less adequate image, of
an object not actually present to
the senses." [20] They point out
that an idea and a perception are
by various authorities
contrasted in various ways.
"Difference in degree of
intensity", "comparative
absence of bodily movement on
the part of the subject",
"comparative dependence on
mental activity", are suggested
by psychologists as
characteristic of an idea as
compared with a perception.[13]
It should be observed that an
idea, in the narrower and
generally accepted sense of a
mental reproduction, is
frequently composite. That is,
as in the example given above
of the idea of a chair, a great
many objects, differing
materially in detail, all call a
single idea. When a man, for
example, has obtained an idea
of chairs in general by
comparison with which he can
say "This is a chair, that is a
stool", he has what is known as
an "abstract idea" distinct from
the reproduction in his mind of
any particular chair (see
abstraction). Furthermore, a
complex idea may not have any
corresponding physical object,
though its particular constituent
elements may severally be the
reproductions of actual
perceptions. Thus the idea of a
centaur is a complex mental
picture composed of the ideas
of man and horse, that of a
mermaid of a woman and a
fish.[13]
Walter Benjamin
"Ideas are to objects [of
perception] as constellations are
to stars,"[21] writes Walter
Benjamin in the introduction to
his The Origin of German
Tragic Drama. "The set of
concepts which assist in the
representation of an idea lend it
actuality as such a
configuration. For phenomena
are not incorporated into ideas.
They are not contained in them.
Ideas are, rather, their objective
virtual arrangement, their
objective interpretation."
Benjamin advances, "That an
idea is that moment in the
substance and being of a word
in which this word has become,
and performs, as a symbol." as
George Steiner summarizes.[21]
In this way techne--art and
technology--may be
represented, ideally, as
"discrete, fully autonomous
objects...[thus entering] into
fusion without losing their
identity."[21]
In
anthropology
and the social
sciences
Diffusion studies explore the
spread of ideas from culture to
culture. Some anthropological
theories hold that all cultures
imitate ideas from one or a few
original cultures, the Adam of
the Bible, or several cultural
circles that overlap.
Evolutionary diffusion theory
holds that cultures are
influenced by one another but
that similar ideas can be
developed in isolation.
In the mid-20th century, social
scientists began to study how
and why ideas spread from one
person or culture to another.
Everett Rogers pioneered
diffusion of innovations studies,
using research to prove factors
in adoption and profiles of
adopters of ideas. In 1976, in
his book The Selfish Gene,
Richard Dawkins suggested
applying biological
evolutionary theories to the
spread of ideas. He coined the
term meme to describe an
abstract unit of selection,
equivalent to the gene in
evolutionary biology.
Ideas & Intellectual Property
Main articles: Intellectual property and Ideaexpression divide
Relationship between ideas and patents
On susceptibility to exclusive property
It has been pretended by some, (and in England
especially,) that inventors have a natural and
exclusive right to their inventions, and not
merely for their own lives, but inheritable to
their heirs. But while it is a moot question
whether the origin of any kind of property is
derived from nature at all, it would be singular
to admit a natural and even an hereditary right
to inventors. It is agreed by those who have
seriously considered the subject, that no
individual has, of natural right, a separate
property in an acre of land, for instance.
By a universal law, indeed, whatever, whether
fixed or movable, belongs to all men equally
and in common, is the property for the moment
of him who occupies it, but when he
relinquishes the occupation, the property goes
with it. Stable ownership is the gift of social
law, and is given late in the progress of society.
It would be curious then, if an idea, the fugitive
fermentation of an individual brain, could, of
natural right, be claimed in exclusive and stable
property.
If nature has made any one thing less
susceptible than all others of exclusive property,
it is the action of the thinking power called an
idea, which an individual may exclusively
possess as long as he keeps it to himself; but the
moment it is divulged, it forces itself into the
possession of every one, and the receiver cannot
dispossess himself of it. Its peculiar character,
too, is that no one possesses the less, because
every other possesses the whole of it. He who
receives an idea from me, receives instruction
himself without lessening mine; as he who
lights his taper at mine, receives light without
darkening me.
That ideas should freely spread from one to
another over the globe, for the moral and mutual
instruction of man, and improvement of his
condition, seems to have been peculiarly and
benevolently designed by nature, when she
made them, like fire, expansible over all space,
without lessening their density in any point, and
like the air in which we breathe, move, and have
our physical being, incapable of confinement or
exclusive appropriation. Inventions then cannot,
in nature, be a subject of property.
Society may give an exclusive right to the
profits arising from them, as an encouragement
to men to pursue ideas which may produce
utility, but this may or may not be done,
according to the will and convenience of the
society, without claim or complaint from
anybody. Accordingly, it is a fact, as far as I am
informed, that England was, until we copied her,
the only country on earth which ever, by a
general law, gave a legal right to the exclusive
use of an idea. In some other countries it is
sometimes done, in a great case, and by a
special and personal act but, generally speaking,
other nations have thought that these
monopolies produce more embarrassment than
advantage to society.[22]
— Thomas Jefferson, letter to Isaac McPherson,
13 August 1813
Patent law regulates various aspects related to
the functional manifestation of inventions based
on new ideas or incremental improvements to
existing ones. Thus, patents have a direct
relationship to ideas.
Relationship between ideas and copyrights
A picture of a lightbulb is often used to
represent a person having a bright idea.
In some cases, authors can be granted limited
legal monopolies on the manner in which
certain works are expressed. This is known
colloquially as copyright, although the term
intellectual property is used mistakenly in place
of copyright. Copyright law regulating the
aforementioned monopolies generally does not
cover the actual ideas. The law does not bestow
the legal status of property upon ideas per se.
Instead, laws purport to regulate events related
to the usage, copying, production, sale and other
forms of exploitation of the fundamental
expression of a work, that may or may not carry
ideas. Copyright law is fundamentally different
from patent law in this respect: patents do grant
monopolies on ideas (more on this below).
A copyright is meant to regulate some aspects of
the usage of expressions of a work, not an idea.
Thus, copyrights have a negative relationship to
ideas.
Work means a tangible medium of expression. It
may be an original or derivative work of art, be
it literary, dramatic, musical recitation, artistic,
related to sound recording, etc. In (at least)
countries adhering to the Berne Convention,
copyright automatically starts covering the work
upon the original creation and fixation thereof,
without any extra steps. While creation usually
involves an idea, the idea in itself does not
suffice for the purposes of claiming
copyright.[23][24][25][26] [27]
Relationship of ideas to confidentiality
agreements
Confidentiality and nondisclosure agreements
are legal instruments that assist corporations and
individuals in keeping ideas from escaping to
the general public. Generally, these instruments
are covered by contract law.
Idealism
In philosophy, the term idealism
identifies and describes metaphysical
perspectives which assert that reality is
indistinguishable and inseparable from
perception and understanding; that reality
is a mental construct closely connected to
ideas.[1] Idealist perspectives are in two
categories: subjective idealism, which
proposes that a material object exists
only to the extent that a human being
perceives the object; and objective
idealism, which proposes the existence of
an objective consciousness that exists
prior to and independently of human
consciousness, thus the existence of the
object is independent of human
perception.
The philosopher George Berkeley said
that the essence of an object is to be
perceived. By contrast, Immanuel Kant
said that idealism "does not concern the
existence of things", but that "our modes
of representation" of things such as space
and time are not "determinations that
belong to things in themselves", but are
essential features of the human mind.[2]
In the philosophy of "transcendental
idealism" Kant proposes that the objects
of experience relied upon their existence
in the human mind that perceives the
objects, and that the nature of the thingin-itself is external to human experience,
and cannot be conceived without the
application of categories, which give
structure to the human experience of
reality.
Epistemologically, idealism is
accompanied by philosophical skepticism
about the possibility of knowing the
existence of any thing that is independent
of the human mind. Ontologically,
idealism asserts that the existence of
things depends upon the human mind;[3]
thus, ontological idealism rejects the
perspectives of physicalism and dualism,
because neither perspective gives
ontological priority to the human mind.
In contrast to materialism, idealism
asserts the primacy of consciousness as
the origin and prerequisite of
phenomena. Idealism holds that
consciousness (the mind) is the origin of
the material world.[4]
Indian and Greek philosophers proposed
the earliest arguments that the world of
experience is grounded in the mind's
perception of the physical world. Hindu
idealism and Greek neoplatonism gave
panentheistic arguments for the existence
of an all-pervading consciousness as the
true nature, as the true grounding of
reality.[5] In contrast, the Yogācāra
school, which arose within Mahayana
Buddhism in India in the 4th century
AD,[6] based its "mind-only" idealism to
a greater extent on phenomenological
analyses of personal experience. This
turn toward the subjective anticipated
empiricists such as George Berkeley,
who revived idealism in 18th-century
Europe by employing skeptical
arguments against materialism.
Beginning with Kant, German idealists
such as Georg Wilhelm Friedrich Hegel,
Johann Gottlieb Fichte, Friedrich
Wilhelm Joseph Schelling, and Arthur
Schopenhauer dominated 19th-century
philosophy. This tradition, which
emphasized the mental or "ideal"
character of all phenomena, gave birth to
idealistic and subjectivist schools ranging
from British idealism to phenomenalism
to existentialism.
Idealism as a philosophy came under
heavy attack in the West at the turn of the
20th century. The most influential critics
of both epistemological and ontological
idealism were G. E. Moore and Bertrand
Russell,[7] but its critics also included the
new realists. According to Stanford
Encyclopedia of Philosophy, the attacks
by Moore and Russell were so influential
that even more than 100 years later "any
acknowledgment of idealistic tendencies
is viewed in the English-speaking world
with reservation". However, many
aspects and paradigms of idealism did
still have a large influence on subsequent
philosophy.[8] Phenomenology, an
influential strain of philosophy since the
beginning of the 20th century, also draws
on the lessons of idealism. In his Being
and Time, Martin Heidegger famously
states:
If the term idealism amounts to the
recognition that being can never be
explained through beings, but, on the
contrary, always is the transcendental in
its relation to any beings, then the only
right possibility of philosophical
problematics lies with idealism. In that
case, Aristotle was no less an idealist
than Kant. If idealism means a reduction
of all beings to a subject or a
consciousness, distinguished by staying
undetermined in its own being, and
ultimately is characterised negatively as
non-thingly, then this idealism is no less
methodically naive than the most coarsegrained realism
Definitions
Idealism is a term with several related
meanings. It comes via Latin idea from the
Ancient Greek idea (ἰδέα) from idein (ἰδεῖν),
meaning 'to see'. The term entered the
English language by 1743.[10][11] It was first
used in the abstract metaphysical sense
"belief that reality is made up only of ideas"
by Christian Wolff in 1747.[8] The term reentered the English language in this abstract
sense by 1796.[12]
In ordinary language, as when speaking of
Woodrow Wilson's political idealism, it
generally suggests the priority of ideals,
principles, values, and goals over concrete
realities. Idealists are understood to
represent the world as it might or should be,
unlike pragmatists, who focus on the world
as it presently is. In the arts, similarly,
idealism affirms imagination and attempts to
realize a mental conception of beauty, a
standard of perfection, juxtaposed to
aesthetic naturalism and realism.[13][14] The
term idealism is also sometimes used in a
sociological sense, which emphasizes how
human ideas—especially beliefs and
values—shape society.[15]
Any philosophy that assigns crucial
importance to the ideal or spiritual realm in
its account of human existence may be
termed "idealist". Metaphysical idealism is
an ontological doctrine that holds that reality
itself is incorporeal or experiential at its
core. Beyond this, idealists disagree on
which aspects of the mental are more basic.
Platonic idealism affirms that abstractions
are more basic to reality than the things we
perceive, while subjective idealists and
phenomenalists tend to privilege sensory
experience over abstract reasoning.
Epistemological idealism is the view that
reality can only be known through ideas,
that only psychological experience can be
apprehended by the mind.[3][16][17]
Subjective idealists like George Berkeley
are anti-realists in terms of a mindindependent world. However, not all
idealists restrict the real or the knowable to
our immediate subjective experience.
Objective idealists make claims about a
transempirical world, but simply deny that
this world is essentially divorced from or
ontologically prior to the mental. Thus, Plato
and Gottfried Leibniz affirm an objective
and knowable reality transcending our
subjective awareness—a rejection of
epistemological idealism—but propose that
this reality is grounded in ideal entities, a
form of metaphysical idealism. Nor do all
metaphysical idealists agree on the nature of
the ideal; for Plato, the fundamental entities
were non-mental abstract forms, while for
Leibniz they were proto-mental and concrete
monads.[18]
As a rule, transcendental idealists like Kant
affirm idealism's epistemic side without
committing themselves to whether reality is
ultimately mental; objective idealists like
Plato affirm reality's metaphysical basis in
the mental or abstract without restricting
their epistemology to ordinary experience;
and subjective idealists like Berkeley affirm
both metaphysical and epistemological
idealism.[19]
Classical idealism
Pre-Socratic philosophy
Idealism as a form of metaphysical monism
holds that consciousness, not matter, is the
ground of all being. It is monist because it
holds that there is only one type of thing in
the universe and idealist because it holds
that one thing to be consciousness.
Anaxagoras (480 BC) taught that "all
things" were created by Nous ("Mind"). He
held that Mind held the cosmos together and
gave human beings a connection to the
cosmos or a pathway to the divine.
Platonism and neoplatonism
Plato's theory of forms or "ideas" describes
ideal forms (for example the platonic solids
in geometry or abstracts like Goodness and
Justice), as universals existing
independently of any particular instance.[20]
Arne Grøn calls this doctrine "the classic
example of a metaphysical idealism as a
transcendent idealism",[21] while Simone
Klein calls Plato "the earliest representative
of metaphysical objective idealism".
Nevertheless, Plato holds that matter is real,
though transitory and imperfect, and is
perceived by our body and its senses and
given existence by the eternal ideas that are
perceived directly by our rational soul. Plato
was therefore a metaphysical and
epistemological dualist, an outlook that
modern idealism has striven to avoid:[22]
Plato's thought cannot therefore be counted
as idealist in the modern sense.
With the neoplatonist Plotinus, wrote
Nathaniel Alfred Boll "there even appears,
probably for the first time in Western
philosophy, idealism that had long been
current in the East even at that time, for it
taught... that the soul has made the world by
stepping from eternity into time...".[23][24]
Similarly, in regard to passages from the
Enneads, "The only space or place of the
world is the soul" and "Time must not be
assumed to exist outside the soul".[25]
Ludwig Noiré wrote: "For the first time in
Western philosophy we find idealism proper
in Plotinus".[5] However, Plotinus does not
address whether we know external
objects,[26] unlike Schopenhauer and other
modern philosophers.
Christian philosophy
Christian theologians have held idealist
views,[27] often based on neoplatonism,
despite the influence of Aristotelian
scholasticism from the 12th century onward.
However, there is certainly a sense in which
the scholastics retained the idealism that
came via St. Augustine right back to
Plato.[28]
Later western theistic idealism such as that
of Hermann Lotze offers a theory of the
"world ground" in which all things find their
unity: it has been widely accepted by
Protestant theologians.[29] Several modern
religious movements, for example the
organizations within the New Thought
Movement and the Unity Church, may be
said to have a particularly idealist
orientation. The theology of Christian
Science includes a form of idealism: it
teaches that all that truly exists is God and
God's ideas; that the world as it appears to
the senses is a distortion of the underlying
spiritual reality, a distortion that may be
corrected (both conceptually and in terms of
human experience) through a reorientation
(spiritualization) of thought.[30]
Chinese philosophy
Wang Yangming, a Ming Chinese neoConfucian philosopher, official,
educationist, calligraphist and general, held
that objects do not exist entirely apart from
the mind because the mind shapes them. It is
not the world that shapes the mind but the
mind that gives reason to the world, so the
mind alone is the source of all reason,
having an inner light, an innate moral
goodness and understanding of what is
good.
Idealism in Vedic and
Buddhist thought
Part of a series on
Eastern
philosophy
China
India
Japan
Korea
Philosophy portal
v
t
e



The sage Yajnavalkya (possibly 8th century
BCE) is one of the earliest exponents of
idealism, and is a major figure in the
Brihadaranyaka Upanishad.
There are currents of idealism throughout
Indian philosophy, ancient and modern.
Hindu idealism often takes the form of
monism or non-dualism, espousing the view
that a unitary consciousness is the essence or
meaning of the phenomenal reality and
plurality.
Buddhist idealism on the other hand is more
epistemic and is not a metaphysical monism,
which Buddhists consider eternalistic and
hence not the Middle Way between
extremes espoused by the Buddha.
The oldest reference to Idealism in Vedic
texts is in Purusha Sukta of the Rig Veda.
This Sukta espouses panentheism by
presenting cosmic being Purusha as both
pervading all universe and yet being
transcendent to it.[31] Absolute idealism can
be seen in Chāndogya Upaniṣad, where
things of the objective world like the five
elements and the subjective world such as
will, hope, memory etc. are seen to be
emanations from the Self.[32]
Indian philosophy
Idealist notions have been propounded by
the Vedanta schools of thought, which use
the Vedas, especially the Upanishads as
their key texts. Idealism was opposed by
dualists Samkhya, the atomists Vaisheshika,
the logicians Nyaya, the linguists Mimamsa
and the materialists Cārvāka. There are
various sub schools of Vedanta, like Advaita
Vedanta (non-dual), Vishishtadvaita and
Bhedabheda Vedanta (difference and nondifference).
The schools of Vedanta all attempt to
explain the nature and relationship of
Brahman (universal soul or Self) and Atman
(individual self), which they see as the
central topic of the Vedas. One of the
earliest attempts at this was Bādarāyaņa's
Brahma Sutras, which is canonical for all
Vedanta sub-schools. Advaita Vedanta is a
major sub school of Vedanta which holds a
non-dual Idealistic metaphysics. According
to Advaita thinkers like Adi Shankara (788–
820) and his contemporary Maṇḍana Miśra,
Brahman, the single unitary consciousness
or absolute awareness, appears as the
diversity of the world because of maya or
illusion, and hence perception of plurality is
mithya, error. The world and all beings or
souls in it have no separate existence from
Brahman, universal consciousness, and the
seemingly independent soul (jiva)[33] is
identical to Brahman. These doctrines are
represented in verses such as brahma satyam
jagan mithya; jīvo brahmaiva na aparah
(Brahman is alone True, and this world of
plurality is an error; the individual self is not
different from Brahman). Other forms of
Vedanta like the Vishishtadvaita of
Ramanuja and the Bhedabheda of Bhāskara
are not as radical in their non-dualism,
accepting that there is a certain difference
between individual souls and Brahman.
Dvaita school of Vedanta by Madhvacharya
maintains the opposing view that the world
is real and eternal. It also argues that real
Atman fully depends on the reflection of
independent Brahman.
The Tantric tradition of Kashmir Shaivism
has also been categorized by scholars as a
form of Idealism.[34] The key thinker of this
tradition is the Kashmirian Abhinavagupta
(975–1025 CE).
Modern Vedic Idealism was defended by the
influential Indian philosopher Sarvepalli
Radhakrishnan in his 1932 An Idealist View
of Life and other works, which espouse
Advaita Vedanta. The essence of Hindu
Idealism is captured by such modern writers
as Sri Nisargadatta Maharaj, Sri Aurobindo,
P. R. Sarkar, and Sohail Inayatullah.
Buddhist philosophy
Statue of Vasubandhu (jp. Seshin), Kōfukuji, Nara, Japan.
Buddhist views which can be said to be
similar to Idealism appear in Mahayana
Buddhist texts such as the Samdhinirmocana
sutra, Laṅkāvatāra Sūtra, Dashabhumika
sutra, etc.[35] These were later expanded
upon by Indian Buddhist philosophers of the
influential Yogacara school, like
Vasubandhu, Asaṅga, Dharmakīrti, and
Śāntarakṣita. Yogacara thought was also
promoted in China by Chinese philosophers
and translators like Xuanzang.
There is a modern scholarly disagreement
about whether Yogacara Buddhism can be
said to be a form of idealism. As Saam
Trivedi notes: "on one side of the debate,
writers such as Jay Garfield, Jeffrey
Hopkins, Paul Williams, and others maintain
the idealism label, while on the other side,
Stefan Anacker, Dan Lusthaus, Richard
King, Thomas Kochumuttom, Alex
Wayman, Janice Dean Willis, and others
have argued that Yogacara is not
idealist."[36] The central point of issue is
what Buddhist philosophers like
Vasubandhu who used the term vijñaptimatra ("representation-only" or "cognitiononly") and formulated arguments to refute
external objects actually meant to say.
Vasubandhu's works include a refutation of
external objects or externality itself and
argues that the true nature of reality is
beyond subject-object distinctions.[36] He
views ordinary consciousness experience as
deluded in its perceptions of an external
world separate from itself and instead argues
that all there is Vijñapti (representation or
conceptualization).[36] Hence Vasubandhu
begins his Vimsatika with the verse: All this
is consciousness-only, because of the
appearance of non-existent objects, just as
someone with an optical disorder may see
non-existent nets of hair.[36]
Likewise, the Buddhist philosopher
Dharmakirti's view of the apparent existence
of external objects is summed up by him in
the Pramānaṿārttika ('Commentary on Logic
and Epistemology'): Cognition experiences
itself, and nothing else whatsoever. Even the
particular objects of perception, are by
nature just consciousness itself.[37]
While some writers like Jay Garfield hold
that Vasubandhu is a metaphysical idealist,
others see him as closer to an epistemic
idealist like Kant who holds that our
knowledge of the world is simply
knowledge of our own concepts and
perceptions of a transcendental world. Sean
Butler upholding that Yogacara is a form of
idealism, albeit its own unique type, notes
the similarity of Kant's categories and
Yogacara's Vāsanās, both of which are
simply phenomenal tools with which the
mind interprets the noumenal realm.[38]
Unlike Kant however who holds that the
noumenon or thing-in-itself is unknowable
to us, Vasubandhu holds that ultimate reality
is knowable, but only through nonconceptual yogic perception of a highly
trained meditative mind.[36]
Writers like Dan Lusthaus who hold that
Yogacara is not a metaphysical idealism
point out, for example, that Yogācāra
thinkers did not focus on consciousness to
assert it as ontologically real, but simply to
analyze how our experiences and thus our
suffering is created. As Lusthaus notes: "no
Indian Yogācāra text ever claims that the
world is created by mind. What they do
claim is that we mistake our projected
interpretations of the world for the world
itself, i.e. we take our own mental
constructions to be the world."[39] Lusthaus
notes that there are similarities to Western
epistemic idealists like Kant and Husserl,
enough so that Yogacara can be seen as a
form of epistemological idealism. However
he also notes key differences like the
concepts of karma and nirvana.[39] Saam
Trivedi meanwhile notes the similarities
between epistemic idealism and Yogacara,
but adds that Yogacara Buddhism is in a
sense its own theory.[36]
Similarly, Thomas Kochumuttom sees
Yogacara as "an explanation of experience,
rather than a system of ontology" and Stefan
Anacker sees Vasubandhu's philosophy as a
form of psychology and as a mainly
therapeutic enterprise.[40][41]
Subjective idealism
ubjective idealism (also known as
immaterialism) describes a relationship between
experience and the world in which objects are
no more than collections or bundles of sense
data in the perceiver. Proponents include
Berkeley, Bishop of Cloyne, an Anglo-Irish
philosopher who advanced a theory he called
"immaterialism," later referred to as "subjective
idealism", contending that individuals can only
know sensations and ideas of objects directly,
not abstractions such as "matter", and that ideas
also depend upon being perceived for their very
existence - esse est percipi; "to be is to be
perceived".
Arthur Collier[42] published similar assertions
though there seems to have been no influence
between the two contemporary writers. The only
knowable reality is the represented image of an
external object. Matter as a cause of that image,
is unthinkable and therefore nothing to us. An
external world as absolute matter unrelated to an
observer does not exist as far as we are
concerned. The universe cannot exist as it
appears if there is no perceiving mind. Collier
was influenced by An Essay Towards the
Theory of the Ideal or Intelligible World by
Cambridge Platonist John Norris (1701).
Paul Brunton, a British philosopher, mystic,
traveler, and guru, taught a type of idealism
called "mentalism," similar to that of Bishop
Berkeley, proposing a master world-image,
projected or manifested by a world-mind, and an
infinite number of individual minds
participating. A tree does not cease to exist if
nobody sees it because the world-mind is
projecting the idea of the tree to all minds[43]
Epistemological idealism is a subjectivist
position in epistemology that holds that what
one knows about an object exists only in one's
mind. Proponents include Brand Blanshard.
A. A. Luce[44] and John Foster are other
subjectivists.[45] Luce, in Sense without Matter
(1954), attempts to bring Berkeley up to date by
modernizing his vocabulary and putting the
issues he faced in modern terms, and treats the
Biblical account of matter and the psychology
of perception and nature. Foster's The Case for
Idealism argues that the physical world is the
logical creation of natural, non-logical
constraints on human sense-experience. Foster's
latest defense of his views (phenomenalistic
idealism) is in his book A World for Us: The
Case for Phenomenalistic Idealism.
Critics of Subjective idealism include Bertrand
Russell's popular 1912 book The Problems of
Philosophy, Australian philosopher David
Stove,[46] Alan Musgrave,[47] and John Searle.[48]
Transcendental idealism
Main article: Transcendental idealism
Transcendental idealism, founded by Immanuel
Kant in the eighteenth century, maintains that
the mind shapes the world we perceive into the
form of space-and-time.
... if I remove the thinking subject, the whole
material world must at once vanish because it is
nothing but a phenomenal appearance in the
sensibility of ourselves as a subject, and a
manner or species of representation.
— Critique of Pure Reason A383
The 2nd edition (1787) contained a Refutation
of Idealism to distinguish his transcendental
idealism from Descartes's Sceptical Idealism
and Berkeley's anti-realist strain of Subjective
Idealism. The section Paralogisms of Pure
Reason is an implicit critique of Descartes's
idealism. Kant says that it is not possible to infer
the 'I' as an object (Descartes' cogito ergo sum)
purely from "the spontaneity of thought". Kant
focused on ideas drawn from British
philosophers such as Locke, Berkeley and
Hume but distinguished his transcendental or
critical idealism from previous varieties;
The dictum of all genuine idealists, from the
Eleatic school to Bishop Berkeley, is contained
in this formula: "All knowledge through the
senses and experience is nothing but sheer
illusion, and only in the ideas of the pure
understanding and reason is there truth." The
principle that throughout dominates and
determines my [transcendental] idealism is, on
the contrary: "All knowledge of things merely
from pure understanding or pure reason is
nothing but sheer illusion, and only in
experience is there truth."
— Prolegomena, 374
Kant distinguished between things as they
appear to an observer and things in themselves,
"that is, things considered without regard to
whether and how they may be given to us".[49]
We cannot approach the noumenon, the "thing
in Itself" (German: Ding an sich) without our
own mental world. He added that the mind is
not a blank slate, tabula rasa but rather comes
equipped with categories for organising our
sense impressions.
In the first volume of his Parerga and
Paralipomena, Schopenhauer wrote his "Sketch
of a History of the Doctrine of the Ideal and the
Real". He defined the ideal as being mental
pictures that constitute subjective knowledge.
The ideal, for him, is what can be attributed to
our own minds. The images in our head are
what comprise the ideal. Schopenhauer
emphasized that we are restricted to our own
consciousness. The world that appears is only a
representation or mental picture of objects. We
directly and immediately know only
representations. All objects that are external to
the mind are known indirectly through the
mediation of our mind. He offered a history of
the concept of the "ideal" as "ideational" or
"existing in the mind as an image".
[T]rue philosophy must at all costs be idealistic;
indeed, it must be so merely to be honest. For
nothing is more certain than that no one ever
came out of himself in order to identify himself
immediately with things different from him; but
everything of which he has certain, sure, and
therefore immediate knowledge, lies within his
consciousness. Beyond this consciousness,
therefore, there can be no immediate certainty ...
There can never be an existence that is objective
absolutely and in itself; such an existence,
indeed, is positively inconceivable. For the
objective, as such, always and essentially has its
existence in the consciousness of a subject; it is
therefore the subject's representation, and
consequently is conditioned by the subject, and
moreover by the subject's forms of
representation, which belong to the subject and
not to the object.
— The World as Will and Representation, Vol.
II, Ch. 1
Charles Bernard Renouvier was the first
Frenchman after Nicolas Malebranche to
formulate a complete idealistic system, and had
a vast influence on the development of French
thought. His system is based on Immanuel
Kant's, as his chosen term "néo-criticisme"
indicates; but it is a transformation rather than a
continuation of Kantianism.
Friedrich Nietzsche argued that Kant commits
an agnostic tautology and does not offer a
satisfactory answer as to the source of a
philosophical right to such-or-other
metaphysical claims; he ridicules his pride in
tackling "the most difficult thing that could ever
be undertaken on behalf of metaphysics."[50] The
famous "thing-in-itself" was called a product of
philosophical habit, which seeks to introduce a
grammatical subject: because wherever there is
cognition, there must be a thing that is cognized
and allegedly it must be added to ontology as a
being (whereas, to Nietzsche, only the world as
ever changing appearances can be assumed).[51]
Yet he attacks the idealism of Schopenhauer and
Descartes with an argument similar to Kant's
critique of the latter (see above).[52]
Objective idealism
Main article: Objective idealism
Objective idealism asserts that the reality of
experiencing combines and transcends the
realities of the object experienced and of the
mind of the observer.[53] Proponents include
Thomas Hill Green, Josiah Royce, Benedetto
Croce and Charles Sanders Peirce.
Absolute idealism
Main article: Absolute idealism
Schelling (1775–1854) claimed that the Fichte's
"I" needs the Not-I, because there is no subject
without object, and vice versa. So there is no
difference between the subjective and the
objective, that is, the ideal and the real. This is
Schelling's "absolute identity": the ideas or
mental images in the mind are identical to the
extended objects which are external to the mind.
Absolute idealism is G. W. F. Hegel's account
of how existence is comprehensible as an allinclusive whole. Hegel called his philosophy
"absolute" idealism in contrast to the "subjective
idealism" of Berkeley and the "transcendental
idealism" of Kant and Fichte,[54] which were not
based on a critique of the finite and a dialectical
philosophy of history as Hegel's idealism was.
The exercise of reason and intellect enables the
philosopher to know ultimate historical reality,
the phenomenological constitution of selfdetermination, the dialectical development of
self-awareness and personality in the realm of
History.
In his Science of Logic (1812–1814) Hegel
argues that finite qualities are not fully "real"
because they depend on other finite qualities to
determine them. Qualitative infinity, on the
other hand, would be more self-determining and
hence more fully real. Similarly finite natural
things are less "real"—because they are less
self-determining—than spiritual things like
morally responsible people, ethical communities
and God. So any doctrine, such as materialism,
that asserts that finite qualities or natural objects
are fully real is mistaken.[55]
Hegel certainly intends to preserve what he
takes to be true of German idealism, in
particular Kant's insistence that ethical reason
can and does go beyond finite inclinations.[56]
For Hegel there must be some identity of
thought and being for the "subject" (any human
observer) to be able to know any observed
"object" (any external entity, possibly even
another human) at all. Under Hegel's concept of
"subject-object identity," subject and object both
have Spirit (Hegel's ersatz, redefined,
nonsupernatural "God") as their conceptual (not
metaphysical) inner reality—and in that sense
are identical. But until Spirit's "self-realization"
occurs and Spirit graduates from Spirit to
Absolute Spirit status, subject (a human mind)
mistakenly thinks every "object" it observes is
something "alien," meaning something separate
or apart from "subject." In Hegel's words, "The
object is revealed to it [to "subject"] by [as]
something alien, and it does not recognize
itself."[57] Self-realization occurs when Hegel
(part of Spirit's nonsupernatural Mind, which is
the collective mind of all humans) arrives on the
scene and realizes that every "object" is himself,
because both subject and object are essentially
Spirit. When self-realization occurs and Spirit
becomes Absolute Spirit, the "finite" (man,
human) becomes the "infinite" ("God," divine),
replacing the imaginary or "picture-thinking"
supernatural God of theism: man becomes
God.[58] Tucker puts it this way: "Hegelianism . .
. is a religion of self-worship whose
fundamental theme is given in Hegel's image of
the man who aspires to be God himself, who
demands 'something more, namely infinity.'"
The picture Hegel presents is "a picture of a
self-glorifying humanity striving compulsively,
and at the end successfully, to rise to
divinity."[59]
Kierkegaard criticized Hegel's idealist
philosophy in several of his works, particularly
his claim to a comprehensive system that could
explain the whole of reality. Where Hegel
argues that an ultimate understanding of the
logical structure of the world is an
understanding of the logical structure of God's
mind, Kierkegaard asserts that for God reality
can be a system but it cannot be so for any
human individual because both reality and
humans are incomplete and all philosophical
systems imply completeness. For Hegel, a
logical system is possible but an existential
system is not: "What is rational is actual; and
what is actual is rational".[60] Hegel's absolute
idealism blurs the distinction between existence
and thought: our mortal nature places limits on
our understanding of reality;
So-called systems have often been characterized
and challenged in the assertion that they
abrogate the distinction between good and evil,
and destroy freedom. Perhaps one would
express oneself quite as definitely, if one said
that every such system fantastically dissipates
the concept existence. ... Being an individual
man is a thing that has been abolished, and
every speculative philosopher confuses himself
with humanity at large; whereby he becomes
something infinitely great, and at the same time
nothing at all.[61]
A major concern of Hegel's Phenomenology of
Spirit (1807) and of the philosophy of Spirit that
he lays out in his Encyclopedia of the
Philosophical Sciences (1817–1830) is the
interrelation between individual humans, which
he conceives in terms of "mutual recognition."
However, what Climacus means by the
aforementioned statement, is that Hegel, in the
Philosophy of Right, believed the best solution
was to surrender one's individuality to the
customs of the State, identifying right and
wrong in view of the prevailing bourgeois
morality. Individual human will ought, at the
State's highest level of development, to properly
coincide with the will of the State. Climacus
rejects Hegel's suppression of individuality by
pointing out it is impossible to create a valid set
of rules or system in any society which can
adequately describe existence for any one
individual. Submitting one's will to the State
denies personal freedom, choice, and
responsibility.
In addition, Hegel does believe we can know the
structure of God's mind, or ultimate reality.
Hegel agrees with Kierkegaard that both reality
and humans are incomplete, inasmuch as we are
in time, and reality develops through time. But
the relation between time and eternity is outside
time and this is the "logical structure" that Hegel
thinks we can know. Kierkegaard disputes this
assertion, because it eliminates the clear
distinction between ontology and epistemology.
Existence and thought are not identical and one
cannot possibly think existence. Thought is
always a form of abstraction, and thus not only
is pure existence impossible to think, but all
forms in existence are unthinkable; thought
depends on language, which merely abstracts
from experience, thus separating us from lived
experience and the living essence of all beings.
In addition, because we are finite beings, we
cannot possibly know or understand anything
that is universal or infinite such as God, so we
cannot know God exists, since that which
transcends time simultaneously transcends
human understanding.
Bradley saw reality as a monistic whole
apprehended through "feeling", a state in which
there is no distinction between the perception
and the thing perceived. Like Berkeley, Bradley
thought that nothing can be known to exist
unless it is known by a mind.
We perceive, on reflection, that to be real, or
even barely to exist, must be to fall within
sentience ... Find any piece of existence, take up
anything that any one could possibly call a fact,
or could in any sense assert to have being, and
then judge if it does not consist in sentient
experience. Try to discover any sense in which
you can still continue to speak of it, when all
perception and feeling have been removed; or
point out any fragment of its matter, any aspect
of its being, which is not derived from and is not
still relative to this source. When the experiment
is made strictly, I can myself conceive of
nothing else than the experienced.
— F. H. Bradley, Appearance and Reality,
chapter 14
Bradley was the apparent target of G.E. Moore's
radical rejection of idealism. Moore claimed
that Bradley did not understand the statement
that something is real. We know for certain,
through common sense and prephilosophical
beliefs, that some things are real, whether they
are objects of thought or not, according to
Moore. The 1903 article The Refutation of
Idealism is one of the first demonstrations of
Moore's commitment to analysis. He examines
each of the three terms in the Berkeleian
aphorism esse est percipi, "to be is to be
perceived", finding that it must mean that the
object and the subject are necessarily connected
so that "yellow" and "the sensation of yellow"
are identical - "to be yellow" is "to be
experienced as yellow". But it also seems there
is a difference between "yellow" and "the
sensation of yellow" and "that esse is held to be
percipi, solely because what is experienced is
held to be identical with the experience of it".
Though far from a complete refutation, this was
the first strong statement by analytic philosophy
against its idealist predecessors, or at any rate
against the type of idealism represented by
Berkeley.
Actual idealism
Actual idealism is a form of idealism developed
by Giovanni Gentile that grew into a
"grounded" idealism contrasting Kant and
Hegel. The idea is a version of Occam's razor;
the simpler explanations are always correct.
Actual idealism is the idea that reality is the
ongoing act of thinking, or in Italian "pensiero
pensante".[62] Any action done by humans is
classified as human thought because the action
was done due to predisposed thought. He further
believes that thoughts are the only concept that
truly exist since reality is defined through the
act of thinking. This idea was derived from
Gentile's paper, "The Theory of Mind As Pure
Act".[63]
Since thoughts are actions, any conjectured idea
can be enacted. This idea not only affects the
individual's life, but everyone around them,
which in turn affects the state since the people
are the state.[64] Therefore, thoughts of each
person are subsumed within the state. The state
is a composition of many minds that come
together to change the country for better or
worse.
Gentile theorizes that thoughts can only be
conjectured within the bounds of known reality;
abstract thinking does not exist.[63] Thoughts
cannot be formed outside our known reality
because we are the reality that halt ourselves
from thinking externally. With accordance to
"The Act of Thought of Pure Thought", our
actions comprise our thoughts, our thoughts
create perception, perceptions define reality,
thus we think within our created reality.
The present act of thought is reality but the past
is not reality; it is history. The reason being,
past can be rewritten through present knowledge
and perspective of the event. The reality that is
currently constructed can be completely
changed through language (e.g. bias (omission,
source, tone)).[64] The unreliability of the
recorded reality can skew the original concept
and make the past remark unreliable. Actual
idealism is regarded as a liberal and tolerant
doctrine since it acknowledges that every being
picturizes reality, in which their ideas remained
hatched, differently. Even though, reality is a
figment of thought.
Even though core concept of the theory is
famous for its simplification, its application is
regarded as extremely ambiguous. Over the
years, philosophers have interpreted it
numerously different ways:[65] Holmes took it as
metaphysics of the thinking act; Betti as a form
of hermeneutics; Harris as a metaphysics of
democracy; Fogu as a modernist philosophy of
history.
Giovanni Gentile was a key supporter of
fascism, regarded by many as the "philosopher
of fascism". Gentile's philosophy was the key to
understating fascism as it was believed by many
who supported and loved it. They believed, if
priori synthesis of subject and object is true,
there is no difference between the individuals in
society; they're all one. Which means that they
have equal right, roles, and jobs. In fascist state,
submission is given to one leader because
individuals act as one body. In Gentile's view,
far more can be accomplished when individuals
are under a corporate body than a collection of
autonomous individuals.[64]
Pluralistic idealism
Pluralistic idealism such as that of Gottfried
Leibniz[66][67] takes the view that there are many
individual minds that together underlie the
existence of the observed world and make
possible the existence of the physical
universe.[68] Unlike absolute idealism, pluralistic
idealism does not assume the existence of a
single ultimate mental reality or "Absolute".
Leibniz' form of idealism, known as
Panpsychism, views "monads" as the true atoms
of the universe and as entities having
perception. The monads are "substantial forms
of being, "elemental, individual, subject to their
own laws, non-interacting, each reflecting the
entire universe. Monads are centers of force,
which is substance while space, matter and
motion are phenomenal and their form and
existence is dependent on the simple and
immaterial monads. There is a pre-established
harmony by God, the central monad, between
the world in the minds of the monads and the
external world of objects. Leibniz's cosmology
embraced traditional Christian theism. The
English psychologist and philosopher James
Ward inspired by Leibniz had also defended a
form of pluralistic idealism.[69] According to
Ward the universe is composed of "psychic
monads" of different levels, interacting for
mutual self-betterment.[70]
Personalism is the view that the minds that
underlie reality are the minds of persons.
Borden Parker Bowne, a philosopher at Boston
University, a founder and popularizer of
personal idealism, presented it as a substantive
reality of persons, the only reality, as known
directly in self-consciousness. Reality is a
society of interacting persons dependent on the
Supreme Person of God. Other proponents
include George Holmes Howison[71] and
J. M. E. McTaggart.[72]
Howison's personal idealism [73] was also called
"California Personalism" by others to
distinguish it from the "Boston Personalism"
which was of Bowne. Howison maintained that
both impersonal, monistic idealism and
materialism run contrary to the experience of
moral freedom. To deny freedom to pursue
truth, beauty, and "benignant love" is to
undermine every profound human venture,
including science, morality, and philosophy.
Personalistic idealists Borden Parker Bowne and
Edgar S. Brightman and realistic (in some
senses of the term, though he remained
influenced by neoplatonism) personal theist
Saint Thomas Aquinas address a core issue,
namely that of dependence upon an infinite
personal God.[74]
Howison, in his book The Limits of Evolution
and Other Essays Illustrating the Metaphysical
Theory of Personal Idealism, created a
democratic notion of personal idealism that
extended all the way to God, who was no more
the ultimate monarch but the ultimate democrat
in eternal relation to other eternal persons. J. M.
E. McTaggart's idealist atheism and Thomas
Davidson's apeirotheism resemble Howisons
personal idealism.[75]
J. M. E. McTaggart argued that minds alone
exist and only relate to each other through love.
Space, time and material objects are unreal. In
The Unreality of Time he argued that time is an
illusion because it is impossible to produce a
coherent account of a sequence of events. The
Nature of Existence (1927) contained his
arguments that space, time, and matter cannot
possibly be real. In his Studies in Hegelian
Cosmology (Cambridge, 1901, p196) he
declared that metaphysics are not relevant to
social and political action. McTaggart "thought
that Hegel was wrong in supposing that
metaphysics could show that the state is more
than a means to the good of the individuals who
compose it".[76] For McTaggart "philosophy can
give us very little, if any, guidance in action...
Why should a Hegelian citizen be surprised that
his belief as to the organic nature of the
Absolute does not help him in deciding how to
vote? Would a Hegelian engineer be reasonable
in expecting that his belief that all matter is
spirit should help him in planning a bridge?[77]
Thomas Davidson taught a philosophy called
"apeirotheism", a "form of pluralistic
idealism...coupled with a stern ethical
rigorism"[78] which he defined as "a theory of
Gods infinite in number." The theory was
indebted to Aristotle's pluralism and his
concepts of Soul, the rational, living aspect of a
living substance which cannot exist apart from
the body because it is not a substance but an
essence, and nous, rational thought, reflection
and understanding. Although a perennial source
of controversy, Aristotle arguably views the
latter as both eternal and immaterial in nature,
as exemplified in his theology of unmoved
movers.[79] Identifying Aristotle's God with
rational thought, Davidson argued, contrary to
Aristotle, that just as the soul cannot exist apart
from the body, God cannot exist apart from the
world.[80]
Idealist notions took a strong hold among
physicists of the early 20th century confronted
with the paradoxes of quantum physics and the
theory of relativity. In The Grammar of Science,
Preface to the 2nd Edition, 1900, Karl Pearson
wrote, "There are many signs that a sound
idealism is surely replacing, as a basis for
natural philosophy, the crude materialism of the
older physicists." This book influenced
Einstein's regard for the importance of the
observer in scientific measurements.[81] In § 5 of
that book, Pearson asserted that "...science is in
reality a classification and analysis of the
contents of the mind..." Also, "...the field of
science is much more consciousness than an
external world."
Arthur Eddington, a British astrophysicist of the
early 20th century, wrote in his book The
Nature of the Physical World that "The stuff of
the world is mind-stuff":
The mind-stuff of the world is, of course,
something more general than our individual
conscious minds... The mind-stuff is not spread
in space and time; these are part of the cyclic
scheme ultimately derived out of it... It is
necessary to keep reminding ourselves that all
knowledge of our environment from which the
world of physics is constructed, has entered in
the form of messages transmitted along the
nerves to the seat of consciousness...
Consciousness is not sharply defined, but fades
into subconsciousness; and beyond that we must
postulate something indefinite but yet
continuous with our mental nature... It is
difficult for the matter-of-fact physicist to
accept the view that the substratum of
everything is of mental character. But no one
can deny that mind is the first and most direct
thing in our experience, and all else
iinference[8The 20th-century British scientist
Sir James Jeans wrote that "the Universe begins
to look more like a great thought than like a
great machine"
Ian Barbour in his book Issues in Science and
Religion (1966), p. 133, cites Arthur
Eddington's The Nature of the Physical World
(1928) for a text that argues The Heisenberg
Uncertainty Principles provides a scientific
basis for "the defense of the idea of human
freedom" and his Science and the Unseen World
(1929) for support of philosophical idealism
"the thesis that reality is basically mental".
Sir James Jeans wrote: "The stream of
knowledge is heading towards a non-mechanical
reality; the Universe begins to look more like a
great thought than like a great machine. Mind
no longer appears to be an accidental intruder
into the realm of matter... we ought rather hail it
as the creator and governor of the realm of
matter."[83]
Jeans, in an interview published in The
Observer (London), when asked the question:
"Do you believe that life on this planet is the
result of some sort of accident, or do you
believe that it is a part of some great scheme?"
replied:
I incline to the idealistic theory that
consciousness is fundamental, and that the
material universe is derivative from
consciousness, not consciousness from the
material universe... In general the universe
seems to me to be nearer to a great thought than
to a great machine. It may well be, it seems to
me, that each individual consciousness ought to
be compared to a brain-cell in a universal mind.
Addressing the British Association in 1934,
Jeans said:
What remains is in any case very different from
the full-blooded matter and the forbidding
materialism of the Victorian scientist. His
objective and material universe is proved to
consist of little more than constructs of our own
minds. To this extent, then, modern physics has
moved in the direction of philosophic idealism.
Mind and matter, if not proved to be of similar
nature, are at least found to be ingredients of
one single system. There is no longer room for
the kind of dualism which has haunted
philosophy since the days of Descartes.[84]
In The Universe Around Us, Jeans writes:
Finite picture whose dimensions are a certain
amount of space and a certain amount of time;
the protons and electrons are the streaks of paint
which define the picture against its space-time
background. Traveling as far back in time as we
can, brings us not to the creation of the picture,
but to its edge; the creation of the picture lies as
much outside the picture as the artist is outside
his canvas. On this view, discussing the creation
of the universe in terms of time and space is like
trying to discover the artist and the action of
painting, by going to the edge of the canvas.
This brings us very near to those philosophical
systems which regard the universe as a thought
in the mind of its Creator, thereby reducing all
discussion of material creation to futility.[85]
The chemist Ernest Lester Smith wrote a book
Intelligence Came First (1975) in which he
claimed that consciousness is a fact of nature
and that the cosmos is grounded in and pervaded
by mind and intelligence.[86]
Bernard d'Espagnat, a French theoretical
physicist best known for his work on the nature
of reality, wrote a paper titled The Quantum
Theory and Reality. According to the paper:
The doctrine that the world is made up of
objects whose existence is independent of
human consciousness turns out to be in conflict
with quantum mechanics and with facts
established by experiment.[87]
In a Guardian article entitled "Quantum
Weirdness: What We Call 'Reality' is Just a
State of Mind",[88] d'Espagnat wrote:
What quantum mechanics tells us, I believe, is
surprising to say the least. It tells us that the
basic components of objects – the particles,
electrons, quarks etc. – cannot be thought of as
'self-existent'.
He further writes that his research in quantum
physics has led him to conclude that an
"ultimate reality" exists, which is not embedded
in space or time.[89]
Brainstorming
Brainstorming is a group creativity technique
by which efforts are made to find a conclusion
for a specific problem by gathering a list of
ideas spontaneously contributed by its
members.[citation needed]
In other words, brainstorming is a situation
where a group of people meet to generate new
ideas and solutions around a specific domain of
interest by removing inhibitions. People are able
to think more freely and they suggest as many
spontaneous new ideas as possible. All the ideas
are noted down without criticism and after the
brainstorming session the ideas are
evaluated.[citation needed]
The term was popularized by Alex Faickney
Osborn in the classic work Applied Imagination
(1953).[1]Once a new product has passed
through the screening process, the next step is to
conduct a business analysis. Business analysis is
a basic assessment of a product's compatibility
in the marketplace and its potential profitability.
Both the size of the market and competing
products are often studied at this point. The
most important question relates to market
demand: How will a product affect a firm's
sales, costs, and profits? If a product survives
the first three steps, it is developed into a
prototype that should reveal the intangible
attributes it possesses as perceived by the
consumer
Osborn's
method
Flowchart for conducting a
brainstorming session.
Two principles
Osborn claimed that two principles
contribute to "ideative efficacy":
1.
Defer judgment;
2.
Reach for quantity.[8]
Four rules
Following these two principles were his
four general rules of brainstorming,
established with intention to:[citation needed]



reduce social inhibitions among group
members;
stimulate idea generation;
increase overall creativity of the
group.
These four rules were:
1. Go for quantity: This rule is a way
of enhancing divergent production,
aiming at facilitation of problem
solution through the maxim quantity
breeds quality. The assumption is that
the greater the number of ideas
generated the bigger the chance of
producing a radical and effective
solution.[citation needed]
2. Withhold criticism: In
brainstorming, criticism of ideas
generated should be put 'on hold'.
Instead, participants should focus on
extending or adding to ideas,
reserving criticism for a later 'critical
stage' of the process. By suspending
judgment, participants will feel free to
generate unusual ideas.[citation needed]
3. Welcome wild ideas: To get a good
long list of suggestions, wild ideas are
encouraged. They can be generated by
looking from new perspectives and
suspending assumptions. These new
ways of thinking might give better
solutions.[citation needed]
4. Combine and improve ideas: As
suggested by the slogan "1+1=3". It is
believed to stimulate the building of
ideas by a process of association.[8]
Applications
Osborn notes that brainstorming should
address a specific question; he held that
sessions addressing multiple questions
were inefficient.[citation needed]
Further, the problem must require the
generation of ideas rather than judgment;
he uses examples such as generating
possible names for a product as proper
brainstorming material, whereas
analytical judgments such as whether or
not to marry do not have any need for
brainstorming.[8]
Groups
Osborn envisioned groups of around 12
participants, including both experts and
novices. Participants are encouraged to
provide wild and unexpected answers.
Ideas receive no criticism or discussion.
The group simply provide ideas that
might lead to a solution and apply no
analytical judgment as to the feasibility.
The judgments are reserved for a later
date.[citation needed]
Variations
Nominal group technique
Main article: Nominal group technique
Participants are asked to write their ideas
anonymously. Then the facilitator
collects the ideas and the group votes on
each idea. The vote can be as simple as a
show of hands in favor of a given idea.
This process is called distillation.[citation
needed]
After distillation, the top-ranked ideas
may be sent back to the group or to
subgroups for further brainstorming. For
example, one group may work on the
color required in a product. Another
group may work on the size, and so forth.
Each group will come back to the whole
group for ranking the listed ideas.
Sometimes ideas that were previously
dropped may be brought forward again
once the group has re-evaluated the
ideas.[citation needed]
It is important that the facilitator is
trained in this process before attempting
to facilitate this technique. The group
should be primed and encouraged to
embrace the process. Like all team
efforts, it may take a few practice
sessions to train the team in the method
before tackling the important ideas.[citation
needed]
Group passing technique
Each person in a circular group writes
down one idea, and then passes the piece
of paper to the next person, who adds
some thoughts. This continues until
everybody gets his or her original piece
of paper back. By this time, it is likely
that the group will have extensively
elaborated on each idea.[citation needed]
The group may also create an "idea
book" and post a distribution list or
routing slip to the front of the book. On
the first page is a description of the
problem. The first person to receive the
book lists his or her ideas and then routes
the book to the next person on the
distribution list. The second person can
log new ideas or add to the ideas of the
previous person. This continues until the
distribution list is exhausted. A follow-up
"read out" meeting is then held to discuss
the ideas logged in the book. This
technique takes longer, but it allows
individuals time to think deeply about the
problem.[citation needed]
Team idea mapping method
This method of brainstorming works by
the method of association. It may
improve collaboration and increase the
quantity of ideas, and is designed so that
all attendees participate and no ideas are
rejected.[citation needed]
The process begins with a well-defined
topic. Each participant brainstorms
individually, then all the ideas are
merged onto one large idea map. During
this consolidation phase, participants
may discover a common understanding
of the issues as they share the meanings
behind their ideas. During this sharing,
new ideas may arise by the association,
and they are added to the map as well.
Once all the ideas are captured, the group
can prioritize and/or take action.[9]
Directed brainstorming
Directed brainstorming is a variation of
electronic brainstorming (described
below). It can be done manually or with
computers. Directed brainstorming works
when the solution space (that is, the set
of criteria for evaluating a good idea) is
known prior to the session. If known,
those criteria can be used to constrain the
ideation process intentionally.[citation needed]
In directed brainstorming, each
participant is given one sheet of paper (or
electronic form) and told the
brainstorming question. They are asked
to produce one response and stop, then
all of the papers (or forms) are randomly
swapped among the participants. The
participants are asked to look at the idea
they received and to create a new idea
that improves on that idea based on the
initial criteria. The forms are then
swapped again and respondents are asked
to improve upon the ideas, and the
process is repeated for three or more
rounds.[citation needed]
In the laboratory, directed brainstorming
has been found to almost triple the
productivity of groups over electronic
brainstorming.[10]
Guided brainstorming
A guided brainstorming session is time
set aside to brainstorm either individually
or as a collective group about a particular
subject under the constraints of
perspective and time. This type of
brainstorming removes all cause for
conflict and constrains conversations
while stimulating critical and creative
thinking in an engaging, balanced
Participants are asked to adopt different
mindsets for pre-defined period of time
while contributing their ideas to a central
mind map drawn by a pre-appointed
scribe. Having examined a multiperspective point of view, participants
seemingly see the simple solutions that
collectively create greater growth. Action
is assigned individually.
Following a guided brainstorming
session participants emerge with ideas
ranked for further brainstorming,
research and questions remaining
unanswered and a prioritized, assigned,
actionable list that leaves everyone with
a clear understanding of what needs to
happen next and the ability to visualize
the combined future focus and greater
goals of the group nicely.[citation needed]
Individual brainstorming
Individual brainstorming is the use of
brainstorming in solitary situations. It
typically includes such techniques as free
writing, free speaking, word association,
and drawing a mind map, which is a
visual note taking technique in which
people diagram their thoughts. Individual
brainstorming is a useful method in
creative writing and has been shown to
be superior to traditional group
brainstorming.[11][12]
Question brainstorming
This process involves brainstorming the
questions, rather than trying to come up
with immediate answers and short-term
solutions. Theoretically, this technique
should not inhibit participation as there is
no need to provide solutions. The
answers to the questions form the
framework for constructing future action
plans. Once the list of questions is set, it
may be necessary to prioritize them to
reach to the best solution in an orderly
way.[13]
"Questorming" is another term for this
mode of inquiry.[14]
Methods to
improving
brainstorming
sessions
Groups can improve the effectiveness
and quality of their brainstorming
sessions in a number of ways.[15]



Avoid face-to-face groups: Using
face-to-face groups can increase
production blocking, evaluation
apprehension, social matching and
social loafing.
Stick to the rules: Brainstorming rules
should be followed, and feedback
should be given to members that
violate these rules. Violations of
brainstorming rules tend to lead to
mediocre ideas.
Pay attention to everyone's ideas:
People tend to pay more attention to
their own ideas, however
brainstorming requires exposure to
the ideas of others. A method to



encourage members to pay attention
to others' ideas is to make them list
the ideas out or ask them to repeat
others' ideas.
Include both individual and group
approaches: One method that helps
members integrate their ideas into the
group is brainwriting. This is where
members write their ideas on a piece
of paper and then pass it along to
others who add their own ideas.
Take breaks: Allow silence during
group discussions so that members
have time to think things through.
Do not rush: Allow plenty of time for
members to complete the task.
Although working under pressure
tends to lead to more solutions
initially, the quality is usually lower
than if more time is spent on the task.


Stay persistent: Members should stay
focused and persist at the task even
when productivity is low.
Facilitate the session: A skilled
discussion leader should lead and
coordinate the brainstorming sessions.
This leader can motivate members,
correct mistakes, and provide a clear
standard of work. They can also be
used to keep track of all the ideas and
make sure that these ideas are
available to everyone.
Alternatives to
brainstorming
If brainstorming does not work for a
group, some alternatives are available:[15]




Buzzgroups: Larger groups can form
subgroups that come up with ideas
when the larger group is stumped.
Afterwards, these subgroups come
back together and discuss their ideas
as a whole group.
Bug list: Group members write down
all the little problems or irritations
concerning the issue they are working
on, and then the group discusses
solutions for each of these "bugs".
Stepladder technique: A method
where new members state their ideas
before listening to the group's
position.
Synectics: A leader guides the group
and discusses their goals, wishes, and
frustrations using analogies,
metaphors, and fantasy.

TRIZ: This method is primarily used
in science and engineering, and
involves following a specific
sequence of problem analysis,
resource review, goal setting, and
review of prior approaches to the
problem.
Electronic
brainstorming
See also: Brainstorming software,
Electronic meeting system, and
Computer supported brainstorming
Although the brainstorming can take
place online through commonly available
technologies such as email or interactive
web sites, there have also been many
efforts to develop customized computer
software that can either replace or
enhance one or more manual elements of
the brainstorming process.[citation needed]
Early efforts, such as GroupSystems at
University of Arizona[16] or Software
Aided Meeting Management (SAMM)
system at the University of Minnesota,[17]
took advantage of then-new computer
networking technology, which was
installed in rooms dedicated to computer
supported meetings. When using these
electronic meeting systems (EMS, as
they came to be called), group members
simultaneously and independently
entered ideas into a computer terminal.
The software collected (or "pools") the
ideas into a list, which could be
displayed on a central projection screen
(anonymized if desired). Other elements
of these EMSs could support additional
activities such as categorization of ideas,
elimination of duplicates, assessment and
discussion of prioritized or controversial
ideas. Later EMSs capitalized on
advances in computer networking and
internet protocols to support
asynchronous brainstorming sessions
over extended periods of time and in
multiple locations.
Introduced along with the EMS by
Nunamaker and colleagues at University
of Arizona[16] was electronic
brainstorming (EBS). By utilizing
customized computer software for groups
(group decision support systems or
groupware), EBS can replace face-to-
face brainstorming.[18] An example of
groupware is the GroupSystems, a
software developed by University of
Arizona.[16] After an idea discussion has
been posted on GroupSystems, it is
displayed on each group member's
computer. As group members
simultaneously type their comments on
separate computers, those comments are
anonymously pooled and made available
to all group members for evaluation and
further elaboration.[16]
Compared to face-to-face brainstorming,
not only does EBS enhanced efficiency
by eliminating travelling and turn-taking
during group discussions, it also
excluded several psychological
constraints associated with face-to-face
meetings. Identified by Gallupe and
colleagues,[18] both production blocking
(reduced idea generation due to turntaking and forgetting ideas in face-toface brainstorming)[19] and evaluation
apprehension (a general concern
experienced by individuals for how
others in the presence are evaluating
them) are reduced in EBS.[20] These
positive psychological effects increase
with group size.[21] A perceived
advantage of EBS is that all ideas can be
archived electronically in their original
form, and then retrieved later for further
thought and discussion. EBS also enables
much larger groups to brainstorm on a
topic than would normally be productive
in a traditional brainstorming session.[18]
Computer supported brainstorming may
overcome some of the challenges faced
by traditional brainstorming methods.
For example, ideas might be "pooled"
automatically, so that individuals do not
need to wait to take a turn, as in verbal
brainstorming. Some software programs
show all ideas as they are generated (via
chat room or e-mail). The display of
ideas may cognitively stimulate
brainstormers, as their attention is kept
on the flow of ideas being generated
without the potential distraction of social
cues such as facial expressions and
verbal language.[21] EBS techniques have
been shown to produce more ideas and
help individuals focus their attention on
the ideas of others better than a
brainwriting technique (participants write
individual written notes in silence and
then subsequently communicate them
with the group).[21] The production of
more ideas has been linked to the fact
that paying attention to others' ideas
leads to non-redundancy, as
brainstormers try to avoid to replicate or
repeat another participant's comment or
idea. Conversely, the production gain
associated with EBS was less found in
situations where EBS group members
focused too much on generating ideas
that they ignored ideas expressed by
others. The production gain associated
with GroupSystem users' attentiveness to
ideas expressed by others has been
documented by Dugosh and
colleagues.[22] EBS group members who
were instructed to attend to ideas
generated by others outperformed those
who were not in terms of creativity.
According to a meta-analysis comparing
EBS to face-to-face brainstorming
conducted by DeRosa and colleagues,[23]
EBS has been found to enhance both the
production of non-redundant ideas and
the quality of ideas produced. Despite the
advantages demonstrated by EBS groups,
EBS group members reported less
satisfaction with the brainstorming
process compared to face-to-face
brainstorming group members.
Some web-based brainstorming
techniques allow contributors to post
their comments anonymously through the
use of avatars. This technique also allows
users to log on over an extended time
period, typically one or two weeks, to
allow participants some "soak time"
before posting their ideas and feedback.
This technique has been used particularly
in the field of new product development,
but can be applied in any number of
areas requiring collection and evaluation
of ideas.[24]
Some limitations of EBS include the fact
that it can flood people with too many
ideas at one time that they have to attend
to, and people may also compare their
performance to others by analyzing how
many ideas each individual produces
(social matching).[citation needed]
Incentives
Some research indicates that incentives
can augment creative processes.
Participants were divided into three
conditions. In Condition I, a flat fee was
paid to all participants. In the Condition
II, participants were awarded points for
every unique idea of their own, and
subjects were paid for the points that they
earned. In Condition III, subjects were
paid based on the impact that their idea
had on the group; this was measured by
counting the number of group ideas
derived from the specific subject's ideas.
Condition III outperformed Condition II,
and Condition II outperformed Condition
I at a statistically significant level for
most measures. The results demonstrated
that participants were willing to work far
longer to achieve unique results in the
expectation of compensation.[25]
Challenges to
effective group
brainstorming
A good deal of research refutes Osborn's
claim that group brainstorming could
generate more ideas than individuals
working alone.[12] For example, in a
review of 22 studies of group
brainstorming, Michael Diehl and
Wolfgang Stroebe found that,
overwhelmingly, groups brainstorming
together produce fewer ideas than
individuals working separately.[26]
However, this conclusion is brought into
question by a subsequent review of 50
studies by Scott G. Isaksen showed that a
misunderstanding of the tool, and weak
application of the methods (including
lack of facilitation), and the artificiality
of the problems and groups undermined
most such studies, and the validity of
their conclusions.[27]
Several factors can contribute to a loss of
effectiveness in group brainstorming.

Production blocking: Because only
one participant may give an idea at
any one time, other participants might
forget the idea they were going to
contribute or not share it because they
see it as no longer important or
relevant.[28] Further, if we view
brainstorming as a cognitive process
in which "a participant generates

ideas (generation process) and stores
them in short-term memory
(memorization process) and then
eventually extracts some of them from
its short-term memory to express
them (output process)", then blocking
is an even more critical challenge
because it may also inhibit a person's
train of thought in generating their
own ideas and remembering them.[29]
Group members can be given
notepads to write their ideas on and
the meeting can organize who will get
to speak next. However, this
brainstorming technique does not
perform as well as individuals using
the nominal group technique.
Collaborative fixation: Exchanging
ideas in a group may reduce the
number of domains that a group

explores for additional ideas.
Members may also conform their
ideas to those of other members,
decreasing the novelty or variety of
ideas, even though the overall number
of ideas might not decrease.[30]
Evaluation apprehension: Evaluation
apprehension was determined to occur
only in instances of personal
evaluation. If the assumption of
collective assessment were in place,
real-time judgment of ideas,
ostensibly an induction of evaluation
apprehension, failed to induce
significant variance.[12][31]
Furthermore, when an authority figure
watches the group members
brainstorm the effectiveness lowers
because members worry their ideas
may be viewed negatively. Especially

individuals with high social anxiety
are particularly unproductive
barnstormers and report feeling more
nervous, anxious, and worried than
group members who are less anxiety
prone.[32]
Free-writing: Individuals may feel
that their ideas are less valuable when
combined with the ideas of the group
at large. Indeed, Diehl and Stroebe
demonstrated that even when
individuals worked alone, they
produced fewer ideas if told that their
output would be judged in a group
with others than if told that their
output would be judged individually.
However, experimentation revealed
free-writing as only a marginal
contributor to productivity loss, and


type of session (i.e., real vs. nominal
group) contributed much more.[12]
Personality characteristics:
Extroverts have been shown to
outperform introverts in computer
mediated groups. Extroverts also
generated more unique and diverse
ideas than introverts when additional
methods were used to stimulate idea
generation, such as completing a
small related task before
brainstorming, or being given a list of
the classic rules of brainstorming.[33]
Social matching: One phenomenon of
group brainstorming is that
participants will tend to alter their rate
of productivity to match others in the
group. This can lead to participants
generating fewer ideas in a group
setting than they would individually

because they will decrease their own
contributions if they perceive
themselves to be more productive
than the group average. On the other
hand, the same phenomenon can also
increase an individual's rate of
production to meet the group
average.[26][34]
Illusion of group productivity:
Members tend to overestimate their
group's productivity and so work less.
Members of the group can only guess
at the quantity and quality of their
group's product and their personal
contributions to the process but there
is no standard to determine how well
it is performing. A combination of
processes explain why members are
incorrectly overestimating
productivity:
1. Group member(s) may intuitively
mistake others' ideas for their own,
and so when they think about their
own performance they cognitively
claim a few ideas that others actually
suggested[35]
2. Group members compare
themselves to others who generate
relatively few ideas, reassuring them
that they are one of the high
performers[36]
3. Group brainstorming may "feel"
more successful because participants
rarely experience failure in a
communal process. When individuals
are trying to think creatively alone,
people repeatedly find that they are
unable to come up with a new idea. In
a group setting, people are less likely
to experience this failure in their
search for new ideas because others'
ideas are being discussed
Creativity techniques
Creativity techniques are methods that
encourage creative actions, whether in
the arts or sciences. They focus on a
variety of aspects of creativity, including
techniques for idea generation and
divergent thinking, methods of reframing problems, changes in the
affective environment and so on. They
can be used as part of problem solving,
artistic expression, or therapy.
Some techniques require groups of two
or more people while other techniques
can be accomplished alone. These
methods include word games, written
exercises and different types of
improvisation, or algorithms for
approaching problems. Aleatory
techniques exploiting randomness are
also common
Aleatory techniques
Main article: Aleatoricism
Aleatoricism is the incorporation of chance
(random elements) into the process of
creation, especially the creation of art or
media. Aleatoricism is commonly found in
music, art, and literature, particularly in
poetry. In film, Andy Voda made a movie in
1979 called Chance Chants, which he
produced by a flip of a coin or roll of a dice.
In music, John Cage, an avant-garde
musician, composed music by using the I
Ching to determine the position of musical
notes,[1] superimposing star maps on blank
sheet music, by rolling dice and preparing
open-ended scores that depended on the
spontaneous decisions of the performers. (1)
Other ways of practicing randomness
include coin tossing, picking something out
of a hat, or selecting random words from a
dictionary.
The aleatory approach is also demonstrated
in the case of the process called provocation,
which was initially introduced by Edward de
Bono as an aid to research.[2] This method,
which Richard Restak said was also
employed by Anthony Burgess, aims to
achieve novel ideas in writing by directing a
plot with creative connections through
random words picked from a reference
book.[3] Restak explained that the two
hundred billion interconnected neural cells
in the brain are capable of an abundance of
possibilities for long-range connections and
creative interactions using random and
unrelated words.[3]
In short, aleatoricism is a way to introduce
new thoughts or ideas into a creative
process.
Improvisation
Main article: Improvisation
Improvisation is a creative process which
can be spoken, written, or composed without
prior preparation.[4] Improvisation, also
called extemporization, can lead to the
discovery of new ways to act, new patterns
of thought and practices, or new structures.
Improvisation is used in the creation of
music, theater, and other various forms.
Many artists also use improvisational
techniques to help their creative flow.
The following are two significant domains
that use improvisation:


Improvisational theater is a form of
theater in which actors use
improvisational acting techniques to
perform spontaneously. Many
improvisational ("improv") techniques
are taught in standard drama classes. The
basic skills of listening, clarity,
confidence, and performing instinctively
and spontaneously are considered
important skills for actors to develop.[5]
Free improvisation is real-time
composition. Musicians of all kinds
improvise ("improv") music; such
improvised music is not limited to a
particular genre. Two contemporary
musicians that use free improvisation are
Anthony Braxton and Cecil Taylor.
In problem solving
Main article: Creative problem solving
In problem-solving contexts, the randomword creativity technique is perhaps the
simplest method. A person confronted with
a problem is presented with a randomly
generated word, in the hopes of a solution
arising from any associations between the
word and the problem. A random image,
sound, or article can be used instead of a
random word as a kind of creativity goad or
provocation.[6][7]
There are many problem-solving tools and
methodologies to support creativity:













TRIZ (theory which are derived from
tools such as ARIZ or TRIZ
contradiction matrix)
Creative Problem Solving Process (CPS)
(complex strategy, also known as
Osborn-Parnes-process)
Lateral thinking process, of Edward de
Bono
Six Thinking Hats, of Edward de Bono
Ordinal Priority Approach
Herrmann Brain Dominance Instrument
– right brain / left brain
Brainstorming and Brainwriting
Think outside the box
Business war games, for the resolution of
competitive problems
SWOT analysis
The method USIT of convergent
creativity
Thought experiment
Five Ws
In project
management
For project management purposes, group
creativity techniques are creativity
techniques used by a team in the course of
executing a project. Some relevant
techniques are brainstorming, the nominal
group technique, the Delphi technique,
idea/mind mapping, the affinity diagram,
and multicriteria decision analysis.[8] These
techniques are referenced in the Guide to the
Project Management Body of Knowledge.[9]
Group creativity techniques can be used in a
sequence; for example:[9]
1. Gather requirements using idea/mind
mapping
2. Continue generating ideas by
brainstorming
3. Construct an affinity diagram based
on the generated ideas
4. Identify the most important ideas by
applying the nominal group technique
5. Obtain several rounds of independent
feedback using the Delphi technique
Affecting factors
Distraction
Multiple studies have confirmed that
distraction actually increases creative
cognition.[10] One such study done by
Jonathan Schooler found that nondemanding distractions improve
performance on a classic creativity task
called the UUT (Unusual Uses Task) in
which the subject must come up with as
many possible uses for a common object.
The results confirmed that decision-related
neural processes occur during moments of
unconscious thought while a person engages
in a non-demanding task. The research
showed that while distracted a subject isn’t
maintaining one thought for a particularly
long time, which in turn allows different
ideas to float in and out of one’s
consciousness—this sort of associative
process leads to creative incubation.[11]
Ambient noise is another variable that is
conducive to distraction. It has been proven
that a moderate level of noise actually
heightens creativity.[12] Professor Ravi
Mehta conducted a study to research the
degree of distraction induced by various
noise levels and their effect on creativity.
The series of experiments show that a
moderate level of ambient noise (70 dB)
produces just enough distraction to induce
processing disfluency, which leads to
abstract cognition. These higher construal
levels caused by moderate levels of noise
consequently enhance creativity.[12]
Walking
In 2014, a study found that walking
increased creativity.[13]
Sleep and relaxation
Some advocate enhancing creativity by
taking advantage of hypnagogia, the
transition from wakefulness to sleep, using
techniques such as lucid dreaming. One
technique used by Salvador Dalí was to drift
off to sleep in an armchair with a set of keys
in his hand; when he fell completely asleep,
the keys would fall and wake him up,
allowing him to recall his mind's
subconscious imaginings.[14] Thomas Edison
used the same technique, with ball
bearings.[15]
Meditation
A study[16] from 2014 conducted by
researchers in China and the US, including
the psychologist Michael Posner found that
performing a short 30 minute meditation
session each day, for seven days, was
sufficient to improve verbal and visual
creativity, as measured by the Torrance
Tests of Creative Thinking, due to the
positive effects of meditation on emotional
regulation. The same researchers[17] also
showed in 2015 that short term meditation
training could also improve insight-based
problem solving (the type commonly
associated with an "Ah-ha", or "eureka" type
moment of realization) as measured by the
Remote Associates Test.
Invention
An invention is a unique or novel device,
method, composition, idea or process. An
invention may be an improvement upon a
machine, product, or process for increasing
efficiency or lowering cost. It may also be an
entirely new concept. If an idea is unique
enough either as a stand alone invention or as a
significant improvement over the work of
others, it can be patented. A patent, if granted,
gives the inventor a proprietary interest in the
patent over a specific period of time, which can
be licensed for financial gain.
An inventor creates or discovers an invention.
The word inventor comes from the Latin verb
invenire, invent-, to find.[1][2] Although
inventing is closely associated with science and
engineering, inventors are not necessarily
engineers or scientists.[3] Due to advances in
artificial intelligence, the term "inventor" no
longer exclusively applies to an occupation (see
human computers).[4]
Some inventions can be patented. The system of
patents was established to encourage inventors
by granting limited-term, limited monopoly on
inventions determined to be sufficiently novel,
non-obvious, and useful. A patent legally
protects the intellectual property rights of the
inventor and legally recognizes that a claimed
invention is actually an invention. The rules and
requirements for patenting an invention vary by
country and the process of obtaining a patent is
often expensive.
Another meaning of invention is cultural
invention, which is an innovative set of useful
social behaviours adopted by people and passed
on to others.[5] The Institute for Social
Inventions collected many such ideas in
magazines and books.[6] Invention is also an
important component of artistic and design
creativity. Inventions often extend the
boundaries of human knowledge, experience or
capability.
Types
Inventions are of three kinds: scientifictechnological (including medicine),
sociopolitical (including economics and
law), and humanistic, or cultural.
Scientific-technological inventions include
railroads, aviation, vaccination,
hybridization, antibiotics, astronautics,
holography, the atomic bomb, computing,
the Internet, and the smartphone.
Sociopolitical inventions comprise new
laws, institutions, and procedures that
change modes of social behavior and
establish new forms of human interaction
and organization. Examples include the
British Parliament, the US Constitution, the
Manchester (UK) General Union of Trades,
the Boy Scouts, the Red Cross, the Olympic
Games, the United Nations, the European
Union, and the Universal Declaration of
Human Rights, as well as movements such
as socialism, Zionism, suffragism, feminism,
and animal-rights veganism.
Humanistic inventions encompass culture in
its entirety and are as transformative and
important as any in the sciences, although
people tend to take them for granted. In the
domain of linguistics, for example, many
alphabets have been inventions, as are all
neologisms (Shakespeare invented about
1,700 words). Literary inventions include
the epic, tragedy, comedy, the novel, the
sonnet, the Renaissance, neoclassicism,
Romanticism, Symbolism, Aestheticism,
Socialist Realism, Surrealism,
postmodernism, and (according to Freud)
psychoanalysis. Among the inventions of
artists and musicians are oil painting,
printmaking, photography, cinema, musical
tonality, atonality, jazz, rock, opera, and the
symphony orchestra. Philosophers have
invented logic (several times), dialectics,
idealism, materialism, utopia, anarchism,
semiotics, phenomenology, behaviorism,
positivism, pragmatism, and deconstruction.
Religious thinkers are responsible for such
inventions as monotheism, pantheism,
Methodism, Mormonism, iconoclasm,
puritanism, deism, secularism, ecumenism,
and the Baháʼí Faith. Some of these
disciplines, genres, and trends may seem to
have existed eternally or to have emerged
spontaneously of their own accord, but most
of them have had inventors.[7]
Process
Practical means
Alessandro Volta with the first electrical
battery. Volta is recognized as an influential
inventor.
Ideas for an invention may be developed on
paper or on a computer, by writing or
drawing, by trial and error, by making
models, by experimenting, by testing and/or
by making the invention in its whole form.
Brainstorming also can spark new ideas for
an invention. Collaborative creative
processes are frequently used by engineers,
designers, architects and scientists. Coinventors are frequently named on patents.
In addition, many inventors keep records of
their working process - notebooks, photos,
etc., including Leonardo da Vinci, Galileo
Galilei, Evangelista Torricelli, Thomas
Jefferson and Albert Einstein.[8][9][10][11]
In the process of developing an invention,
the initial idea may change. The invention
may become simpler, more practical, it may
expand, or it may even morph into
something totally different. Working on one
invention can lead to others too.[12]
History shows that turning the concept of an
invention into a working device is not
always swift or direct. Inventions may also
become more useful after time passes and
other changes occur. For example, the
parachute became more useful once
powered flight was a reality.[13]
Conceptual means
Thomas Edison with phonograph. Edison
was one of the most prolific inventors in
history, holding 1,093 U.S. patents in his
name.
Invention is often a creative process. An
open and curious mind allows an inventor to
see beyond what is known. Seeing a new
possibility, connection or relationship can
spark an invention. Inventive thinking
frequently involves combining concepts or
elements from different realms that would
not normally be put together. Sometimes
inventors disregard the boundaries between
distinctly separate territories or fields.[citation
needed]
Several concepts may be considered
when thinking about invention.
Play
Play may lead to invention. Childhood
curiosity, experimentation, and imagination
can develop one's play instinct. Inventors
feel the need to play with things that interest
them, and to explore, and this internal drive
brings about novel creations.[14][15]
Sometimes inventions and ideas may seem
to arise spontaneously while daydreaming,
especially when the mind is free from its
usual concerns.[16] For example, both J. K.
Rowling (the creator of Harry Potter)[17] and
Frank Hornby (the inventor of Meccano)[18]
first had their ideas while on train journeys.
In contrast, the successful aerospace
engineer Max Munk advocated "aimful
thinking".[19]
Re-envisioning
To invent is to see anew. Inventors often
envision a new idea, seeing it in their mind's
eye. New ideas can arise when the conscious
mind turns away from the subject or
problem when the inventor's focus is on
something else, or while relaxing or
sleeping. A novel idea may come in a
flash—a Eureka! moment. For example,
after years of working to figure out the
general theory of relativity, the solution
came to Einstein suddenly in a dream "like a
giant die making an indelible impress, a
huge map of the universe outlined itself in
one clear vision".[20] Inventions can also be
accidental, such as in the case of
polytetrafluoroethylene (Teflon).
Insight
Insight can also be a vital element of
invention. Such inventive insight may begin
with questions, doubt or a hunch. It may
begin by recognizing that something unusual
or accidental may be useful or that it could
open a new avenue for exploration. For
example, the odd metallic color of plastic
made by accidentally adding a thousand
times too much catalyst led scientists to
explore its metal-like properties, inventing
electrically conductive plastic and light
emitting plastic-—an invention that won the
Nobel Prize in 2000 and has led to
innovative lighting, display screens,
wallpaper and much more (see conductive
polymer, and organic light-emitting diode or
OLED).[21]
Exploration
A rare 1884 photo showing the experimental
recording of voice patterns by a
photographic process at the Alexander
Graham Bell Laboratory in Washington,
D.C. Many of their experimental designs
panned out in failure.
Eric M. C. Tigerstedt (1887–1925) was
known as a pioneer of sound-on-film
technology. Tigerstedt in 1915.
Invention is often an exploratory process
with an uncertain or unknown outcome.
There are failures as well as successes.
Inspiration can start the process, but no
matter how complete the initial idea,
inventions typically must be developed.
Improvement
Inventors may, for example, try to improve
something by making it more effective,
healthier, faster, more efficient, easier to
use, serve more purposes, longer lasting,
cheaper, more ecologically friendly, or
aesthetically different, lighter weight, more
ergonomic, structurally different, with new
light or color properties, etc.
Implementation
Western Arabic numerals - an example of
non-material inventions.
Railways — probably the most important
invention in land transport. (Railway station
in Bratislava, Slovakia)
In economic theory, inventions are one of
the chief examples of "positive
externalities", a beneficial side effect that
falls on those outside a transaction or
activity. One of the central concepts of
economics is that externalities should be
internalized—unless some of the benefits of
this positive externality can be captured by
the parties, the parties are under-rewarded
for their inventions, and systematic underrewarding leads to under-investment in
activities that lead to inventions. The patent
system captures those positive externalities
for the inventor or other patent owner so that
the economy as a whole invests an optimum
amount of resources in the invention
process.
Patent model
A patent model was a
handmade miniature model no
larger than 12" by 12" by 12"
(approximately 30 cm by 30 cm
by 30 cm) that showed how an
invention works. It was one of
the most interesting early
features of the United States
Since some early inventors had
little technological or legal
training, it was difficult for
them to submit formal patent
applications which require the
novel features of an invention to
be described in a written
application and a number of
diagrams.
History
In the US, patent models were
required from 1790 to 1880.[2]
The United States Congress
abolished the legal requirement
for them in 1870, but the U.S.
Patent Office (USPTO) kept the
requirement until 1880.[3]
On July 31, 1790 inventor
Samuel Hopkins of Pittsford,
Vermont became the first
person to be issued a patent in
the United States. His patented
invention was an improvement
in the "making of Pot Ash by a
new apparatus & process."
These earliest patent law
required that a working model
of each invention be produced
in miniature.
Some inventors still willingly
submitted models at the turn of
the twentieth century. In some
cases, an inventor may still
want to present a "working
model" as an evidence to prove
actual reduction to practice in
an interference proceeding. In
some jurisdictions patent
models stayed an aid to
demonstrate the operation of the
invention. In applications
involving genetics, samples of
genetic material or DNA
sequences may be required.
United States
Patent
Office's
collection of
models
Cases of patent models on view
at the U.S. Patent Office in
1861
The United States Patent Office
used to publicly display the
models of approved patents.[4]
This collection of models
suffered two major fires- one in
1836, and another in 1877. The
1877 fire destroyed 75,000
patent models.[5]
In 1908, the Patent Office
donated just over 1,000 patent
models to United States
National Museum.[6] The
remaining models were packed
and moved several times before
Congress chose to dissolve the
collection in 1926. The
Smithsonian Institution was
allowed to choose first from the
remaining models; accessions
from the Patent Office now
form part of the collection of
over 10,000 patent models at
the National Museum of
American History.[6]
Depiction of the 1877 fire at the
U.S. Patent Office, which
destroyed 75,000 patent models
Many models were sold off by
the patent office in 1925 and
were purchased by Sir Henry
Wellcome, the founder of the
Burroughs-Wellcome Company
(now part of GlaxoSmithKline).
Although he intended to
establish a patent model
museum, the stock market crash
of 1929 damaged his fortune;
the models were left in storage.
After his death, the collection
went through a number of
ownership changes; a large
portion of the collection—along
with $1,000,000—was donated
to the nonprofit United States
Patent Model Foundation by
Cliff Peterson. Rather than
being put into a museum, these
models were slowly sold off by
the foundation. Much legal
wrangling, purchasing, and reselling ensued.[7] A
comparatively small number of
models (4,000) were the
property of the Rothschild
Patent Museum until 2015,
when they were transferred to
Hagley Museum and Library,
forming a part of the museum's
collection of patent models.
With over 5,000 models, the
Hagley's is the largest private
collection, and second in size
only to the Smithsonian's
Download