Objects as Organization: CS & the Art of Memory

advertisement
Samuel DiBella
Objects as Organization: Computer Science and the Art of Memory
The ancients of Greek and Rome, due to the difficulty of print distribution at the time,
developed a complex system of spatial mnemonic techniques that was used and refined up to and
through the medieval era and the Renaissance. As of today, those same techniques are still in use
in specialized areas, and those interested in its history, although they are no longer used as part of
the core of a liberal arts education. This paper intends to explore the void left by the forgotten art
of memory, and what seems to have taken its place, focusing on the field of computer science.
Overall, I will suggest and point out the connections that the modern computer science objectoriented paradigm has to the techniques used in the art of memory to ease active recall. I will not
prove whether this overlap is due to actual inherited methods, or simply a coincidence, as that
would be beyond the scope of this paper. However, I will make use of the cognitive lingustic
term, the conceptual metaphor, to point out what characteristics of the two discipline might
explain their overlap.
Overall, the art of memory can be broken down into two aspects, the objects that refer to
the word or thing that is to be remembered, and the creation of loci, or mental spaces, for storing
those objects. Even as further techniques were added, such as visual alphabets, and complicated
interlocking systems, like the Brunian wheel, the core system of loci and objects remained
generally the same, and was simply iterated upon. For objects, they should be, as described in the
classic memory text, the ad Herennium, “ likenesses as striking as possible; if we set up images
that are not many or vague, but doing something; if we assign to them exceptional beauty or
singular ugliness; if we dress some of them with crowns or purple cloaks, for example, so that
the likeness may be more distinct to us ... The things we easily remember when they are real we
likewise remember without difficulty when they are figments, if they have been carefully
delineated,” (Caplan). The loci, on the other hand, must follow a certain understood order, either
by spatial placement, or causal relation, so that it is easy to mentally walk from one loci to
another and find the objects within them. For this, there are specific rules that the loci be not too
bright nor too dim, not be crowded, or defined by repetitive features. More complicated
architectures, such as those used by Giordano Bruno, can make use of more esoteric rules, by
making the loci themselves reflect some aspect of the information they contain, or by nesting loci
within one another. For example, a book could be broken down into separate rooms, each
representing a single chapter, and within the room there could be subdivided areas for the
different concepts in the chapter.
As I have given a brief explanation of the core of medieval mnemonic thought, I will now
shift to the concept of computer science. Before I begin speaking about computer code, I would
like to make a short disclaimer, and explanation. To wit, computer processing should not be
confused with or equated to human cognition. A computer's memory system functions as
essentially a system of place holders that can effortlessly be wiped clean, unlike the human mind.
My main argument is not that computers' 'thought' is the same as human thought, but rather that
the art of memory and computer programming both require the human brain to be working
through similar processes, the key difference being that the art of memory requires purely mental
constructs, while computer code acts as a mental aid for the programmer, allowing them to better
recall the functions and interactions of the code.
Computer science is about the creation and execution of efficient algorithms, or recipes.
Because computers are, at their most simple level, a series of transistors, computer programming
makes use of a series of language abstractions, from binary code, to machine and assembly
languages, to high level programming languages that communicate well formed commands to a
computer. Each step up in the language hierarchy obscures more and more of the actual workings
of the computer, to allow for easier human comprehension. Sometimes lines of code can look
almost exactly like actual sentences in a natural language. However, the way that these languages
work is highly defined, and discrete, with little of the expressive power that a natural language
would have. As Shaw describes in her essay, “programming languages are used for
communication among people as well as for controlling machines. This role is particularly
important in long-lived programs, because a program is in many ways the most practical medium
for expressing the structure imposed by the designer-and for maintaining the accuracy of this
documentation over time. Thus, even though most programming languages technically have the
same expressive power, differences among languages can significantly affect their practical
utility,” (Shaw 15). Any low level language can actually technically do more with a computer's
architecture, but high level languages are more often used, for their accessibility. Still, at the
most basic levels, computer code is about the storage of numbers within spaces. On that simple
process, of storing and retrieving values, the entirety of computer science is based.
However, there are various and many ways, provided a programmer knows enough to
implement them, for data to be stored and manipulated. The simplest form is the variable.
Typically, a variable has to be named, and the type of data within it (an integer, a string of letters,
an object, and so on) has to be declared, before the variable can be used to store information.
Already the link to loci should be clear. Past the variable, computer science has a variety of data
structures, 'contracts' that a programmer makes about the way that information is stored, which
can allow for complicated interactions between data. Many of these make use of common
understandings to describe how the data structure works. The programmer uses these restrictions
to guarantee that whatever data their program looks at will have been acquired only through a
specific way. As an example, a queue is a list of elements, oriented front to back, and only the
element at the front of the line can be accessed. Fundamentally, every computers processor
makes use of a queue for allocating time to different programs. Each program 'waits in line' until
it comes to the front of the processor queue, at which point it is allocated processing time.
Another important aspect of code is its readability. Readable code is easy to comprehend,
and written in a structured manner. In “A readable program always seems to exhibit a common
set of properties...The logical structure of the program is constructed of single-entry single-exit
flow of control units. Variable names are mnemonic and references to them localized. The
program's physical layout makes the salient features of the algorithm that is implemented stand
out,” (Elshof 513). The strange thing about readable code is that its performance is usually
exactly the same as unreadable code. The computer will be able to parse both the same. Readable
code is created for humans. The layout of the code needs to follow an internal order, in a way
that is consistent, and easily understood by whoever is reading the code. There are many ways to
accomplish this goal. Among the most important methods are the appropriate creation of variable
names, and line comments. For variables, the names should be as short as possible, and named in
a way that is consistent with their function within the surrounding code, to allow for better recall
and comprehension (Binkley).
Comments are lines of code that are marked by the programmer so that the computer does
not try to execute them. Programmers can then use that line or block of text to write explanatory
text in their natural language to express the workings of the related areas of code. Comments not
only provide instructions, but they can be used to visually organize programming: “Comments
should be used to make the source text of a program text of a program understandable. Block
comments should be placed at the beginning of a program to describe the program's purpose,
external interface, and how it works. The program should be divided into major sections,
paragraphs, separated by blank lines or page boundaries. Block comments should also be used to
describe the functions performed by the paragraphs,” (Elshof 513). Much the way in which a
speech must follow a particular order, a program walks through the steps of its algorithms, and
comments provide visual and textual references for the user.
At this point, I would like to provide a general idea of the possible genealogical
connection between the art of memory and computer science. Computer science, as a discipline,
draws much of its foundation from mathematics and logic. In The Art of Memory, Yates
postulates that several early scientists were aware of, enamored with, and inspired by the art of
memory, from the Francis Bacon to Leibniz: “Leibniz formed a project known as the
'characteristica'. Lists were drawn up of all the essential notions of thought, and to these notions
were to be assigned symbols or 'characters'. If, as has been suggested, Leibniz's 'characteristica'
as a whole comes straight out of the memory tradition, it would follow that the search for 'images
for things', when transferred to to mathematical symbolism, resulted in the discovery of new and
better mathematical or logico-mathematical, notations, making possible new types of
calculation,” (384-6). If Yates is correct, computer science could claim the art of memory as a
distant ancestor, through the art's contribution to notation and conceptual thought.
However, there seems to be a much closer connection here than that. In many
ways, the practice and use of computer code seems to follow precepts very similar to those of the
art of memory, although reconfigured to allow for multiple humans making use of the same
system, which is stored within a computer, rather than a single mind. The storage of information
within specifically defined structures, organized in such as way as to be accessed in order, is
present. The aesthetic individual choice of how the code will appear to it user, given that some
forms will have more ease of use, is there. However, I would like to try and explain in a bit more
detail how the art of memory and computer code function in similar ways.
To begin, I will have to explain a term that I believe can be used to understand the core of
the interrelation of ancient mnemonics and computer science, the conceptual metaphor, as it is
described by the linguist, George Lakoff. As he puts it, a conceptual metaphor is a way of
mapping, or creating references, from one domain to another target domain. By doing this, we
create a reference for understanding one concept in terms of another. The common example that
Lakoff gives is the idea that “Argument is War”. As a result of the conceptual metaphor between
these two concepts, we gain new ways to describe certain phenomena in the abstract concept of
argumentation in terms of the visceral terminology of war, leading to phrases like “Your claims
are indefensible” and etc. For Lakoff, “The essence of metaphor is understanding and
experiencing one kind of thing in terms of another. It is not that arguments are a subspecies of
war. Arguments and wars are different kinds of things- verbal discourse and armed conflict- and
the actions performed are different kinds of actions. But ARGUMENT is partially structured,
understood, performed, and talked about in terms of WAR [Lakoff's emphasis]” (Lakoff 4-5).
The architectural loci method of memory could very easily be seen as an example of a personally
created conceptual metaphor, wherein a loci and object are tied to a person or thing to be
remembered.
This is the point that computer science's functional concerns can serve to provide more
examples of overlaps in methodology. As I mentioned before, abstraction of computer functions
and workings are understood to be an essential component to computer organization and its use
by humans. In the way that code is written, and the way that the hierarchy of programming
language works, information that is not necessary is hidden away from the user: “An abstraction
is a simplified description, or specification, of a system that emphasizes some of the system's
details or properties while suppressing others. A good abstraction is one that emphasizes details
that are significant to the reader or user and suppresses details that are, at least for the moment,
immaterial or diversionary,” (Shaw 10). All languages make use of abstraction to some degree or
another.
Currently, one of the most popular programming language paradigms, or 'linguistic'
styles, is object-oriented programming. The main structure of these languages is around the
creation and manipulation of types of objects. These objects are essentially containers with
specific variables within them, and specific functions can be called on or by these objects. As an
example, a programmer might create a Student class, which contains a name variable and a
graduation year variable. From this class, or template, the programmer can create objects of the
type Student, and store references to them. In this way, you could create a list of Students, each
with their own variables. This is, in essence, the most simple implementation of object-oriented
programming, class files that provide code for individual objects. These objects can be stored in
whatever manner or order the programmer would like, and they can have any kind of internal
logic to them, from simplistic to arcane.
The standard case of assigning classes to concrete entities has its limits however. If a too
realistic or specific model is created, extra and unnecessary calculation can be created. As
Alexander Repenning describes in his case study of 'antiobjects',
“Objects can deceive us. They can lure us into a false sense of understanding. The
metaphor of objects can go too far by making us try to create objects that are too
much inspired by the real world. This is a serious problem as a resulting system
may be significantly more complex that it would have to be, or worse, will not
work at all...When implemented as antiobjects, [an] object that we assumed to be
complex turn out to be simple and have little computation. Objects we assumed to
be passive, such as non-functional background objects, turn out to host most of the
computation,” (Repenning).
Here Repenning is suggesting that objects, like walls or floors, can be made to do computational
work in a way that is actually more intuitive than 'realistic' representations of the work that
objects do. For example, a single 'room' object, with information about all the objects it contains,
can actually do a more efficient job of moving those objects around within the room, than if each
object had to calculate its own path. Alternative computer science paradigms, or other views of
object-oriented programming, take advantage of other non-literal ways of creating data
structures, much in the way that Giordano Bruno recommended the blurring of passive objects
and loci with the active mnemonist, as he traverses his mental landscape.
A mnemonist has created a mental mapping between objects and things to recall, taking
advantage of the ease with which human memory can bring to mind visual objects, and then
connecting those objects to abstract or more complicated ideas. Computer science, being
fundamentally about data storage and movement, maps the physical reality of the computer's
state to the programmer's or user's abstracted ideas of how the program is functioning. I would
argue that computer science and the art of memory make use of the same style of cognitive
metaphor, as Lakoff describes them. The mnemonist has their mental architecture with its rooms,
sub-rooms and divisions, whose appearances are carefully chosen to be evocative and
representative of their contents, just as a programmer chooses brief, but punchy, names for their
data variables, and carefully documents their files, so that it is typographically appealing. Both
have created a system tuned for careful and purposeful storage and recall. The risks for both are
clear. A sloppy mnemonic system will just fail to provide consistent active recall. For the
programmer, variables are attuned to a particular data type, so incorrect storage will result in the
program crashing. Sloppy code will perform perfectly fine, but will crumble upon any further
modification, as the programmer will be uncertain as to how the programs various parts interlock
with one another.
In conclusion, while I do not have the conclusive evidence to explain why there is such a
confluence of mental methods and techniques between these two disciplines, who are separated
by hundreds of years of human development. The connections certainly seem clear though, and I
believe they would merit further research.
Bibliography
Bannon L. Forgetting as a feature, not a bug: the duality of memory and implications for
ubiquitous computing. Codesign [serial online]. March 2006;2(1):3-15. Available from:
Academic Search Complete, Ipswich, MA. Accessed April 21, 2014.
Binkley D, Lawrie D, Maex S, Morrell C. Identifier length and limited programmer memory.
Science Of Computer Programming [serial online]. May 2009;74(7):430-445. Available
from: Academic Search Complete, Ipswich, MA. Accessed April 21, 2014.
Elshoff, James L., and Michael Marcotty. "Improving Computer Program Readability to Aid
Modification."Communications of the ACM 25.8 (1982): 512-21. Print.
Lakoff, George, and Mark Johnson. Metaphors We Live by. Chicago: U of Chicago, 1980. Print.
Laitinen K. Estimating understandability of software documents. Software Engineering Notes
[serial online]. July 1996;21(4):81. Available from: Publisher Provided Full Text
Searching File, Ipswich, MA. Accessed May 6, 2014.
"Rhetorica Ad Herennium Passages on Memory." Rhetorica Ad Herennium Passages on
Memory. Trans. Harry Caplan. University of Texas, n.d. Web. 06 May 2014.
Shaw, M. "Abstraction Techniques in Modern Programming Languages." IEEE Software 1.4
(1984): 10-26. Print.
Yates, Frances Amelia. The Art of Memory. Yates. Australia: Penguin, 1966. Print.
Download