An incomplete list of classic CompSci papers every Software

advertisement
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
An incomplete list of classic
CompSci papers every Software
Architect should read
We all have our favourite classic comp-sci paper, usually one we find so fascinating that we want to
share it with everyone.
Not all may agree it’s a good idea to read the original paper. Some might prefer a modern textbook
exposition of it. Nevertheless a more detailed look to our past can be helpful when trying to
understand the future, and provides us with a more polished understanding.
Below is a list of “classic” papers that have shaped computing history. Some of which will
eventually become classics (e.g. the bitcoin paper). Most were perceived radical at the time, but
turned out to influence terminology and became pillars of computer science.
Before we start, let me point you to a 2-page paper, “Hanson 1999: Efficient Reading of Papers in
Science and Technology“, which shows you how to assess whether a particular text should be
worth your time – without having to read it all!
Source: http://valbonne-consulting.com/papers/classic/Hanson_99Efficient_Reading_of_Papers_in_Science_and_Technology.pdf
***
Turing 1936 – On computable numbers with an application to
the Entscheidungsproblem
Turing’s1 1936 classic, “On computable numbers with an application to the
Entscheidungsproblem” [36 pages], is unarguably the paper that began the field of computer
science as we understand it today. Here we have the first descriptions of universal computing
devices, Turing machines, which eventually led to the idea of universal stored-program digital
computers. The paper even seems to describe, in what is unarguably the first ever conceptual
programming language, a form of continuation passing style2, in the form of the “skeleton tables”
Turing used to abbreviate his Turing machine designs.
Source: http://valbonne-consulting.com/papers/classic/Turing_36on_computable_numbers_with_an_application_to_the_entscheidungsproblem.pdf
1
2
http://valbonne-consulting.com/pioneers/264-alan-turing-1912-1954
https://en.wikipedia.org/wiki/Continuation_passing_style
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
Shannon 1948 – A Mathematical Theory of Communication
“A Mathematical Theory of Communication” [55 pages], by Claude Shannon in 1948 was one of
the founding works of the field of information theory and laying out the basic elements of
communication: information source, transmitter, channel, receiver and destination.
Source: http://valbonne-consulting.com/papers/classic/Shannon_1948A_Mathematical_Theory_of_Communication.pdf
Simon 1962 – Architecture of Complexity
Simon in his 1962 paper “The Architecture of Complexity” [32 pages], was the first to analyze the
architecture of complexity and to propose a preferential attachment mechanism to explain power
law distributions. This is not strictly compsci but more philosophical. You might be interested in his
later works3 which expands on the topic.
Source: http://valbonne-consulting.com/papers/classic/Simon_62-Architecture_of_Complexity.pdf
Ritchie Thompson 1974 - The UNIX Time Sharing System
“The UNIX Time-Sharing System” [11 pages], by Dennis Ritchie4 & Ken Thompson5, is one of the
best-written papers ever. The elegance of thought and economy of description set a standard we
should all aspire to.
Source: http://valbonne-consulting.com/papers/classic/Ritchie_Thompson_74The_UNIX_Time_Sharing_System.pdf
Lampson & Sturgis 1976 – Crash Recovery in a Distributed
System
“Lampson & Sturgis” from 1976 “Crash Recovery in a Distributed System” [28 pages], is a
beautiful piece of writing and logic on stable storage and layered abstractions.
Source: http://valbonne-consulting.com/papers/classic/Lampson_Sturgis_76Crash_Recovery_in_a_Distributed_System.pdf
Backus 1997 – Can Programming be liberated from the von
Neumann Style?
Often seen as an apology for having invented the FORTRAN language, Backus6‘ [1978] Turing
Award lecture [29 pages] was one of the most influential and now most-often cited papers
advocating the functional programming paradigm. Backus coined the term “word-at-a-time
programming” to capture the essence of imperative languages, showed how such languages were
3
4
5
6
http://understandingsociety.blogspot.fr/2013/01/simon-on-complexity.html
http://valbonne-consulting.com/pioneers/316-dennis-ritchie-1941-2011
http://valbonne-consulting.com/pioneers/317-ken-thompson-1943
http://valbonne-consulting.com/pioneers/268-john-backus-1924-2007
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
inextricably tied to the von Neumann mac7hine, and gave a convincing argument why such
languages were not going to meet the demands of modern software development. That this
argument was being made by the person who is given the most credit for designing FORTRAN and
who also had significant influence on the development of ALGOL led substantial weight to the
functional thesis. The exposure given to Backus’ paper was one of the best things that could have
happened to the field of functional programming, which at the time was certainly not considered
mainstream.
Source: http://valbonne-consulting.com/papers/classic/Backus_97Can_Programming_be_liberated_from_the_Von_Neumann_Style-Turingaward_Lecture.pdf
Baran 1962 – On Distributed Communication Networks
Paul Baran‘s 1962, “On Distributed Communications Networks” [41 pages]: It is from this paper
that the rumor was started that the Internet was created by the military to withstand nuclear war.
This is totally false. Even though this Rand work was based on this premise, the ARPANET and the
Internet stemmed from the MIT work of Licklider, Kleinrock and Roberts, and had no relation to
Baran’s work.
Source: http://valbonne-consulting.com/papers/classic/Baran_62On_Distributed_Communication_Networks.pdf
Brian Foote 1999 – Big Ball of Mud
Brian Foote and Joseph Yoder wrote an often quoted paper about the undocumented “Big Ball of
Mud” [41 pages] architectural design pattern. A Big Ball of Mud is a haphazardly structured,
sprawling, sloppy, duct-tape-and-baling-wire, spaghetti-code jungle. These systems show
unmistakable signs of unregulated growth, and repeated, expedient repair. Information is shared
promiscuously among distant elements of the system, often to the point where nearly all the
important information becomes global or duplicated. The overall structure of the system may never
have been well defined. If it was, it may have eroded beyond recognition. Programmers with a
shred of architectural sensibility shun these quagmires. Only those who are unconcerned about
architecture, and, perhaps, are comfortable with the inertia of the day-to-day chore of patching the
holes in these failing dikes, are content to work on such systems.
Source: http://valbonne-consulting.com/papers/classic/Brian_Foote_99-Big_Ball_of_Mud.pdf
Brooks 1975 – The Mythical Manmonth
“The Mythical Man-Month” [212 pages] from 1975 on software engineering and project
management popularized Brook‘s Law8: “Adding manpower to a late software project makes it
later” and was the first book to advocate prototyping. It is as relevant today as it was 30 years ago.
Source: http://valbonne-consulting.com/papers/classic/Brooks_75-The_Mythical_Manmonth.pdf
7
8
http://valbonne-consulting.com/pioneers/289-john-von-neumann-1903-1957
http://valbonne-consulting.com/pioneers/290-fred-brooks-1931
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
David Chaum 1981 – Untraceable electronic mail return
addresses and digital pseudonyms
The 1981 paper, by David Chaum, “Untraceable Electronic Mail, Return Addresses, and Digital
Pseudonyms” [8 pages], laid the groundwork for the field of anonymous communications research.
The ideas described are the technical roots of the Cypherpunk9 movement that began in the late
1980s. Chaum’s proposal allowed users to obtain digital currency from a bank and spend it in a
manner that is untraceable by the bank or any other party.
Source: http://valbonne-consulting.com/papers/classic/Chaum_81Untraceable_electronic_mail_return_addresses_and_digital_pseudonyms.pdf
Cook 1971 – The Complexity of Theorem Proving Procedures
In his “The Complexity of Theorem Proving Procedures” [7 pages], Cook10 formalized the notions
of polynomial-time reduction (a.k.a. Cook reduction) and NP-completeness, and proved the
existence of an NP-complete problem by showing that the Boolean satisfiability problem (usually
known as SAT) is NP-complete. This theorem was proven independently by Leonid Levin in the
Soviet Union, and has thus been given the name the Cook-Levin theorem. The paper also
formulated the most famous problem in computer science, the P vs. NP problem. Informally, the “P
vs. NP” question asks whether every optimization problem whose answers can be efficiently
verified for correctness/optimality can be solved optimally with an efficient algorithm.
Source: http://valbonne-consulting.com/papers/classic/Cook_71The_Complexity_of_Theorem_Proving_Procedures.pdf
David Garlan 1994 – An Introduction to Software Architecture
David Garlans 1994 paper, “An Introduction to Software Architecture” [42 pages], was one of the
first to provide an introduction to the (back then) emerging field.
Source: http://valbonne-consulting.com/papers/classic/David_Garlan_94An_Introduction_to_Software_Architecture.pdf
David Parnas 1971 – Information Distribution Aspects of
Design Methodology
David Parnas 1971 paper, “Information Distribution Aspects of Design Methodology” [6 pages],
was fundamental to modular software design and information hiding.
Source: http://valbonne-consulting.com/papers/classic/David_Parnas_71Information_Distribution_Aspects_of_Design_Methodology.pdf
9 https://en.wikipedia.org/wiki/Cypherpunk
10 http://valbonne-consulting.com/pioneers/278-stephen-cook-leonoid-levin
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
Dingledine and Mathewson 2006 – Anonymity Loves Company,
Usability and the Network Effect
In their 2006 paper, “Anonymity Loves Company, Usability and the Network Effect” [12 pages],
Dingledine and Mathewson focus on the network effects of usability on privacy and security. This
paper has heavily influenced the design of the TOR11 onion router.
Source: http://valbonne-consulting.com/papers/classic/Dingledine_Mathewson_06Anonymity_Loves_Company-Usability_and_the_Network_Effect.pdf
Edgar Dijkstra 1968 – GO TO Statement Considered Harmful
The 1968 paper, “GO TO Statement Considered Harmful” [4 pages], by Edsger Dijkstra‘s12,
criticized the excessive use of the GOTO statement in programming languages of the day and
advocated structured programming instead.
Source: http://valbonne-consulting.com/papers/classic/Edgar_Dijkstra_68GOTO_Statement_Considered_Harmful.pdf
Felleisen and Wright 1992 – A Syntactic Approach to Type
Soundness
The approach to “A Syntactic Approach to Type Soundness” [49 pages], presented in 1992 by
Felleisen & Wright, introduced type soundness for Hindley/Milner-style polymorphic type systems.
The keys to their approach are an adaptation of subject reduction theorems from combinatory logic
to programming languages, and the use of rewriting techniques for the specification of the language
semantics. The paper has heavily influenced programming language design in C#, Java and C++
Source: http://valbonne-consulting.com/papers/classic/Felleisen_and_Wright_92A_Syntactic_Approach_to_Type_Soundness.pdf
Fox and Brewer 1999 – Harvest, Yield and Scalable, Tolerant
Systems
“Harvest, Yield and Scalable Tolerant Systems” [5 pages], by Fox & Brewer, is one of a series of
their papers which introduced the CAP theorem13, also known as Brewer’s theorem.
Source: http://valbonne-consulting.com/papers/classic/FOX_Brewer_99Harvest_Yield_and_Scalable_Tolerant_Systems.pdf
Frederick Brooks 1987 – No Silver Bullet: Essence and
Accidents of Software Engineering
“No Silver Bullet — Essence and Accidents of Software Engineering” [16 pages], is a widely
11 https://www.torproject.org/docs/documentation.html.en
12 http://valbonne-consulting.com/pioneers/279-edsger-w-dijkstra-1930-2002
13 https://en.wikipedia.org/wiki/Cap_theorem
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
discussed paper on software engineering written by Fred Brooks14 in 1986. Brooks argues that
“there is no single development, in either technology or management technique, which by itself
promises even one order of magnitude [tenfold] improvement within a decade in productivity, in
reliability, in simplicity.” He also states that “we cannot expect ever to see two-fold gains every two
years” in software development, like there is in hardware development (Moore’s law).
Source: http://valbonne-consulting.com/papers/classic/Frederick_Brooks_87No_Silver_Bullet_Essence_and_Accidents_of_Software_Engineering.pdf
Fredkin and Toffoli – Conservative Logic
The 1982 paper, “Conservative Logic” [26 pages], is a comprehensive model of computation which
explicitly reflects a number of fundamental principles of physics, such as the reversibility of the
dynamical laws and the conservation of certain additive quantities (among which energy plays a
distinguished role). Because it more closely mirrors physics than traditional models of computation,
conservative logic is in a better position to provide indications concerning the realization of highperformance computing systems, i.e., of systems that make very efficient use of the “computing
resources” actually offered by nature. In particular, conservative logic shows that it is ideally
possible to build sequential circuits with zero internal power dissipation.
Source: http://valbonne-consulting.com/papers/classic/Fredkin_and_ToffoliConservative_Logic.pdf
C.A.R. Hoare 1978 – Communicating Sequential Processes
C. A. R. Hoare15, in 1978, first introduced the idea of “Communicating Sequential Processes (CSP)”
[260 pages], a formal language for describing patterns of interaction in concurrent systems. CSP is a
member of the family of mathematical theories of concurrency known as process algebras, or
process calculi, based on message passing via channels. CSP was highly influential in the design of
the occam programming language, and also influenced the design of programming languages such
as Limbo and Go.
Source: http://valbonne-consulting.com/papers/classic/Hoare_04Communicating_Sequential_Processes.pdf
IEEE 1471/2000 – A recommended practice for architectural
description of software intensive systems
Not a paper per-se but a very complete technical standard, the “IEEE Recommended Practice for
Architectural Description of Software-Intensive Systems” [23 pages], describes the fundamental
organization of a system, embodied in its components, their relationships to each other and the
environment, and the principles governing its design and evolution.
Source: http://valbonne-consulting.com/papers/classic/IEEE1471-2000A_recommended_practice_for_architectural_description_of_software_intensive_systems.pdf
14 http://valbonne-consulting.com/pioneers/290-fred-brooks-1931
15 http://valbonne-consulting.com/pioneers/287-tony-hoare-1934
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
Kendall et al 1994 – A Note on Distributed Computing
The 1994 classic, “A Note on Distributed Computing” [14 pages], argues that objects that interact in
a distributed system need to be dealt with in ways that are intrinsically different from objects that
interact in a single address space. These differences are required because distributed systems require
that the programmer be aware of latency, have a different model of memory access, and take into
account issues of concurrency and partial failure. It looks at a number of distributed systems that
have attempted to paper over the distinction between local and remote objects, and show that such
systems fail to support basic requirements of robustness and reliability. These failures have been
masked in the past by the small size of the distributed systems that have been built. In the
enterprise-wide distributed systems foreseen in the near future, however, such a masking will be
impossible. It concludes by discussing what is required of both systems-level and application-level
programmers and designers if one is to take distribution seriously.
Source: http://valbonne-consulting.com/papers/classic/Kendall_94A_Note_on_Distributed_Computing.pdf
Ken Thompson 1984 – Reflections on Trusting Trust
An old and respected paper from 1984, about compilers that teaches us a lot about network security
architecture, “Reflections on Trusting Trust” [8 pages], by Ken Thompson16. Thompson stipulated
how he could insert a backdoor into the compiler so that even if your code is safe, after being
compiled it will get back-doored. While his paper is about compilers, the concept is trust. How far
can you trust anything? How far can what you trust, in turn, trust anything further down the line? If
you write your own programs, then you can be reasonably sure they have no backdoor. Do you also
write your own compiler? How about the operating system? The motherboard? The CPU? There’s
no end to trust. No matter how paranoid you are, eventually you have to take a leap of faith.
Source: http://valbonne-consulting.com/papers/classic/Ken_Thompson_84Reflections_on_Trusting_Trust.pdf
Lamport 1978 – Time, Clocks and the ordering of Events in a
Distributed System
Leslie Lamport17 is probably best-known for the open-source typesetting LaTeX macro package and
book. Arguably his most important contributions are in the domain of distributed systems: “Lamport
78 Time Clocks and the ordering of Events in a Distributed System” [8 pages]. The paper suggests
to think about distributed systems in terms of state machines — an idea that most people who read
the paper seem not to notice. Almost any system can be described as a state machine. So an easy
way to implement a system is to describe it as a state machine and use a general algorithm for
implementing state machines.
Source: http://valbonneconsulting.com/papers/classic/Lamport_78_Time_Clocks_and_the_ordering_of_Events_in_a_Distri
buted_System.pdf
16 http://valbonne-consulting.com/pioneers/317-ken-thompson-1943
17 http://valbonne-consulting.com/pioneers/297-leslie-lamport-1941
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
Lamport 1982 – The Byzantine Generals Problem
The “Byzantine Generals Problem” [20 pages], is an agreement protocol that’s built around an
imaginary General who makes a decision to attack or retreat, and who must communicate his
decision to his lieutenants. Lamport framed the paper around a story problem after observing what
he felt was an inordinate amount of attention received by Dijkstra’s Dining Philosopher problem.
This problem is built around an imaginary General who makes a decision to attack or retreat, and
must communicate the decision to his lieutenants. A given number of these actors are traitors
(possibly including the General). Traitors cannot be relied upon to properly communicate orders;
worse yet, they may actively alter messages in an attempt to subvert the process.
Source: http://valbonne-consulting.com/papers/classic/Lamport_82The_Byzantine_Generals_Problem.pdf
Barbara Liskov 1972 – Design Methodology for Reliable
Software Systems
The 1972 paper, by Barbara Liskov, “A Design Methodology for Reliable Software Systems” [10
pages], argues entire system should be partitioned. No global state and each partition owns some
part of the state. Partition exposes operations and only way of interacting with the state would be by
calling operations on the modules.
Source: http://valbonne-consulting.com/papers/classic/Liskov_72Design_Methodology_for_Reliable_Software_Systems.pdf
Martin Fowler 2003 – Who Needs an Architect
Martin Fowler proposes in “Who needs Architects?” [3 pages], that Software is not limited by
physics, like buildings are. It is limited by imagination, by design, by organization. In short, it is
limited by properties of people, not by properties of the world. “We have met the enemy, and he is
us.” As Fowler mentions in his article, once you have designed a building it is very difficult to
change the basement, on the other hand if you have layered your software well and applied the
relevant design patterns you can still make shifts to your foundations without impacting the entire
software stack! A Must-Read not just for Software-Architects.
Source: http://valbonne-consulting.com/papers/classic/Martin_Fowler_2003Who_Needs_an_Architect.pdf
Merkle 1979 – Security, Authentication and Public Key
Systems
Merkle’s 1979, “Security, Authentication and Public Key Systems” [193 pages], is a must read for
anyone working in the security field. Merkle co-invented the Merkle–Hellman knapsack
cryptosystem, invented cryptographic hashing (now called the Merkle–Damgård construction based
on a pair of articles published 10 years later that established the security of the scheme), and
invented Merkle trees. While at Xerox PARC, Merkle designed the Khufu and Khafre block
ciphers, and the Snefru hash function.
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
Source: http://valbonne-consulting.com/papers/classic/Merkle_1979Security_Authentication_and_Public_Key_Systems.pdf
Moseley and Marks 2006 – Out of the Tarpit
The 2006 paper “Out of the Tarpit” [66 pages], by Moseley and Marks, proposes the Functional
Relational Programming (FRP) approach to address out-of-control complexity in software. You can
witness many of the ideas below actually implemented (at least partly) in the Clojure language and
some of its libraries.
Source: http://valbonne-consulting.com/papers/classic/Moseley_and_Marks_06Out_of_the_Tarpit.pdf
Naur and Randell 1968 – Software Engineering Report of a
conference sponsored by the NATO Science Committee
The term “Software Engineer” was first used in 1968 as a title for the world’s first conference on
Software Engineering, sponsored and facilitated by NATO. The conference was attended by
international experts on software who agreed on defining best practices for software grounded in the
application of engineering. The result of the conference is a report [136 pages], that defines how
software should be developed [i.e., software engineering foundations]. The original report is
publicly available. (Naur18 and Randell, 1968)
Source: http://valbonne-consulting.com/papers/classic/Naur_and_Randell_68Software_Engineering_Report_of_a_conference_sponsored_by_the_NATO_Science_Committee.pd
f
Niklaus Wirth 1971 – Program Development by Stepwise
Refinement
Nikolaus Wirth’s 1971 paper, “Program Development by Stepwise Refinement” [13 pages], about
the teaching of programming, is considered to be a classic text in software engineering. Niklaus
Wirth is the inventor of the programming language Pascal. In this paper Wirth argues how to
decompose a task in subtask, in a top-down fashion. In fact Wirth gave Dijkstra's letter the heading
"Go To Statement Considered Harmful", which introduced the phrase "considered harmful" into
computing.
Source: http://valbonne-consulting.com/papers/classic/Niklaus_Wirth_71Program_Development_by_Stepwise_Refinement.pdf
Eric S. Raymond 1998 – The Cathedral and the Bazaar
Raymond’s 1998 book, “The Cathedral and the Bazaar” [35 pages], is considered as the Bible of the
Open Source definition. It attempts to explain what open source is and define its motives and
mechanics.
18 http://valbonne-consulting.com/pioneers/312-peter-naur-1928
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
Source: http://valbonne-consulting.com/papers/classic/Raymond_98-Cathedral_Bazaar.pdf
Robert Martin 1996 – The Open Closed Principle
Bertrand Meyer is generally credited as having originated the term “Open/Closed Principle” [14
pages]. In OOP, software entities (classes, modules, functions, etc.) should be open for extension,
but closed for modification.
Source: http://valbonne-consulting.com/papers/classic/Robert_Martin_96The_Open_Closed_Principle.pdf
Satoshi Nakamoto 2008 – Bitcoin: A Peer to Peer Electronic
Cash System
Satoshi Nakamoto19 makes the case for Bitcoin in his 2008, “Bitcoin: A Peer-to-Peer Electronic
Cash System” [9 pages]. Satoshi Nakamoto, which is the pseudonym of an individual or group
behind the creation of bitcoin, took many ideas from the Cypherpunk movement (see above David
Chaum, “Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms“). Bitcoin solves
the Byzantine Generals Problem described by Leslie Lamport by adding a random cost to sending
messages, which slows down the rate of message passing to ensure only one camp will be able to
broadcast at a time. The cost that it adds is called ‘proof of work’, and is based on calculating a
random hash algorithm. Bitcoin promises the separation of banking and state, and is neither
controlled by state, banks, corporations. It’s the first international currency and it’s controlled by
mathematics. But the paper Satoshi published is more than just a description for e-currency. Money
is just the first application. Bitcoin the blockchain is essentially a network layer similar to TCP/IP
that allows you to do transactions between a sender and recipient but supports higher layers built on
top of it. The scenario of John paying Sarah 1 bitcoin is done via a stack based protocol in Reverse
Polish Notation (RPN). This makes the protocol a highly versatile framework for a variety of
transactions20 that are completely unrelated to e-currency and built on top of the blockchain
protocol. Already we see a host of other ideas that are coming forward such as trust-models, escrow
or timelocks, and many more, leveraging the protocol.
Source: http://valbonne-consulting.com/papers/classic/Satoshi_Nakamoto_08Bitcoin_A_Peer_to_Peer_Electronic_Cash_System.pdf
Paul Graham 2002 – The Roots of Lisp
In his 2002 paper “The Roots of Lisp” [13 pages], Paul Graham21 (who proposed the idea of
Bayesian anti-spam filtering) provides a refreshing analysis of the McCarthy classic Recursive
Functions of Symbolic Expressions and Their Computation by Machine (Part I)22.
Source: http://valbonne-consulting.com/papers/classic/Graham_02-The_Roots_of_Lisp.pdf
19
20
21
22
http://valbonne-consulting.com/pioneers/298-satoshi-nakamoto
http://storj.io/
http://www.paulgraham.com/rootsoflisp.html
http://www-formal.stanford.edu/jmc/recursive.html
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
Steele and Sussman 1978 – The Art of the Interpreter
Steele and Sussman 1978 paper, “The Art of the Interpreter” [75 pages], examines the effects of
various language design decisions on the programming styles available to a user of the language,
with particular emphasis on the ability to incrementally construct modular systems. This influential
paper/essay starts with an description of the history of the Lisp programming language, then
continues by describing a series of language features, and how they are motivated by the need to
support modular programming. What made it a classic is the fact that each of these features are
described through a small interpreter implementing it, so-called metacircular evaluators. The
elegant manner this in done in is a wonderful demonstration of Steele and Sussman’s masterhackerdom — more than two decades later, the essay is still a very worthwhile read.
Source: http://valbonne-consulting.com/papers/classic/Steele_and_Sussman_78The_Art_of_the_Interpreter.pdf
Strachey 1967 – Fundamental Concepts in Programming
Languages
“Fundamental Concepts in Programming Languages” [39 pages], is an influential set of lecture
notes written by Christopher Strachey for the International Summer School in Computer
Programming at Copenhagen in August, 1967. It introduced much programming language
terminology still in use today, including R-values, L-values, parametric polymorphism, and ad hoc
polymorphism.
Source: http://valbonne-consulting.com/papers/classic/Strachey_67Fundamental_Concepts_in_Programming_Languages.pdf
Turner 2004 – Total Functional Programming
Turner‘s 2004 paper, “Total Functional Programming” [18 pages], drives the idea of functional
programming is to make programming more closely related to mathematics. A program in a
functional language such as Haskell or Miranda consists of equations which are both computation
rules and a basis for simple algebraic reasoning about the functions and data structures they define.
The existing model of functional programming, although elegant and powerful, is compromised to a
greater extent than is commonly recognized by the presence of partial functions. We consider a
simple discipline of total functional programming designed to exclude the possibility of nontermination. Among other things this requires a type distinction between data, which is finite, and
codata, which is potentially infinite.
Source: http://valbonne-consulting.com/papers/classic/Turner_04Total_Functional_Programming.pdf
Saltzer Reed & Clark – End to End Arguments in System
Design
“End-to-End Arguments in System Design” [10 pages] by Saltzer, Reed and Clark states that
functionality of a network system should be pushed as close to the endpoint, i.e. the application that
Valbonne Consulting
Research & Recruitment for Cloud, Automotive, Telco & IoT
uses it, instead of putting much functionality in the lower level of the system. If functions are placed
at lower levels of the network, all applications that use the network use the functions redundantly,
even if they don’t need them. While this paper describes with good arguments the basics of the
network systems used today it also has its critics23. It is interesting to read from the first source as to
why and how these principles were invented and widely spread.
Source: http://valbonne-consulting.com/papers/classic/Saltzer_Reed_and_ClarkEnd_to_End_Arguments_in_System_Design.pdf
***
Did we forget your favorite paper? Please let us know. Thanks!
23 http://maestro.ee.unsw.edu.au/~timm/pubs/02icc/published.pdf
Download