A Digital Timeline

advertisement
A Digital Timeline
A History of Digital Technology
Beginnings to 1900
Compiled by Skip Schiel
(added April 24, 2002)
(revised December 12, 2009)
An attempt at charting the trajectory of digital technology, with special attention to graphical applications. Comments solicited, corrections gladly considered,
links and images most graciously desired. (Special note: those attributed as inventors or creators more often were joined by many others, some named, some
not. And dates are often only approximations.)
1901 - 1959
1960 - 1979
1980 - 1985
1986 - 1995
1996 -2005
3000 BCE Abacus
The name Abacus derives from the
Greek word abax, meaning table or
board covered with dust. The origins
of the Abacus are buried deep in the
history of mankind. It is known that
in its 'modern' form it appeared in
China in the 13th century AD.
Logarithms, "Napier’s
bones," multiplication
1550-1617 tables on a stick
John Napier
Nearing the end of his life, John
Napier, who is generally considered
the inventor of logarithms, developed
an ingenious arithmetic trick— not as
remarkable as logarithms, but very
useful all the same. His invention was
a method for performing arithmetic
operations by the manipulation of
rods, called “bones” because they
were often constituted from bones
and printed with integers. Napier’s
rods essentially rendered the complex
processes of multiplication and
division into the comparatively
simple tasks of addition and
subtraction.
—Alexandros Diploudis
A machine for adding,
subtracting, multiplying
1592-1635 and dividing
Wilhelm Schickard
Schickard wrote that he had built a
machine that "...immediately
computes the given numbers
automatically; adds, subtracts,
multiplies, and divides".
Unfortunately, no original copies of
Schickard's machine exist, but
working models have been
constructed from his notes.
—Bebop BYTES Back
(An Unconventional Guide to
Computers)
1644 Pascaline (a mechanical
calculator)
Blair Pascal
Slide Rule
1650
Edmund Gunter and William
Oughtred
The differential calculus & a
1679 machine to multiply
Gottfried Wilhelm Leibniz
A mechanism to add & subtract with
8 figures and carrying of 10's, 100's,
and 1000's etc.
The first Slide Rule appeared in 1650
and was the result of a joint effort of
two Englishmen, Edmund Gunter and
the Reverend William Oughtred. This
slide rule based on Napier's
logarithms was to become the first
analog computer (of the modern
ages) since multiplication and
subtraction were figured out by
physical distance. This invention was
dormant until 1850 when a French
Artillery officer Amedee Mannheim
added the movable double sided
cursor, which gave it its appearance
as we know it today.
He improved the Pascaline by
creating a machine that could also
multiply. Like its predecessor,
Leibniz's mechanical multiplier
worked by a system of gears and
dials.
Joseph Marie Jacquard's inspiration
of 1804 revolutionized patterned
textile weaving. For the first time,
fabrics with big, fancy designs could
be woven automatically by one man
working without assistants...
Power loom with an
1804 automatic card reader
Joseph Marie Jacquard
This was the earliest use of punched
cards programmed to control a
manufacturing process. Although he
created his mechanism to aid the local
silk industry, it was soon applied to
cotton, wool, and linen weaving. It
appeared in the United States about
1825 or 1826.
—Steven E. Schoenherr
Arithmometer (massproduced mechanical
1820 calculator)
Thomas de Colmar
The honor of first establishing the
manufacture of calculating machines
as an industry goes to Charles Xavier
Thomas of Colmar, France, or
Thomas de Colmar, as he is more
commonly known. Like others,
Thomas used the stepped cylinder
invented by Leibniz as his digitalvalue actuator.
—George C. Chase
A mechanical digital computer which,
viewed with the benefit of a century
and a half's hindsight, anticipated
virtually every aspect of present-day
computers.
Difference & analytic
1822 engines
Charles Babbage
His subsequent invention, the analytic
engine, inspired by Jacquard’s
punched cards, used a store, a mill,
and an output device (automated type
setter)
— John Walker
A biogaphy of Charles Babbage
(Thanks to Jane Matthews)
1830 Telegraph
Samuel F.B. Morse & Joseph
Henry
1839
Photography
Talbot, Niépce, & Daguerre
Electrical signals encode information,
dots & dashes, to form letters and
words.
Silver salts, converted to free silver by
light and chemicals, co-discovered by
William Henry Fox Talbot, Joseph
Nicéphore Niépce, & Louis Jacques
Mandé Daguerre
1843
Programs & subroutines for
the Analytic Engine
Ada Augusta Byron, aka Lady
Lovelace
She suggested to Babbage writing a
plan for how the Engine might
calculate Bernoulli numbers. This
plan is now regarded as the first
"computer program." A software
language developed by the U.S.
Department of Defense was named
"Ada" in her honor in 1979.
—Dr. Betty Toole
The Calculus of Logic
Algebra from logic, truth
1854 tables
George Boole
In a work lately published I have
exhibited the application of a new and
peculiar form of Mathematics to the
expression of the operations of the
mind in reasoning...
The part of the system to which I shall
confine my observations is that which
treats of categorical propositions...
—George Boole Cambridge and Dublin
Mathematical Journal, Vol. III (1848),
pp. 183-98
Typewriter
1866
Sholes and Carlos Glidden and
others
While developing a machine for
numbering book pages, they were
inspired to build a machine that could
print words as well as numbers
1876
Telephone
Alexander Graham Bell
In Boston, Massachusetts, Alexander
Graham Bell invented the telephone.
Thomas Watson fashioned the device
itself; a crude thing made of a
wooden stand, a funnel, a cup of acid,
and some copper wire. But these
simple parts and the equally simple
first telephone call —"Mr. Watson,
come here, I want you!" — belie a
complicated past.
—Tom Farley
1877
Phonograph
Thomas Edison
The device consisted of a cylindrical
drum wrapped in tinfoil and mounted
on a threaded axle. A mouthpiece
attached to a diaphragm was
connected to a stylus that etched
vibrational patterns from a sound
source on the rotating foil. For
playback the mouthpiece was
replaced with a "reproducer" that
used a more sensitive diaphragm.
Edison recited "Mary Had a Little
Lamb" into the mouthpiece for the
first demonstration.
—Geoffrey Rubinstein
Punch card reader &
1890 tabulating machine
Herman Hollerith at MIT
A punch-card tabulation machine
system that revolutionized statistical
computation
Used during the 1890 US census
Cinema
1895
1895
Auguste and Louis Lumière &
Thomas Edison
Radio
Guglielmo Marconi
Lumiere's portable, suitcase-sized
cinematographe served as a camera,
film processing unit, and projector all
in one. He could shoot footage in the
morning, process it in the afternoon,
and then project it to an audience that
evening. His first film was the arrival
of the express train at Ciotat. Other
subjects included workers leaving the
factory gates, a child being fed by his
parents, people enjoying a picnic
along a river.
Radio—signaling and audio
communication using
electromagnetic radiation—was first
employed as a "wireless telegraph",
for point-to-point links where regular
telegraph lines were unreliable or
impractical. Next developed was
radio's ability to broadcast messages
simultaneously to multiple locations,
at first using the dots-and-dashes of
telegraphic code, and later in full
audio.
Sound
cinema
Thomas Edison
"Edison invented
the motion
pictures as a
supplement to his
phonograph, in
the belief that
sound plus a
moving picture
would provide
better
entertainment
than sound alone.
But in a short time
the movies proved
to be good enough
entertainment
without sound. It
has been said that
although the
motion picture
and the
phonograph were
intended to be
partners, they
grew up
separately. And it
might be added
that the motion
picture held the
phonograph in
such low esteem
that for years it
would not speak.
Throughout the
long history of
efforts to add
sound, the success
of the silent movie
was the great
obstacle to
commercialization
of talking
pictures."
—Edward W.
Kellog ,June 1955,
Journal of the
SMPTE
Could there exist, at least in principle,
a definite method or process by which
it could be decided whether any given
mathematical assertion was
provable?
1936
The Turing Machine
Alan Turing
To answer such a question needed a
definition of 'method' which would be
not only precise but compelling. This
is what Turing supplied. He analysed
what could be achieved by a person
performing a methodical process, and
seizing on the idea of something done
'mechanically', expressed the analysis
in terms of a theoretical machine able
to perform certain precisely defined
elementary operations on symbols on
paper tape. He presented convincing
arguments that the scope of such a
machine was sufficient to encompass
everything that would count as a
'definite method.' Daringly he
included an argument based on the
transitions between 'states of mind' of
a human being performing a mental
process.
— Andrew Hodges
The Atanasoff-Berry Computer was
the world's first electronic digital
computer. It was built by John
Vincent Atanasoff and Clifford Berry
at Iowa State University during 193742. It incorporated several major
innovations in computing including
the use of binary arithmetic,
regenerative memory, parallel
processing, and separation of
memory and computing functions.
Digital computer
1937
John Vincent Atanasoff & Clifford
Berry at Iowa State University
—Department of Computer Science,
Iowa State University
Enigma is used to scramble all of
Germany's most top-secret
communications. It is the most
advanced cipher ever designed and
was, until now, thought unbreakable.
1940
Breaking a German code,
the Enigma
Alan Turing
Enigma M3
In 1938 Turing published a
mathematical paper entitled On
Computational Numbers in which he
introduced the theory of so-called
Universal Turing Machines,
mechanical devices capable of being
configured in order to tackle any
mathematical problem imaginable.
Turing used this ingenious concept to
create precisely configurable large
machines called "bombes" capable of
applying the enormous amount of
mathematical effort required to break
the enigma code by brute force.
1941
Television
Television came into being based on
the inventions and discoveries of
many men and scientists. The 'first'
generation of television sets were not
entirely electronic. The display (TV
screen) had a small motor with a
spinning disc and a neon lamp, which
worked together to give a blurry
reddish-orange picture about half the
size of a business card!
—www.tvhistory.tv/pre-1935.htm
1941
Digital computer (Z3)
Konrad Zuse
Konrad Zuse was the creator of the
first full automatic, program
controlled and freely programmable,
in binary floating point arithmetic
working computer. The Z3 was
finished in 1941.
—Professor Dr. Friedrich L. Bauer
1943
Entirely electric computer
(COLOSSUS)
Max Newman & Tommy Flowers
Colossus reduced the time to break
Lorenz messages from weeks to
hours. It was just in time for the
deciphering of messages which gave
vital information to Eisenhower and
Montgomery prior to D-Day. These
deciphered Lorenz messages showed
that Hitler had swallowed the
deception campaigns, the phantom
army in the South of England, the
phantom convoys moving east along
the channel; that Hitler was
convinced that the attacks were
coming across the Pas de Calais and
that he was keeping Panzer divisions
in Belgium. After D-day the French
resistance and the British and
American Air Forces bombed and
strafed all the telephone and
teleprinter land lines in Northern
France, forced the Germans to use
radio communications and suddenly
the volume of intercepted messages
went up enormously.
—Tony Sale
1944
Stored program, sort and
merge operations
John Louis von Neumann
Von Neumann's interest in computers
differed from that of his peers by his
quickly perceiving the application of
computers to applied mathematics for
specific problems, rather than their
mere application to the development
of tables. During the war, von
Neumann's expertise in
hydrodynamics, ballistics,
meteorology, game theory, and
statistics, was put to good use in
several projects. This work led him to
consider the use of mechanical devices
for computation, and although the
stories about von Neumann imply
that his first computer encounter was
with the ENIAC, in fact it was with
Howard Aiken's Harvard Mark I
(ASCC) calculator.
—J. A. N. Lee
1944
Relay-based computer
(MARK 1)
Howard Aiken at Harvard-IBM
The Mark I was constructed out of
switches, relays, rotating shafts, and
clutches, and was described as
sounding like a "roomful of ladies
knitting." The machine contained
more than 750,000 components, was
50 feet long, 8 feet tall, and weighed
approximately 5 tons!
—Bebop BYTES Back
(An Unconventional Guide to
Computers)
The world's first electronic digital
computer was developed by Army
Ordnance to compute World War II
ballistic firing tables.
1946
ENIAC (electronic
numerical integrator and
computer)
John W. Mauchly and J. P.
Eckert, Jr. at University of
Pennsylvania
By today's standards for electronic
computers the ENIAC was a
grotesque monster. Its thirty separate
units, plus power supply and forcedair cooling, weighed over thirty tons.
Its 19,000 vacuum tubes, 1,500
relays, and hundreds of thousands of
resistors, capacitors, and inductors
consumed almost 200 kilowatts of
electrical power.
But ENIAC was the prototype from
which most other modern computers
evolved. It embodied almost all the
components and concepts of today's
high- speed, electronic digital
computers. Its designers conceived
what has now become standard
circuitry such as the gate (logical
"and" element), buffer (logical "or"
element) and used a modified EcclesJordan flip-flop as a logical, highspeed storage-and-control device. The
machine's counters and
accumulators, with more
sophisticated innovations, were made
up of combinations of these basic
elements.
—Martin H. Weik
1948
Transistor
Barden, Shockley, & Brattain
William Shockley and Walter
Brattain had both been working with
semiconductors since the early 1930’s,
and in 1939, Shockley had an idea, to
use a piece of copper screen in a piece
of semi-conducting material.
Although that particular experiment
failed, in 1940 Russell Ohl
accidentally discovers the silicon p-n
junction at Bell Labs.
—Shelley A. Steiner
Business computer
(UNIVAC 1)
1951
John W. Mauchly and J. P.
Eckert, Jr. at University of
Pennsylvania
The first UNIVAC computer was
delivered to the Census Bureau in
June 1951. Unlike the ENIAC, the
UNIVAC processed each digit serially.
But its much higher design speed
permitted it to add two ten-digit
numbers at a rate of almost 100,000
additions per second. Internally, the
UNIVAC operated at a clock
frequency of 2.25 MHz, which was no
mean feat for vacuum tube circuits.
The UNIVAC also employed mercury
delay-line memories. Delay lines did
not allow the computer to access
immediately any item data held in its
memory, but given the reliability
problems of the alternative Cathode
Ray Tube (CRT) technology, this was
a good technical choice.
—University of Pennsylvania Library
1953
Transistorized computer
Tom Watson at IBM
Tom Watson, Jr., led IBM to
introduce the model 604 computer, its
first with transistors, that became the
basis of the model 608 of 1957, the
first solid-state computer for the
commercial market. Transistors were
expensive at first, cost $8 vs. $.75 for
a vacuum tube.
—Steven E. Schoenherr
1955
TRADIC—a fully
transistorized computer
Bell Labs
TRADIC stands for TRAnisitor
DIgital Computer, and as the name
suggests this was the first machine to
use all transistors and diodes and no
vacuum tubes. It was built by Bell
Labs for the U.S. Air Force, which was
interested in the lightweight nature of
such a computer for airborne use. The
machine consisted of 700 pointcontact transistors and 10,000
germanium diodes. During two years
of continuous operation only 17 of
these devices failed, a vastly lower
failure rate than Vacuum tube
machines of the time.
— Tom Howe
Integrated circuit
1958
Jack Kilby at Texas
Instruments
It was a relatively simple device that
Jack Kilby showed to a handful of coworkers gathered in TI's
semiconductor lab more than 40
years ago -- only a transistor and
other components on a slice of
germanium. Little did this group of
onlookers know, but Kilby's invention,
7/16-by-1/16-inches in size and called
an integrated circuit, was about to
revolutionize the electronics industry.
—Texas Instruments
1959
The first development efforts on
digital modems appear to have
stemmed from the need to transmit
data for North American air defense
during the 1950s.
Modem
Bell Labs
As a graduate student in electrical
engineering at UC Berkeley after
World War II Doug Engelbart began
to imagine ways in which all sorts of
information could be displayed on the
screens of cathode ray tubes like the
ones he had used as a radar
technician during the war, and he
dreamed of "flying" through a variety
of information spaces.
—MouseSite
Mouse
1963
Doug Engelbart at
Stanford
Hypertext editing system
(HTML)
1967
<img>
Andy van Dam & Tim
Berners-Lee
1968
Random Access Memory
(RAM)
Robert Dennard
The idea behind HTML was a modest
one. When Tim Berners-Lee was
putting together his first elementary
browsing and authoring system for
the Web, he created a quick little
hypertext language that would serve
his purposes. He imagined dozens, or
even hundreds, of hypertext formats
in the future, and smart clients that
could easily negotiate and translate
documents from servers across the
Net. It would be a system similar to
Claris XTND on the Macintosh, but
would work on any platform and
browser.
—Jeffrey Veen
At that time, RAM was a known and
used concept: memory reserved for
writing to and reading from in a
temporary fashion, to be erased every
time the computer is turned off.
However, in the mid-1960s RAM
required an elaborate system of wires
and magnets that negated in practice
RAM's theoretical efficiency.
Dennard's revolutionary achievement
was to reduce RAM to a memory cell
or an earlier tube version
Mini-computer
Ken Olsen at Digital
Equipment Corporation
Internet
1969
Department of Defense
on a single transistor. His key insight
was that it should be possible to store
binary data as a positive or negative
charge on a capacitator. After several
months of experimenting, Dennard
had reduced his RAM cell to a single
field-effect transistor and a data line
that both wrote and read the charge
in a small capacitator. The ultimate
effect of Dennard's invention was that
a single chip could hold 16 million
RAM cells
—The Lemelson-MIT Program's
Invention Dimension
The DEC PDP-8 computer on March
22, 1965, is generally recognized as
the most important small computer of
the 1960's. It was the least expensive
parallel general purpose computer on
the market, the first computer sold on
a retail basis, and the first parallel
general purpose digital computer sold
in a table-top configuration.
—Douglas W. Jones
The global Internet's progenitor was
the Advanced Research Projects
Agency Network (ARPANET) of the
U.S. Department of Defense. This is
an important fact to remember...
—Michael Hauben
Unix
1971
Floppy disk
Microprocessor
Gilbert P. Hyatt & Ted
Hoff at Intel
The Creation of the UNIX* Operating
System
After three decades of use, the UNIX*
computer operating system from Bell
Labs is still regarded as one of the
most powerful, versatile, and flexible
operating systems (OS) in the
computer world. Its popularity is due
to many factors, including its ability
to run a wide variety of machines,
from micros to supercomputers, and
its portability -- all of which led to its
adoption by many manufacturers.
Like another legendary creature
whose name also ends in 'x,' UNIX
rose from the ashes of a multiorganizational effort in the early
1960s to develop a dependable
timesharing operating system.
—www.bell-labs.com/history/unix/
Floppy disk drives were originally
introduced commercially as a readonly device to hold microcode and
diagnostics for large IBM mainframe
computer systems in the early 1970s.
—Accurite Technologies Inc
In 1969, a Japanese firm called
Busicom contacted Intel about
developing custom chips for its new
desktop-printing calculator. Hoff
thought there was a better, simpler
way to develop the technology than
what the Japanese were initially
looking for. Rather than build 12
customized calculator chips, each
with a single specific function, Hoff
proposed that Intel develop a more
universal CPU chip[computer
processing unit] that could also run
the calculator. The idea of a CPU on a
chip had been around since the early
1960s but had not been feasible then.
But Fairchild and Rockwell had both
done some preliminary work in the
area and Hoff thought he could make
it work.
—Linda Stranahan
Graphical user interface
1974
Xerox
Altair personal computer
1975
Ed Roberts at Micro
Instrumentation
Telemetry Systems
(MITS)
Programming language—
Beginner's All-purpose
Symbolic Instruction
Code (BASIC)
A commercial version by
Bill Gates & Paul Allen
The history of graphical user
interfaces (GUIs) goes back to the
1970s. Project Smalltalk was
established at Xerox Palo Alto
Research Center (Parc) which
attempted to look into the future. The
idea was to assume that in the future
computing power would be abundant
and inexpensive. How could the best
use be made of the power available?
Two influential developments
resulted: object-oriented
programming and the graphical user
interface.
—Alistair D. N. Edwards
Altairs were originally "Hobbyist"
computers and have their roots in
kits. They helped define the "personal"
in Personal Computers. These
machines where part of an open
architecture concept that later made
the PC successful. The S-100 bus
allowed Altairs to be expanded and
created opportunities for other
companies to form.
—William Thomas Sanderson
Bill Gates: "We realized things were
starting to happen. Just because, we
had the vision for a long time of
where this chip could go, what it
could mean….. that didn't mean the
industry was going to wait for us
while I stayed and finished my degree
at Harvard."
Paul Allen: "So, I called up Ed and
[said: we have] this basic
[interpreter] and... it's not that far
from being done, and we would like to
come out and show it to you."
Bill Gates: "So we created this basic
interpreter. Paul took the paper tape
and flew out. In fact, the night before
he got some sleep while I doublechecked everything to make sure that
we had it all right."
1976
Word processor (Electric
Pencil)
Michael Schrayer
Apple computers
Steven Jobs & Steven
Wozniak
1978
Network
intercommunication—
Transfer Control
Protocol/Internet
Protocol (TCP/IP)
At that time, in the CPM world, the
Electric Pencil was the word
processor of the day. I took the care to
contact Dave Schrayer, author of
Electric Pencil and asked if I could use
the same "dot" commands for printer
formatting. This way, electric Pencil
users would already know the
commands if they decided to go to
EasyWriter. Or go with Electric Pencil
if they had to work in CPM.
—Webcrunchers International
Wozniak had been dabbling in
computer-design for some time when,
in 1976, he designed what would
become the Apple I. Jobs, who had an
eye for the future, insisted that he and
Wozniak try to sell the machine, and
on April 1, 1976, Apple Computer was
born.
—Glen Sanford
As time passed many enhancements
were made to the existing protocol but
by 1973 it was clear that [the first
network] was unable to handle the
volume of traffic passing through it...
The TCP/IP and gateway architecture
was proposed in 1974. This protocol
was to be independent of the
underlying network and computer
hardware as well as having universal
connectivity throughout the network.
This would enable any kind of
platform to participate in the
network.In 1981 a series of requests
for comment was
issued, standardising the TCP/IP
version 4 for the Arpanet.
—PeteDotCom
Spreadsheet program
(VISICALC)
Dan Bricklin & Bob
Frankston
Laser printer
Xerox
The idea for the electronic
spreadsheet came to me while I was a
student at the Harvard Business
School, working on my MBA degree,
in the spring of 1978. Sitting in
Aldrich Hall, room 108, I would
daydream. "Imagine if my calculator
had a ball in its back, like a mouse..."
(I had seen a mouse previously, I
think in a demonstration at a
conference by Doug Engelbart, and
maybe the Alto). And "..imagine if I
had a heads-up display, like in a
fighter plane, where I could see the
virtual image hanging in the air in
front of me. I could just move my
mouse/keyboard calculator around,
punch in a few numbers, circle them
to get a sum, do some calculations,
and answer '10% will be fine!'" (10%
was always the answer in those days
when we couldn't do very complicated
calculations...)
—Dan Bricklin
The original laser printer was
developed at the Xerox Palo Alto
Research Center. Xerox Engineer,
Gary Starkweather adapted Xerox
copier technology adding a laser
beam to it to come up with the laser
printer.
—Mary Bellis
Atari microcomputer
1979
Steve Mayer and Ron.
Milner
Unix User Network
(Usenet)
Tom Truscott, Jim Ellis, &
Steve Bellovin
Mouse with computer—
Xerox Star
Atari is most known for its
innovations in video game
technology. But a wealth of computer
products and technologies were
pioneered by Atari. In 1979 Atari Inc.
showcased its first computer product
at the Winter Consumer Electronics
show.
From that point on Atari created
innovative 8 bit computers which
were manufactured and supported up
until 1992!
Usenet came into being in late 1979,
shortly after the release of V7
UNIX with UUCP. Two Duke
University grad students in North
Carolina,
Tom Truscott and Jim Ellis, thought
of hooking computers together to
exchange information with the UNIX
community. Steve Bellovin, a grad
student at the University of North
Carolina, put together the first
version of the news software using
shell scripts and installed it on
the first two sites: "unc" and "duke."
—Mark Moraes
Star was designed as an office
automation system. The idea was that
professionals in a business or
organization would have
workstations on their desks and
would use them to produce, retrieve,
distribute, and organize
documentation, presentations,
memos, and reports. All of the
workstations in an organization
would be connected via Ethernet and
would share access to file servers,
printers, etc.
—Jeff Johnson and Teresa L. Roberts
et al
Word Perfect
1980
1981
Satellite Software &
Corel
IBM PC with DOS & Intel
Portable computer—
1981
Osburne I
Adam Osborne
WordPerfect originated in the days
when top-of-the-line printers were
daisywheel impact devices requiring
manual intervention to change fonts,
and when on-screen displays were
restricted to a single monospaced
font. Particularly flexible dot-matrix
printers included half a dozen fonts.
—Rod Smith
In the early part of 1980, IBM decided
to create a microcomputer (up to this
date, IBM produced only mini and
mainframes). They didn't really know
that they wanted and they didn't
think for one second that producing
microcomputers was a profitable
business (who would have thought!)!
—OldComputers.com
Introduced at the West Coast
Computer Faire in 1981, the Osborne 1
was the brain child of Adam Osborne,
a computer columnist, writer, and
engineer. It was co-developed with
Lee Felsenstien, and Lee designed it.
The goal was a truly integrated
computer that could go wherever the
user wanted to. The machine was
shipped as a full package including all
the hardware and software a user
could need including: 64K RAM, Z-80
CPU, 5" CRT, two floppy drives,
keyboard, serial ports, CP/M
operating system, WordStar,
SuperCalc, and two versions of
BASIC: CBASIC and MBASIC. The
machine also had the ability to
connect with scientific equipment via
a built-in IEEE-488 interface, and
could run an optional external
monitor via the built-in port. Not only
was the machine complete, it was
cheap - $1795.
— Justin Mayrand
1982
Norton Utilities
Adobe
1982
John Warnock & Charles
Geschke
Once upon a time there were lots of
disk-repair utilities for the Mac.
Symantec made Norton Utilities,
Central Point made MacTools, and
Fifth Generation made Public
Utilities. MacTools and Public
Utilities could scan disks during idletime. MacTools had TrashBack, the
best way to undelete files I've ever
seen, and could boot itself from a
RAM disk. What did Norton have? It
had a bunch of components you don't
find in the current release, including a
Directory Assistance utility that
improved Open/Save dialogs, a
backup utility, and a utility for
duplicating floppy disks. The last item
is as obsolete as Fast Find, but I think
many users would enjoy having the
first two as part of the current Norton
Utilities package.
—Michael Tsai
One of the brilliant engineers working
at Xerox was John Warnock. He
developed a language called
"Interpress" that could be used to
control Xerox laser printers. He and
his boss, Charles M. 'Chuck' Geschke,
tried for two years to convince Xerox
to turn Interpress into a commercial
product. When this failed, they
decided to leave Xerox and try it on
their own.
John Warnock and Chuck Geschke
named their company Adobe, after a
little creek that ran behind the house
of Warnock in Los Altos, California.
You sometimes see it mentioned in
wine guides on maps of Napa Valley
where some of the finest Californian
wines are made.
— L. Leurs
Compaq
1982
Rod Canion, co-founder
Rod Canion, Jim Harris and Bill
Murto, founders
The 'sewing machine' was the very
first Compaq computer.When this
machine came out, there were no
clones. An IBM compatible had the
three magic letters on the case.
Period. Part of the reason was that
IBM had published the source code for
their BIOS (basic input/output
system)so that they could claim that
anyone who brought out their own
BIOS had infringed on IBM's
copyrights and would have to stop.
—Paul Braun
War of the Words: Microsoft Word
versus WordPerfect
Dec. 1982
Microsoft Word
1983
Bill Gates at al
Satellite Software International ships
WordPerfect for DOS for $500.
Apr. 1983
Microsoft introduces Multi-Tool Word
for DOS.
Nov. 1983
WordPerfect 3.0 for DOS ships at
$500. Microsoft releases Microsoft
Word 1.0 for $375.
—Marquette University
1983
Named for one of its designer's
daughters, the Lisa (pictured below
left) was supposed to be the Next Big
Thing. It was the first personal
computer to use a Graphical User
Interface. Aimed mainly at large
businesses, Apple said the Lisa would
increase productivity by making
computers easier to work with.
—Glen Sanford
Graphical interface with
computer— Lisa
Apple
Released with much fanfare in
January of 1984, the Macintosh was
the first affordable computer to
include a Graphical User Interface. It
was built around the new Motorola
68000 chip, which was significantly
faster than previous processors,
running at 8 MHz. The Mac came in a
small beige case with a black and
white monitor built in. It came with a
keyboard and mouse, and had a
floppy drive that took 400k 3.5"
disk—the first personal computer to
do so. It originally sold for $2,495.
— Glen Sanford
Macintosh computer
1984
Apple
Domain name system
1984
Paul Mockapetris
com
net
org
gov
mil
The purpose of the domain name
system is to allow any computer on
the Internet to figure out what IP
address (for example, 216.112.23.10)
corresponds with a particular
computer hostname (for example,
"www.ahref.com"), and also what
hostname, if any, corresponds with
an IP(internet protocol) address. Your
computer needs to know remote
computers' IP addresses to figure out
how and where to send things like
email messages and requests for web
pages.
—ep Productions, Inc
Apple Works
1984
Apple
1987
AppleWorks & Claris
Rupert Lissner
In 1984, the same year that the
Macintosh was introduced, Apple
Computer released the first program
called AppleWorks. (Select this link
for more information on AppleWorks
for the Apple II). It was a strange
time for consumers: the Macintosh
was newer, had fancy fonts and
styles, had a wonderfully clear
display, but all the software that was
available for it was a simple word
processor called "MacWrite" and a
paint program called "MacPaint." On
the other hand, AppleWorks made the
old Apple II more capable than the
Mac, since it combined a word
processor, a database, and a
spreadsheet, and it let you create in
any of those "modules" and move the
information into either of the others.
It was, in other words, an integrated
program.
—Gareth Jones
After seeing the Office System on the
Lisa computer, Lissner conceived the
idea of a single program that would
put word processing, database, and
spreadsheet capabilities together, and
run on an Apple II. It was originally
called "Apple Pie", and he began work
on it in 1982. Lissner took two years
to complete his program, and did it
entirely in assembly language to
achieve better speed. He wrote
versions of the program to work on
both the Apple II and Apple III
computers, making use of the same
filetypes and data structures. Apple
Pie files created on an Apple II could
be used on an Apple III, and viceversa.
—Steven Weyhrich
Excel was originally written for the
512K Apple Macintosh in 1984-1985.
Excel was one of the first spreadsheets
to use a graphical interface with pull
down menus and a point and click
capability using a mouse pointing
device. [This] was easier for most
people to use than the command line
interface of PC-DOS spreadsheet
products.
—D. J. Power
Excel
1985
Microsoft
Postscript
1985
Adobe
Chuck Geschke and John Warnock
To appreciate PostScript, you have to
know how the market worked before
it became available. In those days, if
you needed typesetting equipment,
you went to Acme Typesetters, and
they would sell you an Acme system
with an Acme output device. Then you
would follow at least two weeks of
training to learn how to use the
system. The Acme system would be
incompatible with equipment from
any other manufacturer. In most
cases, it would even be difficult or
impossible to exchange data with
other systems.
If you owned a personal computer,
you could hook it up to a dot-matrix
printer that would output low quality
bitmap character. Graphics could be
done but the quality was only
acceptable to the nerds that bought
computers in those days.
—L. Leurs
Page Maker
1985
Paul Brainard & Aldus
Windows 1.0
1985
Microsoft
Aldus PageMaker is released for the
Macintosh in July and desktop
publishing is born. Because of
advances in printing technology and
the Macintosh WYSIWYG (what you
see is what you get) operating system,
publishers can now arrange text into
columns and headlines and move
their text around the page. Users can
also easily incorporate graphics into
their page. Soon the days of X-Acto
knives and hot wax were gone forever
as publishers began to create their
pages on screen and print. This is also
very cost effective for professional
printers who no longer needed
expensive typesetting, drawing and
page layout equipment.
—Melissa Creech
Microsoft first began development of
the Interface Manager (subsequently
renamed Microsoft Windows) in
September 1981. Although the first
prototypes used Multiplan and Wordlike menus at the bottom of the screen,
the interface was changed in 1982 to
use pull-down menus and dialogs, as
used on the Xerox Star. Microsoft
finally announced Windows in
November 1983, with pressure from
just-released VisiOn and impending
TopView. Windows promised an
easy-to-use graphical interface,
device-independent graphics and
multitasking support. The
development was delayed several
times, however, and the Windows 1.0
hit the store shelves in November
1985. The selection of applications
was sparse, however, and Windows
sales were modest.
—pcbiography.net
1985
Laptop computer
Toshiba
1986
MacPlus
Apple
Amiga 1000
Jay Milner
My love affair with the T1100+ began
in the early Summer of 2000. While
perusing the offerings of an annual
street wide garage sale in my
neighbourhood, I spotted what
appeared to be an old word processor
for sale for $25. I looked it over. The
owner pointed out rather flatly that it
ran DOS and was fully functional. His
spouse was much more enthusiastic
about my investigations, adding how
useful it had been.
—
www.cyberus.ca/~pgillil/toshiba.htm
l
Announced in January 1986, the Mac
Plus was the answer to complaints
that the original Mac was not
expandable. It doubled the ROM of
the 512k from 64k to 128k, and
increased the RAM to 1 MB
(expandable to 4 MB). It was the first
Mac to include a SCSI port, allowing
for a variety of external peripherals,
and was the first mac to use the now
familiar platinum case color
(although it initially shipped in beige).
The Mac Plus originally sold for
$2600, and was sold to educational
markets as the Mac ED.
—Glen Sanford
The conceptor of the Amiga 1000 was
Jay Miner, who created the Atari 800
many years before. He wanted to
make the most powerful computer
ever, then he joined a little California
company called Amiga. He used the
principle of the three coprocessors
(again) to help the main processor.
— oldcomputers.com
Liquid Crystal Display
(LCD)
Toshiba
1986 or 1989?
America On Line (AOL)
A liquid crystal display (LCD) test
cell
Today, LCDs are everywhere we look,
but they didn't sprout up overnight. It
took a long time to get from the
discovery of liquid crystals to the
multitude of LCD applications we
now enjoy. Liquid crystals were first
discovered in 1888, by Austrian
botanist Friedrich Reinitzer. Reinitzer
observed that when he melted a
curious cholesterol-like substance
(cholesteryl benzoate), it first became
a cloudy liquid and then cleared up as
its temperature rose. Upon cooling,
the liquid turned blue before finally
crystallizing. Eighty years passed
before RCA made the first
experimental LCD in 1968. Since then,
LCD manufacturers have steadily
developed ingenious variations and
improvements on the technology,
taking the LCD to amazing levels of
technical complexity. And there is
every indication that we will continue
to enjoy new LCD developments in the
future!
—Marshall Brain
The Internet bulletin-board system
Quantum Computer Services acquires
a new name, America Online (AOL),
and focuses on recruiting a diverse,
broad-based subscribership. From
1989 to 1998, AOL grows from its
roots as an insignificant start-up with
barely 100,000 members, to an
industry leader with more than 14
million members.
— The Moschovitis Group
1987
Computer virus
Brain
Illustrator
Adobe
XPress
Quark
The "Brain" virus is probably the
earliest MS-DOS virus. At one time it
was the most widespread of PC viral
programs.
Brain is a boot sector infector,
somewhat longer than some of the
more recent BSIs. Brain occupies
three sectors itself, and, as is usual
with BSIs, repositions the normal
boot sector in order to "mimic" the
boot process. As the boot sector is only
a single sector, Brain, in infecting a
disk, reserves two additional sectors
on the disk for the remainder of itself,
plus a third for the original boot
sector.
—Robert M. Slade
Adobe® Illustrator® 10 software
defines the future of vector graphics
with groundbreaking creative options
and powerful tools for efficiently
publishing artwork on the Web, in
print, everywhere. Produce superb
Web graphics using symbols and
innovative slicing options. Explore
creative ideas with live distortion
tools. Publish in record time with
dynamic data-driven graphics and
other productivity features.
—Adobe
Software engineer Tim Gill founded
Quark in 1981, producing the first
word processor for the Apple II
computer. Gill named the company
Quark after the subatomic particle
proposed as a building block for all
matter—an appropriate metaphor for
the role that QuarkXPress would soon
come to play in the electronic
publishing industry.
—Quark
Cat
Canon
1988
Worm
In 1987 Canon USA Inc. released a
new computer named the Canon Cat.
This computer was targeted at lowlevel clerical worked such as
secretaries. After six months on the
market and with 20,000 units sold,
Canon discontinued the Cat. The Cat
featured an innovative text based user
interface that did not rely upon a
mouse, icons, or graphics. The key
person behind the Cat was Mr. Jef
Raskin, an eclectic gadgeteer, who
began the design of the Cat during his
work on the first Macintosh project at
Apple Computer in 1979.
—David T. Craig
On the evening of November 2, 1988,
a self-replicating program was
released upon the Internet (1) This
program (a worm) invaded VAX and
Sun-3 computers running versions of
Berkeley UNIX, and used their
resources to attack still more
computers (2). Within the space of
hours this program had spread across
the U.S., infecting hundreds or
thousands of computers and making
many of them unusable due to the
burden of its activity. This paper
provides a chronology for the
outbreak and presents a detailed
description of the internals of the
worm, based on a C version produced
by decompiling.
—Donn Seeley
In service for nearly 10 years,
Disinfectant was probably the most
popular Macintosh anti-viral
program of all time. It was free, it
was so perfectly programmed that it
caused no extension conflicts, and it
was updated promptly every time a
new virus was discovered.
Disinfectant was an application and a
companion INIT, providing both ondemand and and on-access or
background scanning. John Norstad
retired Disinfectant on 6 May, 1998.
—John Norstad
Anti-virus software
Graphics super
computers
Apollo, Ardent, Stellar,
Cray
CRAY-1 SuperComputer
A broad term for one of the fastest
computers currently available. Such
computers are typically used for
number crunching including scientific
simulations, (animated) graphics,
analysis of geological data (e.g. in
petrochemical prospecting),
structural analysis, computational
fluid dynamics, physics, chemistry,
electronic design, nuclear energy
research and meteorology. Perhaps
the best known supercomputer
manufacturer is Cray Research.
—Free On-line Dictionary of
Computing
Portable Macintosh
1989
Apple
Office
Microsoft
Touch sensitive
pad/touchpad
My love affair with the T1100+ began
in the early Summer of 2000. While
perusing the offerings of an annual
street wide garage sale in my
neighbourhood, I spotted what
appeared to be a old word processor
for sale for $25. I looked it over. The
owner pointed out rather flatly that it
ran DOS and was fully functional. His
spouse was much more enthusiastic
about my investigations, adding how
useful it had been
—
www.cyberus.ca/~pgillil/toshiba.htm
l
A twist on integrated software began
with the introduction of Microsoft
Office: a single box containing
versions of Microsoft's word
processing, spreadsheet, and
presentation programs, along with a
few alterations that let them work
together in an integrated way. Like
integrated programs, such "suites"
are very popular. Other software
suites have been offered by Lotus,
Corel, and Sun.
—Gareth Jones
Touchpads are relative motion
devices. That is, there is no
isomorphism from the screen to the
touchpad. Instead, relative motion of
the user's fingers causes relative
motion of the cursor. The buttons
below or above the pad serve as
mouse standard buttons. You can also
click by tapping your finger on the
touchpad, and drag with a tap
following by a continuous pointing
motion (a click-and-a-half). Some
touchpads also have "hotspots":
locations on the touchpad that
indicate user intentions other than
pointing. For example, on certain
touchpads, moving your finger along
the right edge of the touch pad will
control the scrollbar and scroll the
window that has the focus vertically.
Moving the finger on the bottom of
the touchpad often scrolls in
horizontal direction. Some touchpads
can emulate multiple mouse buttons
by either tapping in a special corner
of the pad, or by tapping with two or
more fingers.
—en.wilipedia.org.wiki/Touchpad
Multi media platform
specifications
Object Management
Group, including
Microsoft, IBM, AT&T and
others
1990
Photo Shop
Adobe
The Object Management Group
(OMG) is an open membership, notfor-profit consortium that produces
and maintains computer industry
specifications for interoperable
enterprise applications. Our
membership includes virtually every
large company in the computer
industry, and hundreds of smaller
ones. Most of the companies that
shape enterprise and Internet
computing today are represented on
our Board of Directors.
—www.omg.org/
The story of one of the original "killer
apps" begins in Ann Arbor, Michigan
(USA) with a college professor named
Glenn Knoll. Glenn was a photo
enthusiast who maintained a
darkroom in the family basement. He
was also a technology aficionado
intrigued by the emergence of the
personal computer. His two sons,
Thomas and John, inherited their
father's inquisitive nature. And the
vision for future greatness began with
their exposure to Glenn's basement
darkroom and with the Apple II Plus
that he brought home for research
projects.
—Derrick Story
1992
Macromedia
Personal Digital Assistant
1993
Apple
In November of 1996, Macromedia
was getting tired of hearing about
our product when they worked with
Disney to use Macromedia1s
Shockwave product. So Macromedia
approached us about working
together. We had been running
FutureWave for 4 years with a total
investment of $500,000 and the idea
of having access to the resources of a
larger company to help us get
FutureSplash established in a market
that was full of competitors and
growing slowly seemed like a good
one. So in December of 1996, we sold
FutureWave Software to Macromedia
and FutureSplash Animator became
Macromedia Flash 1.0
— Jonathan Gay
In 1993, Apple Computer Inc.
introduced the world to the first PDA,
the Newton®. They were dubbed
PDAs (personal digital assistants) by
John Sculley, former chairman of
Apple Computer Inc, and were sold as
the ultimate information appliance.
Sculley predicted PDAs would become
ubiquitous tools that would hold
telephone numbers, keep your
calendar, store notes, plus send and
receive data wirelessly. Although, the
Newton was not able to deliver all of
those features at the time it was
released.
For the next three years, PDA sales
dwindled, and were almost off the
charts.
Then, in March 1996, Palm™, Inc.
delivered the industry's first truly
compelling handheld computer, the
PalmPilot. A robust yet small goanywhere device that helped people
manage and organize their personal
and professional lives by providing
instant, anytime access to schedules,
important phone numbers, to-do lists
and other key information. This new
type of information management was
met with tremendous acceptance.
Mobile, busy people embraced the
small and powerful Palm™
handhelds.
—
www.handango.com/PDAHistory.jsp
?siteId=1
QuickTake 100 camera
1994
Apple
Zip disk and drive
1995
Iomega
Apple sees the camera being used for
business, education and "memories".
It is fully automatic, with a built-in
flash. A window at the rear of the
camera is surrounded by four buttons
which control the flash, picture
resolution, self-timer, and delete
functions. The camera can store up to
32 images at a resolution of 320 x 240
pixels - each a quarter of a 13 inch
monitor screenful - or eight 640 x 480
pixel images - each a full 13 inch
monitor screenful - for up to a year in
its internal flash memory. The
resolution can be changed on a shotby-shot basis if required
—John Henshall
inventors.about.com/library/invento
rs/bldigitalcamera.htm
In March 1995, Iomega launched the
low-cost Iomega Zip 100MB drive for
the consumer and small business
market. It was an instant success that
revolutionized the storage industry,
becoming one of the fastest-selling
and most successful peripherals in the
history of computing. Today, Iomega
has sold more than 55 million Zip
drives and 350 million Zip disks
Java
Sun Microsystems
To demonstrate what
they saw as a possible
future in digital devices,
the Green Team locked
themselves away in an
anonymous office on
Sand Hill Road in Menlo
Park, cut all regular
communications with
Sun, and worked around
the clock for 18 months.
In the summer of 1992,
they emerged with a
working demo, an
interactive, handheld
home-entertainment
device controller with an
animated touchscreen
user interface.
In the demo, the now
familiar Java technology
mascot, Duke, was
shown waving and
doing cartwheels on the
screen. The device was
called *7 ("StarSeven"),
named after an "answer
your phone from any
extension" feature of the
phone system in the
Green Team office. Duke
was actually a
representation of the *7's
"agent", a software
entity that did tasks on
behalf of the user.
—Jon Byous
Flat screen
Sony
The basic idea of a plasma
screen is to illuminate tiny
colored fluorescent lights to
form an image. Each pixel is
made up of three fluorescent
lights -- a red light, a green
light and a blue light. The
plasma display varies the
intensities of the different lights
to produce a full range of colors
In the early 90s—the dawn of
history as far as the World
Wide Web is concerned—
relatively few users were
communicating across this
global network. They used an
assortment of shareware and
other software for Microsoft
Windows® operating system.
Internet Explorer
In 1995, Microsoft hosted an
Internet Strategy Day and
announced its commitment to
adding Internet capabilities to
all its products. In fulfillment of
that announcement, Microsoft
Internet Explorer arrived as
both a graphical Web browser
and the name for a set of
technologies.
Microsoft
—www.microsoft.com/windows
40 million people connected to
the Internet, more that $1
billion commerce per year,
rapidly growing internet
companies like Netscape
1996
Internet's 25th anniversary
Tim Berner-Lee
1997
Athlon
processor
This processor
competes
successfully with
Pentium chips
Advanced
Micro Devices
Giant
MagnetoResistive
heads
A new technology
used in IBM's
Deskstar 16 GP, a
16.8 GB drive,
bringing down the
cost of memory to
25 cents per
megabyte
IBM
Pentium II
processor
Pentium
A 7.5 million transistor
processor incorporates
MMX technology,
which is designed
specifically to process
video, audio, and
graphics data efficiently
1998
DVD-RAM drive
2001
iPod
Apple
5.2 GB rewriteable
capacity on a doublesided cartridge, enough
to hold a full length 2
hr movie (not be
confused with DVDROM)
iPod is not based on a
new concept.
Companies before
Apple released hard
drive based music
players, but none had
the charm and
elegance in the Apple
implementation.
Unlike the competitors,
the iPod used a high
speed FireWire
interface to transfer
files on and off of it,
and it used a tiny hard
drive, that made the
device a quarter of the
size of comparable
products.
—Saad
2005
Wearable
computer
A person's computer
should be worn, much
as eyeglasses or
clothing are worn, and
interact with the user
based on the context of
the situation. With
heads-up displays,
unobtrusive input
devices, personal
wireless local area
networks, and a host of
other context sensing
and communication
tools, the wearable
computer can act as an
intelligent assistant,
whether it be through a
Remembrance Agent,
augmented reality, or
intellectual collectives.
—Wearable Computing,
MIR
2006
20th
anniversary of
the MacPlus
things haven't changed
as much as the hype
would have it. I think
that years from now,
when the details have
been washed away by
the acid rains of time,
four major commercial
events will stand out in
the history of personal
computers: the advent
of the microprocessor
which drove prices of
computers down to the
point where
individuals could buy
them and led to the
first flowering of the
present computer
revolution, the
ascendancy of the
software industry and
the shift from "users
will program them" to
"users will run
software packages",
the Mac interface and
its followers which
brought the benefits of
computers to a far
broader audience and
fundamentally
changed the way we
use computers of all
sizes and software of
all kinds, and (to tread
on dangerous ground
since the event is
relatively recent) the
blossoming of the
Internet. To sum up the
history: cheap
hardware, application,
software, human
interface, & internet
—Jef Raskin
Download