Evolution of Computer Systems

advertisement
Evolution of Computer Systems
Matthew N. O. Sadiku
Department of Electrical Engineering
Temple University
Philadelphia, PA 19122
Abstract
Although bits and pieces of the historical
background on computer systems are found in
monographs and encyclopedia, a brief account that a
beginner can quickly digest is hard to come by. This
article presents such a short account. From ABACUS to
ENIAC and from ENIAC to BISDN the paper covers the
significant advances associated with computers. It is
hoped that the paper is useful to a beginner in the field
or an interested non-expert.
Early Developments
The need for counting, computing or processing
data has been with man from the beginning. The most
significant early computing tool is the ABACUS, a
wooden rack holding parallel rods on which beads are
strung. This simple device was used for addition and
subtraction. A Scottish scholar, John Napier (1550-1617)
invented the logarithm and in 1661, William Oughtred
invented both the rectilinear and circular slide rules.
These are analog computers which have been replaced in
modern times by pocket calculators.
A significant advance in the evolution of computing
systems was the invention of a mechanical adding
machine in 1642 by the French scientist Blaise Pascal
(1623-1662). Having observed Pascal's machine in Paris,
the German mathematician Gottfried Wilhelm von
Leibniz (1646-1716) designed a better one in 1671[1].
While Pascal's machine could only count, Leibniz device
could also multiply, divide, and find square root. In 1820,
Thomas of Colmar (Charles Xavier Thomas) produced
the first commercially available mechanical calculator.
This desktop calculator could add, subtract, multiply, and
divide. This was followed by a succession of advanced
and improved versions of mechanical calculators [2].
While Thomas of Colmar was working on the
mechanical calculator, Charles Babbage (1792-1871) at
Cambridge, England, was developing the first digital
computer. By 1822, he built an automatic mechanical
calculator called the "difference engine." In 1833 he
Clarence N. Obiozor
Department of Electrical Engineering
University of North Florida
Jacksonville, FL 32224
began to work on a general-purpose, programmable,
automatic mechanical digital computer called the
“analytic machine”. Unfortunately, Babbage's analytical
engine was never completed because its design required
fabrication precision beyond what was feasible at that
time.
A major step forward in the evolution of computer
systems is the invention of punch cards which was first
used during the U.S. census of 1890 by Herman Hollerith
and James Powers while working for the U.S. Census
Bureau. With the punch cards, the calculating machines
became fully automatic. In 1896 Hollerith formed the
Tabulating Machine Company which manufactured
punch card machines. After his retirement in 1913,
Thomas J. Watson, Sr. became president of the company,
which became International Business Machines
Corporation in 1924. This company was later to play a
significant role in the evolution of computer systems.
Modern Digital Systems
Although the punched machine was well established and reliable by the late 1930’s, several research
groups worked hard to build automatic digital computer.
A group of IBM team of four workers led by Howard
Hathaway Aiken, a physicist and mathematician at
Harvard University, began work on a fully automatic
calculator in 1939. The calculator, commonly called the
International Business Machines Automatic Sequence
Controlled Calculator or Harvard Mark I, was completed
in August 1944.
This was the first information-processing machine.
As an electromechanical
computer, it has 760,000 wheels, 500 miles of wire, and
a panel 51 ft long and 8ft high. Input data was entered
through the punched cards, and the output was by
punched card or electric typewriter. Aiken's machine
was similar in principle to Babbage's analytical machine
although Aiken did not know about Babbage's work
when he started his research.
The first all-digital electronic computer made its
appearance during the World War II. In the United
States, there was the desperate need for computers that
would quickly compute firing tables for the variety of
new weapons used by the U.S. army. In 1942, electrical
engineer J. Presper Eckert and physicist John W.
Mauchly, at the Moore School of Engineering, University
of Pennsylvania, Philadelphia, met the need and
developed ENIAC (Electronic Numerical Integrator and
Calculator). ENIAC went into operation in 1946. It was
the first all-purpose, digital electronic computer. It used
vacuum tubes instead of relays as the logic elements.
Because of this, it was more than 1,000 faster than its
electromechanical predecessors. However, ENIAC was
of an unprecedented size and complexity.
In 1950, ENIAC was succeeded by EDVAC
(Electronic Discrete Variable Automatic Computer), a
stored-program computer. In 1947, Eckert and Mauchly
established their own company, Eckert-Mauchly
Computer Corporation, to manufacture computers
commercially. In 1951, the company produced the
UNIVAC I (Universal Automatic Computer) for the U.S.
Census Bureau.
Although this first commercial
computer was produced for the Census Bureau, it was
used extensively by the scientific community. UNIVAC I
achieved the greatest fame among the early digital
computers because it was used to predict correctly the
presidential election in 1952. The results projected
Dwight Eisenhower's election over Adlai Stevenson 45
minutes after the polls closed.
integrated circuit(IC) in 1958. With the introduction of
integrated circuits , it was possible to have hundreds of
circuit elements on a tiny silicon chip. Important
members of the third generation include the IBM 360
and 370, UNIVAC 1108, RCA 3301, GE 645, Honeywell
200 series, and the DEC PDP-8.
The fourth-generation computers became available
in the 1980s when very large-scale integration (VLSI), in
which thousands of transistors and other circuit elements
are placed on a single chip, became increasingly
common. The VLSI technology greatly increased the
circuit density.
While the first-, second-, third-generation
computers used ferrite core as memory units, the
fourth-generation computers used semiconductor devices
fabricated by VLSI technology as ultrahigh-access
memory units. The drop in cost associated with the
size-reduction trend led to the introduction of personal
computers for use in office, schools, and homes. Several
companies such as IBM, Apple Computer, and Radio
Shack, began to produce and market personal computers
with enormous success.
The race is now on building the next or "fifth"
generation of computers, machines that exhibit artificial
intelligence. Thus new generations of computers will
involve robotics and computer networks.
Generations Of Computers
Computer Networks
The first generation of computers (1950-1959) used
vacuum tubes as their logic elements and ring-shaped
ferrite cores as memories. During this period computers
were bulky, unreliable, and expensive. These computers
include ENIAC, EDVAC, UNIVAC I, UNIVAC II, IBM
702 and 650. The introduction of semiconductor digital
elements marked the beginning of the second computer
generation in 1959. The second generation was marked
by reduced size and cost with increased speed and
reliability. Magnetic tape became the principal external
storage medium. IBM produced the 709TX system in
1959 and later produced 7094 which dominated the
scientific computer market during the period of
1960-1964. Some of the popular second-generation
computers were IBM 7000, 1400 series, UNIVAC III,
RCA 301 and 501, Honeywell 400 and 800, and NCR
315.
The second-generation computers (1959-1969)
were succeeded by the third computer generation
(1969-1977), which used integrated circuits. The era of
microelectronics started with the invention of the
Originally networks were used to connect only
mainframe computers. But with the proliferation of
inexpensive computer systems and advances in software,
the need to network personal computers and other
computer peripherals became apparent.
Computer
networking has been developed at three levels: local area
network (LAN) that interconnect computers located
within a relative small area such as a college campus;
metropolitan area network (MAN) representing LAN
technologies optimized for a metropolitan area such as a
city;
wide
area
network
(WAN)
providing
communication services over several kilometers, across
the nation, or around the globe [3].
The idea of computer networking started in the
1960s when time-sharing services were first available to
the public [4]. Early pioneers were General Electric
(GE), XEROX, AT &T, IBM government agencies,
research laboratories, and universities. ARPANET was
built in 1969 by the Advanced Research Projects
(ARPA), an arm of the U.S. Department of Defense. It
was a public network connecting several major
universities and research institutions. The ARPANET
eventually grew into a U.S. backbone network leading to
the current Internet. The success of the ARPANET led
its primary contractors (Bolt, Beranek, and Newman) to
form a commercial network company, TELENET in
1972.
The metropolitan area networks (MANs) are an
outgrowth of LANs. The MAN effort started in 1982.
The objectives were to provide for interconnection of
LANs, bulk data transfer, digitized voice, and video. The
fiber distributed data interface (FDDI) proposed by the
American National Standard Institute (ANSI) is the most
popular MAN. It is a token ring network with fiber optic
as its transmission medium.
In the late 1970s, the concept of ISDN was born.
The ISDN is regarded as an all-purpose digital network
in that it will provide an integrated access that will
support a wide variety of applications in a flexible and
cost-effective manner. The implementation of ISDN has
been slow in actual practice. The ISDN concept has been
tried in many nations. The real excitement of ISDN
comes about when one considers the capabilities of
broadband ISDN (BISDN).
university laboratory on robotics at Carnegie Mellon
University. In the same year, University of Rhode Island
demonstrated a prototype robotics vision system.
Conclusion
Early computers were electromechanical at best;
they were limited in speed, reliability, and flexibility.
Modern digital computers are fast and reliable. Computer
systems will continue to find increasing application in
every aspect of human activity. As we approach the next
century, the most important areas related to computers
will be networking and artificial intelligence.
References
1.
M. R. Williams, A History of Computing
Technology. Englewood Cliffs, NJ: Prentice Hall,
1985, p. 139.
2.
A. Ralston (ed.), Encyclopedia of Computer Science
and Engineering. New York: Nostrand Reinhold
Co., 1982, 2nd ed., 1982, pp. 532-554.
3.
M. N. O. Sadiku and M. Ilyas, Simulation of Local
Area Networks. Boca Raton, FL: CRC Press, 1995,
pp. 1-2.
4.
J. S. Quarterman, The Matrix.
Digital Press, 1990, pp.137-174.
5.
R. D. Klafter et al., Robotic Engineering: An
Integrated Approach. Englewood
Cliffs, NJ:
Prentice Hall, 1989, p. 10.
Robotics And Artificial Intelligence
A robot is a reprogrammable, multifunctional
manipulator designed to perform functions ordinarily
ascribed to human beings [5]. The key word is
reprogrammable because it refers to a built-in computer
control system.
This distinguishes robots from
numerically controlled systems that can adapt to new
tasks. The "robot age" began in 1954 when George C.
Devol, who is regarded as the "father of robot," patented
the first manipulator with a playback memory. By the
mid-1960s, the race to create intelligent robots with the
most accuracy and speed led to the formation of research
centers and laboratories in the new field of robotics and
its allied field of artificial intelligence. Researchers' aims
were to integrate perceptual and problem-solving
capabilities into one system, using computers for
controllers or brains, TV cameras for vision, and touch
sensors for robot grippers. In 1967, General Electric
produced a four-legged vehicle. In 1969 and 1970,
researchers at Stanford Research Institute (SRI) produced
a mobile robot, known as Shakey, which had some vision
capability. In 1974, Vicarm Inc. marketed a robot that
used a minicomputer for a controller. By the year 1978,
there were about 2,500 industrial robots in the United
States. 1980 witnessed the establishment of the largest
Bedford, MA:
Download