Uploaded by melchior.vos

HoS summary

advertisement
Introduction
The computer has a relatively short history, which the authors deem to start in the 1940s.
Few other technologies have changed their scale, dominant applications, and users so often and so
fundamentally.
In the 1940s, computers were used by a few hundred people worldwide to carry out complex
numerical calculations. They were built as one-of-a-kind pieces of custom lab equipment, each
costing the equivalent of several million present-day dollars.
The computer began as a highly specialized technology and has moved toward universality and
ubiquity. We think of this as a progression toward practical universality, in contrast to the theoretical
universality often claimed for computers as embodiments of Turing machines.
Maps, filing cabinets, video tape players, typewriters, paper memos, and slide rules are rarely used
now, as their functions have been replaced by software running on personal computers,
smartphones, and networks. We conceptualize this convergence of tasks on a single platform as a
dissolving of those technologies
The first transformations began in the 1950s, as computers were remade for use as scientific
supercomputers capable of feats of number crunching, data processing devices able to automate the
work of hundreds of clerks, and real-time control systems used to coordinate air defense.
Our story starts in the 1940s with programmable electronic computers and not, like more traditional
overview histories, with mechanical calculators or Charles Babbage.
What was the first computer? Any answer depends on one’s definition of computer. In the 1940s the
question would not even have made sense, because computer usually meant a person employed to
carry out complex calculations.
We start in 1945 with the first operation of a machine called ENIAC at the University of Pennsylvania.
ENIAC is usually called something like the “first electronic, general-purpose, programmable
computer.” Electronic distinguishes it from electromechanical computers whose logic units worked
thousands of times more slowly. General purpose and programmable separated ENIAC from special
purpose electronic machines whose sequence of operations was built into the hardware and so
could not be reprogrammed to carry out fundamentally different tasks.
The ENIAC project introduced the vocabulary of programs and programming and the automation of
higher-level control functions such as branches and looping. It was publicized around the world,
stimulating interest in electronic computation.
Chapter 1
ENIAC (electronic numerical integrator and computer) started operating in 1946
the University of Pennsylvania where ENIAC had been designed and built, starting in 1943. It would
not be reassembled at the Ballistics Research Laboratory in nearby Maryland until 1947, where it
enjoyed a long and productive career until being retired in 1955. The machine was demonstrated to
reporters and dignitaries by its now-famous team of six female operators
Adele Goldstine wrote ENIAC’s programming manual and was married to the Army’s project lead.
ENIAC’s place in computer history rests on more than being the first device to merit check marks for
electronic and programmable on a comparison sheet of early machines. It fixed public impressions of
what a computer looked like and what it could do.
Howard Aiken and Grave Murray Hopper the first programmers of the ENIAC
Since the 1980s, a program so compelling that users acquire an entire computer system to run it has
been called a killer application. This term is particularly appropriate to ENIAC, as the aim was literally
to kill more efficiently.
Altogether, ENIAC was not so much a single computer as a kit of forty modules from which a
different computer was constructed for each problem.
Eckert and Mauchly are the names usually attached to building the ENIAC, dozens of people worked
to create the machine
Who is Von Neumann
EDVAC (electronic discrete variable automatic computer) centered on a new kind of memory
proposed by Eckert, the delay line.
we prefer to separate the enormous influence of EDVAC into three clusters of ideas (or paradigms).
The first of these, the EDVAC hardware paradigm, specified an all-electronic machine with a large
high-speed memory using binary number storage
The second was the von Neumann architecture paradigm.
EDVAC’s switching circuits as neurons.12 He broke its structure into organs, including a large delay
line memory for programs and data (the only biological term that stuck).
The third cluster of ideas was a system of instruction codes: the modern code paradigm.
Timesharing systems
A timesharing system, popular in the 1970s and 1980s, was a computing paradigm that allowed
multiple users to access and utilize a single computer system concurrently. The main goal was to
optimize the use of expensive computer resources and provide a cost-effective solution for users
who could not afford their own mainframe or minicomputer.
In a timesharing system, each user would connect to the central computer using a terminal, which
could be a simple keyboard and screen or a more advanced device with graphical capabilities. The
terminals were often connected to the central computer via telephone lines using modems or
through dedicated networks.
The central computer would then divide its processing power and resources (such as memory and
storage) among the connected users, allocating small time slices to each user in a rapid and cyclical
manner. This gave the illusion of each user having their own dedicated system, as the switching
between users happened so fast that users generally wouldn't notice any significant delay.
The timesharing model was eventually supplanted by the rise of personal computers in the 1980s
and 1990s, which provided users with their own dedicated computing resources at a much lower
cost. However, the concept of timesharing has made a comeback in recent years with the advent of
cloud computing, which allows users to share computing resources in a similar manner, albeit on a
much larger scale and with far more advanced technology.
ARPANET
The rise of ARPANET can be traced back to the late 1960s and is considered a crucial milestone in the
development of the internet. ARPANET was the first wide-area packet-switching network and the
first network to implement the TCP/IP protocol, both of which are fundamental to the modern
internet.
The origins of ARPANET can be attributed to the United States Department of Defense's Advanced
Research Projects Agency (ARPA), now known as DARPA. The primary goal of ARPANET was to
facilitate communication and resource-sharing among (research) institutions. The project aimed to
create a robust, decentralized network that could withstand potential infrastructure damage, such
as from a nuclear attack, and continue to function. (the ARPANET was justified by the need for
remote logins)
ARPANET was officially launched in October 1969, initially connecting four major institutions: UCLA,
Stanford Research Institute, UC Santa Barbara, and the University of Utah. The network used IMPs,
which were early packet-switching devices, to relay messages between connected computers.
Throughout the 1970s and 1980s, ARPANET continued to expand, connecting more universities,
research centers, and government agencies. The development of new protocols and technologies,
such as TCP/IP, helped improve the network's functionality and interoperability.
In 1983, the ARPANET officially adopted TCP/IP, which ultimately led to the creation of the internet
as we know it today. As more networks adopted TCP/IP and became interconnected, the term
"internet" emerged as a way to describe the vast, interconnected network of networks.
ARPANET was decommissioned in 1990, as its role had been largely supplanted by other networks,
including NSFNET, which was funded by the National Science Foundation. Nevertheless, ARPANET's
pioneering technologies and its role in the development of the modern internet have left a lasting
legacy in the field of computer networking.
GPS
The origin of GPS (Global Positioning System) navigation devices can be traced back to the
development of satellite-based navigation systems by the United States Department of Defense. The
concept of a satellite navigation system was first proposed by American scientists and engineers in
the 1960s as a way to provide accurate positioning information for military and civilian applications.
The GPS system was initially developed for military use to improve navigation, communication, and
targeting accuracy. However, its potential for civilian applications was soon recognized, and efforts
were made to make the technology available for public use.
In 1973, the United States Department of Defense combined several satellite navigation projects to
create the GPS program. The first GPS satellite was launched in 1978. Throughout the 1980s and
1990s, more GPS satellites were launched, and the system became fully operational in 1995 with a
constellation of 24 satellites.
The early GPS receivers were large and expensive, limiting their adoption to specialized applications,
such as aviation and maritime navigation. However, as technology advanced, GPS receivers became
smaller and more affordable, leading to the development of dedicated GPS navigation devices for
consumer use.
Companies like Garmin, Magellan, and TomTom started producing portable GPS navigation devices
in the late 1990s and early 2000s. These devices became popular for in-car navigation, providing
users with turn-by-turn directions and real-time traffic updates.
Over time, the integration of GPS technology into smartphones has led to a decline in the demand
for standalone GPS navigation devices. However, dedicated GPS devices still find use in specialized
applications, such as hiking, aviation, and marine navigation, where specific features and ruggedness
are important.
In conclusion, the origin of GPS navigation devices lies in the development of satellite-based
navigation systems by the United States Department of Defense in the 1960s and 1970s. The
technology has since evolved and become more accessible, leading to the widespread adoption of
GPS navigation devices for civilian applications.
SHARE
SHARE, founded in 1955, is one of the oldest computer user groups in existence. It was established
by a group of IBM 704 computer users from various organizations, including aerospace companies,
research institutions, and government agencies, with the aim of facilitating collaboration and
knowledge sharing among its members.
The main objectives of the SHARE group were:
1. Collaboration and resource sharing: The primary goal of SHARE was to encourage
collaboration and resource sharing among its members. This included sharing software,
technical expertise, and best practices for using and maintaining IBM computer systems. By
pooling resources and knowledge, the group sought to help its members make better use of
their computing investments.
2. Influence on IBM's product development: SHARE provided a platform for its members to
communicate with IBM and influence the development of its products. The group worked
closely with IBM, providing feedback on existing products, as well as suggesting
improvements and new features. This collaborative relationship helped shape the evolution
of IBM's hardware and software offerings.
3. Training and education: SHARE organized regular meetings, conferences, and workshops
where members could learn about new technologies, share their experiences, and discuss
solutions to common problems. These events provided valuable training and education
opportunities for members, helping them to stay up-to-date with the latest developments in
the field of computing.
4. Development of standards and best practices: SHARE played an important role in the
development of standards and best practices for the use of IBM computers. The group's
members worked together to develop guidelines for programming, systems management,
and other aspects of computing. These efforts helped to establish a common set of practices
across the industry and contributed to the professionalization of the field.
Over the years, SHARE has expanded its focus beyond IBM computers to encompass a broader range
of computing platforms and technologies. Today, SHARE continues to be an active organization,
providing its members with opportunities for networking, education, and collaboration in the field of
enterprise IT.
NLS
In December 1968, Douglas Engelbart of the Stanford Research Institute (SRI) demonstrated the use
of timesharing systems for online collaboration through their online system NLS at the Fall Joint
Computer Conference in San Francisco. The presentation, known as the "Mother of All Demos,"
showcased groundbreaking innovations, including the mouse, online text editing, outline processing,
linking between files, and collaborative document editing. Despite Engelbart's powerful ideas, NLS
had limitations such as a complex command structure, a five-key chord keyboard, and high hardware
costs for video displays. However, it set the stage for future research and development in the field of
computer supported cooperative work (CSCW) in the 1990s.
THREE TRADITIONS WHICH STARTED THE IDEA OF COMPUTING
1. Administration
The administration tradition aimed to optimize office efficiency through the use of
technological innovations such as typewriters, calculators, and sorting machines. The
efficiency movement that began in 1911 in Amsterdam helped streamline and modernize
office processes. This movement laid the foundation for the widespread adoption of office
automation technologies, which eventually led to the development of modern computers
and software for business applications. As a result, today's offices rely heavily on computer
systems for tasks like data processing, communication, and document management,
reflecting the long-lasting impact of the administration tradition.
2. Process Control
The process control tradition focused on automating and controlling large machinery in
industrial settings like factories, for example, Shell. Due to the scale and complexity of these
machines, it was crucial to find ways to control them remotely, as it was dangerous or
inefficient for humans to be in close proximity. This need for automation led to the
development of computer systems that could monitor and regulate industrial processes,
ensuring safe and efficient operation. The process control tradition has had a significant
impact on modern manufacturing and industrial operations, as automation, robotics, and
computer-based control systems are now common features in factories around the world.
3. Science and Engineering
The science and engineering tradition played a vital role in the advancement of computing
technology. Before the 1950s, the term "computers" referred to people who performed
complex calculations for various scientific and engineering applications. However, with the
invention of electronic computers, these tasks could be performed much faster and more
accurately by machines. Early computers were used for a wide range of applications,
including weather prediction, mechanical calculations, and economic forecasting. The
science and engineering tradition has helped drive innovations in computing hardware and
software, leading to the development of powerful tools and technologies that continue to
transform research and development across various disciplines.
In conclusion, the three traditions of administration, process control, and science and engineering
have collectively contributed to the development and widespread adoption of computing
technology. While each tradition had its unique focus and objectives, they all shared a common goal:
to leverage technology for greater efficiency, precision, and problem-solving capabilities. Today's
ubiquitous computing landscape is a testament to the enduring impact of these three traditions.
Sounds were used for
In the 1950s and 1960s, sounds played a significant role in the operation and use of computers. Early
computers lacked screens and visual interfaces that we are familiar with today, so sounds served as
important indicators for engineers and operators to understand the machine's status and progress.
Some of the primary functions of sounds in early computing included:
1. Announcing program termination: Computers of the 1950s and 1960s often used sounds to
signal the completion of a task or program. This auditory cue helped operators know when a
particular computation was finished, allowing them to proceed to the next step or address
any issues that might have arisen during the process.
2. Auditory monitoring: Sounds served as a simple and effective way to monitor the
computer's operation continuously. As long as the machine produced the expected sounds,
operators could be confident that it was functioning correctly. This form of auditory
monitoring was particularly useful in the absence of visual interfaces, like screens or
graphical displays.
3. Specific navigation within a program: In some cases, sounds were used to facilitate
navigation within a program or to indicate specific events or statuses. For example, different
tones or sequences of tones could be used to signal the completion of a subroutine or the
occurrence of an error.
4. Part of the instruction: In certain instances, sounds were intentionally incorporated into
computer programs as a form of instruction or user feedback. This could include producing
specific sounds in response to user input or using sounds to convey information about the
program's state or progress.
As computing technology evolved, engineers began to incorporate special speakers into new
computer designs to produce sounds more deliberately. This development was a response to the
growing recognition of the value of auditory feedback in computer operation, particularly in an era
where visual interfaces were limited or non-existent.
In conclusion, sounds played a crucial role in the use of computers in the 1950s and 1960s, serving as
indicators of machine operation, program status, and user feedback. The reliance on auditory cues in
early computing highlights the creative ways engineers and operators adapted to the limitations of
technology at the time and laid the foundation for the more sophisticated audiovisual interfaces we
enjoy today.
Computers and education
Discuss two ways in which education and computing stimulated each other mutually in the 1960s.
In the 1960s, education and computing began to influence and stimulate each other mutually in
various ways, leading to a more intertwined relationship between the two fields. Two notable ways
in which this occurred were the inclusion of computer learning in the curriculum and the use of
computers in the classroom.
1. Learning about computers in the curriculum: As computers started to play a more significant
role in society during the 1960s, there was a growing recognition of the importance of
computer literacy. Schools and universities began to incorporate computer science and
programming courses into their curriculums, teaching students the foundational principles
of computing, such as algorithms, data structures, and programming languages. This early
exposure to computing concepts helped foster a generation of computer-literate individuals
who would go on to advance the field further, both as computer scientists and as
professionals in various other domains where computing played a role. In this way,
education stimulated the growth and development of computing by nurturing a skilled
workforce with a strong understanding of computing principles.
2. Use of computers in the classroom: The 1960s also saw the development and
implementation of computers for classroom usage. One such system was PLATO. PLATO was
an early example of a computer-based education system that allowed students to interact
with course material through a combination of text, graphics, and touchscreen technology.
The system provided individualized instruction and allowed students to progress at their
own pace. Teachers could also use PLATO to monitor student progress and tailor their
instruction accordingly.
The mutual stimulation between education and computing in the 1960s led to a symbiotic
relationship that benefited both fields. The inclusion of computer learning in the curriculum
prepared students for careers in the rapidly growing technology sector, while the use of computers
in the classroom, such as with the PLATO system, revolutionized the way students learned and
interacted with educational content.
Computer
Why was the word "computer" not used to denote the machines in the early 1950s? Explain how this
changed and mention names of machines illustrating that change.
In the early 1950s, the word "computer" was not commonly used to refer to machines because,
historically, the term "computer" referred to a human who performed calculations or computations.
These human computers were often employed in scientific, engineering, or military contexts to carry
out complex mathematical tasks. With the advent of electronic computing machines, the meaning of
the term gradually shifted to include these new devices.
One of the first electronic computing machines that contributed to this shift in terminology was the
ENIAC, developed in the mid-1940s at the University of Pennsylvania. ENIAC was designed to
perform high-speed arithmetic operations, and its success demonstrated the potential for electronic
devices to replace human computers in various fields.
As electronic computing machines became more sophisticated and widespread throughout the
1950s and beyond, the term "computer" increasingly came to denote these machines instead of
human calculators. This shift in terminology reflects the rapid evolution of computing technology
and its growing importance in various aspects of modern life.
Movie industry
In what ways did computing change the movie industry between 1980 and 2000?
Between 1980 and 2000, computing technology not only revolutionized various aspects of
filmmaking and distribution but also influenced the themes and narratives of movies themselves. As
computers and digital technology became more prevalent in society, they began to feature
prominently in the stories and settings of films, shaping the way movies depicted the world and our
relationship with technology. Between 1980 and 2000, computing technology not only transformed
the technical aspects of filmmaking but also had a profound impact on movie narratives and themes.
As computers became more integrated into society, movies began to explore and reflect the growing
influence of technology in various aspects of our lives, often raising thought-provoking questions
about the potential consequences and ethical implications of our increasingly digital world.
As computers and the internet became more widespread movies such as "Blade Runner" (1982),
and "The Matrix" (1999), often featured dystopian futures in which advanced technology, artificial
intelligence, and virtual reality played central roles. These movies explored themes related to the
potential dangers and ethical implications of technology, as well as the nature of reality and human
identity in a digital world.
Computer-generated imagery (CGI): The development of CGI allowed filmmakers to create realistic
visual effects that were previously impossible or too costly to achieve using traditional methods. The
1982 film "Tron" was one of the first movies to make extensive use of CGI, and since then, CGI has
become an integral part of modern filmmaking. Films like "Jurassic Park" (1993) and "Toy Story"
(1995) further showcased the potential of CGI, paving the way for a new era of visually stunning and
innovative movies.
These developments in computing technology between 1980 and 2000 fundamentally changed the
movie industry, opening up new creative possibilities and reshaping the way films were produced,
edited, and distributed. The impact of these innovations can still be felt today, as the industry
continues to evolve and embrace new digital technologies.
Science
Can ERA be seen as influential in computing in the 1950s?
One of the most notable achievements of ERA was the development of the ERA 1101, also known as
the Atlas, which was completed in 1950. The ERA 1101 was an early stored-program computer and
one of the first machines to use magnetic drum memory, which was an alternative to the faster and
much more expensive memory used at the time in most computers.This computer was designed for
the United States Navy to perform code-breaking tasks and was later commercialized as the UNIVAC
1101.
In 1952, ERA was acquired by Remington Rand. The ERA team, under the new company, continued
to contribute to the advancement of computer technology. For instance, the ERA 1103, developed in
1953, was an improved version of the ERA 1101 and was one of the earliest computers to use
magnetic core memory, a more efficient and reliable memory technology that became the standard
for computers in the following decades.
Many ERA employees went on to have significant careers in the computing industry. Notably, one of
the co-founders of ERA founded CDC in 1957. CDC became a major player in the computer industry
and developed several innovative and influential computers, including the CDC 6600, which was the
world's fastest computer in the mid-1960s.
In summary, the Engineering Research Associates was influential in computing during the 1950s,
both through its development of early computer technology, like the ERA 1101 and ERA 1103, and
through the subsequent contributions of its employees who went on to make significant impacts on
the computing industry.
Everywhere and nowhere
Was the history of computing made irrelevant with the introduction of the smartphone?
In answering the exam question, you should argue that the history of computing has not been made
irrelevant by the introduction of the smartphone, based on the information provided in the
hypothetical book "A New History of Modern Computing." Here's a suggested response:
The introduction of the smartphone has undoubtedly revolutionized the way we interact with
technology and access information. However, to argue that the history of computing has been made
irrelevant by the smartphone would be a significant oversimplification. The smartphone is, in fact, a
product of the rich and diverse history of computing, building upon the foundations established by
earlier innovations and breakthroughs.
The book "A New History of Modern Computing" highlights key milestones in the development of
computing, including the invention of early calculating machines, the creation of the first
programmable computers, the rise of personal computers, and the development of the internet.
Each of these milestones has contributed to the evolution of computing technology, shaping the
devices and systems we use today, including smartphones.
Smartphones are essentially compact, portable computers with the ability to make phone calls, send
messages, and access the internet. Their development was made possible by advances in hardware
and software, as well as breakthroughs in wireless communication and battery technology. These
innovations can be traced back to earlier developments in the history of computing, such as the
invention of the microprocessor, the creation of graphical user interfaces, and the establishment of
the World Wide Web.
Moreover, the history of computing is not solely defined by the devices themselves, but also by the
people, ideas, and social changes that have shaped the field over time. The pioneers of computing,
such as Alan Turing, Grace Hopper, and Tim Berners-Lee, have had a lasting impact on the way we
approach problem-solving, programming, and information sharing. Their work, along with that of
countless other innovators, remains relevant as we continue to develop new technologies and face
new challenges in the world of computing.
In conclusion, the introduction of the smartphone has not made the history of computing irrelevant.
Rather, it highlights the importance of understanding the context and evolution of technology, as
each innovation builds upon the foundation laid by previous generations. The history of computing
provides valuable insights and lessons that continue to inform the development of new
technologies, including smartphones and beyond.
Inventing the computer
Describe the clusters of ideas that determined the influence of EDVAC, according to Haigh and
Ceruzzi.
The EDVAC was an early electronic computer designed by John von Neumann, J. Presper Eckert, and
John Mauchly, and its design had a profound impact on subsequent computer architecture. The key
clusters of ideas associated with the EDVAC include:
Stored-program concept: The EDVAC’s most novel feature was the stored-program concept, where
both instructions and data are stored in the same memory. This idea allowed computers to be
reprogrammed more easily and efficiently and laid the foundation for the development of modern
computer programming. The stored-program concept became a central feature of computer design.
"
1. Hardware paradigm – large high speed memory using binary number storage
2. The Von Neumann architecture - This broke the structure of the machine down into parts
referred to as organs, memory was the only term that stuck. Storage and arithmetic was
binary, using bits to encode information.
3. Modern code paradigm – the flow of instructions mirrored the way humans performed
calculations. Each instruction with an operation code was usually followed by parameters or
an address.
Historical mistakes
Discuss two kinds of mistakes one can make while doing history, and give for each of these kinds of
mistakes an example form the history of computing.
Mistake 1: Not knowing your sources
One common mistake historians can make is not thoroughly evaluating their sources. In the field of
history, primary and secondary sources are crucial for understanding events, trends, and ideas.
Failure to consider the reliability, accuracy, and potential biases of sources can lead to
misinterpretations or the perpetuation of myths.
Example:
A historian, while writing about the development of the World Wide Web, might rely on an
inaccurate source that claims Tim Berners-Lee invented the internet. In reality, Tim Berners-Lee
invented the World Wide Web, a system that operates over the internet, while the internet itself
was a result of various researchers' efforts, including ARPANET, which was developed by the U.S.
Department of Defense in the 1960s. By not verifying the accuracy of their sources, the historian
would be misattributing the invention of the internet and perpetuating a common misconception.
Mistake 2: Anachronism
Another common mistake is the presence of anachronisms, which involve placing events, objects, or
ideas in the wrong historical context. This can occur when historians inaccurately attribute modern
concepts or technologies to earlier time periods or fail to account for the historical context in which
events took place.
Example: A historian discussing the history of modern computing might claim that the use of social
media platforms like Facebook and Twitter played a significant role in the spread of personal
computers during the 1980s and 1990s. This would be an anachronism because Facebook and
Twitter were not launched until 2004 and 2006, respectively. Social media's impact on personal
computer adoption should be analyzed within the context of the 2000s and beyond, not in the
earlier decades of personal computer history.
Mistake 3: Defining your subject Example: A historian writing a new history of modern computing
might focus exclusively on hardware advancements like processors, memory, and storage devices,
while neglecting the development of software, programming languages, and the growth of the opensource movement. By defining the subject of modern computing history too narrowly, the historian
would be overlooking the crucial role that software innovations and collaborative development have
played in shaping the computing landscape, leading to an incomplete understanding of the field's
evolution.
In conclusion, it is essential for historians to be aware of their sources and to avoid anachronisms
when studying and interpreting the past. Failing to do so can result in an inaccurate understanding
of historical events and the context in which they occurred.
Science
To what extent did the Cray I supercomputer influence the history of modern computing?
The Cray-1 supercomputer, developed by Seymour Cray and released in 1976, had a significant
impact on the history of modern computing. It was one of the most powerful computers of its time
and set a new benchmark for high-performance computing. The Cray-1 influenced the field of
supercomputing in several ways:
Performance: At the time of its release, the Cray-1 was the fastest computer in the world, capable of
processing 160 million floating-point operations per second (MFLOPS). It achieved this performance
through innovative design features such as vector processing, pipelining, and a high-speed memory
system. The Cray-1's impressive performance opened up new possibilities in computational science
and engineering, enabling researchers to tackle more complex and data-intensive problems.
Commercial success: The Cray-1 was not only a technical achievement but also a commercial
success. It was adopted by numerous research institutions, government agencies, and private
companies, and its sales helped establish Cray Research as a dominant player in the supercomputing
market. The success of the Cray-1 demonstrated the demand for high-performance computing and
helped pave the way for the development of future generations of supercomputers.
In summary, the Cray-1 supercomputer had a significant impact on the history of modern computing
by setting new performance benchmarks, introducing innovative design features, and driving the
development of high-performance computing technologies. Its legacy can be seen in the evolution of
supercomputers and the continued pursuit of higher computational performance in modern
computing systems.
Inventing the computer
How do Haigh and Ceruzzi explain their choice to start the history of modern computing with ENIAC?
While they acknowledge that there were several computing machines and projects that predate the
ENIAC (Electronic Numerical Integrator and Computer), they choose to start the history of modern
computing with ENIAC due to several factors:
1. General-purpose design: Unlike some earlier computing machines, the ENIAC was designed
to be a general-purpose computer, capable of solving a wide range of numerical problems.
This flexibility and versatility set the stage for the development of subsequent computers
that could be programmed to perform various tasks, which is a defining characteristic of
modern computing.
2. Influence on subsequent computer development: The design and construction of the ENIAC
had a profound impact on the development of subsequent computers. The project brought
together a team of engineers and mathematicians, including J. Presper Eckert and John
Mauchly, who would go on to make significant contributions to the field of computing.
Furthermore, the ideas and techniques developed during the ENIAC project influenced the
design of later computers like the EDVAC, which in turn shaped the development of modern
computer architecture.
3. Publicity and awareness: The ENIAC was unveiled to the public in 1946, and it received
widespread media coverage, capturing the imagination of the public and the scientific
community. This publicity helped raise awareness of the potential of electronic computers
and stimulated interest and investment in computer research, contributing to the rapid
development of the field in the following years.
By focusing on the ENIAC, Haigh and Ceruzzi emphasize the machine's groundbreaking performance,
general-purpose design, influence on subsequent computer development, and its role in raising
public awareness of electronic computing. These factors make the ENIAC a logical starting point for
the history of modern computing, even though there were earlier computing devices and projects
that contributed to the development of the field.
Communications platform
In what sense did the early MIT electronic mail system contribute to the rise of computers as a
communications platform?
The early electronic mail system developed at MIT was part the timesharing system at MIT, which
played a significant role in the rise of computers as a communications platform. Before this
innovation, computers were primarily seen as tools for processing data and performing calculations.
However, the development of the MAIL system showcased the potential of computers for facilitating
communication and collaboration. This early email system contributed to the rise of computers as a
communications platform in several ways:
CTSS MAIL demonstrated that it was feasible to use computers for sending and receiving
messages electronically. The introduction of electronic mail on the CTSS platform enabled
users to communicate with each other more easily, some people even accessed the system
remotely over phone lines from other institutions, fostering collaboration and information
exchange. This innovation helped shape the perception of computers as interactive tools for
communication and collaboration, rather than just standalone machines for processing data.
Influence on subsequent email systems and online communication: The early electronic mail
system at MIT laid the groundwork for the development of more sophisticated email
systems and other online communication tools. These developments eventually led to the
widespread adoption of email as a primary means of communication in various sectors,
including academia, business, and personal communication.
The idea of using computers for communication, as demonstrated by the CTSS MAIL system,
contributed to the development of networked computing. The realization that computers
could be powerful tools for communication and collaboration encouraged research into
computer networks, such as ARPANET, which later evolved into the modern internet. This
ultimately transformed the way people communicate, share information, and work
together.
In summary, the early MIT electronic mail system was an essential milestone in demonstrating the
potential of computers as a communications platform. It provided a proof of concept for electronic
messaging, fostered the perception of computers as interactive tools, influenced the development of
subsequent email systems and online communication tools, and contributed to the rise of
networked computing.
Communications platform
In what sense do Haigh and Ceruzzi characterize ARPANET as a precursor to Internet?
Thomas Haigh and Paul Ceruzzi, as historians of computing, characterize the ARPANET as a precursor
to the Internet because it laid the groundwork for the development of the global network we know
today. ARPANET was a pioneering computer network developed by ARPA in the late 1960s and
1970s. The ARPANET was designed to facilitate communication and resource sharing among
research institutions and government agencies.
Haigh and Ceruzzi highlight several aspects of the ARPANET that contributed to its status as a
precursor to the Internet:
1. Packet switching: One of the key innovations of the ARPANET was the development and
implementation of packet-switching technology. This method of breaking data into smaller
packets, transmitting them individually and reassembling them at the destination allowed
for more efficient and robust communication over a network. Packet switching became a
fundamental technology for modern computer networks, including the Internet.
2. Network protocols: The ARPANET's development led to the adaptation of the Transmission
Control Protocol/Internet Protocol (TCP/IP), which became the standard for network
communication and remains the foundation of the Internet today. The development of
these protocols was essential for the growth and interoperability of different computer
networks.
3. Growth and interconnection of networks: The ARPANET began as a small network
connecting a few research institutions, but it rapidly expanded, connecting more
universities, government agencies, and private organizations. As the network grew, it
became necessary to interconnect ARPANET with other networks, giving rise to the concept
of an "internet" – a network of networks. This idea of connecting different networks
eventually led to the development of the modern Internet.
4. Development of key applications and technologies: The ARPANET provided a platform for
the development of various applications and technologies that would later become
fundamental to the Internet. For example, electronic mail (email) was first developed on the
ARPANET and subsequently evolved into one of the most widely used applications on the
Internet. Similarly, file transfer protocols, remote access, and other networking technologies
have their roots in the ARPANET era.
In summary, Haigh and Ceruzzi characterize the ARPANET as a precursor to the Internet because it
introduced key innovations such as packet switching, network protocols, and the concept of
interconnected networks. Additionally, it provided a platform for the development of essential
applications and technologies that are now integral to the modern Internet. The ARPANET's
pioneering work laid the groundwork for the evolution of the global network that we know as the
Internet today.
Inventing
Discuss two of the driving forces behind the construction of electronic computers in the 1940s.
1. The need for efficient code-breaking and cryptography during World War II: World War II
played a crucial role in accelerating the development of electronic computers. Codebreaking and cryptography were essential for intelligence gathering, as the ability to
decipher encrypted enemy communications could provide a strategic advantage. The British,
for instance, were particularly interested in breaking the German encryption system,
Enigma.
This need for efficient code-breaking led to the construction of the Colossus, designed by Tommy
Flowers and his team at the Government Code and Cypher School at Bletchley Park. The Colossus
was the world's first programmable electronic computer, developed to speed up the decryption of
Lorenz-encrypted messages used by the Germans. The success of the Colossus demonstrated the
potential of electronic computers in solving complex problems rapidly, paving the way for further
research and development in the field of computing.
2. The demand for efficient scientific and engineering calculations: The 1940s also saw a
growing need for efficient scientific and engineering calculations to solve complex problems
in various fields such as physics, mathematics, and engineering. These calculations, which
were previously performed by human "computers," were time-consuming and prone to
errors.
One of the earliest electronic computers designed to address this need was the Electronic Numerical
Integrator and Computer (ENIAC), developed by John W. Mauchly and J. Presper Eckert at the
University of Pennsylvania. The ENIAC was initially intended for artillery trajectory calculations for
the U.S. Army, but its applications quickly expanded to other scientific and engineering domains,
including atomic energy research and weather prediction. The ENIAC's success demonstrated the
potential of electronic computers to perform complex calculations at unprecedented speeds,
significantly advancing research and development in various fields.
In summary, the need for efficient code-breaking during World War II and the demand for rapid
scientific and engineering calculations were two major driving forces behind the construction of
electronic computers in the 1940s. These early developments laid the groundwork for the modern
computing landscape, transforming the way we process and analyze information today.
Download