Uploaded by Malyca Labine

History of Computers: A Timeline of Key Events & Inventions

advertisement
Here's the merged timeline:
1817: Charles Babbage conceives the idea of a
programmable mechanical computer, known as the
"Difference Engine."
1822: Charles Babbage designs a programmable
mechanical computer, the 'Analytical Engine,"
considered the precursor to modern computers.
1834: Charles Babbage proposes the Analytical Engine, a
general-purpose computer concept.
1837: Charles Babbage develops plans for the Analytical
Engine, which includes components like the arithmetic
logic unit (ALU) and conditional branching.
1837: Ada Lovelace, often regarded as the world's first
computer programmer, publishes notes on Babbage's
Analytical Engine, describing how it could be
programmed to perform various tasks.
1946: The Electronic Numerical Integrator and
Computer (ENIAC), the first electronic general-purpose
computer, is completed at the University of
Pennsylvania.
1946: Mauchly and Eckert build the UNIVAC, the first
commercial computer.
1947: Shockley, Bardeen, and Brattain invent the
transistor.
1949: The Electronic Delay Storage Automatic Calculator
(EDSAC) is built, the first practical stored-program
computer.
1951: The UNIVAC I (Universal Automatic Computer),
the first commercially produced computer, is delivered
to the United States Census Bureau.
1951: The UNIVAC I is delivered to the U.S. Census
Bureau.
1842: George Boole publishes "An Investigation of the
Laws of Thought," laying the foundation for digital logic.
1953: The IBM 701, IBM's first commercially available
scientific computer, is introduced.
1843: Ada Lovelace publishes notes on Babbage's
Analytical Engine, becoming the world's first computer
programmer.
1954: The first prototype of desktop calculators is
introduced.
1890: Herman Hollerith invents the tabulating machine,
a precursor to the modern computer.
1936: Alan Turing introduces the concept of a
theoretical computing device, known as the 'Turing
Machine," laying the foundation for modern computer
science.
1936: Alan Turing publishes "On Computable Numbers,"
introducing the concept of a universal machine.
1937: Konrad Zuse builds the Z3, the world's first fully
functional digital computer.
1941: Atanasoff and Berry build the Atanasoff-Berry
Computer (ABC), the first digital electronic computer in
the U.S.
1943: Colossus, the world's first programmable digital
electronic computer, is built in the UK to decrypt
German codes during WWII.
1945: ENIAC (Electronic Numerical Integrator and
Computer), the first general-purpose electronic digital
computer, is completed in the United States.
1958: Jack Kilby and Robert Noyce independently invent
the integrated circuit, leading to the miniaturization of
electronic components.
1959: IBM introduces the 1401, the first commercially
successful computer.
1963: The IBM System/360 is introduced, a family of
computers that includes the first 32-bit mainframe.
1964: Douglas Engelbart demonstrates the first
computer mouse, as part of his work on interactive
computing and human-computer interaction.
1969: The GE 645, the first minicomputer, is released.
1969: ARPANET, the precursor to the internet, is
established, connecting computers at four U.S. research
institutions.
1970: Intel releases the first commercially available
microprocessor, the Intel 4004, marking the beginning
of the microcomputer era.
1971: Intel releases the first microprocessor, the Intel
4004, marking a significant milestone in the
development of personal computing.
1971: The first floppy disk is introduced.
1972: The Magnavox Odyssey, the first home video
game console, is released.
1973: The Intel 4004, the first microprocessor, is
introduced.
1976: Steve Jobs and Steve Wozniak found Apple
Computer, Inc., launching the Apple I, one of the first
personal computers.
1977: The Apple II, the first successful mass-produced
personal computer, is released.
1981: IBM introduces the IBM Personal Computer (IBM
PC), which quickly becomes the industry standard for
personal computing.
1981: IBM introduces the PC.
1984: The Apple Macintosh is introduced.
1985: Microsoft releases Windows.
1989: Tim Berners-Lee proposes the World Wide Web,
laying the groundwork for the modern internet and the
proliferation of online information.
1989: Tim Berners-Lee proposes the World Wide Web.
1991: Linus Torvalds releases the Linux kernel, a free
and open-source operating system kernel that powers
many modern computing systems.
1993: The Pentium microprocessor is released,
advancing the use of graphics and music on PCs.
1996: The Pentium Pro is released, the first Pentium
processor with MMX technology.
1998: Google is founded by Larry Page and Sergey Brin,
revolutionizing internet search and becoming one of the
world's largest technology companies.
1999: The first USB flash drive is introduced.
2000: The first iPod is released.
2007: Apple releases the iPhone, revolutionizing the
smartphone industry and paving the way for mobile
computing as we know it today.
2010: The emergence of cloud computing services, such
as Amazon Web Services (AWS), Google Cloud Platform,
and Microsoft Azure, begins to reshape the way
computing resources are provisioned and managed.
2015: The first quantum computer with 5 qubits is
demonstrated, showcasing the potential for quantum
computing.
2017: Quantum computing advances significantly, with
companies like IBM, Google, and Microsoft making
breakthroughs in developing practical quantum
computers capable of solving complex problems far
beyond the capabilities of classical computers. Artificial
intelligence and machine learning technologies continue
to advance, impacting various industries and everyday
life.
Download