APIN Uvod\software

advertisement
A History of Software
It is anyone’s guess how far computer technology will advance. And as software applications have
become vital to virtually every aspect of modern life, it is anyone’s guess how fully integrated
technology will become in modern living. Recent decades have rapidly evolved technologically,
building upon innovations of previous decades with greater speed than at any other time in history.
And speed is a primary reason for it. Since the Industrial Revolution, humankind has sought ways to
become ever more efficient in all realms of life from production at the factory to cooking meals at
home. Today the mediating force between technology and humans is software.
Through machine code instructions called programming languages, software allows individualized
access to the complicated interaction of input and output technology, memory, and processing—that
is, individualized control over the hardware components. Electronic data management proved useful in
its earliest commercial applications such as employee payroll and airline reservations, but even the
earliest pioneers in software development could never have predicted the personal computer and its
full range of applications in the home. Software, though ubiquitous now, was only a gradual
emergence in the computer industry, and it took individual contributions by a number of brilliant minds
to evolve into the products and services we take for granted today.
1._________________________________
Mechanical methods of computation were forever changed by the advent of electronics, and electronic
computation was forever changed by the versatility that software provided. The early problem in
electronic computation was in distinguishing among distinct numerical quantities. This problem led to
the development of an electronic pulse technique, or the simple distinction between off and on, or 0
and 1: binary code (Glass 1998). Before programs began to be written to make the most of electronic
computing power, the computer industry was dominated by engineers developing hardware. The
history of the computer industry extends at least as far back as Edwin Howard Armstrong, who in the
early twentieth century improved radio transmission with a receiver called the “three-electrode valve
(or triode)…[that] was to be the seed of modern electronics, computers and the Internet” (Evans
2004).
In the first couple of decades of the twentieth century, three distinct entrepreneurial forces combined to
form the behemoth electronics and computer company that has operated for nearly a century:
International Business Machines or, simply, IBM. First, the critically important punch-card tabulating
machine company of Herman Hollerith was absorbed by Charles Flint’s Computer-TabulatingRecording Company (CTR) in 1911. Second, Thomas Watson, the man who would eventually
transform CTR into IBM, cut his teeth at John Patterson’s National Cash Register Company. Patterson
was an intense and distinguished salesperson who used rallying slogans, an emphasis on sales and
service, and technological innovation to create “America’s first national sales force” (Evans 2004). And
third, Watson’s unique ability to unite the divided CTR combined with his sales and marketing
experience helped him transform the company through a focus on engineering and technology, such
that by the time of the New Deal, IBM was in the position to lead the nation in mechanical computation
products.
From the 1930s to the 1950s, punch cards became the driving force of corporate America as they
were used in virtually every office accounting machine. Software pioneer Raymond Houghton recalls
the “punched little rectangular holes in [decks of] cards” that were read by computing machines
without operating systems well into the 1960s (Glass 1998). Cards were notated with programming
languages such as IBM’s FORTRAN (FORmula TRANslation) and the U.S. Department of Defense’s
COBOL (COmmon Business Oriented Language) and they combined coded instruction sets such as
compilers and assemblers to try to make computing more efficient. Compilers “automated the process
of selecting and reusing code to create programs” while assemblers were “program[s] that translated
between a more recognizable assembly notation and machine code” (Yost 2005). These slow,
sometimes unreliable language-programming methods were basically analogue precursors to digital
software programming techniques.
With pressure from emerging competitors in the field, Thomas Watson steered his company into
electronics and hired new engineers en masse to develop IBM’s own mainframe calculating machines.
The early massive mainframe computation machines were composed of tons of steel and glass,
hundreds of thousands of parts, and thousands of vacuum tubes and clunky relays. Early electronic
relays, such as those on the University of Pennsylvania’s Pentagon-sponsored Electronic Numerical
Integrator and Computer (ENIAC), had to be manually plugged and unplugged individually according
the specific computations being programmed. His son Thomas Watson, Jr. later observed ENIAC in
action and was not convinced such immense, unreliable machines could ever be useful in business
applications. Yet he would eventually lead IBM into a dominating position in the computer industry as
mainframes evolved and software became a codependent but independent field.
At the time, however, ENIAC’s designers got the attention of Prudential Insurance (a major client of
punch-card technology that was finding it increasingly difficult to store millions of programming and
archived cards) and the U.S. Census Bureau when they proposed digital computation with magnetic
tape technology for storage. According to Paul E. Ceruzzi of the Smithsonian, this step was the critical
one leading to development of programming “as something both separate from and as important as
hardware design” (Evans 2004). In the 1950s, computer hardware technology improved with the
development of magnetic-core memory, transistorized circuits instead of vacuum tubes, and randomaccess storage. Software programs were needed to manage such complex needs like that of the
growing airline industry, which handled the massive “flow of bookings, cancellations, seat
assignments, availability of seats, [and] connecting flights” among other such “complications.” The
need for computer software was becoming painfully evident (Evans 2004).
2.___________________________________
As early as 1939, scientists such as William Shockley theorized that diminutive semiconductors would
replace vacuum tubes. Indeed, all of modern electronics is based on Shockley’s ideas.
Semiconductors can handle electronic pulses at the rate of billions of times per second, instead of the
10,000-times-persecond speed of the clunky and precarious vacuum tubes. Fairchild Semiconductor
entered the market to compete with Shockley Semiconductor, and soon Fairchild became known for
an innovation in semiconductors that is now familiar around the world: the use of silicon.
Silicon, “a commonplace mineral that constitutes 90 percent of the earth’s surface” was first used by
Fairchild for U.S. Air Force rockets in transistors that needed to withstand intense heat. Additional
elements were combined with silicon on flattened transistors to create the first integrated circuits
capable of handling multiple devices and increasingly complex software applications. “Silicon Valley”
was born as innumerable high-tech companies emerged on the scene, congregating in at the southern
end of California's San Francisco Bay area. Perhaps most notably, Integrated Electronics, or Intel, was
founded and new advances in memory chips and microprocessors allowed computers to handle
software light years more complex than the single mathematical computations of the original
mainframes (Evans 2004).
3.________________________________________
Microsoft's MS-DOS was directly modeled on a now lesser-known operating system called CP/M that
was developed by University of Washington graduate Gary Kildall’s Digital Research (DRI). Kildall’s
work was essential to Bill Gates and Microsoft (which was originally founded to sell the Beginners’ All-
Purpose Symbolic Instruction Code (BASIC) programming language interpreter for hobbyists to write
their own programs), but so were the early personal computer developments of Apple and its
subsequent graphical user interface (GUI) that preceded Windows. It is Kildall’s work, nevertheless,
that truly shaped Microsoft and much about modern computing. Evans theorizes that had Kildall had
his way, the personal computer industry would have had access to multitasking windows-style
platforms much sooner and the entire industry would be much more advanced today. Still, Kildall is
attributed with the ideas that were “the genesis of the whole third-party software industry” (2004).
Gary Kildall’s style of programming helped drive the transition from mechanical computing into digital
computing. Kildall developed open language programming years before IBM’s PC, and a number of
months before Apple. In short, before microcomputers even existed, Kildall authored a programming
language “for a microcomputer operating system and the first floppy disk operating system” (Evans
2004). Intel’s microprocessors were already running everything from microwaves to watches, but
Kildall imagined them in home computers running software that would drive networks and wouldn’t be
bogged down by hardware compatibility issues. His Programming Language for Microcomputers
(PL/M) evolved into the Control Program for Microcomputers (CP/M), which contained the first PC
prompt, wherein Kildall could open and store files in directories--work that is now down seemingly
automatically as users click-and-drag files through virtual space on the computer desktop.
Next, Kildall’s basic input/output system (BIOS) could be easily changed by programmers to adapt to
their specific hardware. Kildall’s software advancements were easily adapted into clone systems,
though Kildall had largely retained licensing rights to his software through encoded copyright and
encryption techniques. One operating system, however, Tim Patterson’s DOS, or the Quick ’n’ Dirty
Operating System (QDOS), was developed for Rod Brock’s Seattle Computer Products. QDOS,
according to Evans, “was yet another one of the rip-offs of the CP/M design” that would not have
necessarily mattered had IBM’s business arrangements not aligned with those of Bill Gates. Spurred
by the success of Steve Jobs and Steve Wozniak’s Apple products from the late 1970s and 1980s,
IBM entered the field of microcomputers. Bill Gates seized the opportunity of Kildall’s delayed CP/M86 (being designed for the faster Intel chip IBM had decided upon) and purchased Patterson’s
operating system in order to strike a deal (2004).
The trouble was, Kildall had already made arrangements with IBM and he thought he had successfully
negotiated CP/M a share of the market upon the release of IBM’s new personal computer in 1981. But
the final price point of CP/M was six times that of Microsoft’s PC-DOS, effectively flushing CP/M out of
the market. Kildall had been betrayed. Ironically, only Kildall knew the limitations of CP/M and PCDOS. His intentions for multitasking operating software would have revolutionized the industry at that
time, but the IBM-Microsoft partnership dominated the American market and they evolved at their own
pace. Meanwhile, Kildall kept his operation afloat with his European offices, which embraced the
multitasking capacities of his MP/M OS.
While Kildall went on to innovate in areas from CD-ROMs to computer networking, DRI combined the
graphic display technology of Atari with the expertise of former Microsoft programmer Kay Nishi and
cloned the single-tasking MS-DOS with their DR-DOS. Upon entering the market, DR-DOS not only
drove down Microsoft’s price point, but also fixed a number of MS-DOS bugs. This move helped lead
to Novell’s acquisition of DRI in 1991 for $120 million. Gates missed the opportunity to acquire DRI for
$10 million a few years earlier but, oddly enough, his investment in the ideas of Steve Jobs in 1996
helped Apple enter successful new fields of digital innovation such as the iPod and music downloading
software, a field that, of course, Microsoft soon entered. Perhaps most importantly, Microsoft proved
the power of owning the operating system. After years of working with IBM as the provider of the
software for their hardware, Microsoft surpassed IBM (Evans 2004).
Come up with a title for each section
Are the following statements true, false or not given according to the information in the text
1. Software is seen everywhere these and when it appeared in the past, it came very quickly
out of nowhere.
2. When the New Deal legislation was passed Watson, Flint and Patterson’s new company were
ready to become national leaders in the nascent computing industry.
3. Punch cards were the most popular method of accounting in the US form the 1930’s to
1950’s.
4. Compilers and assemblers were analogue versions of the digital programmes to come.
5. Software was necessary for the airline industry to help people cancel flights.
6. Fairchild was the first person to use silicon in his semiconductors.
7. Evans believes that computers are less advanced today because Kildall was not able to
persuade people to agree with him.
8. Having worked for IBM for years Microsoft finally took over the company
Match the underlined words with their definitions
1. slow working
2.when something makes progress slower
3. an enormous creature or entity
4. inspiring people to follow you
5.very small
6.very big
7. motivated
8. seen everywhere
9. usable in lots of different situations
10.begin learning how to do something
11. survive despite something
12. something that came before something else
13. fragile, delicately balanced, or just about to fall
14. appearing from nowhere
Find 2 types of relative clauses in the text. How are they different?
Find one without a pronoun, when can we leave out the pronoun?
Download