The Great Computer Graveyard in the Sky In these days a computer has a typical useable age of just one or two years, before it is either cascaded onto someone who doesn't quite require the most modern computer, or, as is more typical in modern industry they are retired to the great computer graveyard in the sky, who are well looked after by some of the great, but departed innovators of the computer, such as: Gary Kildall. Who, with CP/M was the innovator, and technical genius behind of one of the first operating systems for the microprocessor systems (and who developed many of the initial standard for CD-ROM interface, and produced the first successful open-system architecture). If not for a blunder in arrangements with a meeting with IBM, CP/M may have become the standard operation system for the PC, rather than MS-DOS. Novell eventually bought his company, Digital Research, in 1991, and his products eventually disappeared under the weight of the power Microsoft Windows. He died in 1994 at the age of 52, after falling in a drunken state and hitting his head. Gary, unlike many others in the computer business, was always more interested in technical specifications, rather than financial statements, and balance sheets. John Eckert. During World War II, John at the University of Pennsylvania built the world's first large electronic computer. It contained over 19,000 and was called ENIAC (Electronic Numerical Integrator and Computer). It was so successful that it ran for over 11 years before it was switched off. He would be totally amazed with modern computers, especially in the way that it is now possible to integrate millions of digital devices onto a single piece of silicon, which is smaller than a thumbprint. For them you could actually hold a digital device in your hand, if it is was working it would burn your hand. von Neumann. Who, with ENVAC, presented the architecture of the future: the stored-program concept, where program instructions and data occupy the same memory. To von Neumann the invisible communications over an infrared link would seem more like magic than technology. Herman Hollerith. Who, at the end of the 19th century, devised a machine that accepted punch cards with information on them. These cards allowed an electrical current to pass through a hole when there was a hole present (a 'true'), and did not conduct a current when it a hole was not present (a 'false'). This was one of the first uses of binary information, which represents data in a collection of one of two states (such as true or false, or, 0 or 1). The company that Herman initiated went on to become IBM. He would be amazed with current transfer rates for data storage. To him a few hundred bytes a second would seem fast, but would be totally amazed with the transfer rates that give many hundred of millions of bytes every second, all from invisible magnetic fields stored on a metal disk. Also imagine the number of punch cards that would be required to load many of our modern programs. William Shockley. Who, along with others at the Bell Labs, invented the electronic transistor, which allowed computers to migrate from re-enforced concrete floors which occupied whole floors of a building, and need special electrical generators to power them, to ones which could be fitted onto a pin-head. Grace Hopper. Grace overcame one of the major problems in software development: how to write programs which could be easily written by humans, and easily converted into a form which a computer could understand. In the early-fifties work had begun on assemblers which would simply use simple text representations of the binary operations that the computer understood (such as ADD A, B to add two numbers). The assembler would convert them into a binary form. This aided the programmer as they did not have to continually look-up the binary equivalent of the command that they required. It also made programs easier to read. The great advance occurred around 1956 when Grace Hopper (1906-1992), started to develop compilers for the UNIVAC computer. These graceful programs converted a language which was readable by humans into a form that a computer could understand. This work would lead to the development of the COBOL programming language (which has since survived to the present day, although it is still blamed for many of the Year 2000 problems). -- William J.Buchanan, 1 July 2001. "We build systems like the Wright bothers built airplanes - build the whole thing, push it off the cliff, let it crash, and start over again" Software researcher on software development, 1968. The first large-scale computer system contained over 19,000 and was called ENIAC (Electronic Numerical Integrator and Computer). It was so successful that it ran for over 11 years before it was switched off (not many modern day computers will run for more than a few years before they are considered unusable). By today's standards, though, it was a lumbering dinosaur, and by the time it was dismantled it weighed over 30 tons and spread itself over 1,500 square feet. Amazingly, it also consumed over 25kW of electrical power (equivalent to the power of over 400 60W light bulbs), but could perform over 100,000 calculations per second (which, even by today's standards, is reasonable). Unfortunately, it was unreliable, and would work only for a few hours, on average, before an electronic valve needed to be replaced. Faultfinding, though, was much easier in those days, as a valve that was not working would not glow, and would be cold to touch.