Thursday, June 3, 2010

HISTORY OF THE COMPUTER INDUSTRY

The computer industry started with John Presper Eckert and John W. Mauchly, who designed two of the earliest electronic computers, ENIAC and EDVAC, at the University of Pennsylvania during World War II. In 1946 they left to start the Electronic Control Company, the first computer manufacturer, and their Univac (Universal Automatic Computer) was the first commercially successful computer. Other pioneering commercial ventures included two British machines: the Ferranti Mark I, based on Manchester University’s Mark I prototype; and LEO, the Lyons Electronic Office, developed by the Lyons tea shop company from Cambridge University’s EDSAC computer. Indeed, the first commercially built Mark I was installed at Manchester University in February 1951, a month before the first Univac was delivered to the United States Census bureau. However, it was the Univac that proved there was a market for computers, and that encouraged other companies to begin manufacturing them.
The computer represented a new way of doing things, but most of the things that needed doing were already being done using electromechanical devices. At IBM, the computer was mainly seen as a faster way of tabulating punched cards, which had been the basis of data processing since 1890. IBM was thus able to convert its domination of the data processing business into a corresponding domination of the computer industry. In his autobiography, Tom Watson Jr., IBM’s chief executive from 1956 to 1971, pointed out that only IBM had the “large field force of salesmen, repairmen, and servicemen” who understood how to install automated bookkeeping systems. “The invention [of the computer] was important,” he wrote, “but the knowledge of how to put a great big system online, how to make it run, and solve problems was probably four times as important.”
The industry started to change dramatically when silicon chips became available in quantity. The microprocessor, or “computer on a chip”, developed by Intel in 1971, made computer power a mass-market commodity. Computers had been huge, complicated machines that only large companies, governments, and a few universities could afford, and they were often kept behind glass walls where they could be seen but not touched. (Many firms had visitors’ galleries for people who had never seen a computer.) Microprocessors made computers available in ordinary homes and offices. When Eckert and Mauchly started, they struggled to win orders for their first six Univacs. By comparison, sales of personal computers passed 130 million a year in 2001.
Small, cheap, programmable microprocessors also made it relatively simple for small companies to build computers. Between 1975 and 1985, hundreds of firms entered the business. Some started in garages (such as Apple Computer, Inc.), university computer departments (such as Sun Microsystems, Inc.), and college dormitories (such as Dell). Only a handful became successful global corporations: most died. While it was comparatively easy to design a personal computer, other aspects of the business—manufacturing, advertising, telephone support, maintenance, and so on—were beyond most of the hobbyists and enthusiasts involved.
New computer manufacturers also discovered that software was another major problem. Users who bought a cheap computer required cheap software as well, and—unlike large companies using minicomputers and mainframes—were not willing or equipped to write it themselves. Customers therefore tended to buy the computers for which most software was available, while software houses preferred to write programs for the best-selling computers. This created a “virtuous circle” for a few manufacturers who came to dominate the market, but a vicious circle for the rest.
The market was particularly unkind to small European manufacturers: they were rarely able to compete with American rivals, whose larger home market provided greater economies of scale. Dozens of small firms entered the British microcomputer market in the late 1970s and early 1980s, including Acorn, Amstrad, Apricot, Camputers, Dragon Data, Enterprise, Grundy, Jupiter Cantab, Memotech, Oric, Positron, Sinclair Research (the creation of Sir Clive Sinclair), and Torch. Most struggled to attract software, and few survived.
The market needed a standard, and IBM, the industry’s dominant supplier, was best placed to create it. The company did that when it launched its first personal computer, the IBM PC, in 1981. Since then, “PC-compatibles”, or “clones” of the PC, have gradually taken over more and more of the market, displacing proprietary designs such as the Atari ST, Commodore Amiga, and Apple Macintosh.
However, the personal computer market has become different from the older minicomputer and mainframe markets, because IBM did not take its usual approach of creating the PC’s hardware and software itself. Instead, it went to outside suppliers for parts. Most importantly, it went to Intel for the 8088 microprocessor and to Microsoft for the MS-DOS disk operating system and Basic programming language. Intel and Microsoft retained the ability to supply these parts (and their successors) to IBM’s rivals, creating an intensely competitive and relatively open market, while making immense profits.

No comments:

Post a Comment