The Internet technology was created by Vinton Cerf in early 1973 as part of a project headed by Robert Kahn and conducted by the Advanced Research Projects Agency, part of the United States Department of Defense. Thereafter, Cerf led many efforts to build, scale, and standardize the Internet. In 1984 the technology and the network were turned over to the private sector and to government scientific agencies for further development. The growth has continued exponentially. Service-provider companies that make “gateways” to the Internet available to home and business users enter the market in ever-increasing numbers. By early 2000, access was available in over 200 countries and encompassed around 100 million users. The Internet and its technology continue to have a profound effect in promoting the sharing of information, making possible rapid transactions among businesses, and supporting global collaboration among individuals and organizations. In 1999, 205 countries and territories in the world had at least one connection to the Internet. The development of the World Wide Web is fuelling the rapid introduction of new business tools and activities that may by then have led to annual business transactions on the Internet worth hundreds of billions of pounds. The potential of web-based commerce is immense. Techniques that allow safe transactions over the Net (for payment and funds transfers), the construction of faster, more secure networks and the development of efficient search techniques make the Internet an ideal trading medium.
Future concerns are focused in a number of areas, including the efficiency of search engines—even the most efficient of them cover less than a sixth of all publicly available pages—as well as privacy, security, and Internet piracy. By its very nature, the Internet does not cope well with traffic that requires a dedicated link between two points (such as voice) as end-to-end delay cannot readily be controlled. Several protocols that allow greater predictability are being developed to guarantee an assured quality of service. The ability to integrate applications is of increasing importance. Common data formats allow e-business applications to cooperate and services such as Internet phones that are easy to install are being refined and deployed.
In addition to these extra features, the core of the Internet—the network hardware that connects everyone together—is undergoing an overhaul that will enable it to cope with ever-increasing traffic loads. The “Internet 2” project has been under way for several years now and is building faster links and bigger switches that will power the Internet for years to come.
Thursday, June 3, 2010
METHODS OF CONNECTING
There are four ways to connect to the public Internet.
Host access is usually carried out via dial-up telephone lines and modems, combined with internet software on a personal computer, and allows the computer that is accessed to function fully as an internet host.
Network access is similar to host access, but is done via a leased line or an “always-on” link such as Digital Subscriber Line (DSL) or Ether loop. In this case, all the attached computers are made into internet hosts.
Terminal access is usually carried out via dial-up telephone lines and modems combined with terminal emulation software on a personal computer; it allows interaction with another computer that is an internet host.
Gateway access is similar to terminal access, but is provided via on-line or similar proprietary services that give the user the ability to exchange e-mail with the Internet.
Host access is usually carried out via dial-up telephone lines and modems, combined with internet software on a personal computer, and allows the computer that is accessed to function fully as an internet host.
Network access is similar to host access, but is done via a leased line or an “always-on” link such as Digital Subscriber Line (DSL) or Ether loop. In this case, all the attached computers are made into internet hosts.
Terminal access is usually carried out via dial-up telephone lines and modems combined with terminal emulation software on a personal computer; it allows interaction with another computer that is an internet host.
Gateway access is similar to terminal access, but is provided via on-line or similar proprietary services that give the user the ability to exchange e-mail with the Internet.
Services of the internet
Internets support thousands of different kinds of operational and experimental services. A few of the most popular include the following:
E-mail (electronic mail) allows a message to be sent from one person to another, or to many others, via computer. Internet has its own e-mail standards that have also become the means of interconnecting most of the world's e-mail systems. Internet e-mail addresses usually have a form such as “editor@encarta.microsoft.com”, where “editor” is the e-mail account name, and “encarta.microsoft.com” is the domain identity of the computer hosting the account. E-mail can also be used to create collaborative groups through the use of special e-mail accounts called “reflectors” or “exploders” that automatically redistribute mail sent to the address.
The World Wide Web allows the seamless creation and use of elegant point-and-click hypermedia presentations, linked across the Internet in a way that creates a vast open knowledge repository, through which users can easily browse.
Gopher is a system that allows the creation and use of directories of files held on computers on the Internet, and builds links across the Internet in a manner that allows users to browse through the files.
FTP (File Transfer Protocol) is a set of conventions allowing easy transfer of files between host computers. This remains the biggest use of the Internet, especially for software distribution, and many public distribution sites now exist.
Usenet allows automatic global distribution of news messages among thousands of user groups, called newsgroups.
Telnet is the system that allows a user to “log in” to a remote computer, and make use of it.
E-mail (electronic mail) allows a message to be sent from one person to another, or to many others, via computer. Internet has its own e-mail standards that have also become the means of interconnecting most of the world's e-mail systems. Internet e-mail addresses usually have a form such as “editor@encarta.microsoft.com”, where “editor” is the e-mail account name, and “encarta.microsoft.com” is the domain identity of the computer hosting the account. E-mail can also be used to create collaborative groups through the use of special e-mail accounts called “reflectors” or “exploders” that automatically redistribute mail sent to the address.
The World Wide Web allows the seamless creation and use of elegant point-and-click hypermedia presentations, linked across the Internet in a way that creates a vast open knowledge repository, through which users can easily browse.
Gopher is a system that allows the creation and use of directories of files held on computers on the Internet, and builds links across the Internet in a manner that allows users to browse through the files.
FTP (File Transfer Protocol) is a set of conventions allowing easy transfer of files between host computers. This remains the biggest use of the Internet, especially for software distribution, and many public distribution sites now exist.
Usenet allows automatic global distribution of news messages among thousands of user groups, called newsgroups.
Telnet is the system that allows a user to “log in” to a remote computer, and make use of it.
Internet
INTRODUCTION
Internet, a collection of computer networks that operate to common standards and enable the computers and the programs they run to communicate directly. There are many small-scale, controlled-access “enterprise internets”, but the term is usually applied to the global, publicly accessible network, called simply the Internet or Net. By the end of 2002, more than 100,000 networks and around 120 million users were connected via the Internet.
Internet connection is usually accomplished using international standards collectively called TCP/IP (Transmission Control Protocol/Internet Protocol), which are issued by an organization called the Internet Engineering Task Force, combined with a network registration process, and with the aid of public providers of Internet access services, known as Internet Service Providers or ISPs.
Each connected computer—called an Internet host—is provided with a unique Internet Protocol (IP) address—198.105.232.1, for example. For obvious reasons, the IP address has become known as the “dot address” of a computer. Although very simple and effective for network operation, dot addresses are not very user-friendly. Hence the introduction of the Domain Name System (DNS) that allows for the assignment of meaningful or memorable names to numbers. DNS allows Internet hosts to be organized around domain names: for example, “microsoft.com” is a domain assigned to the Microsoft Corporation, with the suffix “com” signifying a commercial organization. “ftp.microsoft.com” is an Internet host within that domain. Each part of the domain still has an IP or dot address, which is used by the network elements to deliver information. From a user point of view, though, the IP address is translated (or “resolved”) by DNS into the now familiar format.
The suffix .com is called a generic top-level domain name, and before 2001 there were just three of these (.com, .net, and .org), with .edu and .gov restricted to educational institutions and government agencies respectively. As a result of the rapid growth in Internet use, seven new top-level domain names have been prepared for use, some by specific sectors (.aero, .coop, and .museum) and some for general use (.biz, .info, .pro, and .name).
Internets are constructed using virtually any kind of electronic transmission medium, such as optical-fibre or copper-wire telephone lines, or radio or microwave channels. They can also connect almost any kind of computer or operating system; and they are operated in such a way as to be “self-aware” of their capabilities.
The great scale and universality of the public Internet results in its use to connect many other kinds of computer networks and services—including online information and shopping services—via systems called gateways. As a result of all these features, internets are an ideal means of building a very robust universal information infrastructure throughout the world. The rapid growth of online shops, information services, and electronic business applications is testament to the inherent flexibility of the Net.
Internet, a collection of computer networks that operate to common standards and enable the computers and the programs they run to communicate directly. There are many small-scale, controlled-access “enterprise internets”, but the term is usually applied to the global, publicly accessible network, called simply the Internet or Net. By the end of 2002, more than 100,000 networks and around 120 million users were connected via the Internet.
Internet connection is usually accomplished using international standards collectively called TCP/IP (Transmission Control Protocol/Internet Protocol), which are issued by an organization called the Internet Engineering Task Force, combined with a network registration process, and with the aid of public providers of Internet access services, known as Internet Service Providers or ISPs.
Each connected computer—called an Internet host—is provided with a unique Internet Protocol (IP) address—198.105.232.1, for example. For obvious reasons, the IP address has become known as the “dot address” of a computer. Although very simple and effective for network operation, dot addresses are not very user-friendly. Hence the introduction of the Domain Name System (DNS) that allows for the assignment of meaningful or memorable names to numbers. DNS allows Internet hosts to be organized around domain names: for example, “microsoft.com” is a domain assigned to the Microsoft Corporation, with the suffix “com” signifying a commercial organization. “ftp.microsoft.com” is an Internet host within that domain. Each part of the domain still has an IP or dot address, which is used by the network elements to deliver information. From a user point of view, though, the IP address is translated (or “resolved”) by DNS into the now familiar format.
The suffix .com is called a generic top-level domain name, and before 2001 there were just three of these (.com, .net, and .org), with .edu and .gov restricted to educational institutions and government agencies respectively. As a result of the rapid growth in Internet use, seven new top-level domain names have been prepared for use, some by specific sectors (.aero, .coop, and .museum) and some for general use (.biz, .info, .pro, and .name).
Internets are constructed using virtually any kind of electronic transmission medium, such as optical-fibre or copper-wire telephone lines, or radio or microwave channels. They can also connect almost any kind of computer or operating system; and they are operated in such a way as to be “self-aware” of their capabilities.
The great scale and universality of the public Internet results in its use to connect many other kinds of computer networks and services—including online information and shopping services—via systems called gateways. As a result of all these features, internets are an ideal means of building a very robust universal information infrastructure throughout the world. The rapid growth of online shops, information services, and electronic business applications is testament to the inherent flexibility of the Net.
THE GLOBAL NATURE OF THE COMPUTER INDUSTRY
Although its heart is in California’s Silicon Valley, the computer industry is a global enterprise. Intel’s Pentium processor, for example, was designed in the United States, but a particular example might be manufactured in Ireland from a Japanese semiconductor wafer, packaged in its protective housing in Malaysia, inserted into a printed circuit board in Taiwan, and assembled into a product that is sold in England by a German manufacturer. Many of the parts used in personal computers are now manufactured at factories across Asia, with production levels particularly highly concentrated in Japan, South Korea, Hong Kong S. A. R., and the island of Taiwan. Many industry suppliers expect mainland China and India to become large markets for computers in the future, and to develop large computer-manufacturing and software industries.
The software industry is also global. However, because programs have not yet been “componentized” (split into reusable modules that can be created separately), it is not as diverse as the computer hardware industry. All the best-selling operating systems and most of the leading applications programs have been written in the United States, then converted for use elsewhere. There are exceptions, such as accounting packages that meet local needs, but these tend not to be exportable. Nonetheless, the large pool of computer science graduates and the relatively low wages in countries such as India and China have started to create custom-programming industries outside the United States and Europe. Further globalization can be expected, thanks to the Internet’s tendency to make national boundaries invisible, and its ability to deliver software at little or no cost, without the need for packaging or printed manuals.
In the 1950s and 1960s large companies used relatively small numbers of computers to automate internal processes such as payroll and stock control, and computers are still performing these mundane tasks. However, the industry’s emphasis has shifted towards personal use, for “productivity applications” (word processing, desktop publishing), for communications (e-mail, instant messaging), and for entertainment (games, music, digital photography, video). Hundreds of millions of people now use personal computers in their daily lives, both in their workplaces and in their homes, and—thanks to the growing popularity of notebook PCs and electronic organizers—often in between the two. Few industries have changed so much in such a short time, and the pace of change shows no signs of slowing.
The software industry is also global. However, because programs have not yet been “componentized” (split into reusable modules that can be created separately), it is not as diverse as the computer hardware industry. All the best-selling operating systems and most of the leading applications programs have been written in the United States, then converted for use elsewhere. There are exceptions, such as accounting packages that meet local needs, but these tend not to be exportable. Nonetheless, the large pool of computer science graduates and the relatively low wages in countries such as India and China have started to create custom-programming industries outside the United States and Europe. Further globalization can be expected, thanks to the Internet’s tendency to make national boundaries invisible, and its ability to deliver software at little or no cost, without the need for packaging or printed manuals.
In the 1950s and 1960s large companies used relatively small numbers of computers to automate internal processes such as payroll and stock control, and computers are still performing these mundane tasks. However, the industry’s emphasis has shifted towards personal use, for “productivity applications” (word processing, desktop publishing), for communications (e-mail, instant messaging), and for entertainment (games, music, digital photography, video). Hundreds of millions of people now use personal computers in their daily lives, both in their workplaces and in their homes, and—thanks to the growing popularity of notebook PCs and electronic organizers—often in between the two. Few industries have changed so much in such a short time, and the pace of change shows no signs of slowing.
OPEN SYSTEMS
Intel’s microprocessors and Microsoft’s MS-DOS and Windows programs have become de facto standards because their sales dominate the personal computer market. However, they are still owned by Intel and Microsoft, which gives these companies an advantage. There is an alternative, the open-systems approach, in which standards are specified independently of particular manufacturers by industry committees. Suppliers compete by trying to produce the best implementation of an agreed standard. In theory, this should lead to better products at lower prices. It is certainly less risky for manufacturers to support an agreed standard than for each company to develop a different proprietary system: probably only one or two such systems would be successful, and the rest would fail. For this reason, computer industry suppliers invest a great deal of time and energy in efforts to agree standards. Companies will even “give away” their technologies to recognized industry bodies to encourage their adoption as standards.
In the mid-1980s several European companies, encouraged by the European Commission (see European Union: European Commission), began to agree open-systems standards for computers, and they were soon joined by the leading American and Japanese suppliers. These standards were based on a portable (not machine-specific) operating specification called Posix, which was ultimately derived from AT&T’s UNIX operating system, and on Open Systems Interconnection (OSI) networking, as ratified by the International Standards Organization. Following a Commission directive, many governments mandated public bodies to buy open-systems computers whenever possible, and the United States government and some others took similar actions. Leading manufacturers such as IBM, Digital Equipment, and the largest British computer company at the time, ICL, changed their proprietary operating systems and networks to comply with the required standards.
However, the open-systems approach was not successful in capturing the mass market, and the effort exposed several problems with it. Because the specifications had to satisfy many participants in the standards-making process, they often proved to be complex and expensive to implement. Some standards provided too many options, or allowed different interpretations, so that even when different firms implemented the same standards, their systems remained incompatible. The standards-setting process often proved to be too slow for the fast-moving computer market.
As a result, computer industry suppliers now participate in numerous shifting alliances formed around various technologies. These alliances constantly generate publicity in an attempt to sway the market, producing an often spurious impression that the computer industry is continually at war with itself.
Since 1994 the Internet has had an increasing influence on the development of the computer industry. Most suppliers have now adopted Internet standards that have been developed in universities and research institutes over the past 25 years. In particular, the Internet’s method of connecting computers, which is called TCP/IP (Transmission Control Protocol/Internet Protocol), has largely displaced commercial alternatives produced by companies such as IBM and Novell, Inc., as well as the open-systems standard, OSI.
In the beginning, the computer industry was dominated by large, proprietary, centralized systems. Later, mass market economies of scale made cheap personal computer technologies dominant. Today, the Internet is connecting both large and small systems together in a more balanced network that encourages approaches such as distributed computing, peer-to-peer file sharing, and the use of open standards to provide Web-based services.
In the mid-1980s several European companies, encouraged by the European Commission (see European Union: European Commission), began to agree open-systems standards for computers, and they were soon joined by the leading American and Japanese suppliers. These standards were based on a portable (not machine-specific) operating specification called Posix, which was ultimately derived from AT&T’s UNIX operating system, and on Open Systems Interconnection (OSI) networking, as ratified by the International Standards Organization. Following a Commission directive, many governments mandated public bodies to buy open-systems computers whenever possible, and the United States government and some others took similar actions. Leading manufacturers such as IBM, Digital Equipment, and the largest British computer company at the time, ICL, changed their proprietary operating systems and networks to comply with the required standards.
However, the open-systems approach was not successful in capturing the mass market, and the effort exposed several problems with it. Because the specifications had to satisfy many participants in the standards-making process, they often proved to be complex and expensive to implement. Some standards provided too many options, or allowed different interpretations, so that even when different firms implemented the same standards, their systems remained incompatible. The standards-setting process often proved to be too slow for the fast-moving computer market.
As a result, computer industry suppliers now participate in numerous shifting alliances formed around various technologies. These alliances constantly generate publicity in an attempt to sway the market, producing an often spurious impression that the computer industry is continually at war with itself.
Since 1994 the Internet has had an increasing influence on the development of the computer industry. Most suppliers have now adopted Internet standards that have been developed in universities and research institutes over the past 25 years. In particular, the Internet’s method of connecting computers, which is called TCP/IP (Transmission Control Protocol/Internet Protocol), has largely displaced commercial alternatives produced by companies such as IBM and Novell, Inc., as well as the open-systems standard, OSI.
In the beginning, the computer industry was dominated by large, proprietary, centralized systems. Later, mass market economies of scale made cheap personal computer technologies dominant. Today, the Internet is connecting both large and small systems together in a more balanced network that encourages approaches such as distributed computing, peer-to-peer file sharing, and the use of open standards to provide Web-based services.
HISTORY OF THE COMPUTER INDUSTRY
The computer industry started with John Presper Eckert and John W. Mauchly, who designed two of the earliest electronic computers, ENIAC and EDVAC, at the University of Pennsylvania during World War II. In 1946 they left to start the Electronic Control Company, the first computer manufacturer, and their Univac (Universal Automatic Computer) was the first commercially successful computer. Other pioneering commercial ventures included two British machines: the Ferranti Mark I, based on Manchester University’s Mark I prototype; and LEO, the Lyons Electronic Office, developed by the Lyons tea shop company from Cambridge University’s EDSAC computer. Indeed, the first commercially built Mark I was installed at Manchester University in February 1951, a month before the first Univac was delivered to the United States Census bureau. However, it was the Univac that proved there was a market for computers, and that encouraged other companies to begin manufacturing them.
The computer represented a new way of doing things, but most of the things that needed doing were already being done using electromechanical devices. At IBM, the computer was mainly seen as a faster way of tabulating punched cards, which had been the basis of data processing since 1890. IBM was thus able to convert its domination of the data processing business into a corresponding domination of the computer industry. In his autobiography, Tom Watson Jr., IBM’s chief executive from 1956 to 1971, pointed out that only IBM had the “large field force of salesmen, repairmen, and servicemen” who understood how to install automated bookkeeping systems. “The invention [of the computer] was important,” he wrote, “but the knowledge of how to put a great big system online, how to make it run, and solve problems was probably four times as important.”
The industry started to change dramatically when silicon chips became available in quantity. The microprocessor, or “computer on a chip”, developed by Intel in 1971, made computer power a mass-market commodity. Computers had been huge, complicated machines that only large companies, governments, and a few universities could afford, and they were often kept behind glass walls where they could be seen but not touched. (Many firms had visitors’ galleries for people who had never seen a computer.) Microprocessors made computers available in ordinary homes and offices. When Eckert and Mauchly started, they struggled to win orders for their first six Univacs. By comparison, sales of personal computers passed 130 million a year in 2001.
Small, cheap, programmable microprocessors also made it relatively simple for small companies to build computers. Between 1975 and 1985, hundreds of firms entered the business. Some started in garages (such as Apple Computer, Inc.), university computer departments (such as Sun Microsystems, Inc.), and college dormitories (such as Dell). Only a handful became successful global corporations: most died. While it was comparatively easy to design a personal computer, other aspects of the business—manufacturing, advertising, telephone support, maintenance, and so on—were beyond most of the hobbyists and enthusiasts involved.
New computer manufacturers also discovered that software was another major problem. Users who bought a cheap computer required cheap software as well, and—unlike large companies using minicomputers and mainframes—were not willing or equipped to write it themselves. Customers therefore tended to buy the computers for which most software was available, while software houses preferred to write programs for the best-selling computers. This created a “virtuous circle” for a few manufacturers who came to dominate the market, but a vicious circle for the rest.
The market was particularly unkind to small European manufacturers: they were rarely able to compete with American rivals, whose larger home market provided greater economies of scale. Dozens of small firms entered the British microcomputer market in the late 1970s and early 1980s, including Acorn, Amstrad, Apricot, Camputers, Dragon Data, Enterprise, Grundy, Jupiter Cantab, Memotech, Oric, Positron, Sinclair Research (the creation of Sir Clive Sinclair), and Torch. Most struggled to attract software, and few survived.
The market needed a standard, and IBM, the industry’s dominant supplier, was best placed to create it. The company did that when it launched its first personal computer, the IBM PC, in 1981. Since then, “PC-compatibles”, or “clones” of the PC, have gradually taken over more and more of the market, displacing proprietary designs such as the Atari ST, Commodore Amiga, and Apple Macintosh.
However, the personal computer market has become different from the older minicomputer and mainframe markets, because IBM did not take its usual approach of creating the PC’s hardware and software itself. Instead, it went to outside suppliers for parts. Most importantly, it went to Intel for the 8088 microprocessor and to Microsoft for the MS-DOS disk operating system and Basic programming language. Intel and Microsoft retained the ability to supply these parts (and their successors) to IBM’s rivals, creating an intensely competitive and relatively open market, while making immense profits.
The computer represented a new way of doing things, but most of the things that needed doing were already being done using electromechanical devices. At IBM, the computer was mainly seen as a faster way of tabulating punched cards, which had been the basis of data processing since 1890. IBM was thus able to convert its domination of the data processing business into a corresponding domination of the computer industry. In his autobiography, Tom Watson Jr., IBM’s chief executive from 1956 to 1971, pointed out that only IBM had the “large field force of salesmen, repairmen, and servicemen” who understood how to install automated bookkeeping systems. “The invention [of the computer] was important,” he wrote, “but the knowledge of how to put a great big system online, how to make it run, and solve problems was probably four times as important.”
The industry started to change dramatically when silicon chips became available in quantity. The microprocessor, or “computer on a chip”, developed by Intel in 1971, made computer power a mass-market commodity. Computers had been huge, complicated machines that only large companies, governments, and a few universities could afford, and they were often kept behind glass walls where they could be seen but not touched. (Many firms had visitors’ galleries for people who had never seen a computer.) Microprocessors made computers available in ordinary homes and offices. When Eckert and Mauchly started, they struggled to win orders for their first six Univacs. By comparison, sales of personal computers passed 130 million a year in 2001.
Small, cheap, programmable microprocessors also made it relatively simple for small companies to build computers. Between 1975 and 1985, hundreds of firms entered the business. Some started in garages (such as Apple Computer, Inc.), university computer departments (such as Sun Microsystems, Inc.), and college dormitories (such as Dell). Only a handful became successful global corporations: most died. While it was comparatively easy to design a personal computer, other aspects of the business—manufacturing, advertising, telephone support, maintenance, and so on—were beyond most of the hobbyists and enthusiasts involved.
New computer manufacturers also discovered that software was another major problem. Users who bought a cheap computer required cheap software as well, and—unlike large companies using minicomputers and mainframes—were not willing or equipped to write it themselves. Customers therefore tended to buy the computers for which most software was available, while software houses preferred to write programs for the best-selling computers. This created a “virtuous circle” for a few manufacturers who came to dominate the market, but a vicious circle for the rest.
The market was particularly unkind to small European manufacturers: they were rarely able to compete with American rivals, whose larger home market provided greater economies of scale. Dozens of small firms entered the British microcomputer market in the late 1970s and early 1980s, including Acorn, Amstrad, Apricot, Camputers, Dragon Data, Enterprise, Grundy, Jupiter Cantab, Memotech, Oric, Positron, Sinclair Research (the creation of Sir Clive Sinclair), and Torch. Most struggled to attract software, and few survived.
The market needed a standard, and IBM, the industry’s dominant supplier, was best placed to create it. The company did that when it launched its first personal computer, the IBM PC, in 1981. Since then, “PC-compatibles”, or “clones” of the PC, have gradually taken over more and more of the market, displacing proprietary designs such as the Atari ST, Commodore Amiga, and Apple Macintosh.
However, the personal computer market has become different from the older minicomputer and mainframe markets, because IBM did not take its usual approach of creating the PC’s hardware and software itself. Instead, it went to outside suppliers for parts. Most importantly, it went to Intel for the 8088 microprocessor and to Microsoft for the MS-DOS disk operating system and Basic programming language. Intel and Microsoft retained the ability to supply these parts (and their successors) to IBM’s rivals, creating an intensely competitive and relatively open market, while making immense profits.
Subscribe to:
Posts (Atom)