Thursday, June 3, 2010

HISTORY AND FUTURE

The Internet technology was created by Vinton Cerf in early 1973 as part of a project headed by Robert Kahn and conducted by the Advanced Research Projects Agency, part of the United States Department of Defense. Thereafter, Cerf led many efforts to build, scale, and standardize the Internet. In 1984 the technology and the network were turned over to the private sector and to government scientific agencies for further development. The growth has continued exponentially. Service-provider companies that make “gateways” to the Internet available to home and business users enter the market in ever-increasing numbers. By early 2000, access was available in over 200 countries and encompassed around 100 million users. The Internet and its technology continue to have a profound effect in promoting the sharing of information, making possible rapid transactions among businesses, and supporting global collaboration among individuals and organizations. In 1999, 205 countries and territories in the world had at least one connection to the Internet. The development of the World Wide Web is fuelling the rapid introduction of new business tools and activities that may by then have led to annual business transactions on the Internet worth hundreds of billions of pounds. The potential of web-based commerce is immense. Techniques that allow safe transactions over the Net (for payment and funds transfers), the construction of faster, more secure networks and the development of efficient search techniques make the Internet an ideal trading medium.
Future concerns are focused in a number of areas, including the efficiency of search engines—even the most efficient of them cover less than a sixth of all publicly available pages—as well as privacy, security, and Internet piracy. By its very nature, the Internet does not cope well with traffic that requires a dedicated link between two points (such as voice) as end-to-end delay cannot readily be controlled. Several protocols that allow greater predictability are being developed to guarantee an assured quality of service. The ability to integrate applications is of increasing importance. Common data formats allow e-business applications to cooperate and services such as Internet phones that are easy to install are being refined and deployed.
In addition to these extra features, the core of the Internet—the network hardware that connects everyone together—is undergoing an overhaul that will enable it to cope with ever-increasing traffic loads. The “Internet 2” project has been under way for several years now and is building faster links and bigger switches that will power the Internet for years to come.

METHODS OF CONNECTING

There are four ways to connect to the public Internet.
Host access is usually carried out via dial-up telephone lines and modems, combined with internet software on a personal computer, and allows the computer that is accessed to function fully as an internet host.
Network access is similar to host access, but is done via a leased line or an “always-on” link such as Digital Subscriber Line (DSL) or Ether loop. In this case, all the attached computers are made into internet hosts.
Terminal access is usually carried out via dial-up telephone lines and modems combined with terminal emulation software on a personal computer; it allows interaction with another computer that is an internet host.
Gateway access is similar to terminal access, but is provided via on-line or similar proprietary services that give the user the ability to exchange e-mail with the Internet.

Services of the internet

Internets support thousands of different kinds of operational and experimental services. A few of the most popular include the following:
E-mail (electronic mail) allows a message to be sent from one person to another, or to many others, via computer. Internet has its own e-mail standards that have also become the means of interconnecting most of the world's e-mail systems. Internet e-mail addresses usually have a form such as “editor@encarta.microsoft.com”, where “editor” is the e-mail account name, and “encarta.microsoft.com” is the domain identity of the computer hosting the account. E-mail can also be used to create collaborative groups through the use of special e-mail accounts called “reflectors” or “exploders” that automatically redistribute mail sent to the address.
The World Wide Web allows the seamless creation and use of elegant point-and-click hypermedia presentations, linked across the Internet in a way that creates a vast open knowledge repository, through which users can easily browse.
Gopher is a system that allows the creation and use of directories of files held on computers on the Internet, and builds links across the Internet in a manner that allows users to browse through the files.
FTP (File Transfer Protocol) is a set of conventions allowing easy transfer of files between host computers. This remains the biggest use of the Internet, especially for software distribution, and many public distribution sites now exist.
Usenet allows automatic global distribution of news messages among thousands of user groups, called newsgroups.
Telnet is the system that allows a user to “log in” to a remote computer, and make use of it.

Internet


INTRODUCTION
Internet, a collection of computer networks that operate to common standards and enable the computers and the programs they run to communicate directly. There are many small-scale, controlled-access “enterprise internets”, but the term is usually applied to the global, publicly accessible network, called simply the Internet or Net. By the end of 2002, more than 100,000 networks and around 120 million users were connected via the Internet.
Internet connection is usually accomplished using international standards collectively called TCP/IP (Transmission Control Protocol/Internet Protocol), which are issued by an organization called the Internet Engineering Task Force, combined with a network registration process, and with the aid of public providers of Internet access services, known as Internet Service Providers or ISPs.
Each connected computer—called an Internet host—is provided with a unique Internet Protocol (IP) address—198.105.232.1, for example. For obvious reasons, the IP address has become known as the “dot address” of a computer. Although very simple and effective for network operation, dot addresses are not very user-friendly. Hence the introduction of the Domain Name System (DNS) that allows for the assignment of meaningful or memorable names to numbers. DNS allows Internet hosts to be organized around domain names: for example, “microsoft.com” is a domain assigned to the Microsoft Corporation, with the suffix “com” signifying a commercial organization. “ftp.microsoft.com” is an Internet host within that domain. Each part of the domain still has an IP or dot address, which is used by the network elements to deliver information. From a user point of view, though, the IP address is translated (or “resolved”) by DNS into the now familiar format.
The suffix .com is called a generic top-level domain name, and before 2001 there were just three of these (.com, .net, and .org), with .edu and .gov restricted to educational institutions and government agencies respectively. As a result of the rapid growth in Internet use, seven new top-level domain names have been prepared for use, some by specific sectors (.aero, .coop, and .museum) and some for general use (.biz, .info, .pro, and .name).
Internets are constructed using virtually any kind of electronic transmission medium, such as optical-fibre or copper-wire telephone lines, or radio or microwave channels. They can also connect almost any kind of computer or operating system; and they are operated in such a way as to be “self-aware” of their capabilities.
The great scale and universality of the public Internet results in its use to connect many other kinds of computer networks and services—including online information and shopping services—via systems called gateways. As a result of all these features, internets are an ideal means of building a very robust universal information infrastructure throughout the world. The rapid growth of online shops, information services, and electronic business applications is testament to the inherent flexibility of the Net.

THE GLOBAL NATURE OF THE COMPUTER INDUSTRY

Although its heart is in California’s Silicon Valley, the computer industry is a global enterprise. Intel’s Pentium processor, for example, was designed in the United States, but a particular example might be manufactured in Ireland from a Japanese semiconductor wafer, packaged in its protective housing in Malaysia, inserted into a printed circuit board in Taiwan, and assembled into a product that is sold in England by a German manufacturer. Many of the parts used in personal computers are now manufactured at factories across Asia, with production levels particularly highly concentrated in Japan, South Korea, Hong Kong S. A. R., and the island of Taiwan. Many industry suppliers expect mainland China and India to become large markets for computers in the future, and to develop large computer-manufacturing and software industries.
The software industry is also global. However, because programs have not yet been “componentized” (split into reusable modules that can be created separately), it is not as diverse as the computer hardware industry. All the best-selling operating systems and most of the leading applications programs have been written in the United States, then converted for use elsewhere. There are exceptions, such as accounting packages that meet local needs, but these tend not to be exportable. Nonetheless, the large pool of computer science graduates and the relatively low wages in countries such as India and China have started to create custom-programming industries outside the United States and Europe. Further globalization can be expected, thanks to the Internet’s tendency to make national boundaries invisible, and its ability to deliver software at little or no cost, without the need for packaging or printed manuals.
In the 1950s and 1960s large companies used relatively small numbers of computers to automate internal processes such as payroll and stock control, and computers are still performing these mundane tasks. However, the industry’s emphasis has shifted towards personal use, for “productivity applications” (word processing, desktop publishing), for communications (e-mail, instant messaging), and for entertainment (games, music, digital photography, video). Hundreds of millions of people now use personal computers in their daily lives, both in their workplaces and in their homes, and—thanks to the growing popularity of notebook PCs and electronic organizers—often in between the two. Few industries have changed so much in such a short time, and the pace of change shows no signs of slowing.

OPEN SYSTEMS

Intel’s microprocessors and Microsoft’s MS-DOS and Windows programs have become de facto standards because their sales dominate the personal computer market. However, they are still owned by Intel and Microsoft, which gives these companies an advantage. There is an alternative, the open-systems approach, in which standards are specified independently of particular manufacturers by industry committees. Suppliers compete by trying to produce the best implementation of an agreed standard. In theory, this should lead to better products at lower prices. It is certainly less risky for manufacturers to support an agreed standard than for each company to develop a different proprietary system: probably only one or two such systems would be successful, and the rest would fail. For this reason, computer industry suppliers invest a great deal of time and energy in efforts to agree standards. Companies will even “give away” their technologies to recognized industry bodies to encourage their adoption as standards.
In the mid-1980s several European companies, encouraged by the European Commission (see European Union: European Commission), began to agree open-systems standards for computers, and they were soon joined by the leading American and Japanese suppliers. These standards were based on a portable (not machine-specific) operating specification called Posix, which was ultimately derived from AT&T’s UNIX operating system, and on Open Systems Interconnection (OSI) networking, as ratified by the International Standards Organization. Following a Commission directive, many governments mandated public bodies to buy open-systems computers whenever possible, and the United States government and some others took similar actions. Leading manufacturers such as IBM, Digital Equipment, and the largest British computer company at the time, ICL, changed their proprietary operating systems and networks to comply with the required standards.
However, the open-systems approach was not successful in capturing the mass market, and the effort exposed several problems with it. Because the specifications had to satisfy many participants in the standards-making process, they often proved to be complex and expensive to implement. Some standards provided too many options, or allowed different interpretations, so that even when different firms implemented the same standards, their systems remained incompatible. The standards-setting process often proved to be too slow for the fast-moving computer market.
As a result, computer industry suppliers now participate in numerous shifting alliances formed around various technologies. These alliances constantly generate publicity in an attempt to sway the market, producing an often spurious impression that the computer industry is continually at war with itself.
Since 1994 the Internet has had an increasing influence on the development of the computer industry. Most suppliers have now adopted Internet standards that have been developed in universities and research institutes over the past 25 years. In particular, the Internet’s method of connecting computers, which is called TCP/IP (Transmission Control Protocol/Internet Protocol), has largely displaced commercial alternatives produced by companies such as IBM and Novell, Inc., as well as the open-systems standard, OSI.
In the beginning, the computer industry was dominated by large, proprietary, centralized systems. Later, mass market economies of scale made cheap personal computer technologies dominant. Today, the Internet is connecting both large and small systems together in a more balanced network that encourages approaches such as distributed computing, peer-to-peer file sharing, and the use of open standards to provide Web-based services.

HISTORY OF THE COMPUTER INDUSTRY

The computer industry started with John Presper Eckert and John W. Mauchly, who designed two of the earliest electronic computers, ENIAC and EDVAC, at the University of Pennsylvania during World War II. In 1946 they left to start the Electronic Control Company, the first computer manufacturer, and their Univac (Universal Automatic Computer) was the first commercially successful computer. Other pioneering commercial ventures included two British machines: the Ferranti Mark I, based on Manchester University’s Mark I prototype; and LEO, the Lyons Electronic Office, developed by the Lyons tea shop company from Cambridge University’s EDSAC computer. Indeed, the first commercially built Mark I was installed at Manchester University in February 1951, a month before the first Univac was delivered to the United States Census bureau. However, it was the Univac that proved there was a market for computers, and that encouraged other companies to begin manufacturing them.
The computer represented a new way of doing things, but most of the things that needed doing were already being done using electromechanical devices. At IBM, the computer was mainly seen as a faster way of tabulating punched cards, which had been the basis of data processing since 1890. IBM was thus able to convert its domination of the data processing business into a corresponding domination of the computer industry. In his autobiography, Tom Watson Jr., IBM’s chief executive from 1956 to 1971, pointed out that only IBM had the “large field force of salesmen, repairmen, and servicemen” who understood how to install automated bookkeeping systems. “The invention [of the computer] was important,” he wrote, “but the knowledge of how to put a great big system online, how to make it run, and solve problems was probably four times as important.”
The industry started to change dramatically when silicon chips became available in quantity. The microprocessor, or “computer on a chip”, developed by Intel in 1971, made computer power a mass-market commodity. Computers had been huge, complicated machines that only large companies, governments, and a few universities could afford, and they were often kept behind glass walls where they could be seen but not touched. (Many firms had visitors’ galleries for people who had never seen a computer.) Microprocessors made computers available in ordinary homes and offices. When Eckert and Mauchly started, they struggled to win orders for their first six Univacs. By comparison, sales of personal computers passed 130 million a year in 2001.
Small, cheap, programmable microprocessors also made it relatively simple for small companies to build computers. Between 1975 and 1985, hundreds of firms entered the business. Some started in garages (such as Apple Computer, Inc.), university computer departments (such as Sun Microsystems, Inc.), and college dormitories (such as Dell). Only a handful became successful global corporations: most died. While it was comparatively easy to design a personal computer, other aspects of the business—manufacturing, advertising, telephone support, maintenance, and so on—were beyond most of the hobbyists and enthusiasts involved.
New computer manufacturers also discovered that software was another major problem. Users who bought a cheap computer required cheap software as well, and—unlike large companies using minicomputers and mainframes—were not willing or equipped to write it themselves. Customers therefore tended to buy the computers for which most software was available, while software houses preferred to write programs for the best-selling computers. This created a “virtuous circle” for a few manufacturers who came to dominate the market, but a vicious circle for the rest.
The market was particularly unkind to small European manufacturers: they were rarely able to compete with American rivals, whose larger home market provided greater economies of scale. Dozens of small firms entered the British microcomputer market in the late 1970s and early 1980s, including Acorn, Amstrad, Apricot, Camputers, Dragon Data, Enterprise, Grundy, Jupiter Cantab, Memotech, Oric, Positron, Sinclair Research (the creation of Sir Clive Sinclair), and Torch. Most struggled to attract software, and few survived.
The market needed a standard, and IBM, the industry’s dominant supplier, was best placed to create it. The company did that when it launched its first personal computer, the IBM PC, in 1981. Since then, “PC-compatibles”, or “clones” of the PC, have gradually taken over more and more of the market, displacing proprietary designs such as the Atari ST, Commodore Amiga, and Apple Macintosh.
However, the personal computer market has become different from the older minicomputer and mainframe markets, because IBM did not take its usual approach of creating the PC’s hardware and software itself. Instead, it went to outside suppliers for parts. Most importantly, it went to Intel for the 8088 microprocessor and to Microsoft for the MS-DOS disk operating system and Basic programming language. Intel and Microsoft retained the ability to supply these parts (and their successors) to IBM’s rivals, creating an intensely competitive and relatively open market, while making immense profits.

Computer Industry

INTRODUCTION

Computer Industry, wide range of activities based on the manufacture and use of computers to satisfy business or personal needs. This involves the design and construction of hardware (computers plus peripherals such as printers), the specification and coding of applications programs (software), and the running or execution of programs, often by trained operators. Computers now range from pocket-size electronic organizers to powerful business systems—for example, airline reservation systems—that fill large halls, and hundreds of thousands of programs are available to run on them.
The computer industry is based on the electronics industry, which manufactures integrated circuits (silicon chips) and other parts. It overlaps with other industries such as consumer electronics, office systems, and telecommunications, so it is difficult to put a value on it separately. An analysis by the Massachusetts-based International Data Corporation put overall worldwide spending on computer products and services at about US$700 billion in 1996. The industry has been growing at 12 to 16 per cent per year by value, but because computer prices are falling, the growth in unit sales—and the number of users—has sometimes been greater than this suggests.
The American company International Business Machines Corporation (IBM) has dominated the computer industry since the end of the 1950s, and although it lost market share in the 1990s, it is still by far the largest supplier. It is involved in every aspect of the business, from chip manufacture and software development to managing and providing financing for other companies’ computer facilities. In 2000 IBM’s annual turnover reached US$88.4 billion. However, smaller firms such as Intel and Microsoft Corporation have recently become more influential, because of their contributions to the fast-growing personal computer (PC) sector.

SHADING

One of the most demanding aspects of computer graphics is to produce realistic shading. The processing required to achieve this is complex and involves the calculation of how an image should look when it is viewed in a variety of lighting conditions. There are several ways of producing an image with shading, each of which trades off the complexity of the software and the realism of the image produced. The basic form of shading, known as flat shading, assigns a color to each of the polygons that comprise an object and then adds effects to represent the position and intensity of a light source.
There are a number of more sophisticated forms of shading that can be used to produce more lifelike effects. One of the well-established techniques is Gouraud shading, invented by Henri Gouraud in 1971, which uses colour approximations to produce smoothly shaded images without requiring excessive time or processing power. A more sophisticated alternative is Phong shading, named after its creator Bui Tuong Phong, which takes a little more processing time and effort but which produces a more lifelike image. More sophisticated, and hence costly, shading techniques are available for the representation of bumpy or wrinkled surfaces, multiple reflections, and transparency.
The processing power required to produce computer graphics continues to grow and the technologies for displaying the images (such as plasma and liquid-crystal displays) are also evolving. It is little more that 40 years since the first computer graphics appeared but they have developed at such a rate that lifelike images can now be routinely included in computer games, commercial software packages, and feature films

THREE-DIMENSIONAL IMAGES

With the growing sophistication of computer equipment, the focus of computer graphics in the 1980s moved from the rendition of two-dimensional (2D) images to three-dimensional (3D) ones. The demands of a more complex image rendition meant that a lot more information now had to be stored on the computer. As well as rendering simple flat surfaces, three-dimensional objects require information on shading and other more subtle effects.
The basic concept that underpins all modern computer graphics is the three-dimensional polygon (a geometrical plane figure with three or more straight sides). Images are based on collections of polygons and this means that computer graphics software has to store the coordinates that define each of the polygons that comprise the 3D representation. The points, lines, and surfaces that define the polygons may be drawn or derived from measurements (for example, of a moving object or person).
The software that has been developed to render modern computer graphics does much more than simply store geometric information. The realistic images that people now expect to see require a lot more than the assembly of a massive collection of polygons into recognizable shapes. They also require techniques for shading, texturing, and rasterization of the images. The last of these, rasterization, involves the conversion of a vector-defined image into a series of pixels that can be rendered as a 3D image on a video display. Texturing is concerned with how surfaces look after being shaded, depending on the shading method, and how the image is interpreted during shading.

VECTOR AND RASTER GRAPHICS

The physical device used with the Sketchpad program to produce graphic images was the cathode ray tube (CRT), an essential element of the television set at that time. In order to produce images on the CRT screen that served as the computer’s monitor, there had to be some way of generating two-dimensional graphics from the computer’s output. The two basic approaches to this are vector and raster graphics. With the former precise geometric data, layout, and style is stored. This includes the coordinate positions of points, the connections between points (to form lines or paths), the color, and the thickness of the shapes. Most vector graphic systems also have a library of standard shapes such as circles and rectangles.
With raster graphics, a grid of picture elements, each with its own color and brightness is projected onto the screen. This grid is composed of a number of horizontal and vertical lines: a standard PC display has 1,024 of the former, 768 of the latter. At the intersection of each horizontal and vertical line is a point known as a pixel (or picture element) and it is the combination of all the pixels on the screen that define the image. It is possible to adjust the image resolution by having more lines (for higher resolution) or fewer (for lower resolution).

THE DEVELOPMENT OF COMPUTER GRAPHICS

Computer graphics and animation, now familiar through film and video games, are also widely used in science and industry for computer-aided design and the graphical representation of data. In films, highly sophisticated computer graphics are widely used in the creation of special effects. One of the earliest instances of this is the 1982 Disney film Tron, in which computer-generated imagery (CGI) was extensively used. A full 15 minutes of the film consists of moving images generated entirely by computer. In addition to this, there are over 200 scenes in the film that utilize computer-generated backgrounds. Since then, CGI has become such an integral part of the film world that several major movies have been based entirely on computer graphics.
Before the 1960s there was no such thing as computer graphics. At that time, computers operated in batch mode and there was no interaction between the user and the computer, other than the initial submission of a job and the final collection of the results. This situation changed radically in 1962 with the development of a software program called Sketchpad by Ivan Sutherland. This was the first instance of a user interface based on graphics and it paved the way for computers to be used as aids in the production of drawings and images.

Computer Graphics


INTRODUCTION
Computer Graphics, the generation of graphs, tables, and static or moving images by computer systems, usually for display on a monitor screen. The complexity of the graphic material depends on the software program being used and on the amount of computer memory available. It is now quite common for synthetically produced computer graphic images to alter or to be integrated with visual and spatial information sampled from the real world through techniques such as motion capture.

COMPUTER GAMES


While video-game systems are used solely for gaming, games are only one of the many uses for computers. In computer games, players can use a keyboard to type in commands or a mouse to move a cursor around the screen, or sometimes both. Many computer games also allow the use of a joystick or game controller.
Computer games were first developed in the mid-1970s, when computer scientists started to create text adventure games to be played over networks of computers at universities and research institutions. These games challenged players to reach a goal or perform a certain task, such as finding a magical jewel. At each stop along the way, the games described a situation and required the player to respond by typing in an action. Each action introduced a new situation to which the player had to react. One of the earliest such games was called Zork.
Beginning in the late 1970s, Zork and similar games were adapted for use on personal computers, which were just gaining popularity. As technology improved, programmers began to incorporate graphics into adventure games. Because relatively few people owned home computers, however, the market for computer games grew slowly until the mid-1980s. Then, more dynamic games such as Choplifter, a helicopter-adventure game produced by Broderbund, helped fuel rising sales of computers. In 1982 Microsoft Corporation released Flight Simulator, which allows players to mimic the experience of flying an aeroplane.
As the power of personal computers increased in the 1980s, more sophisticated games were developed. Some of the companies that produced the most popular games were Sierra On-Line, Electronic Arts, and Strategic Simulations, Inc. (SSI). A series of so-called Sim games, produced by Maxis, enabled players to create and manage cities (SimCity), biological systems (SimEarth), and other organizational structures. In the process, players learned about the relationships between the elements of the system. For example, in SimCity a player might increase the tax rate to raise money only to find that people move out of the city, thus decreasing the number of taxpayers and possibly offsetting the increase in revenue. An educational mystery game called Where in the World Is Carmen Sandiego?, by Broderbund, was introduced in the 1980s and aimed at children. The game tests players’ reasoning ability and general knowledge by requiring them to track down an elusive master criminal by compiling clues found around the world.
Computer games continued to gain popularity in the 1990s, with the introduction of more powerful and versatile personal computers and the growing use of computers in schools and homes. With the development of CD-ROM technology, games also integrated more graphics, sounds, and videos, making them more engaging for consumers. The most successful games of the 1990s included Doom (by Id Software) and Myst (by Broderbund). Doom is a violent action game in which the player is a marine charged with fighting evil creatures. In Myst, a player wanders through a fictional island world, attempting to solve puzzles and figure out clues. The game’s appeal comes from the process of exploration and discovery.
Many of the most recent games, often referred to as interactive films, employ live actors and some of the techniques of film-making. With the growth of the Internet in the mid-1990s, multiplayer gaming also became popular. In games played over the Internet—such as Ultima Online by Electronic Arts—dozens, hundreds, or even thousands of people can play a game at the same time. Players wander through a fictional world meeting not only computer-generated characters but also characters controlled by other players. By the end of the 1990s the future for new computer games seemed limitless.

VIDEO GAMES

Video-game consoles, small handheld game devices, and coin-operated arcade games are special computers built exclusively for playing games. To control the games, players can use joysticks, trackballs, buttons, steering wheels (for car-racing games), light guns, or specially designed controllers that include a joystick, direction pad, and several buttons or triggers. Goggles and other kinds of virtual reality headgear can provide three-dimensional effects in specialized games. These games attempt to give the player the experience of actually being in a jungle, the cockpit of an aeroplane, or another setting or situation.
The first video games, which consisted of little more than a few electronic circuits in a simplified computer, appeared in the early 1970s as coin-operated cabinet games in pubs and arcades. In 1972 the Atari Company introduced a game called Pong, based on table tennis. In Pong, a ball and paddles are represented by lights on the screen; the ball is set in motion and by blocking it with the paddles, players knock it back and forth across the screen until someone misses. Pong soon became the first successful commercial video game. Arcade games have remained popular ever since.
Also in 1972 the Magnavox Company introduced a home video-game machine called the Odyssey system. It used similar ball-and-paddle games in cartridge form, playable on a machine connected to a television. In 1977 Atari announced the release of its own home video-game machine, the Atari 2600. Many of the games played on the Atari system had originally been introduced as arcade games. The most famous included Space Invaders and Asteroids. In Space Invaders, players have to shoot down ranks of aliens as they march down the screen, while in Asteroids, users are required to destroy asteroids before they crash into their ship. The longer the player survives, the more difficult both games become. After Atari’s success with home versions of such games, other companies began to compete for shares of the fast-growing home video-game market. Major competitors included Coleco with its ColecoVision system, and Mattel with Intellivision. Some companies, particularly Activision, gained success solely by producing games for other companies’ video-game systems.
After several years of enormous growth, the home video-game business collapsed in 1983. The large number of games on offer confused consumers, and many video-game users were increasingly disappointed with the games they purchased. They soon stopped buying games altogether. Failing to see the danger the industry faced, the leading companies continued to spend a great deal of money on product development and advertising. Eventually, these companies ran out of money and left the video-game business.
Despite this decline, the arcade segment of the industry continued to thrive. Pac-Man, which appeared in arcades in 1980, was one of the major sensations of the time. In this game, players manoeuvre a button-shaped character with a large mouth around a maze full of tiny dots. The goal is to gobble up all the dots without being touched by one of four pursuing ghosts. Another popular game was Frogger, in which players try to guide a frog safely across a series of obstacles, including a busy road and a river.
In the mid-1980s, Nintendo, a Japanese company, introduced the Nintendo Entertainment System (NES). The NES started a new boom in home video games, due primarily to two game series: Super Mario Bros. and The Legend of Zelda. These and other games offered more advanced graphics and animation than earlier home video-game systems had, re-igniting the interest of game players. Once again, other companies joined the growing home video-game market. One of the most successful was Sega, also based in Japan. In the early 1990s, the rival video-game machines were Nintendo’s Super NES and Sega’s Genesis. These systems had impressive capabilities to produce realistic graphics, sound, and animation.
Throughout the 1990s Nintendo and Sega competed for dominance of the American home video-game market, and in 1995 another Japanese company, Sony, emerged as a strong competitor. Sega and Sony introduced new CD-ROM systems in 1995, the Sega Saturn and the Sony PlayStation. A year later, Nintendo met this challenge with the cartridge-based Nintendo 64 system, which has even greater processing power than its competitors, meaning that faster and more complex games can be created. In 1998 Sega withdrew the Saturn system from the US market because of low sales.Microsoft Corporation joined the games console market in November 2001 with the launch of the Xbox in the United States, followed by Japan and Europe by March 2002. Xbox is the first console to feature a hard drive and also has a DVD-drive, CD-quality sound, and high-quality graphics; its critics claim that it is not a true gaming machine but a cut-down personal computer. In March 2003 Xbox Live—a subscription-based service that enables owners of the console to play games online through a broadband connection—was launched. Nintendo’s GameCube, a competitively priced alternative for younger game players, was introduced at around the same time as the Xbox. Both hoped to compete with Sony’s PlayStation 2, which hit the market in 2000 and cornered games sales in the early part of the century. In the spring of 2005 Sony unveiled its latest device, the PlayStation 3 (PS3) (due to be released in early 2006), while Nintendo is to reveal its Revolution machine. Keeping apace, Microsoft launched the Xbox 360 in the autumn of 2005. This new generation of devices promises realistic video as part of the gaming element among other innovations

Electronic Games

INTRODUCTION
Electronic Games, software programs played for entertainment, challenge, or educational purposes. Electronic games are full of color, sound, realistic movement, and visual effects, and some even employ human actors. There are two broad classes of electronic games: video games, which are designed for specific video-game systems, handheld devices, and coin-operated arcade consoles; and computer games, which are played on personal computers.
Categories of electronic games include strategy games, sports games, adventure and exploration games, solitaire and multiplayer card games, puzzle games, fast-action arcade games, flying simulations, and versions of classic board games. Software programs that employ game-play elements to teach reading, writing, problem-solving, and other basic skills are commonly referred to as edutainment.
Electronic games put to use a variety of skills. Many games, such as Tetris and Pac-Man, serve as tests of hand-eye coordination. In these games the challenge is to play for as long as possible while the game gets faster or more complex. Other games, such as Super Mario Bros., are more sophisticated. They employ hand-eye coordination by challenging the player to react quickly to action on the screen, but they also test judgment and perseverance, sometimes by presenting puzzles that players must solve to move forward in the game. Strategy games ask players to make more complicated decisions that can influence the long-term course of the game. Electronic games can pit players against each other on the same terminal, across a local network, or via the Internet. Most games that require an opponent can also be played alone, however, with the computer taking on the role of the opponent.

AUTOMATION AND SOCIETY

Automation has made a major contribution towards increases in both free time and real wages enjoyed by most workers in industrialized nations. Automation has greatly increased production and lowered costs, thereby making cars, refrigerators, televisions, telephones, and other goods available to more people. It has allowed production and wages to increase, and at the same time the working week has decreased in most Western countries from 60 to 40 hours.

A. Employment
Not all the results of automation have been positive, however. Some commentators argue that automation has caused overproduction and waste, that it has created alienation among workers, and that it generates unemployment. Of these issues, the relationship between automation and unemployment has received the most attention. Employers and some economists argue that automation has little if any effect on unemployment—that workers are displaced rather than dismissed and are usually employed in another position within the same company or in the same position at another company that has not automated.
Some claim that automation generates more jobs than it displaces. They point out that although some laborers may become unemployed, the industry producing the automated machinery generates more jobs than were eliminated. The computer industry is often cited to illustrate this claim. Business executives would agree that although the computer has replaced many workers, the industry itself has generated more jobs in the manufacturing, sales, and maintenance of computers than the device has eliminated.
On the other hand, some labor leaders and economists argue that automation causes unemployment and, if left unchecked will breed a vast army of unemployed that could disrupt the entire economy. They contend that growth in government-generated jobs and in service industries has absorbed those who became unemployed due to automation and that as soon as these areas become saturated or the programmes reduced, the true relationship between automation and unemployment will become known.

B. Automation and the Individual
Many researchers have described the effect that Detroit automation has on the individual worker as one of alienation. Excessive absenteeism, poor workmanship, and problems of alcoholism, drug addiction, and sabotage of the production lines are well-documented symptoms of this alienation. Many studies have been made since the 1930s, and all conclude that much of the alienation is due to the workers' feelings of being controlled by the machine (because workers must keep pace with the assembly line), boredom caused by repetitious work, and the unchallenging nature of work that requires only a minimum of skill.
The number of workers in more automated industries, especially those using continuous flow processes, tends to be small, and the capital investment in equipment per worker is high. The most dramatic difference between these industries and those using Detroit automation is the reduction in the number of semi-skilled workers. It would appear then that automation has little use for unskilled or semi-skilled workers, their skills being the most easily replaced by automated devices. The labor force needed in an automated plant consists primarily of such skilled workers as maintenance engineers, electricians, and toolmakers, all of whom are necessary to keep the automated machinery in good operating order.

Automation Industry

Many industries are highly automated or use automation technology in some part of their operation. In communications and especially in the telephone industry, dialing, transmission, and billing are all done automatically. Railways are also controlled by automatic signaling devices, which have sensors that detect carriages passing a particular point. In this way the movement and location of trains can be monitored.
Not all industries require the same degree of automation. Agriculture, sales, and some service industries are difficult to automate. The agriculture industry may become more mechanized, especially in the processing and packaging of foods; however, in many service industries such as supermarkets, for example, a checkout counter may be automated and the shelves or supply bins must still be stocked by hand. Similarly, doctors may consult a computer to assist in diagnosis, but they must make the final decision and prescribe treatment.
The concept of automation is evolving rapidly, partly because the applications of automation techniques vary both within a plant or industry and also between industries. The oil and chemical industries, for example, have developed the continuous-flow method of production, owing to the nature of the raw materials used. In a refinery, crude oil enters at one point and flows continuously through pipes in cracking, distillation, and reaction devices as it is being processed into such products as petrol and fuel oil. An array of automatic-control devices governed by microprocessors and coordinated by a central computer is used to control valves, heaters, and other equipment, thereby regulating both the flow and reaction rates.
In the steel, beverage, and canned food industries, on the other hand, some of the products are produced in batches. For example, a steel furnace is charged (loaded with the ingredients), brought up to heat, and a batch of steel ingots produced. In this phase very little automation is evident. These ingots, however, may then be processed automatically into sheet or structural shapes by being squeezed through a series of rollers until the desired shape is achieved.
The car and other consumer product industries use the mass production techniques of step-by-step manufacture and assembly. This technique approximates the continuous-flow concept but involves transfer machines; thus, from the point of view of the auto industry, transfer machines are essential to the definition of automation.
Each of these industries uses automated machines in all or part of its manufacturing processes. As a result, each industry has a concept of automation that fits its particular production needs. More examples can be found in almost every phase of commerce. The widespread use of automation and its influence on daily life provides the basis for the concern expressed by many about the influence of automation on society and the individual.

COMPUTER USE

COMPUTER USE
The advent of the computer has greatly facilitated the use of feedback loops in manufacturing processes. Computers and feedback loops have promoted the development of numerically controlled machines (the motions of which are controlled by punched paper or magnetic tapes) and machining centers (machine tools that can perform several different machining operations).
More recently, the introduction of microprocessors and computer combinations have made possible the development of computer-aided design and computer-aided manufacture (CAD and CAM) technology. When using these systems a designer draws a part and indicates its dimensions with the aid of a mouse, light pen, or other input device. After the sketch has been completed to the satisfaction of the designer, the computer automatically generates the instructions that direct a machining centre to machine the part.
Another development that has further increased the use of automation is that of flexible manufacturing systems (FMS). FMS extends automation to companies in which small production runs do not make full automation economically feasible. A computer is used to monitor and govern the entire operation of the factory, from scheduling each step of production to keeping track of parts inventories and tool use.Automation has also had an influence on areas of the economy other than manufacturing. Small computers are used in systems called word processors, which are rapidly becoming a standard part of the modern office. This technology combines a small computer with a cathode-ray display screen, a typewriter keyboard, and a printer. It is used to edit text, to type form letters tailored to the recipient, and to manipulate mailing lists and other data. The system is capable of performing many other tasks that increase office productivity.

Feedback

Essential to all automatic-control mechanisms is the feedback principle, which enables a designer to endow a machine with the capacity for self-correction. A feedback loop is a mechanical, pneumatic, or electronic device that senses or measures a physical quantity such as position, temperature, size, or speed, compares it with a pre-established and takes whatever pre-programmed action is necessary to maintain the measured quantity within the limits of the acceptable standard.

The feedback principle has been used for centuries. An outstanding early example is the fly ball governor, invented in 1788 by the Scottish engineer James Watt to control the speed of the steam engine. In this device a pair of weighted balls is suspended from arms attached to a spindle, which is connected by gears to the output shaft of the engine. At the top of the spindle the arms are linked by a lever with a valve that regulates the steam input. As the engine speeds up beyond the desired rate, causing the spindle to rotate faster, the fly balls are driven upwards by the centrifugal effect. The action of the fly balls partly closes the input valve, reducing the amount of steam delivered to the engine. The common household thermostat is another example of a feedback device.
In manufacturing and production, feedback loops require that acceptable limits or tolerances be established for the process to be performed; that these physical characteristics be measured and compared with the set of limits; and, finally, that the feedback system be capable of correcting the process so that the measured items comply with the standard. Through feedback devices, machines can start, stop, speed up, slow down, count, inspect, test, compare, and measure. These operations are commonly applied to a wide variety of production operations that can include milling, boring, bottling, and refining.

ELEMENTS OF AUTOMATION

ELEMENTS OF AUTOMATION
Automated manufacture arose out of the intimate relationship of such economic forces and technical innovations as the division of labor, power transfer and the mechanization of the factory, and the development of transfer machines and feedback systems as explained below.
The division of labor (that is, the reduction of a manufacturing or service process into its smallest independent steps) developed in the latter half of the 18th century and was first discussed by the British economist Adam Smith in his book An Inquiry into the Nature and Causes of the Wealth of Nations (1776). In manufacturing, the division of labor results in increased production and a reduction in the level of skills required of workers.
Mechanization was the next step necessary in the development of automation. The simplification of work made possible by the division of labor also made it possible to design and build machines that duplicated the motions of the worker. As the technology of power transfer evolved, these specialized machines were motorized and their production efficiency was improved. The development of power technology also gave rise to the factory system of production, because all workers and machines had to be located near the power source.
The transfer machine is a device used to move a work piece from one specialized machine tool to another, in such a manner as to properly position the work piece for the next machining operation. Industrial robots, originally designed only to perform simple tasks in environments dangerous to human workers, are now extremely dexterous and are being used to transfer, manipulate, and position both light and heavy work pieces, thus performing all the functions of a transfer machine. In actual practice, a number of separate machines are integrated into what may be thought of as one large machine.
In the 1920s the auto industry combined these concepts into an integrated system of production. The goal of this assembly-line system was to make cars available to people who previously could not afford them. This method of production was adopted by most car manufacturers and rapidly became known as Detroit automation. Despite more recent advances, it is this system of production that most people think of as automation.

Automation

INTRODUCTION
Automation, system of manufacture designed to extend the capacity of machines to perform certain tasks formerly done by humans, and to control sequences of operations without human intervention. The term automation has also been used to describe nonmanufacturing systems in which programmed or automatic devices can operate independently or nearly independently of human control. In the fields of communications, aviation, and astronautics, for example, such devices as automatic telephone switching equipment, automatic pilots, and automated guidance and control systems are used to perform various operations much faster or better than could be accomplished by humans.

Future trends of e-publishing

Many industry commentators believe that the Internet in general and the World Wide Web in particular will come to play a central role in electronic publishing, although speed of access at present is too slow to allow multimedia applications to run acceptably on it. Digital Versatile Disk Read-Only Memory (DVD-ROM) is likely to be a successor to CD-ROM, its main advantage being increased storage capacity for space-intensive multimedia elements such as video and audio files. However, DVD-ROM development costs are likely to be even higher than those for CD-ROM, and there is no compelling reason to believe that DVD-ROM will do better in the marketplace than CD-ROM. A small number of manufacturers are now producing simple set-top boxes that allow consumers to use their televisions for access to the Internet, but the convergence between broadcasting personal computers, and the telephone that has been predicted for some while has yet to come about.
Although it has been suggested by both critics and advocates of electronic publishing that it might lead to the demise of the printed book, this seems unlikely, at least in the near future. At present there are no significant advantages to reading a novel or magazine in electronic format, and these types of publication are less expensive to produce in traditional print format. The market for hand-held devices that make electronic text more portable has never really developed, probably because books already offer readers portability and ease of use at a relatively low cost. It seems most likely that both electronic and print publications will continue to exist side by side well into the next millennium.

MARKETS for E-Publishing

Online services offering databases of specialist business, scientific, and legal information on subscription made up the early professional market for electronic publication. Such databases are still heavily used, but are now mostly available on CD-ROM or on the World Wide Web. Business news services are now also part of the professional market for electronically published information, with subscribers receiving only those news stories relevant to their interests.
The consumer market for electronic publications is coming to be dominated by the World Wide Web. Many newspapers now have Web sites where news can be updated hourly or more frequently, although these sites are not yet very profitable, since their advertising revenue is not nearly as substantial as revenue generated from traditional print advertising. Consumer reference products such as encyclopedias and to some extent dictionaries, atlases, and other reference works make up a large part of the CD-ROM market and some may in time be accessible over the World Wide Web. These products are more marketable in electronic format than other types of book because they benefit more from the kind of automated searching and incorporation of multimedia elements (such as video and animation) that software allows. Educational publishing has also benefited from CD-ROM delivery for some of its products, as multimedia content is both attractive to students and can help them to understand complex concepts.
Multimedia games on CD-ROM are also a substantial part of the consumer market. These allow users to interact with characters and participate in adventures in virtual worlds that are often intricately designed and very complex. Multimedia games are usually not based on existing games but have largely grown out of the potential to manipulate images, video, and sound made possible by increasingly powerful personal computers.

ELECTRONIC PUBLISHING AND TRADITIONAL PRINT PUBLISHING

The move to electronic publishing for a traditional print-based publisher is not an easy one. Although most publishers own the rights to a great deal of text and images, these are often not held in digital format and have to be changed into machine-readable form before being published electronically. Text and images must then be “marked up” with some form of tagging, so that they can be accessed and manipulated by software. Any existing digital printer’s tapes that a publisher might hold also have to be converted, often laboriously, to a format that electronic-publishing software can use. If software has to be developed to display the text and images (for a multimedia product, for example), the cost to the publisher can be considerable. All this means that electronic publications are much more expensive to produce than print publications. However, manufacture of a CD-ROM is much less expensive than the manufacture of a book or journal, and once software has been developed it can sometimes be reused for other electronic products, so the high initial costs of electronic publication can eventually be offset.
Some publishers, especially science and journal publishers, now publish information simultaneously in print and electronic formats, or in electronic format alone. This allows them to originate text and images in digital format, with tagging that can be used for both print and electronic delivery. The speed of electronic publication over the Internet or through proprietary online services can be very valuable for scientific journal publications, since publishers can deliver rapidly changing data daily or even more frequently to subscribers. It is now so easy to publish on the World Wide Web that some researchers have even set up journals of their own, bypassing the traditional publishing process altogether. The quality of such publications can be an issue, however, since the editorial procedures that traditional publishers use to establish accuracy may also be bypassed.
Copyright is of greater concern for electronic publishers than for print publishers, since it is very easy for users to make digital copies of electronic publications and distribute them, or sell them, to other people. To address this issue, copyright law is being revised at both the national and international level to take electronic publication into account. Several systems exist, or are currently being developed, that allow publishers to encrypt their content in ways that will allow only authorized users to access it.

History of Electronic Publishing

HISTORY
Commercial electronic publishing began in the mid-1970s with the development of databases of scientific, legal, and business information. These were used mainly by information specialists in large companies and universities, and ran on mainframe computers or minicomputers. Few small businesses or people at home used any form of electronic publication, since the information itself was too specialized, and the software and hardware needed to access it were too expensive.
The consumer market for electronic publishing really began with the development of GUIs and the CD-ROM in the late 1980s. Some industry commentators expected that CD-ROM publishing would revolutionize the publishing industry but the market for CD-ROMs has been slow to develop. With the exception of encyclopedia and journal publishing, the changes to the publishing industry brought about by electronic publishing have been fairly limited so far. Some publishers who adopted the new technology as soon as it appeared found that the market was too small, and consequently they had to scale down their electronic-publishing activities. Publishers who were initially too conservative, however, have found it hard to catch up in such areas as reference, where electronic publishing could have benefited them.

Electronic Publishing

INTRODUCTION
Electronic Publishing, the distribution of information and entertainment in digital format, usually including software that allows users to interact with text and images. Most forms of information can be published electronically, but users normally require a personal computer and sometimes a connection to a network or the Internet to access the information. The advent of graphical user interfaces (GUIs) in the late 1980s made electronically published information much more marketable than it had been previously. This, along with more widespread availability of CD-ROM drives and intense interest in the potential of the Internet, has turned electronic publishing into a mass-market industry after years of being limited to specialist information.

FUTURE TRENDS IN TECHNOLOGY AND LEARNING

FUTURE TRENDS IN TECHNOLOGY AND LEARNING
Radical technological developments in miniaturization, electronic communications and multimedia hold the promise of affordable, truly personal & mobile computing. The move to digital data is blurring the boundary between broadcasting, publishing, and telephony by making all these media available through computer networks and computerized televisions These developments are not only giving learners access to vast libraries and multimedia resources, but also live access to tutors and natural phenomena throughout the world.
As technology provides easier access for students to material previously supplied by the teacher, it enhances the role of the teacher as manager of the learning process rather than source of the content. Easier access for students to information, tutorials, and assessment, together with the use of IT tools such as word processors and spreadsheets, will help them learn more productively. There will be a clear split in the way schools and colleges organize learning. In areas of the curriculum that are structured and transferable to electronic format, students will work at different levels and on different content. By removing the burden of individualized learning from schools and colleges, time will be freed for teachers to concentrate on the many other learning activities requiring a teacher as catalyst.
Developments in communications technology and the increase in personal ownership of technology will allow learning in schools and colleges to integrate with learning elsewhere. The boundaries between one institution and another and between institutions and the outside world will become less important. Crucially, technology will remove the barrier between school and home.
The momentum of the technological revolution creates rapid and disruptive changes in the way in which people live, work, and play. As the pace of technological advance shows no sign of slowing, the challenge is in learning to adapt to changes with the minimum of physical and mental stress. To make this possible, the learning systems and those who manage them must prepare people to work with new technologies competently and confidently. They need to expect and embrace constant change to skill requirements and work patterns, making learning a natural lifelong process.
However disturbing this challenge may at first seem, the nature of technology is that it not only poses problems but also offers solutions—constantly creating opportunities and providing new and creative solutions to the process of living and learning.

IT Education in various other parts of the world

A. IT IN EDUCATION IN THE UNITED STATES
Education in the United States is organized at state and school district level, but significant funding for IT in schools is provided through federal programmes. While all schools make some use of computers, the level of that use varies widely. Between 15 and 20 per cent of schools make extensive use of integrated learning systems. Multimedia computers have been used by some schools to develop pupils’ skills in producing essays containing text, sound, and still and moving images. The proposed extension of electronic communications systems, such as the Internet to all “K-12” schools (kindergarten to grade 12, that is, up to age 18), has given rise to a number of pilots investigating how the education system could capitalize on the opportunities offered.
In his paper of February 23, 1993, Technology for America’s Growth, President Clinton declared that in teaching there should be an emphasis on high performance. He announced new public investment to support technology with the aim of increasing the productivity of teaching and learning in schools.

B. IT IN EDUCATION IN AUSTRALIA
In Australia the range and quality of IT-supported learning is comparable to that in Britain. A number of technology-led initiatives have been funded by federal and state departments. The federal government has identified the emerging information age as a major opportunity for Australian industry and society in general. A national strategy has been announced that is to explore ways of networking schools and colleges.

C. IT IN EDUCATION IN CANADA
Each individual provincial government in Canada has responsibility for running its schools, colleges, and universities. Although these may vary in their approach to education, they are all making substantial investments in IT. In particular, they are developing their use of communication technologies to support their school, college, and university systems. Provinces such as British Columbia and Nova Scotia have invested in extensive networks, which offer distance-learning programmes to overcome geographical barriers and to develop school and community use of technology. National involvement in this and the development of a national “Schoolnet” network is supported through the federal department of “Industry Canada”.

DEVELOPMENT OF A STRATEGY FOR IT IN THE UNITED KINGDOM

From the early days of computers, the United Kingdom has recognized the need to develop a national strategy for the use of IT in education. England, Wales, Scotland, and Northern Ireland have developed separate but similar plans. The IT strategy for schools was initially developed in England, Wales, and Northern Ireland through the government-funded Microelectronics Education Programme, which had a research and development role from 1981 to 1986. Then followed the Microelectronics Education Support Unit, which provided professional support to local education authorities (LEAs). This merged in 1988 with the Council for Educational Technology to become the National Council for Educational Technology (NCET), with the wider remit of evaluating and promoting the use of new technologies in education and training. Scotland set up the Scottish Council for Educational Technology (SCET) to support developments for Scottish schools. NCET was a registered charity, funded primarily by the Department for Education and Employment (DfEE, retitled as the Department for Education and Skills or DfES after the 2001 general election). In April 1998 it was given a new role as the British Educational Communications and Technology Agency (BECTA).
In 1988 the Conservative government set up the Information Technology in Schools (ITIS) initiative to oversee expenditure in this area. Initial strategy focused on encouraging teacher training in new technologies and the provision of hardware in schools. Grants were made to LEAs; before obtaining the grant, each LEA was required to produce a policy statement and a five-year plan for the development of IT in its area. Different but similar initiatives were developed in Scotland, Wales, and Northern Ireland, with the general aim of stimulating schools and local authorities to support curriculum and management use of IT. Of substantial importance across the whole of the United Kingdom was the inclusion of IT as an essential component of the national curriculum for every student aged 5 to 16. The curriculum identifies a core set of IT capabilities and stresses that these should be developed by applying them across subject areas.
Grants to schools for IT development in England ceased in 1994. From 1994 to 1997 government strategy was based on providing information and advice to schools and stimulating the purchase of newer technologies. Following legislation in 1988, schools and colleges became increasingly autonomous in making their own purchasing and staffing decisions. The government was concerned to ensure the growth of viable and appropriate commercial markets for new IT products for schools and colleges. Through a number of NCET-managed intervention strategies, it stimulated specific areas, for example, the introduction of CD-ROMs in schools.
Between 1991 and 1995 some £12 million of government funding was made available through NCET for the purchase of CD-ROM systems by schools and the development of curriculum materials. This strategy resulted in over 90 per cent of secondary schools and more than 30 per cent of primary schools in England having access to CD-ROM systems, and in the development of an independent market for CD-ROM hardware and software for schools. Similar initiatives of varying scales and technologies, including portable computers for teachers, communications technologies, multimedia desktop computers, satellite technologies and integrated learning systems and libraries, have all contributed to keeping UK schools up to date with changes in technology. Research conducted by NCET showed clearly that IT changes what people learn and how they learn it.
After Labor came to power in May 1997, there was a marked change in government strategy in information and communications technology (ICT) for schools and colleges. Much of this strategy was based on developing a National Grid for Learning (NGfL). The concept was of a mosaic of networks and content providers linked together to create a nationwide learning network for schools, colleges, libraries, and, eventually, homes. To achieve this the government set targets for the year 2002: all schools should be connected to the NGfL, all teachers should be competent and confident to teach with ICT, all students should leave school with a good understanding of ICT, and all transactions between central and local government and schools will be electronic. To support this strategy the DfEE provided £50 million in 1998-1999, matched by LEAs and schools. Scotland and Northern Ireland developed similar initiatives and by the end of 1999 all 1,300 schools in Northern Ireland were linked to the Internet. Plans were also made for substantial (costing over £200 million) teacher-training programmes across the United Kingdom; the scheme was run by the New Opportunities Fund (NOF), which reported in 2000 that, to that date, almost half of the teachers in England had registered for ICT training.

A. IT in Schools
In 1996 there was an average of 96 computers per secondary school and 13 per primary school in England, for example. Expenditure on IT by schools steadily increased from £20 million per year in 1984 to £132 million in 1994, with well over half coming from schools’ budgets and the rest from central and local government sources. Despite this positive picture, hardware provision is variable, with some schools having a computer-to-pupil ratio of 1 to 3, while others have a ratio of 1 to 60. The average computer-to-pupil ratio in 1995-1996 was 1 to 19 in primary schools, and 1 to 9 in secondary schools. LEAs were set a target for the year 2001-2002 of 1 computer to 11 pupils in primary schools, and 1 to 7 in secondary schools.

B. IT in Further Education
The provision of hardware and software resources varies substantially in further education (FE) colleges. Learning resource centers now often contain learning materials published on CD-ROM, and most colleges are connected to the Internet. These technologies have the potential to develop “virtual campuses” and thus increase student access and participation. Although there is a trend towards individualized programmes of study for students, little use is made as yet of computer-managed learning. A programme of training in educational technology for FE staff called the Quilt initiative was launched in February 1997 as a joint initiative between NCET, the Further Education Development Agency, the DfEE, and FE colleges.

C. IT in Higher Education
All UK universities are connected to the Internet via the academic network known as JANET. A high-speed broadband version of this network, SuperJANET, is being developed. It currently links 60 universities and enables high-quality moving video to be networked for remote teaching and research purposes. In 1993, through the Teaching and Learning Technology Programme, the Higher Education Funding Council provided over £11 million for 76 projects to develop software materials to support the university curriculum. Use of such materials is encouraged by 20 university centres set up under the Computers in Teaching Initiative. The use of the Internet and CD-ROM to access information continues to grow. In 2000 the Higher Education Funding Council for England (HEFCE) announced a new project, the 'e-University', to develop web-based learning for higher education institutions.

D. IT in Training
In 1994 research by a group called Benchmark found that use of computer-based training in public and private organizations in the United Kingdom had grown from 29 per cent in 1991 to 60 per cent in 1994. The use of other educational technologies was also evident: 12 per cent using interactive video, 6 per cent CD-I (compact disc-interactive), and 6 per cent CD-ROM.

III. IT AND THE CURRICULUM
As part of the IT curriculum, learners are encouraged to regard computers as tools to be used in all aspects of their studies. In particular, they need to make use of the new multimedia technologies to communicate ideas, describe projects, and order information in their work. This requires them to select the medium best suited to conveying their message, to structure information in a hierarchical manner, and to link together information to produce a multidimensional document.
In addition to being a subject in its own right, IT has an impact on most other curriculum areas, since the National Curriculum requires all school pupils from 5 to 16 years to use IT in every compulsory subject. Science uses computers with sensors for logging and handling data; mathematics uses IT in modeling, geometry, and algebra; in design and technology, computers contribute to the pre-manufacture stages; for modern languages, electronic communications give access to foreign broadcasts and other materials; and in music, computers enable pupils to compose and perform without having to learn to play traditional instruments. For those with special educational needs, IT provides access to mainstream materials and enables students to express their thoughts in words, designs, and activities despite their disabilities.

IV. IT AND LEARNING PRODUCTIVITY
Using IT, learners can absorb more information and take less time to do so. Projects investigating the use of IT in learning demonstrate increased motivation in children and adults alike. In some cases it can mean success for people who have previously always failed. Learners may be more productive, challenge themselves more, be bolder, and have more confidence.

V. INTEGRATED LEARNING SYSTEMS
Another use of IT in learning is currently undergoing trials in the United Kingdom: integrated learning systems (ILS). These involve learning through rather than about IT, by providing structured, individualized tuition in numeracy and literacy. Using the system for short, regular sessions, learners progress through the programme at a steady but challenging rate. The system keeps a progress record, assesses the learner’s rate of performance, and produces reports for teachers, learners, and parents. This approach provides highly structured, targeted, and assessed learning for short periods of time.
Pupils and teachers alike find the individual ILS reports helpful and motivating, and teachers have never before had such detailed and accurate analysis of children’s abilities. The learning gains demonstrated so far have been encouraging. The multimedia attributes of the system make it possible to demonstrate complex concepts, and students can proceed at their own pace free from the pressure of their peers. Similar trials are taking place in Australia, Israel, and New Zealand. Integrated learning systems are used extensively in the United States.

VI. COMMUNICATION TECHNOLOGIES IN EDUCATION AND TRAINING
The use of communication tools such as e-mail, fax, computer, and videoconferencing overcomes barriers of space and time, and opens new possibilities for learning. The use of such technology is increasing, and it is now possible to deliver training to a widely dispersed audience by means of on-demand two-way video over terrestrial broadband networks. The vocational training sector has been supported in developments in this area in Britain by projects funded by the Education and Employment Department and the European Commission’s Socrates and Leonardo programmes. Many schools have gained experience of communications through e-mail and electronic conferencing systems that run over the telephone network. The Education Department’s Superhighways Initiative comprises 25 projects—involving over 1,000 UK schools and colleges—that focus on the application of electronic communications in schools and colleges.
Schools and colleges are making increasing use of the Internet. In 1997 all FE colleges, most secondary schools, and some primary schools had access to the Internet but it was expected that all schools would be online by 2002. Schools use the Internet both to access materials, people, and resources, and to display their own Web pages created by teachers and students. The use of videoconferencing is growing slowly and has helped some students learn foreign languages by talking directly to other students abroad. In January 2000, it was announced that teachers taking part in information technology training schemes would receive a subsidy of up to £500 to buy computing equipment.

A. Computer Based Management Information Systems
Following a government initiative with LEAs in 1987, schools have made increasing use of computers for administration. The 1988 Education Act gave schools the responsibility for budgets, teacher and pupil records, and many other day-to-day administrative tasks. Many LEAs integrated their schools’ administrative systems with their own financial systems and provided extensive training and support for this. Between 1987 and 1997 schools and LEAs spent over £600 million on equipment and support. This has led to increasingly sophisticated uses of computer-based management information systems (CMIS), and the trend continues as communication technologies offer the opportunity for schools, LEAs, and government to exchange and compare data easily.

Information Technology in Education


INTRODUCTION
Information Technology in Education, effects of the continuing developments in information technology (IT) on education.The pace of change brought about by new technologies has had a significant effect on the way people live, work, and play worldwide. New and emerging technologies challenge the traditional process of teaching and learning, and the way education is managed. Information technology, while an important area of study in its own right, is having a major impact across all curriculum areas. Easy worldwide communication provides instant access to a vast array of data, challenging assimilation and assessment skills. Rapid communication, plus increased access to IT in the home, at work, and in educational establishments, could mean that learning becomes a truly lifelong activity—an activity in which the pace of technological change forces constant evaluation of the learning process itself.

PROBLEMS OF THE INFORMATION REVOLUTION

PROBLEMS OF THE INFORMATION REVOLUTION
Public transport was challenged by the shift to cars, and familiar aspects of such public services as education and health could be challenged in the information revolution. Wider concerns over changing relations between private and public activities are reflected in debates about potential or already emerging problems, such as: Threats to privacy (unauthorized access to personal data, increasing surveillance of public spaces by security cameras, etc.) The alleged growth of “privatism” (a decline in shared activities as individuals pursue their own interests in isolation) The scope for participation. (How far can people have a say in the use of IT in ways that affect them? Do new media support the emergence of new social movements, and of new forms of interest groups, lobbying, and mobilization?) Questions about the ownership of knowledge. (Who has the right to charge for what kinds of information? Should large parts of the media be controlled by a few large companies? What is the scope for freedom of information to be practised by government?)
All this is in addition to the problems of information inequalities mentioned earlier. The information revolution fundamentally involves a change in the role of information-processing in our society. It is not surprising that fundamental questions are being raised about access to, and the use of, intimate and valuable sorts of information.
The outcome of the information revolution depends on social action and social choices as well as on technological developments. Just as industrial societies around the world take various forms, and there are very different ways of life within them, so it is likely that there will be a wide range of information societies. However, as new IT permits more global communication, and more firms expand into global markets, there are also strong forces at work to share elements of different cultures around the world on an unprecedented scale.

INFORMATION TECHNOLOGY AND THE CONSUMER

INFORMATION TECHNOLOGY AND THE CONSUMER
At different rates IT is diffusing into the home. The implications of consumer innovations can be substantial. Widespread use of cars facilitated new ways of life, with a growth of suburban living and out-of-town shopping centres, and a decline of train and bus services. The expansion of consumer IT is associated with changes in ways of working (for example, telework), playing (new home entertainment systems), shopping (teleshopping), and learning (multimedia products of various sorts, such as this encyclopedia).IT can be used in monitoring body conditions (digital thermometers, pulse meters, and blood-pressure meters are available), and in providing health and lifestyle monitoring and advice (recommending exercise levels, medical check-ups, or diets). Telephone helplines have long offered advice, counselling, and medical services; these and many other services are beginning, sometimes in rudimentary form, to be provided on the Internet.

TRENDS IN EMPLOYMENT

TRENDS IN EMPLOYMENT
The tendency to fit a new technology into established structures, rather than to start afresh every time, has often been documented. It is one reason for the absence of the huge office job losses that were being predicted in the late 1970s and early 1980s, when word processing first began to be taken up on a large scale. However, this is no reason to assume that existing structures will endure. Industrial interest in new forms of organization, such as novel management structures, coordination of activities over large distances by means of telecommunications, teleworking, and other forms of distance working, indicates willingness to consider change.
The “hollow firm” is one effort to gain flexibility. The company attempts to dispense with the direct ownership and operation of many facilities that would traditionally have belonged to it, instead outsourcing production, distribution, and other tasks to other firms. Many computer companies, for example, buy in many or most of their components from specialist suppliers, and some firms do little more than design the computer for others to assemble.
A related idea is “de-layering”, or “flattening”, in which the company tries to do away with the numerous layers of middle management and administration that have traditionally processed information, and communication flows between the senior staff and the shop floor or fieldworkers. New information systems are typically used to allow rapid communication across a reduced number of organizational layers.
By the late 1990s the integration of office IT is becoming apparent: material is increasingly exchanged by e-mail (which has finally established itself); many professionals use personal computers directly, often at home and while travelling, as well as in the office; and increasingly, personal computers are networked. Whether a loss of clerical jobs will result remains much debated. Some commentators point to job losses in office-based sectors such as financial services, which use IT intensively, as a harbinger of things to come. Others argue that the unemployment problems of industrial societies are related more to political and economic changes than to the use of new technology. Indeed, new information-related services are emerging, creating new jobs. While some office jobs may have gone, some other traditional clerical jobs have been upgraded to involve new functions made possible with new IT, such as desktop publishing, database management, and customer services.
A similar debate has concerned the quality of working life—whether skills have been enhanced or reduced, and whether working conditions have been improved or degraded, in the information revolution. Evidence to date indicates a mixed picture. There are certainly some areas in which conditions have worsened and skills have been lost, and many low-skill jobs have been created—for example, in cooking and serving fast food. Yet there is also a tendency for more jobs to be upgraded, and new technical skills and skill combinations are in demand. Large-scale deskilling has not taken place.Polarization of the workforce in terms of quality of work and levels of wages has ensued; at the same time a gulf has been opening between employed and unemployed people. Whether this is a result of the information revolution, or of more or less coincidental economic and political factors, the threat is evident of a widening social gulf between the “information-rich” and the “information-poor”. The former have information-processing skills, access to advanced technologies in their work, and the money to invest in IT at home for themselves and their children; the latter do not.

THE DIRECTION OF THE INFORMATION REVOLUTION

THE DIRECTION OF THE INFORMATION REVOLUTION
The outcome of the information revolution is seen by some commentators as likely to be as profound as the shift from agricultural to industrial society. Others see the transformation as essentially a change from one form of industrial society to another, as has happened in earlier technological revolutions.
One major issue is how rapidly social institutions adapt to take advantage of the new ways of doing things that new IT makes possible. While some jobs and some areas of people’s lives do seem to have changed rapidly, many others appear to have been affected relatively little. Historians point out that it can take a very long time for what in retrospect seems the obvious way to use a technology to become standard practice. For example, electric motors were first used as if they were steam engines, with one centralized motor powering numerous devices, rather than numerous small motors, each powering its own appliance.
New IT has often been introduced into well-established patterns of working and living without radically altering them. For example, the traditional office, with secretaries working at keyboards and notes being written on paper and manually exchanged, has remained remarkably stable, even if personal computers have replaced typewriters.Often the technology that gains acceptance is that which most easily fits within traditional ways of doing things. For example, the fax machine, which could take hand-written or typed notes, and was often delegated to a secretary to use, was hugely successful in the 1980s. At the beginning of that decade, it had been predicted that fax would rapidly die out, and e-mail would take its place; but this proved to involve too much organizational change.

SOCIAL AND TECHNOLOGICAL DEVELOPMENTS

SOCIAL AND TECHNOLOGICAL DEVELOPMENTS
First, there are social and organizational changes. Information-processing has become increasingly visible and important in economic, social, and political life. One familiar piece of evidence is the statistical growth of occupations specializing in information activities. Numerous studies have demonstrated substantial growth in information-based occupations. These occupations now take the largest share of employment in the United States, the United Kingdom, and many other industrial societies. The biggest category is information processors—mainly office workers—followed by information producers, distributors, and infrastructure workers.
Second, there is technological change. The new information technology (IT) based on microelectronics, together with other innovations such as optical discs and fibre optics, underpins huge increases in the power, and decreases in the costs, of all sorts of information-processing. (The term “information-processing” covers the generation, storage, transmission, manipulation, and display of information, including numerical, textual, audio, and video data.) The information-processing aspects of all work can be reshaped through IT, so the revolution is not limited to information occupations: for example, industrial robots change the nature of factory work.
Computing and telecommunications (and also such areas as broadcasting and publishing) used to be quite distinct industries, involving distinct technologies. Now they have converged around certain key activities, such as use of the Internet. Using the same underlying technologies, modern computing and telecommunication devices handle data in digital form. Such data can be shared between, and processed by, many different devices and media, and used in a vast range of information-processing activities.
The pace of adoption of new IT has been very speedy: it is markedly more rapid than that of earlier revolutionary technologies, such as the steam engine or electric motor. Within 25 years of the invention of the microprocessor, it had become commonplace in practically every workplace and many homes: present not only in computers, but also in a huge variety of other devices, from telephones and television sets to washing machines and children’s toys.

Information Revolution

INTRODUCTION
Information Revolution, fundamental changes in the production and use of information, occurring in the late 20th century. Human societies throughout history have had “information specialists” (from traditional healers to newspaper editors); and they have had “information technologies” (from cave painting to accountancy); but two interrelated developments, social and technological, underpin the diagnosis that an information revolution is now occurring.

Wednesday, June 2, 2010

A brief overview of Information Technology-Advantages and Disadvantages

Our world today has changed a great deal with the aid of information technology. Things that were once done manually or by hand have now become computerized operating systems, which simply require a single click of a mouse to get a task completed. With the aid of IT we are not only able to stream line our business processes but we are also able to get constant information in 'real time' that is up to the minute and up to date.The significance of IT can be seen from the fact that it has penetrated almost every aspect of our daily lives from business to leisure and even society. Today personal PCs, cell phones, fax machines, pagers, email and internet have all not only become an integral part of our very culture but also play an essential role in our day to day activities. With such a wide scope for the purpose of this article we shall focus on the impact of the internet in information technology.

Some of the advantages of information technology include:

Globalization - IT has not only brought the world closer together, but it has allowed the world's economy to become a single interdependent system. This means that we can not only share information quickly and efficiently, but we can also bring down barriers of linguistic and geographic boundaries. The world has developed into a global village due to the help of information technology allowing countries like Chile and Japan who are not only separated by distance but also by language to shares ideas and information with each other.

Communication - With the help of information technology, communication has also become cheaper, quicker, and more efficient. We can now communicate with anyone around the globe by simply text messaging them or sending them an email for an almost instantaneous response. The internet has also opened up face to face direct communication from different parts of the world thanks to the helps of video conferencing.

Cost effectiveness - Information technology has helped to computerize the business process thus streamlining businesses to make them extremely cost effective money making machines. This in turn increases productivity which ultimately gives rise to profits that means better pay and less strenuous working conditions.

Bridging the cultural gap - Information technology has helped to bridge the cultural gap by helping people from different cultures to communicate with one another, and allow for the exchange of views and ideas, thus increasing awareness and reducing prejudice.

More time - IT has made it possible for businesses to be open 24 x7 all over the globe. This means that a business can be open anytime anywhere, making purchases from different countries easier and more convenient. It also means that you can have your goods delivered right to your doorstep with having to move a single muscle.

Creation of new jobs - Probably the best advantage of information technology is the creation of new and interesting jobs. Computer programmers, Systems analyzers, Hardware and Software developers and Web designers are just some of the many new employment opportunities created with the help of IT.

Some disadvantages of information technology include
Unemployment - While information technology may have streamlined the business process it has also crated job redundancies, downsizing and outsourcing. This means that a lot of lower and middle level jobs have been done away with causing more people to become unemployed.

Privacy - Though information technology may have made communication quicker, easier and more convenient, it has also bought along privacy issues. From cell phone signal interceptions to email hacking, people are now worried about their once private information becoming public knowledge.

Lack of job security - Industry experts believe that the internet has made job security a big issue as since technology keeps on changing with each day. This means that one has to be in a constant learning mode, if he or she wishes for their job to be secure.

Dominant culture - While information technology may have made the world a global village, it has also contributed to one culture dominating another weaker one. For example it is now argued that US influences how most young teenagers all over the world now act, dress and behave. Languages too have become overshadowed, with English becoming the primary mode of communication for business and everything else.

Information Technology & Artificial Intelligence


Artificial Intellegince (AI)-Bringing Common Sense, Expert Knowledge, and Superhuman Reasoning to Computers
Artificial Intelligence (AI) is the key technology in many of today's novel applications, ranging from banking systems that detect attempted credit card fraud, to telephone systems that understand speech, to software systems that notice when you're having problems and offer appropriate advice. These technologies would not exist today without the sustained federal support of fundamental AI research over the past three decades.
Although there are some fairly pure applications of AI -- such as industrial robots, or the IntellipathTM pathology diagnosis system recently approved by the American Medical Association and deployed in hundreds of hospitals worldwide -- for the most part, AI does not produce stand-alone systems, but instead adds knowledge and reasoning to existing applications, databases, and environments, to make them friendlier, smarter, and more sensitive to user behavior and changes in their environments. The AI portion of an application (e.g., a logical inference or learning module) is generally a large system, dependent on a substantial infrastructure. Industrial R&D, with its relatively short time-horizons, could not have justified work of the type and scale that has been required to build the foundation for the civilian and military successes that AI enjoys today. And beyond the myriad of currently deployed applications, ongoing efforts that draw upon these decades of federally-sponsored fundamental research point towards even more impressive future capabilities:
Autonomous vehicles: A DARPA-funded onboard computer system from Carnegie Mellon University drove a van all but 52 of the 2849 miles from Washington, DC to San Diego, averaging 63 miles per hour day and night, rain or shine;
Computer chess: Deep Blue, a chess computer built by IBM researchers, defeated world champion Gary Kasparov in a landmark performance;
Mathematical theorem proving: A computer system at Argonne National Laboratories proved a long-standing mathematical conjecture about algebra using a method that would be considered creative if done by humans;
Scientific classification: A NASA system learned to classify very faint signals as either stars or galaxies with superhuman accuracy, by studying examples classified by experts;
Advanced user interfaces: PEGASUS is a spoken language interface connected to the American Airlines EAASY SABRE reservation system, which allows subscribers to obtain flight information and make flight reservations via a large, on-line, dynamic database, accessed through their personal computer over the telephone.

In a 1977 article, the late AI pioneer Allen Newell foresaw a time when the entire man-made world would be permeated by systems that cushioned us from dangers and increased our abilities: smart vehicles, roads, bridges, homes, offices, appliances, even clothes. Systems built around AI components will increasingly monitor financial transactions, predict physical phenomena and economic trends, control regional transportation systems, and plan military and industrial operations. Basic research on common sense reasoning, representing knowledge, perception, learning, and planning is advancing rapidly, and will lead to smarter versions of current applications and to entirely new applications. As computers become ever cheaper, smaller, and more powerful, AI capabilities will spread into nearly all industrial, governmental, and consumer applications.
Moreover, AI has a long history of producing valuable spin-off technologies. AI researchers tend to look very far ahead, crafting powerful tools to help achieve the daunting tasks of building intelligent systems. Laboratories whose focus was AI first conceived and demonstrated such well-known technologies as the mouse, time-sharing, high-level symbolic programming languages (Lisp, Prolog, Scheme), computer graphics, the graphical user interface (GUI), computer games, the laser printer, object-oriented programming, the personal computer, email, hypertext, symbolic mathematics systems (Macsyma, Mathematica, Maple, Derive), and, most recently, the software agents which are now popular on the World Wide Web. There is every reason to believe that AI will continue to produce such spin-off technologies.
Intellectually, AI depends on a broad intercourse with computing disciplines and with fields outside computer science, including logic, psychology, linguistics, philosophy, neuroscience, mechanical engineering, statistics, economics, and control theory, among others. This breadth has been necessitated by the grandness of the dual challenges facing AI: creating mechanical intelligence and understanding the information basis of its human counterpart. AI problems are extremely difficult, far more difficult than was imagined when the field was founded. However, as much as AI has borrowed from many fields, it has returned the favor: through its interdisciplinary relationships, AI functions as a channel of ideas between computing and other fields, ideas that have profoundly changed those fields. For example, basic notions of computation such as memory and computational complexity play a critical role in cognitive psychology, and AI theories of knowledge representation and search have reshaped portions of philosophy, linguistics, mechanical engineering and, control theory.
Historical PerspectiveEarly work in AI focused on using cognitive and biological models to simulate and explain human information processing skills, on "logical" systems that perform common-sense and expert reasoning, and on robots that perceive and interact with their environment. This early work was spurred by visionary funding from the Defense Advanced Research Projects Agency (DARPA) and Office of Naval Research (ONR), which began on a large scale in the early 1960's and continues to this day. Basic AI research support from DARPA and ONR -- as well as support from NSF, NIH, AFOSR, NASA, and the U.S. Army beginning in the 1970's -- led to theoretical advances and to practical technologies for solving military, scientific, medical, and industrial information processing problems.
By the early 1980's an "expert systems" industry had emerged, and Japan and Europe dramatically increased their funding of AI research. In some cases, early expert systems success led to inflated claims and unrealistic expectations: while the technology produced many highly effective systems, it proved very difficult to identify and encode the necessary expertise. The field did not grow as rapidly as investors had been led to expect, and this translated into some temporary disillusionment. AI researchers responded by developing new technologies, including streamlined methods for eliciting expert knowledge, automatic methods for learning and refining knowledge, and common sense knowledge to cover the gaps in expert information. These technologies have given rise to a new generation of expert systems that are easier to develop, maintain, and adapt to changing needs.
Today developers can build systems that meet the advanced information processing needs of government and industry by choosing from a broad palette of mature technologies. Sophisticated methods for reasoning about uncertainty and for coping with incomplete knowledge have led to more robust diagnostic and planning systems. Hybrid technologies that combine symbolic representations of knowledge with more quantitative representations inspired by biological information processing systems have resulted in more flexible, human-like behavior. AI ideas also have been adopted by other computer scientists -- for example, "data mining," which combines ideas from databases, AI learning, and statistics to yield systems that find interesting patterns in large databases, given only very broad guidelines.


Case Studies

The following four case studies highlight application areas where AI technology is having a strong impact on industry and everyday life.


Authorizing Financial Transactions
Credit card providers, telephone companies, mortgage lenders, banks, and the U.S. Government employ AI systems to detect fraud and expedite financial transactions, with daily transaction volumes in the billions. These systems first use learning algorithms to construct profiles of customer usage patterns, and then use the resulting profiles to detect unusual patterns and take the appropriate action (e.g., disable the credit card). Such automated oversight of financial transactions is an important component in achieving a viable basis for electronic commerce.


Configuring Hardware and Software
AI systems configure custom computer, communications, and manufacturing systems, guaranteeing the purchaser maximum efficiency and minimum setup time, while providing the seller with superhuman expertise in tracking the rapid technological evolution of system components and specifications. These systems detect order incompletenesses and inconsistencies, employing large bodies of knowledge that describe the complex interactions of system components. Systems currently deployed process billions of dollars of orders annually; the estimated value of the market leader in this area is over a billion dollars.


Diagnosing and Treating Problems
Systems that diagnose and treat problems -- whether illnesses in people or problems in hardware and software -- are now in widespread use. Diagnostic systems based on AI technology are being built into photocopiers, computer operating systems, and office automation tools to reduce service calls. Stand-alone units are being used to monitor and control operations in factories and office buildings. AI-based systems assist physicians in many kinds of medical diagnosis, in prescribing treatments, and in monitoring patient responses. Microsoft's Office Assistant, an integral part of every Office 97 application, provides users with customized help by means of decision-theoretic reasoning.


Scheduling for Manufacturing
The use of automatic scheduling for manufacturing operations is exploding as manufacturers realize that remaining competitive demands an ever more efficient use of resources. This AI technology -- supporting rapid rescheduling up and down the "supply chain" to respond to changing orders, changing markets, and unexpected events -- has shown itself superior to less adaptable systems based on older technology. This same technology has proven highly effective in other commercial tasks, including job shop scheduling, and assigning airport gates and railway crews. It also has proven highly effective in military settings -- DARPA reported that an AI-based logistics planning tool, DART, pressed into service for operations Desert Shield and Desert Storm, completely repaid its three decades of investment in AI research.


The Future
AI began as an attempt to answer some of the most fundamental questions about human existence by understanding the nature of intelligence, but it has grown into a scientific and technological field affecting many aspects of commerce and society.
Even as AI technology becomes integrated into the fabric of everyday life, AI researchers remain focused on the grand challenges of automating intelligence. Work is progressing on developing systems that converse in natural language, that perceive and respond to their surroundings, and that encode and provide useful access to all of human knowledge and expertise. The pursuit of the ultimate goals of AI -- the design of intelligent artifacts; understanding of human intelligence; abstract understanding of intelligence (possibly superhuman) -- continues to have practical consequences in the form of new industries, enhanced functionality for existing systems, increased productivity in general, and improvements in the quality of life. But the ultimate promises of AI are still decades away, and the necessary advances in knowledge and technology will require a sustained fundamental research effort.