Читайте также:
|
|
TEXT Multimedia
& Active Vocabulary
to peer bowel to malfunction source to spot troublesome complicated to attach to clarify | workstation to involve compound infancy to accelerate phenomenon (phenomena) primary to experience |
Task 1. Read the text
A jet aircraft technician peers into the bowels of a malfunctioning engine searching for the source of the problem. Finally, he spots it. Buried deep within the engine is the troublesome part. He will have to replace it. A complicated procedure, to say the least.
The technician goes to his high-powered workstation attached to a network and calls up the information on the part and the replacement procedure. An image of the part seated in the engine appears. In another window, an instructor demonstrates the repair procedure in full motion video while the technician listens through the audio channel as the instructor explains the process. Diagrams pop up to further clarify key points. In a text window, he reviews lists of necessary parts and tools he will need to complete the repair.
Still confused about an irregularity in this situation, the technician presses the help key and a real-time image of a live supervisor pops up in another window. Using the attached microphone, the technician discusses the particular problem with the supervisor, who directs more information onto the technician’s screen. The technician points a video camera at the part in question, to show the supervisor the specific situation.
Welcome to the word of high-end multimedia. The situation described above is not quite here yet, but most of the pieces already exist to make this scenario become a reality using a networked RS/6000 or other high-powered workstation.
Or take this example of a scenario which is more likely today. A manager creates a detailed business presentation involving text, graphics, digitized photographic still images, and tables of spreadsheet data all combined in a single compound document. Before sending the document across the network to a colleague, the manager picks up the microphone and attaches an audio note to one of the tables, reminding the colleague about something unusual or potentially confusing in the accompanying figures.
Using a networked RS/6000 equipped with the necessary audio boards and Bolt Baranek & Newman’s (Cambridge, Mass.) BNN/Slate, a compound office automation application, this scenario is possible today. High-end multimedia is only in it’s infancy, but it is here. And over the next few years, industry observers expect multimedia development to accelerate as current barriers are overcome.
Multimedia is not a new phenomenon, although it is new to business computing. We live in a multimedia world. At home, we experience a variety of media through our television: full-motion video, still images, graphics, sound, and animation. At school, we learn through systematic exposure to different media: the instructor’s words, text, audio tapes, graphics, and a variety of visuals and video.
Computers, however, have tended to be uni-medium. Traditionally, computers were text-based, and this continues to be the primary format for business information. A few systems have provided sound or graphics, but until recently, the efforts were rudimentary compared to the seamlessly integrated, high-quality visuals, video and audio we experience every evening at home.
Task 2. Read these sentences and decide which one best summarizes the text
1. Computers cannot yet match the technological achievements of conventional audio-visual systems.
2. Although multimedia computer systems are improving very fast, they do not yet reflect the multimedia world we already live in.
3. Multimedia computer technology will soon be widely used in business, in industry, and in the home.
Task 3. Decide whether the following statements are true [T] or false [F] in relation to the information in the text
1. The jet aircraft technician first locates the faulty part and makes a mark on it. [ ]
2. The technician calls up his supervisor from his workstation to get information about the faulty part and replacement procedure. [ ]
3. The technician can display a computer-generated graphical representation of a supervisor on his screen. [ ]
4. This kind of repair procedure could well be possible long before. [ ]
5. You can already buy multimedia business presentation applications. [ ]
6. Industrial experts expect multimedia development to get faster and faster. [ ]
7. Multimedia existed long before the invention of the computer. [ ]
8. In terms of quality, multimedia computer systems have only recently become comparable with the media we already use. [ ]
Task 4. Work in pairs. You manage a company specializing in multimedia hardware and software. Prepare a leaflet to inform companies of the potential benefits of using multimedia. Invite them to contact you for a free consultation.
ADDITIONAL TEXTS
Computer
C |
omputer is device capable of performing a series of arithmetic or logical operations. A computer is distinguished from a calculating machine, such as an electronic calculator, by being able to store a computer program (so that it can repeat its operations and make logical decisions), by the number and complexity of the operations it can perform, and by its ability to process, store, and retrieve data without human intervention. Computers developed along two separate engineering paths, producing two distinct types of computer – analog and digital. An analog computer operates on continuously varying data; a digital computer performs operations on discrete data.
Computers are categorized by both size and the number of people who can use them concurrently. Supercomputers are sophisticated machines designed to perform complex calculations at maximum speed; they are used to model very large dynamic systems, such as weather patterns. Mainframes, the largest and most powerful general-purpose systems, are designed to meet the computing needs of a large organization by serving hundreds of computer terminals at the same time. Minicomputers, though somewhat smaller, also are multiuser computers, intended to meet the needs of a small company by serving up to a hundred terminals. Microcomputers, computers powered by a microprocessor, are subdivided into personal computers and workstations, the latter typically incorporating RISC processors. Although microcomputers were originally single-user computers, the distinction between them and minicomputers has blurred as microprocessors have become more powerful. Linking multiple microcomputers together through a local area network or by joining multiple microprocessors together in a parallel-processing system has enabled smaller systems to perform tasks once reserved for mainframes.
Advances in the technology of integrated circuits have spurred the development of smaller and more powerful general-purpose digital computers. Not only has this reduced the size of the large, multi-user mainframe computers – which in their early years were large enough to walk through – to that of large pieces of furniture, but it has also made possible powerful, single-user personal computers and workstations that can sit on a desktop. These, because of their relatively low cost and versatility, have largely replaced typewriters in the workplace and rendered the analog computer inefficient.
Computer architecture
E |
arly computer designers labored to pack as much capability as possible into each instruction and thus gave a major share of resources to the control path. There were at least three reasons for adopting this strategy. First, more-powerful instructions would make life easier for the programmer. Second, they promised higher performance: If each instruction could accomplish more, then the computer could complete a task with fewer instructions. The third reason is related to the second: Programs made up of fewer instructions take up less room in memory. The principal drawback of the strategy was the grotesque complexity of the decoder and control unit needed to implement very complex instructions.
A solution to this problem was devised remarkably early in the history of electronic digital computing. In 1951 Maurice V. Wilkes of the University of Cambridge proposed the idea of microprogramming, which turns the control unit into a miniature computer within the computer. Instead of building a tangled web of hardware gates and latches to translate instruction words into control actions, the designer creates a sequence of microinstructions, or microcode, whose bits correspond more or less directly to the necessary control signals. The microcode is stored in a special memory array within the processor. The original program instruction – now called a macroinstruction – becomes a pointer into this memory, selecting which sequence of microinstructions is to be executed.
Microprogramming was slow to catch on until it was adopted by IBM in the System/360 series of mainframe computers in the 1960s. Thereafter it became very popular, and by 1980 virtually all computers were microprogrammed. Two styles of microcode evolved. Horizontal microprograms are short and fat; in the limiting case the microprogram for each macroinstruction consists of a single wide microinstruction with one bit for each control signal in the processor. Vertical microprograms are tall and skinny, with narrower microinstructions that need some degree of decoding before they can be supplied to the control circuitry.
What is the Internet?
W |
hat is this computers phenomenon called the Internet, or the Net? Do you personally have need of it? Before you decide to get "on" the Internet, you may want to know something about it. In spite of all the hype, there are reasons to exercise caution, especially if there are children in the home. Imagine a room filled with many spiders, each spinning its own web. The webs are so in interconnected that the spiders can travel freely within this maze. You now have a simplified view of the Internet - a global collection of many different types of computers and computer networks that are linked together. Just as a telephone enables you to talk to someone on the other side of the earth who also has a phone, the Internet enables a person to sit at his computer and exchange information with other computers and computer users anyplace in the world.
Some refer to the Internet as the information superhighway. Just as a road allows travel through different areas of a country, so the Internet allows information to flow through many different interconnected computer networks. As messages travel, each network that is reached contains information that assists in connecting to the adjacent network. The final destination may be in a different city or country. Each network can "speak" with its neighbor network by means of a common set of rules created by the Internet designers. Worldwide, how many networks are connected? Some estimates say over 30,000. According to recent surveys, these networks connect aver 10,000,000 computers and some 30,000,000 users throughout the world. It is estimated that the number of connected computers is doubling each year.
What can people locate on the Internet? If offers a rapidly growing collection of information, with topics ranging from medicine to science and technology. It features exhaustive material on the arts as well as research material for students and coverage of reaction, entertainment, sports, shopping and employment opportunities. The Internet provides access to almanacs, dictionaries, encyclopedias and maps.
Internet
I |
f you think of all the information people have been given about the increasing use of the Internet system, you could be forgiven for thinking that youngsters all over the world are using it. But you would be wrong.
It isn't as though they wouldn't want to, given the opportunity, but there isn't the time or the money in many junior schools to let them. Although computers are now used widely in schools, most lack the funds and teachers with enough technical expertise to be able to successfully install or operate an Internet system.
A specialist company called research Machines (RM) develops and supplies information systems, software and services to junior and secondary schools, colleges and universities.
It specializes in the British education system and offers some very comprehensive packages. RM sets a fixed annual fee and this means that users have the advantage of spending a longer time on the Internet without continually having to worry about the cost.
Internet users communicate with one another by using telephone circuits, and, like RM, most Internet systems charge a standard fee with no time charges. This could lead to heavy Internet users taking advantage of this standard fee, and spending hours on the net. This jams local telephone circuits and may prevent ordinary bill-paying telephone users from making calls, even in emergencies.
Not only do Internet users jam telephone circuits and create inconvience and possible danger to telephone users, but they also cost the telephone companies a great deal of money.
The companies frequently have t replace and install expensive new circuits. This is the result of the fact that Internet users are continually overloading their systems.
The problem is that while it is costing the telephone companies money, it is quite legal. Telephone companies claim that Internet users are abusing their networks and this is fast turning into war. But it is a war that will have to be fought on an international scale.
Of course it is inevitable that there will be drawbacks to something as powerful as an international communications system. But with RM, the benefits that young people stand to gain will certainly be to their advantage.
About Internet
I |
nternet is the international computer network linking together thousands of individual networks at military and government agencies, educational institutions, nonprofit organizations, industrial and financial corporations of all sizes, and commercial enterprises (called gateways or service providers) that enable individuals to access the network. The most popular features of the Internet include electronic mail (e-mail), discussion groups (called newsgroups or bulletin boards, where users can post messages and look for responses on a system called Usenet), on-line conversations (called chats), adventure and role-playing games, information retrieval, and electronic commerce (e-commerce).
The public information stored in the multitude of computer networks connected to the Internet forms a huge electronic library, but the enormous quantity of data and number of linked computer networks also make it difficult to find where the desired information resides and then to retrieve it. A number of progressively easier-to-use interfaces and tools have been developed to facilitate searching. Among these are search engines, such as Archie, Gopher, and WAIS (Wide Area Information Server), and a number of commercial indexes, which are programs that use a proprietary algorithm to search a large collection of documents for keywords and return a list of documents containing one or more of the keywords. Telnet is a program that allows users of one computer to connect with another, distant computer in a different network. The File Transfer Protocol (FTP) is used to transfer information between computers in different networks. The greatest impetus to the popularization of the Internet came with the introduction of the World Wide Web (WWW), a hypertext system that makes browsing the Internet both fast and intuitive. Most e-commerce occurs over the Web.
Each computer that is directly connected to the Internet is uniquely identified by a 32-bit binary number called its IP address. This address is usually seen as a four-part decimal number, each part equating to 8 bits of the 32-bit address in the decimal range 0-255. Because an address of the form 4.33.222.111 could be difficult to remember, a system of Internet addresses, or domain names, was developed in the 1980s. Reading from left to right, the parts of a domain name go from specific to general. For example, www.irs.ustreas.gov is a World Wide Web site at the Internal Revenue Service, which is part of the U.S. Treasury Dept., which is a government agency. The rightmost part, or top-level domain (or suffix or zone), can be a two-letter abbreviation of the country in which the computer is in operation; more than 250 abbreviations, such as “ca” for Canada and “uk” for United Kingdom, have been assigned. Although such an abbreviation exists for the United States (us), it is more common for a site in the United States to use a three-letter, specialized top-level domain such as edu (educational institution), gov (government), or mil (military) or one of the four domains designated for open registration worldwide, com (commercial), int (international), net (network), or org (organization). In 2000 seven additional top-level domains were approved for worldwide use. An Internet address is translated into an IP address by a domain-name server, a program running on an Internet-connected computer.
The Internet evolved from a secret feasibility study conceived by the U.S. Dept. of Defense in 1969 to test methods of enabling computer networks to survive military attacks, by means of the dynamic rerouting of messages. As the ARPAnet (Advanced Research Projects Agency network), it began by connecting three networks in California with one in Utah – these communicated with one another by a set of rules called the Internet Protocol (IP). By 1972, when the ARPAnet was revealed to the public, it had grown to include about 50 universities and research organizations with defense contracts, and a year later the first international connections were established with networks in England and Norway. A decade later, the Internet Protocol was enhanced with a set of communication protocols, the Transmission Control Program/Internet Protocol (TCP/IP), that supported both local and wide-area networks. Shortly thereafter, the National Science Foundation (NSF) created the NSFnet to link five supercomputer centers, and this, coupled with TCP/IP, soon supplanted the ARPAnet as the backbone of the Internet. In 1995, however, the NSF decommissioned the NSFnet, and responsibility for the Internet was assumed by the private sector. Fueled by the increasing popularity of personal computers, e-mail, and the World Wide Web (which was introduced in 1991 and saw explosive growth beginning in 1993), the Internet became a significant factor in the stock market and commerce during the second half of the decade.
World Wide Web
W |
orld Wide Web (WWW or W3), collection of globally distributed text and multimedia documents and files and other network services linked in such a way as to create an immense electronic library from which information can be retrieved quickly by intuitive searches. The Web represents the application of hypertext technology and a graphical interface to the Internet to retrieve information that is contained in specially formatted documents that may reside in the same computer or be distributed across many computers around the world. It consists of three main elements. The Hypertext Markup Language (HTML) comprises the programming codes, or tags, that define fonts, layouts, embedded graphics, and links (hyperlinks) to other documents accessible via the Web. The HyperText Transfer Protocol (HTTP) defines a set of standards for transmitting Web pages across the Internet. The Universal Resource Locator (URL) is a standardized naming convention for identifying a Web document or file, in a sense the address of a link. The result is called the Web because it is made up of many sites, all linked together, with users traveling from one site to the next by clicking a computer's pointing device on a hyperlink.
Web sites, also called Web pages, are really Internet sites that all use the same techniques and HTML tags to create multimedia documents with hypertext links. Each Web page can contain many screens or printed pages of text, graphics, audio, and even video, and the starting point for any Web site is called its home page. Although each page is an Internet site, it must be accessed via a special program called a Web browser, which can translate the HTML into the graphical images, text, and hypertext links intended by the creator of the page.
Interactive television is a generic term that encompasses a variety of Web-related television technologies and products. Typically, a home television receiver and a telephone line are connected through a small appliance that accesses the Internet through the telephone line and converts the downloaded Web pages into a form that can be displayed on the receiver. A remote control interface allows the user to navigate through the Web and select the information to be displayed.
Ted Nelson, an American computer consultant, had promoted the idea of linking documents via hypertext during the 1960s, but the technology required was not to be available for another 20 years. The foundation of what we now think of as the Web originated with work done on the retrieval of information from distributed systems by Tim Berners-Lee at the European Laboratory for Particle Physics (CERN) during the 1980s. This culminated in the introduction of a text-only interface, or browser, to the scientific community in 1990 and to the public in 1991. Because of the difficulty of using this version, acceptance outside the scientific and academic communities was slow. Marc Andreessen, an undergraduate student working at the National Center for Supercomputing Applications (NCSA), developed a graphical browser for the Web, introducing a UNIX version in 1993. Versions for the Windows and Macintosh operating systems followed in 1994, and acceptance of the World Wide Web blossomed quickly. In the late 1990s the development of improved browsers with greater multimedia functionality, security, and privacy, as well as more powerful search engines capable of indexing the ever greater information on the Web, led to the commercialization of the Internet (see e-commerce).
The informatization of the society
T |
he progress demands the creation of condition and the language of interaction of objects and subjects of the whole world more and more. The first who were compelled to go this way were the systems of communications. The International Consultative Committee on Telephony and Telegraphy - ICCTT standardize the laws of inter-national communications urgently and with the great success in this field. The regulations of change of all types of information between different networks of communications, tarification of services, communications with computers, in general all that facilitates the communications between all nations are embracing by this Committee. The same functions perform international and a series of the state informational and computer nets. The personal magnetic cards for commercial settlements also promotes the unification of human-machine language of communications due to their inter-national status. This process had been continued the EDIFACT standard approved by the Commission of the UNO on paperless technology of communications in trade and transport.
Unfortunately, not all fields of application of computers and other Informatization means have the base allowing to unify the means of communications between the people, to make easier and improve their communicativity. It is impossible to work with a series of data banks and it is even rather difficult to construct the models on the base of information of several banks without essential expenditures of mental and manual labor. Many of applied automated systems of control even in the same branches of series countries to say nothing of the major countries are quite incompatible.
The foregoing facts suggest an idea to us that the Informatization of the society with high communicability of its objects and subjects is possible only if one is united by some inter-national, unified, developing system that services all the objects of this system really and usefully. Such a system may be a new infrastructure branch like communication system or transport that, by analogy, may be called the Informatics. It may pave the way for processing, storage and transfer of data, perform all approbate daily duties of informational services of objects and subjects of the society from issuing of certificates to the performing of all types of orders, payments, of organization of medical, transport, consumer and other services.
Word processing
U |
se of a computer program or a dedicated hardware and software package to write, edit, format, and print a document. Text is most commonly entered using a keyboard similar to a typewriter's, although handwritten input (see pen-based computer) and audio input (as for dictation) devices have been introduced.
Word processors have various functions that allow a person to revise text without retyping an entire document. As the text is entered or after it has been retrieved, sections ranging from words and sentences to paragraphs and pages can be moved, copied, deleted, altered, and added to while displayed. As word processors have become more sophisticated, such functions as word counting, spell checking, footnoting, and index generation have been added. In addition, a document's format– type size, line spacing, margins, page length, and the like – usually can be easily altered. To aid in these alterations, the text is displayed as it will appear when printed with indented paragraphs and lists, multiple columns, tables, etc; this is called a what-you-see-is-what-you-get (WYSIWYG) display.
Word processors are distinguished from text editors and desktop publishing systems. Text editors are designed for creating and editing computer programs. While they have features found in simple word processors, such as search and replace, that make the entry and editing of words and numbers easier, text editors provide only the most primitive facilities for text formatting and printing. Desktop publishers may include only simple word processing features but provide enhanced formatting functions, such as routines for merging text and graphics into complex page layouts.
Artificial Intelligence
A |
rtificial intelligence (AI), the use of computers to model the behavioral aspects of human reasoning and learning. Research in AI is concentrated in some half-dozen areas. In problem solving, one must proceed from a beginning (the initial state) to the end (the goal state) via a limited number of steps; AI here involves an attempt to model the reasoning process in solving a problem, such as the proof of a theorem in Euclidean geometry.
In game theory (see games, theory of), the computer must choose among a number of possible “next” moves to select the one that optimizes its probability of winning; this type of choice is analogous to that of a chess player selecting the next move in response to an opponent's move. In pattern recognition, shapes, forms, or configurations of data must be identified and isolated from a larger group; the process here is similar to that used by a doctor in classifying medical problems on the basis of symptoms.
Natural language processing is an analysis of current or colloquial language usage without the sometimes misleading effect of formal grammars; it is an attempt to model the learning process of a translator faced with the phrase “throw mama from the train a kiss.”
Cybernetics is the analysis of the communication and control processes of biological organisms and their relationship to mechanical and electrical systems; this study could ultimately lead to the development of “thinking” robots (see robotics).
Machine learning occurs when a computer improves its performance of a task on the basis of its programmed application of AI principles to its past performance of that task. In the public eye advances in chess – playing computer programs have become symbolic of progress in AI.
In 1948 British mathematician Alan Turing developed a chess algorithm for use with calculating machines – it lost to an amateur player in the one game that it played. Ten years later American mathematician Claude Shannon articulated two chess-playing algorithms: brute force, in which all possible moves and their consequences are calculated as far into the future as possible; and selective mode, in which only the most promising moves and their more immediate consequences are evaluated.
In 1988 Hitech, a program developed at Carnegie-Mellon Univ., defeated former U.S. champion Arnold Denker in a four-game match, becoming the first computer to defeat a grandmaster. A year later, Gary Kasparov, the reigning world champion, bested Deep Thought, a program developed by the IBM Corp., in a two-game exhibition. In 1990 the German computer Mephisto-Portrose became the first program to defeat a former world champion; while playing an exhibition of 24 simultaneous games, Anatoly Karpov bested 23 human opponents but lost to the computer. Kasparov in 1996 became the first reigning world champion to lose to a computer in a game played with regulation time controls; the Deep Blue computer, developed by the IBM Corp., won the first game of the match, lost the second, drew the third and fourth, and lost the fifth and sixth.
Дата добавления: 2015-10-29; просмотров: 134 | Нарушение авторских прав
<== предыдущая страница | | | следующая страница ==> |
Amp; Vocabulary 3 страница | | | Amp; Vocabulary 5 страница |