IV. Convert sentences from Active Voice into Passive Voice.



Мы поможем в написании ваших работ!


Мы поможем в написании ваших работ!



Мы поможем в написании ваших работ!


ЗНАЕТЕ ЛИ ВЫ?

IV. Convert sentences from Active Voice into Passive Voice.



1. He connected his computer to the Internet over telephone lines.

2. We use the mouse to move a cursor on the screen of the monitor.

3. Sometimes we refer to the CPU as the processor.

4. High-capacity discs provide greater storage capacities than floppy discs.

5. Computers can perform four general operations.

V. Answer the following questions:

1. What is a computer?

2. What operations can a computer perform?

3. What are the components of a computer?

4. What are two common input devices?

5. What is the function of input devices?

6. What elements does the system board include?

7. What kinds of memory do know?

8. What are the functions of storage devices?

9. What is called information processing?

10. What is the function of the communication devices?

 

Topics for Discussion

Examine your attitude towards computers. Are they based on personal experience? Do you fear or distrust computers, and, if so, why? How do you think people’s attitude towards computers might change as computers become more common at home, at school, and on the job?

Computer Generations

The first Generation, 1951-1958:

The Vacuum Tube

The beginning of the computer age may be dated June 14, 1951. In the first generation, vacuum tubes – electronic tubes about the size of light bulbs were used as the internal computer components. They were used for calculation, control, and sometimes for memory. However, because thousands of such tubes were required, they generated a great deal of heat, causing many problems in temperature regulation and climate control. In addition, all the tubes had to be working simultaneously, they were subject to frequent burnout-and the people operating the computer often did not know whether the problem was in the programming or in the machine. In addition, input and output tended to be slow, since both operations were generally performed on punched cards.

Another drawback was that the language, used in programming was machine language, which uses numbers, rather than the present-day higher-level languages, which are more like English. Programming with numbers alone made using the computer difficult and time-consuming.

Therefore, as long as computers were tied down to vacuum tube technology, they could only be bulky, cumbersome, and expensive.

In the first generation the use of magnetism for data storage was pioneered. For primary storage, magnetic core was the principal form of technology used. This consisted of small, doughnut-shaped rings about the size of a pinhead, which were strung like beads on intersecting thin wires. Magnetic core was the dominant form of primary storage technology for two decades. To supplement primary storage, first-generation computers stored data on punched cards. In 1957, magnetic tape was introduced as a faster, more compact method of storing data. The early generation of computers was used primarily for scientific and engineering calculation rather than for business data processing applications. Because of the enormous size, unreliability, and high cost of these computers, many people assumed they would remain very expensive, specialized tools, not destined for general use.

The Second Generation, 1959-1964:

The Transistor

The invention of the transistor, or semiconductor, was one of the most important developments leading to the personal computer revolution. Bell Laboratories engineers John Bardeen, Walter Brattain, and William Shockley invented the transistor in 1948. The transistor, which essentially functions as a solid-state electronic switch, replaced the much less suitable vacuum tube. The transistor revolutionized electronics in general and computer in particular. Not only did transistors shrink the size of the vacuum tube – but they also had numerous other advantages: they needed no warm-up time, consumed less energy, and were faster and more reliable.

The conversion to transistors began the trend toward miniaturization that continues to this day. Today’s small laptop (or palmtop) PC systems, which run on batteries, have more computing power than many earlier systems that filled rooms

During this generation, another important development was the move from machine language to assembly languages. Assembly languages use abbreviations for instructions (for example, “L” for “LOAD”) rather than numbers. This made programming less cumbersome.

After the development of the symbolic languages came higher-level languages. In 1951, mathematician and naval officer Grace Murray Hoper conceived the first compiler program for translating from a higher-level language to the computer’s machine language. The first language to receive widespread acceptance was FORTRAN (for FORmula TRANslator), developed in the mid-1950s as a scientific, mathematical and an engineering language. Higher-level languages allowed programmers to give more attention to solving problems. They no longer had to cope with all details of the machines themselves. Also in 1962 the first removable disc pack was marketed. Disc storage supplemented magnetic tape systems and enabled users to have fast access to desired data.

The rudiments of operating machines were also emerging. Loading programs loaded other programs into main memory from external media such as punched cards, paper tape, or magnetic tape. Monitor programs aided the programmer or computer operator to load other programs, monitor their execution, and examine the contents of memory locations. An input-output control systems consisted of a set of subroutines for manipulating input, output, and storage devices. By calling these subroutines, a program could communicate with external devices without becoming involved in the intricacies of their internal operations.

All these new developments made the second generation of computers less costly to operate – and thus began a surge of growth in computer systems.

The Third Generation, 1965-1970:

The Integrated Circuit

One of the most abundant elements in the earth’s crust is silicon, a nonmetallic substance found in common beach sand as in practically all rocks and clay. The element has given rise to the name “Silicon Valley” for Santa Clara County, which is about 30 miles south of San Francisco. In 1965 Silicon Valley became the principal site of the electronics industry making the so-called silicon chip.

In 1959, engineers at Texas Instruments invented the integrated circuit (IC), a semiconductor circuit that contains more than one transistor on the same base (or substrate material) and connects the transistors without wires. The first IC contained only six transistors. By comparison, the Intel Pentium Pro microprocessor used in many of today's high-end systems has more than 5,5 million transistors, and the integral cache built into some of these chips contains as many as an additional 32 million transistors. Today, many ICs have transistor counts in the multimillion ranges.

An integrated circuit is a complete electronic circuit on a small chip of silicon. The chip may be less than 1/8 inch square and contains hundreds of electronic components. Beginning in 1965, the integrated circuit began to replace the transistor in machines now called third-generation computers.

Silicon is used because it is semiconductor. That is, it is a crystalline substance that will conduct electric current when it has been “doped” with chemical impurities shot into the latticelike structure of the crystal. A cylinder of silicon is sliced into wafers, each about 3 inches in diameter, and wafer is “etched” repeatedly with a pattern of electrical circuitry.

Integrated circuits entered the market with the simultaneous announcement in 1959 by Texas Instruments and Fairchild Semiconductor that they had each independently produced chips containing several complete electronic circuits. The chips were hailed as generation breakthrough because they had four desirable characteristics: reliability, compactness, low cost, low power use.

In 1969, Intel introduced a 1K-bit memory chip, which was much larger than anything else available at the time. (1K bits equals 1,024 bits, and a byte equals 8 bits. This chip, therefore, stored only 128 bytes–not much by today’s standards.) Because of Intel’s success in chip manufacturing and design, Busicomp, a Japanese calculator manufacturing company, asked Intel to produce 12 different logic chips for one of its calculator designs. Rather than produce 12 separate chips, Intel engineers included all the functions of the chips in a single chip.

In addition to incorporating all the functions and capabilities of the 12-chip design into one multipurpose chip, the engineers designed the chip to be controlled by a program that could alter the function of the chip. The chip then was generic in nature, meaning that it could function in designs other than calculators. Previous designs were hard-wired for one purpose, with built-in instructions; this chip would read from memory a variable set of instructions that would control the function of the chip. The idea was to design almost an entire computing device on a single chip that could perform different functions, depending on what instructions it was given.

The third generation saw the advent of computer terminals for communicating with a computer from a remote location.

Operating systems (OS) came into their own in the third generation. The OS was given complete control of the computer system; the computer operator, programmers, and users all obtained services by placing requests with the OS via computer terminals. Turning over control of the computer to the OS made possible models of operation that would have been impossible with manual control. For example, in multiprogramming the computer is switched rapidly from program to program in round-robin fashion, giving the appearance that all programs are being executed simultaneously.

An important form of multiprogramming is time-sharing, in which many users communicate with a single computer from remote terminals.

The Fourth Generation, 1971-Present:

The Microprocessor

Through the 1970s, computers gained dramatically in speed, reliability, and storage capacity, but entry into the fourth generation was evolutionary rather than revolutionary. The fourth generation was, in fact, an extension of third-generation technology. That is, in the early part of the third generation, specialized chips were developed for computer memory and logic. Thus, all the ingredients were in place for the next technological development, the general-purpose processor-on-a-chip, otherwise known as the microprocessor. First developed by an Intel Corporation design team headed by Ted Hoff in 1969, the microprocessor became commercially available in 1971.

Nowhere is the pervasiveness of computer power more apparent than in the explosive use of the microprocessor. In addition to the common applications of digital watches, pocket calculators, and microcomputers – small home and business computers – microprocessors can be anticipated in virtually every machine in the home or business. (To get a sense of how far we have come, try counting up the number of machines, microprocessor controlled or not, that are around your house. Would more than one or two have been in existence 50 years ago?)

The 1970s saw the advent of large-scale integration (LSI). The first LSI chips contained thousands of transistors; later, it became possible to place first tens and then hundreds of thousands of transistors on a single chip. LSI technology led to two innovations: embedded computers, which are incorporated into other appliances, such as cameras and TV sets, and microcomputers or personal computers, which can be bought and used by individuals. In 1975, very large scale integration (VLSI) was achieved. As a result, computers today are 100 times smaller than those of the first generation, and a single chip is far more powerful than ENIAC.

Computer environments have changed, with climate-controlled rooms becoming less necessary to ensure reliability; some recent models (especially minicomputers and microcomputers) can be placed almost anywhere.

Large computers, of course, did not disappear just because small computers entered the market. Mainframe manufacturers have continued to develop powerful machines, such as the UNIVAC 1100, the IBM 3080 series, and the supercomputers from Gray.

Countries around the world have been active in the computer industry; few are as renowned for their technology as Japan. The Japanese have long been associated with chip technology, but recently they announced an entirely new direction.



Последнее изменение этой страницы: 2016-04-08; Нарушение авторского права страницы; Мы поможем в написании вашей работы!

infopedia.su Все материалы представленные на сайте исключительно с целью ознакомления читателями и не преследуют коммерческих целей или нарушение авторских прав. Обратная связь - 3.239.33.139 (0.007 с.)