Мы поможем в написании ваших работ!



ЗНАЕТЕ ЛИ ВЫ?

Text1. Computers Make the World Smaller and Smarter

Поиск

Методические указания

к практическим занятиям

по дисциплине «Технический перевод текста (английский)»

для студентов всех форм обучения

направления подготовки 051000.62 Профессиональное обучение (по отраслям) профиля «Информатика и вычислительная техника»

профилизации «Компьютерные технологии»

 

Екатеринбург 2012


 

Методические указания к практическим занятиям по дисциплине «Технический перевод текста (английский)» Екатеринбург, ФГАОУ ВПО «Рос. гос. проф.-пед. университет», 2012. 36 с.

 

 

Составители ст. преп. С.И. Унда

ст. преп. А.Г. Николаева

Рецензент канд. педаг. наук, доцент И.М. Кондюрина

 

Одобрены на заседании кафедры иностранных языков ИнЛин РГППУ.Протокол от 27.09.2012г. № 2

 

Заведующая кафедрой ИЯ И.Д. Белеева

 

Рекомендованы к печати методической комиссией Института лингвистики РГППУ. Протокол от 15.10.2012г. №1

 

Председатель методической комиссии

института лингвистики РГППУ А.А. Евтюгина

 

 

© ФГАОУ ВПО «Российский государственный профессионально-педагогический университет», 2012

 
 

 

Введение

Целью данных методических указаний является подготовка студентов к самостоятельной работе со специальной литературой на материале предложенных текстов. Методические указания содержат 13 оригинальных текстов и вопросы к ним для последующего обсуждения.

Тексты по проблемам информационных технологий взяты из современных научных и научно-популярных изданий (монографий, журналов и газет) и приводятся без адаптации. Исходными посылками при отборе текстов, предназначенных для перевода, были актуальность темы, познавательность и увлекательность изложения материала. Тексты рекомендуется переводить в том порядке, в каком они представлены в методических указаниях, так как материал дан в возрастающей степени сложности.

Данные методические указания предназначены для студентов всех форм обучения направления подготовки 051000.62 Профессиональное обучение (по отраслям) профиля «Информатика и вычислительная техника» профилизации «Компьютерные технологии»

Comprehension check

Translate the text and find the sentences which best summarise each paragraph.

2. Translate these words and word combinations into English:

настольный, в настольном исполнении; запоминать, хранить, сохранять; карманный компьютер; усложнённый, утончённый; дистанционный, удалённый; аппаратные средства, аппаратура, оборудование; компьютерные программы, программное обеспечение.

3. Answer the questions to the text:

1.Name some types of devices that use computers on a chip'

2.What uses of handheld computers are mentioned in the text?

3.What are the benefits of using computers with the following items?

a Security systems

b Cars

с Phones

4.What smart devices are mentioned in the text?

5.What are smart cards used for?

6.What are the advantages of multimedia?

7.What can medical expert systems do?

8.How can computers help the disabled?

9.What types of computing systems are made available to people in remote locations using electronic classrooms or boardrooms?

10.What aspects of computing can peoplepower determine?

Text 2. Cache Memory

Words and phrases to the text:

1. cache(n) fast memory used to temporarily store frequently used data to allow it to be accessed more quickly

2. cache controller(n) the set of electronic logic circuits that control the operation of cache memory

Most PCs are held back not by the speed of their main processor, but by the time it takes to move data in and out of memory. One of the most important techniques for getting around this bottleneck is the memory cache.

The idea is to use a small number of very fast memory chips as a buffer or cache between main memory and the processor. Whenever the processor needs to read data it looks in this cache area first. If it finds the data in the cache then this counts as a 'cache hit' and the processor need not go through the more laborious process of reading data from the main memory. Only if the data is not in the cache does it need to access main memory, but in the process it copies whatever it finds into the cache so that it is there ready for the next time it is needed. The whole process is controlled by a group of logic circuits called the cache controller.

One of the cache controller's main jobs is to look after cache coherency which means ensuring that any changes written to main memory are reflected within the cache and vice versa. There are several techniques for achieving this, the most obvious being for the processor to write directly to both the cache and main memory at the same time. This is known as a 'write-through' cache and is the safest solution, but also the slowest.

The main alternative is the 'write-back' cache which allows the processor to write changes only to the cache and not to main memory. Cache entries that have changed are flagged as 'dirty', telling the cache controller to write their contents back to main memory before using the space to cache new data. A write-back cache speeds up the write process, but does require a more intelligent cache controller.

Most cache controllers move a ‘line’ of data rather than just a single item each time they need to transfer data between main memory and the cache. This tends to improve the chance of a cache hit as most programs spend their time stepping through instructions stored sequentially in memory, rather than jumping about from one area to another. The amount of data transferred each time is known as the 'line size'.

 

Comprehension check

Text 3. Data Mining

Words and phrases to the text:

1. artificial intelligence(n) an area of computing concerned with developing computer programs that perform tasks that can normally only be done using human intelligence

2. clustering(n) a method used in data mining that divides data into groups based on similar features or limited data ranges

3. cleanse(v) a term used in data mining meaning to remove duplicate information and erroneous data

 

Data mining is simply filtering through large amounts of raw data for useful information that gives businesses a competitive edge. This information is made up of meaningful patterns and trends that are already in the data but were previously unseen.

The most popular tool used when mining is artificial intelligence (AI). AI technologies try to work the way the human brain works, by making intelligent guesses, learning by example, and using deductive reasoning. Some of the more popular AI methods used in data mining include neural networks, clustering, and decision trees.

Neural networks look at the rules of using data, which are based on the connections found or on a sample set of data. As a result, the software continually analyses value and compares it to the other factors, and it compares these factors repeatedly until it finds patterns emerging. These patterns are known as rules. The software then looks for other patterns based on these rules or sends out an alarm when a trigger value is hit.

Clustering divides data into groups based on similar features or limited data ranges. Clusters are used when data isn't labelled in a way that is favourable to mining. For instance, an insurance company that wants to find instances of fraud wouldn't have its records labelled as fraudulent or not fraudulent. But after analyzing patterns within clusters, the mining software can start to figure out the rules that point to which claims are likely to be false.

Decision trees, like clusters, separate the data into subsets and then analyse the subsets to divide them into further subsets, and so on (for a few more levels). The final subsets are then small enough that the mining process can find interesting patterns and relationships within the data.

Once the data to be mined is identified, it should be cleansed. Cleansing data frees it from duplicate information and erroneous data. Next, the data should be stored in a uniform format within relevant categories or fields. Mining tools can work with all types of data storage, from large data warehouses to smaller desktop databases to flat files. Data warehouses and data marts are storage methods that involve archiving large amounts of data in a way that makes it easy to access when necessary.

When the process is complete, the mining software generates a report. An analyst goes over the report to see if further work needs to be done, such as refining parameters, using other data analysis tools to examine the data, or even scrapping the data if it 's unusable. If no further work is required, the report proceeds to the decision makers for appropriate action.

The power of data mining is being used for many purposes, such as analysing Supreme Court decisions, discovering patterns in health care, pulling stories about competitors from newswires, resolving bottlenecks in production processes, and analysing sequences in the human genetic makeup. There really is no limit to the type of business or area of study where data mining can be beneficial.

 

Comprehension check

Comprehension check

Text5. Linux

Linux has its roots in a student project. In 1992, an undergraduate called Linus Torvalds was studying computer science in Helsinki, Finland. Like most computer science courses, a big component of it was taught on (and about) Unix. Unix was the wonder operating system of the 1970s and 1980s: both a textbook example of the principles of operating system design, and sufficiently robust to be the standard OS in engineering and scientific computing. But Unix was a commercial product (licensed by AT&T to a number of resellers) and cost more than a student could pay.

Annoyed by the shortcomings of Minix (a compact Unix clone written as a teaching aid by Professor Andy Tannenbaum) Linus set out to write his own 'kernel' - the core of an operating system that handles memory allocation, talks to hardware devices, and makes sure everything keeps running. He used the GNU programming tools developed by Richard Stallman's Free Software Foundation, an organisation of volunteers dedicated to fulfilling Stallman's ideal of making good software that anyone could use without paying. When he'd written a basic kernel, he released the source code to the Linux kernel on the Internet.

Source code is important. It's the original from which compiled programs are generated. If you don't have the source code to a program, you can't modify it to fix bugs or add new features. Most software companies won't sell you their source code, or will only do so for an eye-watering price, because they believe that if they make it available it will destroy their revenue stream.

What happened next was astounding, from the conventional, commercial software industry point of view - and utterly predictable to anyone who knew about the Free Software Foundation. Programmers (mostly academics and students) began using Linux. They found that it didn't do things they wanted it to do -so they fixed it. And where they improved it, they sent the improvements to Linus, who rolled them into the kernel. And Linux began to grow.

There's a term for this model of software development; it's called Open Source (see www.opensource.org/ for more information).

Anyone can have the source code — it's free (in the sense of free speech, not free beer). Anyone can contribute to it. If you use it heavily you may want to extend or develop or fix bugs in it - and it is so easy to give your fixes back to the community that most people do so.

An operating system kernel on its own isn't a lot of use; but Linux was purposefully designed as a near-clone of Unix, and there is a lot of software out there that is free and was designed to compile on Linux. By about 1992, the first 'distributions' appeared.

A distribution is the Linux-user term for a complete operating system kit, complete with the utilities and applications you need to make it do useful things - command interpreters, programming tools, text editors, typesetting tools, and graphical user interfaces based on the X windowing system. X is a standard in academic and scientific computing, but not hitherto common on PCs; it's a complex distributed windowing system on which people implement graphical interfaces like KDE and Gnome.

As more and more people got to know about Linux, some of them began to port the Linux kernel to run on non-standard computers. Because it’s free, Linux is now the most widely-ported operating system there is.

 

 

Comprehension check

Text 6. User Interfaces

Words and phrases to the text:

1. multimodal input (n) the process of operating a user interface using a combination of types of input

2. intranet (n) a computer network that is internal to an organization and uses the TCP protocol in the same way as the Internet

Cheaper and more powerful personal computers are making it possible to perform processor-intensive tasks on the desktop. Breakthroughs in technology, such as speech recognition, are enabling new ways of interacting with computers. And the convergence of personal computers and consumer electronics devices is broadening the base of computer users and placing a new emphasis on ease of use. Together, these developments will drive the industry in the next few years to build the first completely new interfaces since SRI International and Xerox's Palo Alto Research Center did their pioneering research into graphical user interfaces (GUIs) in the 1970s.

True, it's unlikely that you'll be ready to toss out the keyboard and mouse any time soon. Indeed, a whole cottage industry -inspired by the hyperlinked design of the World Wide Web - has sprung up to improve today's graphical user interface. Companies are developing products that organize information graphically in more intuitive ways. XML-based formats enable users to view content, including local and network files, within a single browser interface. But it is the more dramatic innovations such as speech recognition that are poised to shake up interface design.

Speech will become a major component of user interfaces, and applications will be completely redesigned to incorporate speech input. Palm-size and handheld PCs, with their cramped keyboards and basic handwriting recognition, will benefit from speech technology.

Though speech recognition may never be a complete replacement for other input devices, future interfaces will offer a combination of input types, a concept known as multimodal input. A mouse is a very efficient device for desktop navigation, for example, but not for changing the style of a paragraph. By using both a mouse and speech input, a user can first point to the appropriate paragraph and then say to the computer, 'Make that bold.' Of course, multimodal interfaces will involve more than just traditional input devices and speech recognition. Eventually, most PCs will also have handwriting recognition, text to speech (TTS), the ability to recognize faces or gestures, and even the ability to observe their surroundings.

At The Intelligent Room, a project of Massachusetts Institute of Technology's Artificial Intelligence Lab, researchers have given sight to PCs running Microsoft Windows through the use of video cameras. 'Up to now, the PC hasn't cared about the world around it,' said Rodney A. Brooks, the Director of MIT's Artificial Intelligence Lab. 'When you combine computer vision with speech understanding, it liberates the user from having to sit in front of a keyboard and screen.'

It's no secret that the amount of information - both on the Internet and within intranets - at the fingertips of computer users has been expanding rapidly. This information onslaught has led to an interest in intelligent agents, software assistants that perform tasks such as retrieving and delivering information and automating repetitive tasks. Agents will make computing significantly easier. They can be used as Web browsers, help-desks, and shopping assistants. Combined with the ability to look and listen, intelligent agents will bring personal computers one step closer to behaving more like humans. This is not an accident. Researchers have long noted that users have a tendency to treat their personal computers as though they were human. By making computers more 'social,' they hope to also make them easier to use.

As these technologies enter mainstream applications, they will have a marked impact on the way we work with personal computers. Soon, the question will be not 'what does software look like' but 'how does it behave?'

Comprehension check

Comprehension check

Comprehension check

Text 9. Email Protocols

Words and phrases to the text:

1. batch mode(n) a process in which all the data is collected and processed together in a batch rather than one at a time as they become available

2. folder (n) a storage area used for grouping files so that they can be easily located

 

Although the format of a mail message, as transmitted from one machine to another, is rigidly defined, different mail protocols transfer and store messages in slightly different ways. The mail system you're probably used to employs a combination of SMTP and POP3 to send and receive mail respectively. Others may use IMAP4 to retrieve mail, especially where bandwidth is limited or expensive.

Simple Mail Transfer Protocol

SMTP is used to transfer messages between one mail server and another. It's also used by email programs on PCs to send mail to the server. SMTP is very straightforward, providing only facilities to deliver messages to one or more recipients in batch mode. Once a message has been delivered, it can't be recalled or cancelled. It's also deleted from the sending server once it's been delivered. SMTP uses 'push' operation, meaning that the

connection is initiated by the sending server rather than the receiver. This makes it unsuitable for delivering messages to desktop PCs, which aren't guaranteed to be switched on at all times. In host-based mail systems, such as Unix and Webmail, SMTP is the only protocol the server uses. Received messages are stored locally and retrieved

from the local file system by the mail program. In the case of Web mail, the message is then translated into HTML and transmitted to your browser. SMTP is the only protocol for transferring messages between servers. How they're then stored varies from system to system.

Post Office Protocol

POP is a message-retrieval protocol used by many PC mail clients to get messages from a server, typically your ISP's mail server. It only allows you to download all messages in your mailbox at once. It works in 'pull' mode, the receiving PC initiating the connection. PC-based P0P3 mail clients can do this automatically at a preset interval. When you use your Web mail account to access a P0P3 mailbox, the mail server opens a connection to the P0P3 server just as a PC-based application would. The messages are then copied into your Web mailbox and read via a browser.

Since P0P3 downloads all the messages in your mailbox there's an option to leave messages on the server, so that they can be picked up from different machines without losing any. This does mean that you'll get every message downloaded every time you connect to the server. If you don't clean out your mailbox regularly, this could mean long downloads. When using a Web mail account to retrieve P0P3 mail, be careful about leaving messages on the server - if too many build up, each download will take a long time and fill up your inbox. Many Web mail systems won't recognise messages you've already downloaded, so you'll get duplicates of ones you haven't deleted.

Internet Mail Access Protocol

IMAP is similar in operation to POP, but allows you more choice over what messages you download. Initially, only message headers are retrieved, giving information about the sender and subject. You can then download just those messages you want to read. You can also delete individual messages from the server, and some IMAP4 servers let you organise your mail into folders. This makes download times shorter and there’s no danger of losing messages

Comprehension check

Comprehension check

Text 11.Bluetooth

As portable computing devices get smarter and more capable, connectivity frustrations increase.

This is where Bluetooth comes in. The brainchild of Ericsson, IBM, Intel, Nokia and Toshiba, Bluetooth is a microwave high-speed wireless link system that's designed to work with portable equipment. To that end, it's low power, very small and very low cost. It uses the same frequencies as existing radio LANs (and, incidentally, microwave ovens) to create a secure 1 Mbit/s link between devices within 10m of each other. These devices can be laptops, PDAs, cellphones, wired telephone access points, even wristwatch devices, headphones, digital cameras and so on. With them, your notebook PC will be able to access your cellular phone — and thus the Internet — without your having to take the phone out of your pocket. Files can be exchanged and communications set up for voice and data between just about any device capable of handling the information.

Bluetooth operates in the unlicensed SM (Industrial, Scientific and Medical) band at 2.45GHz, which is globally available for products.

There's 89MHz of bandwidth allocated here, and since Bluetooth is very low power, it actually radiates less than most national and international standards allow non-transmitting devices to leak as part of their normal operation. This is key, as it allows the technology to operate without restriction on aircraft.

As befits their status as radio frequency experts, Ericsson and Nokia developed the RF side of Bluetooth. The link works in a similar way to the IEEE 802.11 wireless networking system, with a packet-switching protocol based on fast-frequency hopping direct sequence spread spectrum. In other words, it constantly switches channel to avoid interference. It changes frequency 1,600 times a second through 79 frequency bands. It's expected that this will be so good at avoiding conflicting signals from other sources that the transmission power can be kept very low. Security is taken care of through the frequency hopping and 40-bit encryption. As the system uses radio, it can work through some barriers -briefcases, shirt pockets and desktops, for example — but it won't carry through office buildings. The power level of the transmitter can be varied, with feedback from the remote side of the link used to set the output to the lowest level commensurate with error-free operation. This saves power and increases the usable density of devices. The device can operate at up to 1mW (an optional power amplifier can increase this to 100mW and the whole lot consumes between 8mA and 30mA at 2.7V. Various power-saving modes can be used when a device isn't transmitting, trading off speed of response for battery life. These work with current levels between 300рА and 60pA.

Within the 10m radius of a unit, up to 10 independent full-speed piconets can operate, with bandwidth reduced proportionately if more than this are in use. Each can handle up to eight devices, and can be further subdivided into separate services: 432Kbit/s full-duplex data, 721/56Kbit/s asymmetric duplex, or 384Kbit/s third-generation GSM. Each channel canal so support three 64Kbit/s full-duplex voice channels. An optional variation in modulation technique would double the basic data rate to 2Mbit/s.

Power consumption and cost were very significant factors in Bluetooth's design, and it was decided not to make the system a fully-fledged LAN. As a result, there's no collision detection. All devices on a piconet are synchronized to a master device and are controlled by it to prevent simultaneous operation on the same frequency. Any device can be a master, and is elected dynamically when the link starts up.

The standard is open and royalty-free to members of the Bluetooth special interest group.

Comprehension check

Text 12 Doing the SAN thing

As companies rely more and more on ecommerce, online-transaction processing and databases, the amount of information that needs to be managed and stored on a network can intimidate even the most experienced of network managers.

While servers do a good job of storing data, their capacity is limited and they can become a bottleneck if too many users try to access the same information. Instead, most companies rely on peripheral storage devices, such as tape libraries, RAID disks and even optical storage systems. These devices are effective for backing up data online and storing large amounts of information.

But as server farms increase in size and companies rely more heavily on data-intensive applications, such as multimedia, the traditional storage model isn't quite as useful. This is because access to these peripheral devices can be slow, and it might

not always be possible for every user to easily and transparently access each storage device.

The most basic way of expanding storage capacity on the network is to hang disk arrays or other storage devices off servers, using the SCSI interface or bus.

While SCSI has been a workhorse over the years for connecting peripherals at a relatively fast speed, distance limitations have kept this particular bus interface from evolving rapidly.

The SCSI standards put a bus length limit of about 6m on devices. While this distance limitation doesn't really affect connecting storage devices directly to a server, it does severely restrict placing RAID and tape libraries at other points on the network.

Enter the NAS

This is where the concept of Network Attached Storage (NAS) comes in. NAS is simple in concept and execution: disk arrays and other storage devices connect to the network through a traditional LAN interface, such as Ethernet. Storage devices would thus attach to network hubs, much the same as servers and other network devices. However, NAS does have a few drawbacks.

First, network bandwidth places throughput limitations on the storage devices. Another downside to NAS is the lack of cohesion among storage devices. While disk arrays and tape drives are on the LAN, managing the devices can prove challenging, since they are separate entities and not logically tied together. NAS has its place as a viable storage architecture, but large companies need something more.

Mr. SAN man

Large enterprises that want the ability to store and manage large amounts of information in a high-performance environment now have another option: the Storage Area Network (SAN). In a SAN, storage devices such as Digital Linear Tapes (DLTs) and RAID arrays are connected to many kinds of servers via a high-speed interconnection, such as Fibre Channel.

This high-speed link creates a separate, external network, that's connected to the LAN, but acts as an independent entity.

This setup allows for any-to-any communication among all devices on the SAN. It also provides alternative paths from server to storage device. In other words, if a particular server is slow or completely unavailable, another server on the SAN can provide access to the storage device. A SAN also makes it possible to mirror data, making multiple copies available.

SANs offer several advantages. First, they allow for the addition of bandwidth without burdening the main LAN. SANs also make it easier to conduct online backups without users feeling the bandwidth pinch. When more storage is needed, additional drives do not need to be connected to a specific server; rather, they can simply be added to the storage network and accessed from any point.

Another reason for the interest in SANs is that all the devices can be centrally managed. Instead of managing the network on a per-device basis, storage can be managed as a single entity, making it easier to deal with storage networks that could potentially consist of dozens or even hundreds of servers and devices.

You can connect almost any modern server to a SAN, because SAN-support hardware and software spans most PC midrange and mainframe platforms. Ideally, a SAN will lighten your server's workload by offloading many storage-related server tasks to the SAN and by better allocating storage resources to servers.

The most important piece of any SAN architecture is the underlying network technology that drives it. You can use ordinary Fast Ethernet, but Fibre Channel is emerging as the technology of choice for SAN implementations. Fibre Channel was developed by ANSI in the early 1990s as a means to transfer large amounts of data very quickly. Fibre Channel is compatible with SCSI, IP, IEE 802.2, ATM Adaptation Layer for computer data, and Link Encapsulation, and it can be used over copper cabling or fibre-optic cable.

 

Comprehension check

Text 13 Futures

Talking to Professor Cochrane is probably as close as you can get to time travelling without leaving the current dimension, as his vision stretches far into this century and beyond. His seemingly unshakeable conviction is that anything is possible if you really put your mind to it. In fact, ВТ (British Telecom) is already sitting on a host of innovations poised to blow your mind during this century.

Designed for the 21st century, Peter Cochrane's signet ring is built around a chip that holds all the details of his passport, bank account, medical records and driving licence. According to Cochrane, it's set to revolutionise shopping.

The ring is already a fully operational prototype, but it will be some time before you'll be trading your credit card in for the ultimate fashion accessory.

It's not just jewellery that's set to get smarter. One of the biggest projects down at the Lab is looking at artificial intelligence as a way of creating software programs, networks, telephones and machines with a degree of intelligence built in. By sensing their environment, they should be able to develop new capacities as demands change. 'I have software that is breeding, which is interchanging genes and creating adaptable behaviour. This means you'll see the network come alive – itwill watch what you do and it will adapt.'

It doesn't stop there, though, as ВТ has taken artificial intelligence one step further and created machines that are solving their own problems. 'We've created solutions that a human being could never have dreamed of. We have solutions, and although we don't understand how they work, they do work. We're effectively increasing the speed of evolution', says Cochrane.

It's already good to talk, but with artificially intelligent phones on the way it will be even better. Cochrane is at present working on smart phones that can translate English into German, Japanese and French in real-time. 'Some of it's rocket science, but a lot of it's extremely simple. What we've built is a kernel of understanding inside a machine that extracts meaning from the sentence itself - at the moment we can do simple things such as phrase books, he says.

The system uses a non-linear approach that sends the English to the understanding kernel in the machine and then fans it out to all the other languages simultaneously.

There's no doubt that Cochraneis putting a lot of faith in intelligent machines, particularly when it comes to cutting through the deluge of information that he says is the downside of the electronic revolution. BT's solution is the development of intelligent agents that watch, learn and start communicating.

It's not all work down at the Lab, though. BT's also involved in an on-going trial that it claims will revolutionise our leisure time, in particular the way we watch TV. 'We put people on the Internet and broadcast TV at the same time, so that the people at home could actually influence what was happening on their TV sets. As a result, it became interactive and therefore more active.'

ВТ has its fingers in multiple pies and has made biotechnology another core focus of R&D. 'Personally, I think hospitals are very dangerous places to be. There are lots of viable alternatives. For a start, we can stop bunging up hospital wards by putting people online.' ВТ has already developed a pack for heart attack victims that monitors their progress and uploads information via a radio link back to the hospital.

So what will the future hold for us if Peter Cochrane and his futurologists have their way? Well, by the year 2015, it's likely that we will be eclipsed by a supercomputer more powerful than the human brain. And if that's got visions of Terminator dancing in your head, don't worry - Cochrane's got it covered. 'I'd really hate one morning to find myself considered an infestation of this planet. Our inclination is to nurture life and not to destroy it. Before we let loose a bunch of artificial intelligence, we ought to be thinking through the necessity of building in a number of rules that hold your life as a human being sacrosanct.'

Comprehension check

Методические указания

к практическим занятиям

по дисциплине «Технический перевод текста (английский)»

для студентов всех форм обучения

направления подготовки 051000.62 Профессиональное обучение (по отраслям) профиля «Информатика и вычислительная техника»

профилизации «Компьютерные технологии»

 

 

Подписано в печать Формат 60х84/16. Бумага для множ. аппаратов.

Печать плоская. Усл.печ.л. Уч.-изд.л. Тираж экз. Заказ

ФГАОУ ВПО «Российский государственный профессионально-педагогический университет». Екатеринбург, ул. Машиностроителей, 11.

Ризограф ФГАОУ ВПО РГППУ. Екатеринбург, ул. Машиностроителей, 11.

Методические указания

к практическим занятиям

по дисциплине «Технический перевод текста (английский)»

для студентов всех форм обучения

направления подготовки 051000.62 Профессиональное обучение (по отраслям) профиля «Информатика и вычислительная техника»

профилизации «Компьютерные технологии»

 

Екатеринбург 2012


 

Методические указания к практическим занятиям по дисциплине «Технический перевод текста (английский)» Екатеринбург, ФГАОУ ВПО «Рос. гос. проф.-пед. университет», 2012. 36 с.

 

 

Составители ст. преп. С.И. Унда

ст. преп. А.Г. Николаева

Рецензент канд. педаг. наук, доцент И.М. Кондюрина

 

Одобрены на заседании кафедры иностранных языков ИнЛин РГППУ.Протокол от 27.09.2012г. № 2

 

Заведующая кафедрой ИЯ И.Д. Белеева

 

Рекомендованы к печати методической комиссией Института лингвистики РГППУ. Протокол от 15.10.2012г. №1

 

Председатель методической комиссии

института лингвистики РГППУ А.А. Евтюгина

 

 

© ФГАОУ ВПО «Российский государственный профессионально-педагогический университет», 2012

 
 

 

Введение

Целью данных методических указаний является подготовка студентов к самостоятельной работе со специальной литературой на материале предложенных текстов. Методические указания содержат 13 оригинальных текстов и вопросы к ним для последующего обсуждения.

Тексты по проблемам информационных технологий взяты из современных научных и научно-популярных изданий (монографий, журналов и газет) и приводятся без адаптации. Исходными посылками при отборе текстов, предназначенных для перевода, были актуальность темы, познавательность и увлекательность изложения материала. Тексты рекомендуется переводить в том порядке, в каком они представлены в методических указаниях, так как материал дан в возрастающей степени сложности.

Данные методические указания предназначены для студентов всех форм обучения направления подготовки 051000.62 Профессиональное обучение (по отраслям) профиля «Информатика и вычислительная техника» профилизации «Компьютерные технологии»

Text1. Computers Make the World Smaller and Smarter

Words and phrases to the text:

1. edutainment (n)a system that has both educational and entertainment value

2. smart device (n) a device that contains an embedded processor and memory

3. handheld (computer) (n) a small portable computer that can be held in one hand

 

The ability of tiny computing devices to control complex operations has transformed the way many tasks are performed, ranging from scientific research to producing consumer products. Tiny 'computers on a chip' are used in medical equipment, home appliances, cars and toys. Workers use handheld computing devices to collect data at a customer site, to generate forms, to control inventory, and to serve as desktop organisers.

Not only is computing equipment getting smaller, it is getting more sophisticated. Computers are part of many machines and devices that once required continual human supervision and control. Today, computers in security systems result in safer environments, computers in cars improve energy efficiency, and computers in phones provide features such as call forwarding, call monitoring, and call answering.

These smart machines are designed to take over some of the basic tasks previously performed by people; by so doing, they make life a little easier and a little more pleasant. Smart cards store vital information such as health records, drivers' licenses, bank balances, and so on. Smart phones, cars, and appliances with built in computers can be programmed to better meet individual needs. A smart house has a built-in monitoring system that can turn lights on and off, open and close windows, operate the oven, and more.

With small computing devices available for performing smart tasks like cooking dinner, programming the DVD recorder, and controlling the flow of information in an organization, people are able to spend more time doing what they often do best - being creative. Computers can help people work more creatively.

Multimedia systems are known for their educational and entertainment value, which we call 'edutainment'. Multimedia combines text with sound, video, animation, and graphics, which greatly enhances the interaction between user and machine and can make information more interesting and appealing to people. Expert systems software enables computers to 'think' like experts. Medical diagnosis expert systems, for example, can help doctors pinpoint a patient's illness, suggest further tests, and prescribe appropriate drugs.

Connectivity enables computers and software that might otherwise be incompatible to communicate and to share resources. Now that computers are proliferating in many areas and networks are available for people to access data and communicate with others, personal computers are becoming interpersonal PCs. They have the potential to significantly improve the way we relate to each other. Many people today telecommute -that is, use their computers to stay in touch with the office while they are working at home. With the proper tools, hospital staff can get a diagnosis from a medical expert hundreds or thousands of miles away. Similarly, the disabled can communicate more effectively with others using computers.

Distance learning and video conferencing are concepts made possible with the use of an electronic classroom or boardroom accessible to people in remote locations. Vast databases of information are currently available to users of the Internet, all of whom can send mail messages to each other. The information superhighway is designed to significantly expand this interactive connectivity so that people all over the world will have free access to all these resources.

People power is critical to ensuring that hardware, software, and connectivity are effectively integrated in a socially responsible way. People -computer users and computer professionals -are the ones who will decide which hardware, software, and networks endure and how great an impact they will have on our lives. Ultimately people power must be exercised to ensure that computers are used not only efficiently but in a socially responsible way.

 

Comprehension check



Поделиться:


Последнее изменение этой страницы: 2016-12-17; просмотров: 1541; Нарушение авторского права страницы; Мы поможем в написании вашей работы!

infopedia.su Все материалы представленные на сайте исключительно с целью ознакомления читателями и не преследуют коммерческих целей или нарушение авторских прав. Обратная связь - 3.128.31.76 (0.011 с.)