History of the development of mechanical computers. Computer Engineering. From calculator to computer

Lecture No. 10. HISTORY OF COMPUTER ENGINEERING DEVELOPMENT

1.1. THE INITIAL STAGE OF COMPUTER EQUIPMENT DEVELOPMENT

The need to automate data processing, including calculations, arose a long time ago. It is believed that historically the first and, accordingly, the simplest counting device was the abacus, which refers to hand-held counting devices.

The board was divided into grooves. One groove corresponded to units, the other to tens, etc. If more than 10 pebbles were collected in any groove when counting, they were removed and one pebble was added in the next rank. In the countries of the Far East, the Chinese analogue of abacus was widespread - suan-pan(counting was based not on ten, but on five), in Russia - abacus.

Abacus

Suan-pan. Established 1930

Abacus. Set 401.28

The first attempt that has come down to us to solve the problem of creating a machine capable of adding multi-digit integers was a sketch of a 13-bit adding device developed by Leonardo da Vinci around 1500.

In 1642, Blaise Pascal invented a device that mechanically performed the addition of numbers. Having become acquainted with the works of Pascal and having studied his arithmetic machine, Gottfried Wilhelm Leibniz made significant improvements to it, and in 1673 he designed an arithmometer that allows mechanically perform four arithmetic operations. Since the 19th century, adding machines have become very widespread and used. They even performed very complex calculations, for example, calculations of ballistic tables for artillery firing. There was a special profession - a counter.

Despite the obvious progress compared to the abacus and similar devices for manual calculation, these mechanical computing devices required constant human participation during the calculation process. A person, performing calculations on such a device, controls its operation himself and determines the sequence of operations performed.

The dream of the inventors of computer technology was to create a counting machine that would, without human intervention, carry out calculations according to a pre-compiled program.

In the first half of the 19th century, the English mathematician Charles Babbage tried to create a universal computing device - Analytical machine, which was supposed to perform arithmetic operations without human intervention. The Analytical Engine incorporated principles that have become fundamental to computing and provided all the basic components found in a modern computer. Babbage's Analytical Engine was to consist of the following parts:

1. “Factory” is a device in which all operations for processing all types of data (ALU) are performed.

2. “Office” is a device that ensures the organization of the execution of a data processing program and the coordinated operation of all machine components during this process (CU).

3. “Warehouse” is a device designed to store initial data, intermediate values ​​and data processing results (memory, or simply memory).

4. Devices capable of converting data into a form accessible to a computer (encoding). Input Devices.

5. Devices capable of converting the results of data processing into a form understandable to humans. Output devices.

In the final version of the machine, it had three punched card input devices from which the program and data to be processed were read.

Babbage was unable to complete the work - it turned out to be too difficult based on the mechanical technology of the time. However, he developed the basic ideas, and in 1943 the American Howard Aiken, based on 20th century technology, electromechanical relays– was able to build at one of the company’s enterprises IBM such a car called "Mark-1". Mechanical elements (counting wheels) were used to represent numbers, and electromechanical elements were used for control.

1.2. THE BEGINNING OF THE MODERN HISTORY OF ELECTRONIC COMPUTING ENGINEERING

A true revolution in computing has occurred in connection with the use of electronic devices. Work on them began in the late 30s simultaneously in the USA, Germany, Great Britain and the USSR. By this time, vacuum tubes, which had become the technical basis for devices for processing and storing digital information, were already widely used in radio engineering devices.

One of the greatest American mathematicians, John von Neumann, made a huge contribution to the theory and practice of creating electronic computer technology at the initial stage of its development. “Von Neumann’s principles” have forever entered the history of science. The combination of these principles gave rise to the classical (von Neumann) computer architecture. One of the most important principles - the stored program principle - requires that the program be stored in the machine's memory in the same way as the original information is stored in it. The first computer with a stored program ( EDSAC ) was built in Great Britain in 1949.

In our country, until the 70s, the creation of computers was carried out almost entirely independently and independently of the outside world (and this “world” itself was almost completely dependent on the United States). The fact is that electronic computer technology from the very moment of its initial creation was considered a top-secret strategic product, and the USSR had to develop and produce it independently. Gradually, the secrecy regime was softened, but even at the end of the 80s, our country could only buy outdated computer models abroad (and the most modern and powerful computers are still developed and produced by leading manufacturers - the USA and Japan - in secrecy mode).

The first domestic computer, MESM (“small electronic computer”), was created in 1951 under the leadership of Sergei Aleksandrovich Lebedev, the largest Soviet computer designer. The record among them and one of the best in the world for its time was BESM-6 (“large electronic calculating machine, 6th model”), created in the mid-60s and for a long time was the basic machine in defense, space research, scientific and technical research in the USSR. In addition to machines of the BESM series, computers of other series were also produced - “Minsk”, “Ural”, M-20, “Mir” and others.

With the start of serial production, computers began to be divided into generations; the corresponding classification is outlined below.

1.3. COMPUTER GENERATIONS

In the history of computer technology, there is a peculiar periodization of computers by generation. It was initially based on a physical and technological principle: a machine is assigned to one generation or another depending on the physical elements used in it or the technology for their manufacture. The boundaries of generations in time are blurred, since at the same time cars of completely different levels were produced. When dates relating to generations are given, they most likely mean the period of industrial production; the design was carried out much earlier, and very exotic devices can still be found in operation today.

Currently, the physical and technological principle is not the only one in determining whether a particular computer belongs to a generation. One should also take into account the level of software, speed, and other factors, the main ones of which are summarized in the attached table. 4.1.

It should be understood that the division of computers by generation is very relative. The first computers, produced before the beginning of the 50s, were “piece” products on which the basic principles were worked out; there is no particular reason to attribute them to any generation. There is no unanimity in determining the characteristics of the fifth generation. In the mid-80s it was believed that the main characteristic of this (future) generation was full implementation of the principles of artificial intelligence. This task turned out to be much more difficult than it seemed at the time, and a number of experts are lowering the bar of requirements for this stage (and even claim that it has already taken place). There are analogues of this phenomenon in the history of science: for example, after the successful launch of the first nuclear power plants in the mid-50s, scientists announced that the launch of many times more powerful, cheap energy, environmentally friendly thermonuclear stations was about to happen; however, they underestimated the gigantic difficulties along this path, since there are no thermonuclear power plants to this day.

At the same time, among fourth-generation cars the difference is extremely large, and therefore in Table. 4.1, the corresponding column is divided into two: A and B. The dates indicated in the top line correspond to the first years of production of the computer. Many of the concepts reflected in the table will be discussed in subsequent sections of the textbook; Here we will limit ourselves to a brief comment.

The younger the generation, the more distinct the classification characteristics. Computers of the first, second and third generations today are, at best, museum pieces.

Which computers are first generation?

TO first generation usually refer to cars created at the turn of the 50s. Their schemes used vacuum tubes. These computers were huge, uncomfortable and too expensive cars, which could only be purchased by large corporations and governments. The lamps consumed huge amounts of electricity and generated a lot of heat.

The set of instructions was small, the circuit of the arithmetic-logical device and the control device was quite simple, and there was practically no software. Indicators of RAM capacity and performance were low. Punched tapes, punched cards, magnetic tapes and printing devices were used for input and output.

Performance is about 10-20 thousand operations per second.

But this is only the technical side. Another thing is also very important - the ways of using computers, the programming style, and the features of the software.

Programs for these machines were written in the language of a specific machine. The mathematician who compiled the program sat down at the control panel of the machine, entered and debugged the programs and calculated them. The debugging process took the longest time.

Despite the limited capabilities, these machines made it possible to perform complex calculations necessary for weather forecasting, solving nuclear energy problems, etc.

Experience with first-generation machines showed that there was a huge gap between the time spent developing programs and the calculation time.

Domestic machines of the first generation: MESM (small electronic calculating machine), BESM, Strela, Ural, M-20.

Which computers belong to the second generation?

Second generation computer equipment - machines designed around 1955-65. They are characterized by their use as vacuum tubes, so discrete transistor logic elements. Their RAM was built on magnetic cores. At this time, the range of input/output equipment used began to expand, and high-performance devices for working with magnetic tapes, magnetic drums and the first magnetic disks.

Performance- up to hundreds of thousands of operations per second, memory capacity- up to several tens of thousands of words.

The so-called high level languages, the means of which allow the description of the entire necessary sequence of computational actions in a clear, easily understandable form.

A program written in an algorithmic language is incomprehensible to a computer, which understands only the language of its own commands. Therefore, special programs called broadcasters, translate a program from a high-level language to machine language.

A wide range of library programs has appeared for solving various mathematical problems. Appeared monitor systems, controlling the mode of broadcasting and execution of programs. Monitor systems later grew into modern operating systems.

Thus, The operating system is a software extension of the computer control device.

Operating systems with limited capabilities have already been created for some second-generation machines.

Second generation cars were characterized software incompatibility, which made it difficult to organize large information systems. Therefore, in the mid-60s, there was a transition to the creation of computers that were software compatible and built on a microelectronic technological base.

What are the features of third generation computers?

Third generation machines were created approximately after the 60s. Since the process of creating computer technology was continuous, and involved many people from different countries dealing with various problems, it is difficult and futile to try to determine when a “generation” began and ended. Perhaps the most important criterion for distinguishing second and third generation machines is one based on the concept of architecture.

Third generation machines are families of machines with a single architecture, i.e. software compatible. They use integrated circuits, also called microcircuits, as their elemental base.

Third generation machines have advanced operating systems. They have multi-programming capabilities, i.e. simultaneous execution of several programs. Many tasks of managing memory, devices and resources began to be taken over by the operating system or the machine itself.

Examples of third-generation machines are the IBM-360, IBM-370 families, ES EVM (Unified Computer System), SM EVM (Family of Small Computers), etc.

The performance of machines within the family varies from several tens of thousands to millions of operations per second. The capacity of RAM reaches several hundred thousand words.

What is characteristic of fourth generation cars?

Fourth generation is the current generation of computer technology developed after 1970.

The most conceptually important criterion by which these computers can be distinguished from the third generation machines is that the fourth generation machines were designed to efficiently use modern high-level languages ​​and simplify the programming process for the end user.

In terms of hardware, they are characterized by widespread use integrated circuits as an elemental base, as well as the presence of high-speed random access storage devices with a capacity of tens of megabytes.

From a structural point of view, machines of this generation represent multiprocessor and multi-machine systems, working on shared memory and a common field of external devices. The performance is up to several tens of millions of operations per second, the RAM capacity is about 1 - 64 MB.

They are characterized by:

  • use of personal computers;
  • telecommunications data processing;
  • computer networks;
  • widespread use of database management systems;
  • elements of intelligent behavior of data processing systems and devices.

What should fifth generation computers be like?

The development of subsequent generations of computers is based on highly integrated large-scale integrated circuits, the use of optoelectronic principles ( lasers,holography).

Development is also on the way "intellectualization" computers, eliminating the barrier between man and computer. Computers will be able to perceive information from handwritten or printed text, from forms, from the human voice, recognize the user by voice, and translate from one language to another.

In fifth-generation computers there will be a qualitative transition from processing data for processing knowledge.

The architecture of future generation computers will contain two main blocks. One of them is traditional computer. But now he is deprived of communication with the user. This connection is made by a block called a term "smart interface". Its task is to understand text written in natural language and containing the condition of the problem, and translate it into a working computer program.

The problem of decentralization of computing will also be solved using computer networks, both large ones located at a considerable distance from each other, and miniature computers located on a single semiconductor chip.

Computer generations

Index

Computer generations

First

1951-1954

Second

1958-I960

Third

1965-1966

Fourth

Fifth

1976-1979

1985-?

Processor element base

Electronic

lamps

Transistors

Integral circuits

(IS)

Large ICs (LSI)

Super large ICs

(VLSI)

Optoelectronics

Cryoelectronics

RAM element base

Cathode ray tubes

Ferrite cores

Ferrite

cores

BIS

VLSI

VLSI

Maximum RAM capacity, bytes

10 2

10 1

10 4

10 5

10 7

10 8 (?)

Maximum processor speed (op/s)

10 4

10 6

10 7

10 8

10 9

Multiprocessing

10 12 ,

Multiprocessing

Programming languages

Machine code

Assembler

High-level procedural languages ​​(HLP)

New

procedural languages

Non-procedural programming languages

New non-procedural nuclear power plants

Means of communication between the user and the computer

Control panel and punch cards

Punched cards and punched tapes

Alphanumeric terminal

Monochrome graphic display, keyboard

Color + graphic display, keyboard, mouse, etc.

The ancient man had his own counting instrument - ten fingers on his hands. The man bent his fingers - added them, straightened them - subtracted. And the man guessed: for counting you can use anything you can get your hands on - pebbles, sticks, bones. Then they began to tie knots on the rope and make notches on sticks and planks (Fig. 1.1).

Rice. 1.1. Nodules (A) and notches on the tablets ( b)

Abacus period. An abacus (gr. abax - board) was a board covered with a layer of dust, on which lines were drawn with a sharp stick and some objects were placed in the resulting columns according to the positional principle. In the V-IV centuries. BC e. The oldest known accounts were created - the “Salamin board” (named after the island of Salamis in the Aegean Sea), which was called “abacus” by the Greeks and Western Europe. In Ancient Rome, the abacus appeared in the 5th-6th centuries. n. e. and was called calculi or abakuli. The abacus was made from bronze, stone, ivory and colored glass. A bronze Roman abacus has survived to this day, on which pebbles moved in vertically cut grooves (Fig. 1.2).

Rice. 1.2.

In the XV-XVI centuries. In Europe, counting was common on lines or counting tables with tokens placed on them.

In the 16th century Russian abacus with a decimal number system appeared. In 1828, Major General F. M. Svobodskoy put on display an original device consisting of many accounts connected in a common frame (Fig. 1.3). All operations were reduced to the actions of addition and subtraction.

Rice. 1.3.

Period of mechanical devices. This period lasted from the beginning of the 17th to the end of the 19th century.

In 1623, Wilhelm Schickard described the design of a calculating machine in which the operations of addition and subtraction were mechanized. In 1642, the French mechanic Blaise Pascal designed the first mechanical calculating machine - “Pascalina” (Fig. 1.4).

In 1673, the German scientist Goftrid Leibniz created the first mechanical computing machine, performing

Rice. 1.4.

Show four arithmetic operations (addition, subtraction, multiplication and division). In 1770 in Lithuania, E. Jacobson created a summing machine that determined quotients and was capable of working with five-digit numbers.

In 1801 - 1804 French inventor J.M. Jacquard was the first to use punched cards to control an automatic loom.

In 1823, the English scientist Charles Babbage developed a project for the “Difference Engine,” which anticipated the modern program-controlled automatic machine (Fig. 1.5).

In 1890, a resident of St. Petersburg, Vilgodt Odner, invented an adding machine and launched their production. By 1914, in Russia alone there were more than 22 thousand Odner adding machines. In the first quarter of the 20th century. these adding machines were the only mathematical machines that were widely used in various areas of human activity (Fig. 1.6).


Rice. 1.5. Babbage's machine Fig. 1.6. Adding machine

Computer period. This period began in 1946 and continues today. It is characterized by the combination of advances in the field of electronics with new principles for constructing computers.

In 1946, under the leadership of J. Mauchly and J. Eckert, the first computer was created in the USA - ENIAC (Fig. 1.7). It had the following characteristics: length 30 m, height 6 m, weight 35 tons, 18 thousand vacuum tubes, 1500 relays, 100 thousand resistances and capacitors, 3500 op/s. At the same time, these scientists began work on a new machine - "EDVAC" (EDVAC - Electronic


Rice. 1.7.

Discret Variable Automatic Computer - an electronic automatic computer with discrete variables), the program of which had to be stored in the computer's memory. It was supposed to use mercury tubes used in radar as internal memory.

In 1949, the EDSAC computer with a program stored in memory was built in Great Britain.

The appearance of the first computers is still controversial. Thus, the Germans consider the first computer to be a machine for artillery crews, created by Konrad Zuse in 1941, although it worked on electric relays and was therefore not electronic, but electromechanical. For Americans, this is ENIAC (1946, J. Mauchly and J. Eckert). Bulgarians consider the inventor of the computer to be John (Ivan) Atanasov, who in 1941 in the USA designed a machine for solving systems of algebraic equations.

The British, having rummaged through secret archives, stated that the first electronic computer was created in 1943 in England and was intended to decipher negotiations of the German high command. This equipment was considered so secret that after the war it was destroyed on Churchill's orders and the plans were burned to prevent the secret from falling into the wrong hands.

The Germans conducted secret everyday correspondence using Enigma encryption machines (Latin: enigma - riddle). By the beginning of World War II, the British already knew how Enigma worked and were looking for ways to decipher its messages, but the Germans had another encryption system designed only for the most important messages. It was a Schlusselzusatz-40 machine manufactured by Lorenz in a small number of copies (the name translates as “cipher attachment”). Externally, it was a hybrid of an ordinary teletype and a mechanical cash register. The teletype translated the text typed on the keyboard into a sequence of electrical impulses and pauses between them (each letter corresponds to a set of five impulses and “empty spaces”). The “cash register” rotated two sets of five gears, which randomly added two more sets of five pulses and skips to each letter. The wheels had different numbers of teeth, and this number could be changed: the teeth were made movable, they could be moved to the side or pulled into place. There were two more “motor” wheels, each of which rotated its own set of gears.

At the beginning of the transmission of the encrypted message, the radio operator informed the recipient the initial position of the wheels and the number of teeth on each of them. This setting data was changed before each transmission. By placing the same sets of wheels in the same position on his machine, the receiving radio operator ensured that the extra letters were automatically subtracted from the text, and the teletype printed the original message.

In 1943, the mathematician Max Newman developed the Colossus electronic machine in England. The wheels of the car were modeled by 12 groups of electron tubes - thyratrons. Automatically going through different options for the states of each thyratron and their combinations (the thyratron can be in two states - to pass or not to pass electric current, i.e., to give an impulse or a pause), “Colossus” figured out the initial setting of the gears of the German machine. The first version of the “Colossus” had 1,500 thyratrons, and the second, which started operating in June 1944, had 2,500. In an hour, the machine “swallowed” 48 km of punched tape, onto which operators filled in rows of ones and zeros from German messages; 5,000 letters were processed per second. This computer had a memory based on charging and discharging capacitors. It made it possible to read the top-secret correspondence of Hitler, Kesselring, Rommel, etc.

Note. A modern computer solves the initial position of the wheels of the Schlusselzusatz-40 twice as slowly as the Colossus did, so a problem that in 1943 was solved in 15 minutes takes the Repyit PC 18 hours! The fact is that modern computers are designed to be universal, designed to perform a wide variety of tasks, and cannot always compete with ancient computers that could do only one action, but very quickly.

The first domestic electronic computer, MESM, was developed in 1950. It contained more than 6,000 vacuum tubes. This generation of computers includes: “BESM-1”, “M-1”, “M-2”, “M-3”, “Strela”, “Minsk-1”, “Ural-1”, “Ural- 2", "Ural-3", "M-20", "Setun", "BESM-2", "Hrazdan" (Table 1.1). Their speed did not exceed 2-3 thousand op/s, the RAM capacity was 2 K or 2048 machine words (1 K = 1024) with a length of 48 binary characters.

Table 1.1. Characteristics of domestic computers

Characters

First generation

Second generation

Targeting

Length ma-

tire

va (binary digits)

Speed

Ferrite core

About half of the total volume of data in the world's information systems is stored on mainframe computers. For these purposes, the 1VM company back in the 1960s. began producing computers 1ВМ/360, 1ВМ/370 (Fig. 1.8), which became widespread in the world.

With the advent of the first computers in 1950, the idea of ​​using computer technology to control technological processes arose. Computer-based control allows you to maintain process parameters in a mode close to optimal. As a result, the consumption of materials and energy is reduced, productivity and quality are increased, and rapid restructuring of equipment to produce a different type of product is ensured.


Rice. 1.8.

The pioneer of the industrial use of control computers abroad was the company Digital Equipment Corp. (DEC), which released in 1963 a specialized computer “PDP-5” for controlling nuclear reactors. The initial data were measurements obtained as a result of analog-to-digital conversion, the accuracy of which was 10-11 binary digits. In 1965, DEC released the first miniature computer “PDP-8” the size of a refrigerator and costing 20 thousand dollars, the element base of which was used integrated circuits.

Before the advent of integrated circuits, transistors were manufactured individually and had to be connected and soldered by hand when the circuits were assembled. In 1958, American scientist Jack Kilby figured out how to create several transistors on one semiconductor wafer. In 1959, Robert Noyce (the future founder of Intel) invented a more advanced method that made it possible to create transistors and all the necessary connections between them on one plate. The resulting electronic circuits became known as integrated circuits, or chips. Subsequently, the number of transistors that could be placed per unit area of ​​the integrated circuit approximately doubled every year. In 1968, Burroughs released the first integrated circuit computer, and in 1970, Intel began selling memory integrated circuits.

In 1970, another step was taken on the path to a personal computer - Marchian Edward Hoff from Intel designed an integrated circuit similar in its functions to the central processor of a mainframe computer. This is how the first one appeared microprocessor Intel-4004, which went on sale at the end of 1970. Of course, the capabilities of the Intel-4004 were much more modest than those of the central processor of a mainframe computer - it worked much slower and could only process 4 bits of information simultaneously (mainstream processors processed 16 or 32 bits simultaneously). In 1973, Intel released the 8-bit microprocessor Intel-8008, and in 1974, its improved version Intel-8080, which until the end of the 1970s. was the standard for the microcomputer industry (Table 1.2).

Table 1.2. Generations of computers and their main characteristics

Generation

Fourth (since 1975)

Computer element base

Electronic tubes, relays

Transistors,

parametrons

Ultra-Large ICs (VLSI)

CPU performance

Up to 3 10 5 op/s

Up to 3 10 6 op/s

Up to 3 10 7 op/s

3 10 7 op/s

Type of random access memory (RAM)

Triggers,

ferrite

cores

Miniature

ferrite

cores

Semiconductor on

Semiconductor on

More than 16 MB

Characteristic types of computers

generations

Small, medium, large, special

mini- and microcomputers

Supercomputer,

PC, special, general, computer networks

Typical generation models

IBM 7090, BESM-6

BH-2, 1VM RS/XT/AT, RB/2, Sgau, networks

Characteristic

software

security

Codes, autocodes, assemblers

Programming languages, dispatchers, automated control systems, process control systems

PPP, DBMS, CAD, Javascript, operational

DB, ES, parallel programming systems

Generations of computers are determined by the element base (lamps, semiconductors, microcircuits of varying degrees of integration (Fig. 1.9)), architecture and computing capabilities (Table 1.3).

Table 1.3. Features of computer generations

Generation

Peculiarities

I generation (1946-1954)

Application of vacuum tube technology, use of memory systems on mercury delay lines, magnetic drums, cathode ray tubes. Punched tapes and punched cards, magnetic tapes and printing devices were used for data input and output

II generation (1955-1964)

Use of transistors. Computers have become more reliable and their performance has increased. With the advent of memory on magnetic cores, its operating cycle decreased to tens of microseconds. The main principle of the structure is centralization. High-performance devices for working with magnetic tapes and magnetic disk memory devices appeared

III generation (1965-1974)

Computers were designed on the basis of integrated circuits of low degree of integration (MIS from 10 to 100 components per chip) and medium degree of integration (SIS from 10 to 1000 components per chip). At the end of the 1960s. mini-computers appeared. In 1971, the first microprocessor appeared

IV generation (since 1975)

The use of large integrated circuits (LSI from 1000 to 100 thousand components per chip) and ultra-large integrated circuits (VLSI from 100 thousand to 10 million components per chip) when creating computers. The main emphasis when creating computers is on their “intelligence”, as well as on an architecture focused on knowledge processing


a B C

Rice. 1.9. Computer element base: A - electric lamp; b - transistor;

V- integrated circuit

The first microcomputer was the Altair-8800, created in 1975 by a small company in Albuquerque (New Mexico) based on the Intel-8080 microprocessor. At the end of 1975, Paul Allen and Bill Gates (future founders of Microsoft) created a Basic language interpreter for the Altair computer, which allowed users to write programs quite simply.

Subsequently, the TRS-80 RS, RET RS and Apple computers appeared (Fig. 1.10).

Rice. 1.10.

Domestic industry produced DEC-compatible (interactive computing systems DVK-1, ..., DVK-4 based on the Elektronika MS-101, Elektronika 85, Elektronika 32 computers) and IBM PC-compatible (EC 1840 - EC 1842, EC 1845, EC 1849, EC 1861, Iskra 4861), which were significantly inferior in their characteristics to the above.

Recently, personal computers produced by US companies have become widely known: Compaq Computer, Apple (Macintosh), Hewlett Packard, Dell, DEC; UK companies: Spectrum, Amstard; by the French company Micra; by Italian company Olivetty; Japanese companies: Toshiba, Panasonic, Partner.

Personal computers from IBM (International Business Machines Corporation) are currently the most popular.

In 1983, the IBM PC XT computer with a built-in hard drive appeared, and in 1985, the IBM PC AT computer based on the 16-bit Intel 80286 processor (Fig. 1.11).

In 1989, the Intel 80486 processor was developed with modifications 486SX, 486DX, 486DX2 and 486DX4. Clock frequencies of 486DX processors, depending on the model, are 33, 66 and 100 MHz.


IBM's new family of PC models is called PS/2 (Personal System 2). The first models of the PS/2 family used the Intel 80286 processor and actually copied the AT PC, but based on a different architecture.

In 1993, Pentium processors with clock frequencies of 60 and 66 MHz appeared.

In 1994, Intel began producing Pentium processors with clock frequencies of 75, 90 and 100 MHz. In 1996, the clock speed of Pentium processors increased to 150, 166 and 200 MHz (Fig. 1.12).


System

Mouse type manipulator

Rice. 1.12. Multimedia computer configuration

In 1997, Intel released a new Pentium MMX processor with clock frequencies of 166 and 200 MHz. The abbreviation MMX meant that this processor was optimized for working with graphics and video information. In 1998, Intel announced the release of the Celeron processor with a clock frequency of 266 MHz.

Since 1998, Intel has announced a version of the Pentium® II Heop™ processor with a clock frequency of 450 MHz (Table 1.4).

Table 1.4. IBM computers

computer

CPU

Clock frequency, MHz

operational

For a long time, processor manufacturers - primarily Intel and AMD - increased their clock speed to improve processor performance. However, at clock frequencies above 3.8 GHz, the chips overheat and you can forget about the benefits. New ideas and technologies were required, one of which was the idea of ​​creating multi-core chips. In such a chip, two or more processors operate in parallel, which provide greater performance at a lower clock frequency. The currently running program divides the data processing tasks between both cores. This is most effective when both the operating system and application programs are designed to work in parallel, such as for graphics processing.

Multi-core architecture is a variant of processor architecture that places two or more “execution,” or compute, Pentium® cores on a single processor. A multi-core processor is inserted into the processor socket, but the operating system treats each of its execution cores as a separate logical processor with all the corresponding execution resources (Figure 1.13).

This implementation of the internal processor architecture is based on the “divide and conquer” strategy. In other words, section


Rice. 1.13.

By dividing the computational work done in traditional microprocessors by a single Pentium core among multiple Pentium execution cores, a multi-core processor can perform more work in a given time interval. To do this, the software must support load distribution between several execution cores. This functionality is called parallelism at the thread level, or the organization of threaded processing, and the applications and operating systems that support it (such as Microsoft Windows XP) are called multithreaded.

Multi-core also affects the simultaneous operation of standard applications. For example, one processor core may be responsible for a program running in the background, while an antivirus program takes up the resources of the second core. In practice, dual-core processors do not perform calculations twice as fast as single-core processors: although the performance gain is significant, it depends on the type of application.

The first dual-core processors appeared on the market in 2005. Over time, more and more successors appeared. Therefore, “old” dual-core processors have seriously fallen in price today. They can be found in computers starting at $600 and laptops starting at $900. Computers with modern dual-core chips cost about $100 more than models equipped with “older” chips. One of the main developers of multi-core processors is Intel Corporation.

Before the advent of dual-core chips, manufacturers offered single-core processors with the ability to run multiple programs in parallel. Some Pentium 4 series processors had a Hyper-Threading function that returned a byte value containing the logical and physical identifiers of the current process. It can be seen as the predecessor of the Dual-Core architecture, consisting of two optimized mobile execution cores. Dual-Core means that while one core is busy running an application, or, for example, checking for virus activity, the other core will be available to perform other tasks, for example, the user will be able to surf the Internet or work with a spreadsheet. Although the processor had one physical core, the chip was designed so that it could execute two programs simultaneously (Figure 1.14).

Control Panel

QNX Neutrino RTOS (one copy)

Command line interface (kernels 0 and 1)

Routing (Cores 0 and 1)

Management, Administration and Maintenance (Cores 0 and 1)

Dashboard Hardware

Dashboard Monitoring (Cores 0 and 1)

Rice. 1.14. Scheme for using multiprocessing

in the control panel

The operating system recognizes such a chip as two separate processors. Conventional processors process 32 bits per clock cycle. The latest chips manage to process twice as much data in one clock cycle, i.e. 64 bits. This advantage is especially noticeable when processing large amounts of data (for example, when processing photographs). But in order to use it, the operating system and applications must support the 64-bit processing mode.

Under specially designed 64-bit versions of Windows XP and Windows Vista, 32- and 64-bit programs are launched, depending on the need.

The rapid development of digital computing (CT) and the emergence of a science about the principles of its construction and design began in the 40s of the 20th century, when electronics and microelectronics became the technical basis of CT, and achievements in the field of computers became the basis for the development of computer architecture (formerly called computers). artificial intelligence.

Until this time, for almost 500 years, VT was reduced to the simplest devices for performing arithmetic operations on numbers. The basis of almost all devices invented over 5 centuries was a gear wheel designed to fix 10 digits of the decimal number system. The world's first sketch of a thirteen-bit decimal adding device based on such wheels belongs to Leonardo da Vinci.

The first actually implemented mechanical digital computing device was the “Pascalina” of the great French scientist Blaise Pascal, which was a 6 (or 8) digit device, on gear wheels, designed for adding and subtracting decimal numbers (1642).

30 years after Pascalina, Gottfried Wilhelm Leibniz's "arithmetic instrument" appeared in 1673 - a twelve-digit decimal device for performing arithmetic operations, including multiplication and division.

At the end of the 18th century, two events occurred in France that were of fundamental importance for the further development of digital computing technology. Such events include:

 Joseph Jacquard's invention of programmatic control of a weaving machine using punched cards;

 development by Gaspard de Prony of a computing technology that divided numerical calculations into three stages: development of a numerical method, compilation of a program for a sequence of arithmetic operations, carrying out the actual calculations by arithmetic operations on numbers in accordance with the compiled program.

These innovations were later used by the Englishman Charles Babbage, who took a qualitatively new step in the development of VT means - transition from manual to automatic execution of calculations according to a compiled program. He developed a project for the Analytical Engine - a mechanical universal digital computer with program control (1830-1846). The machine consisted of five devices: arithmetic (AU); storage (memory); management (UU); input (UVV); output (UW).

It was these devices that made up the first computers that appeared 100 years later. The control unit was built on the basis of gear wheels, and it was proposed to implement a memory on them (for thousands of 50-bit numbers). Punched cards were used to enter data and programs. The estimated speed of calculations is addition and subtraction in 1 second, multiplication and division in 1 minute. In addition to arithmetic operations, there was a conditional jump command.

It should be noted that although individual components of the machine were created, the entire machine could not be created due to its bulkiness. It would require more than 50,000 gear wheels alone. The inventor planned to use a steam engine to power his analytical engine.

In 1870 (a year before Babbage's death), the English mathematician Jevons designed the world's first "logical machine", which made it possible to mechanize the simplest logical conclusions.

The creators of logical machines in pre-revolutionary Russia were Pavel Dmitrievich Khrushchev (1849-1909) and Alexander Nikolaevich Shchukarev (1884-1936), who worked in educational institutions in Ukraine.

Babbage's brilliant idea was realized by the American scientist Howard Aiken, who created the first relay-mechanical computer in the United States in 1944. Its main blocks - arithmetic and memory - were executed on gear wheels. If Babbage was far ahead of his time, then Aiken, using the same gears, technically used outdated solutions when implementing Babbage's idea.

It should be noted that ten years earlier, in 1934, the German student Konrad Zuse, working on his graduation project, decided to make a digital computer with program control. This machine was the first in the world to use the binary number system. In 1937, the Z1 machine made its first calculations. It was binary 22-bit floating point with a memory of 64 numbers, and worked on a purely mechanical (lever) basis.

In the same 1937, when the world's first mechanical binary machine Z1 began operating, John Atanasov (a Bulgarian by birth who lived in the USA) began developing a specialized computer, using vacuum tubes (300 tubes) for the first time in the world.

In 1942-43, the Colossus computer was created in England (with the participation of Alan Turing). This machine, consisting of 2000 vacuum tubes, was intended to decipher radiograms of the German Wehrmacht. Since the works of Zuse and Turing were secret, few knew about them at that time and they did not cause any resonance in the world.

Only in 1946 did information appear about the ENIAC computer (electronic digital integrator and computer), created in the USA by D. Mauchly and P. Eckert, using electronic technology. The machine used 18 thousand vacuum tubes, and it performed about 3 thousand operations per second. However, the machine remained decimal, and its memory was only 20 words. Programs were stored outside of RAM.

Almost simultaneously, in 1949-52. scientists from England, the Soviet Union and the USA (Maurice Wilkes, EDSAC computer, 1949; Sergei Lebedev, MESM computer, 1951; Isaac Brook, M1 computer, 1952; John Mauchly and Presper Eckert, John von Neumann computer "ADVAK", 1952), created a computer with a stored program.

In general, there are five generations COMPUTER.

First generation (1945-1954 ) characterized by the appearance of electronic tube technology. This is the era of the emergence of computer technology. Most of the first generation machines were experimental devices and were built to test certain theoretical principles. The weight and size of these computers were such that they often required separate buildings.

The founders of computer science are rightfully considered to be Claude Shannon, the creator of information theory, Alan Turing, a mathematician who developed the theory of programs and algorithms, and John von Neumann, the author of the design of computing devices, which still underlies most computers. In those same years, another new science related to computer science arose - cybernetics - the science of management as one of the main information processes. The founder of cybernetics is the American mathematician Norbert Wiener.

In the second generation (1955-1964) Transistors were used instead of vacuum tubes, and magnetic cores and magnetic drums - distant ancestors of modern hard drives - were used as memory devices. All this made it possible to sharply reduce the size and cost of computers, which then began to be built for sale for the first time.

But the main achievements of this era belong to the field of programs. In the second generation, what is now called an operating system first appeared. At the same time, the first high-level languages ​​were developed - Fortran, Algol, Cobol. These two important improvements made writing computer programs much easier and faster.

At the same time, the scope of computer applications expanded. Now it was no longer just scientists who could count on access to computer technology, as computers were used in planning and management, and some large firms even began to computerize their accounting, anticipating this process by twenty years.

IN third generation (1965-1974) For the first time, integrated circuits began to be used - entire devices and assemblies of tens and hundreds of transistors, made on a single semiconductor crystal (microcircuits). At the same time, semiconductor memory appeared, which is still used in personal computers as operational memory.

During these years, computer production acquired an industrial scale. IBM was the first to sell a series of computers that were fully compatible with each other, from the smallest, the size of a small closet (they had never made anything smaller then), to the most powerful and expensive models. The most widespread in those years was the System/360 family from IBM, on the basis of which the ES series of computers was developed in the USSR. Back in the early 60s, the first minicomputers appeared - small, low-power computers affordable for small firms or laboratories. Minicomputers represented the first step towards personal computers, prototypes of which were released only in the mid-70s.

Meanwhile, the number of elements and connections between them that fit in one microcircuit was constantly growing, and in the 70s, integrated circuits already contained thousands of transistors.

In 1971, Intel released the first microprocessor, which was intended for desktop calculators that had just appeared. This invention was destined to create a real revolution in the next decade. The microprocessor is the main component of a modern personal computer.

At the turn of the 60s and 70s of the twentieth century (1969), the first global computer network ARPA, the prototype of the modern Internet, was born. In the same 1969, the Unix operating system and the C programming language appeared simultaneously, which had a huge impact on the software world and still maintains its leading position.

Fourth generation (1975 – 1985) characterized by fewer and fewer fundamental innovations in computer science. Progress is mainly along the path of developing what has already been invented and thought up, primarily through increasing power and miniaturization of the element base and the computers themselves.

The most important innovation of the fourth generation is the appearance of personal computers in the early 80s. Thanks to personal computers, computing technology is becoming truly widespread and accessible to everyone. Despite the fact that personal and minicomputers still lag behind large machines in computing power, the lion's share of innovations, such as graphical user interfaces, new peripheral devices, and global networks, are associated with the emergence and development of this particular technology.

Large computers and supercomputers, of course, continue to develop. But now they no longer dominate the computer arena as they once did.

Some characteristics of computer technology of four generations are given in Table. 1.1.

Table 1.1

Generations of computing

Generation

Main element

Email lamp

Transistor

Integrated circuit

Large integrated circuit (microprocessor)

Number of computers

in the world (pieces)

Tens of thousands

Millions

Computer dimensions

Significantly less

microcomputer

Performance (conditional) operations/sec

Several units

A few dozens

Several thousand

Several tens of thousands

Storage medium

Card,

Punched tape

Magnetic

Fifth generation (1986 to present) is largely determined by the results of the work of the Japanese Committee for Scientific Research in the Field of Computers, published in 1981. According to this project, computers and computing systems of the fifth generation, in addition to high performance and reliability at a lower cost using the latest technologies, must satisfy the following qualitatively new functional requirements:

 ensure ease of use of computers by implementing voice input/output systems, as well as interactive information processing using natural languages;

 provide the possibility of learning, associative constructions and logical conclusions;

 simplify the process of creating software by automating the synthesis of programs according to the specifications of the original requirements in natural languages;

 improve the basic characteristics and performance qualities of computer technology to satisfy various social problems, improve the cost-benefit ratio, speed, lightness, and compactness of computers;

 provide a variety of computing equipment, high adaptability to applications and reliability in operation.

Currently, intensive work is underway to create optoelectronic computers with massive parallelism and neural structure, which are a distributed network of a large number (tens of thousands) of simple microprocessors that model the architecture of neural biological systems

Computing devices and devices from antiquity to the present day

The main stages in the development of computer technology are: Manual - until the 17th century, Mechanical - from the mid-17th century, Electromechanical - from the 90s of the 19th century, Electronic - from the 40s of the 20th century.

The manual period began at the dawn of human civilization.

In any activity, man has always invented and created a wide variety of means, devices and tools in order to expand his capabilities and facilitate work.

With the development of trade, the need for an account appeared. Many centuries ago, to carry out various calculations, people began to use first their own fingers, then pebbles, sticks, knots, etc. But over time, the tasks facing him became more complicated, and it became necessary to find ways, invent devices that could help him solve these problems.

One of the first devices (5th century BC) that facilitated calculations was a special board, later called an abacus (from Greek “counting board”). Calculations on it were carried out by moving bones or pebbles in the recesses of boards made of bronze, stone, ivory, etc. In Greece, the abacus existed already in the 5th century BC. e. One groove corresponded to units, the other to tens, etc. If more than 10 pebbles were collected in one groove when counting, they were removed and one pebble was added to the next digit. The Romans improved the abacus, moving from grooves and pebbles to marble boards with chiseled grooves and marble balls. With its help it was possible to perform the simplest mathematical operations of addition and subtraction.

The Chinese variety of abacus - suanpan - appeared in the 6th century AD; Soroban is a Japanese abacus, derived from the Chinese suanpan, which was brought to Japan in the 15th-16th centuries. XVI century - Russian abacus with decimal number system is being created. They have undergone significant changes over the centuries, but they continue to be used until the 80s of the 20th century.

At the beginning of the 17th century, the Scottish mathematician J. Napier introduced logarithms, which had a revolutionary impact on counting. The slide rule he invented was successfully used fifteen years ago, serving engineers for more than 360 years. It is undoubtedly the crowning achievement of the manual computing tools of the automation era.

The development of mechanics in the 17th century became a prerequisite for the creation of computing devices and instruments using the mechanical method of calculation. Among the mechanical devices, there are adding machines (they can add and subtract), a multiplying device (they multiply and divide), over time they were combined into one - an adding machine (they can perform all 4 arithmetic operations).

In the diaries of the brilliant Italian Leonardo da Vinci (1452-1519), a number of drawings were already discovered in our time, which turned out to be a sketch of a summing computer on gear wheels, capable of adding 13-bit decimal numbers. In those distant years, the brilliant scientist was probably the only person on Earth who understood the need to create devices to facilitate the work of performing calculations. However, the need for this was so small (or rather, it did not exist at all!) that only more than a hundred years after the death of Leonardo da Vinci, another European was found - the German scientist Wilhelm Schickard (1592-1636), who, naturally, did not read the diaries of the great Italian - who proposed his solution to this problem. The reason that prompted Schiccard to develop a calculating machine for summing and multiplying six-digit decimal numbers was his acquaintance with the Polish astronomer J. Kepler. Having become acquainted with the work of the great astronomer, which was mainly related to calculations, Schickard was inspired by the idea of ​​​​helping him in his difficult work. In a letter addressed to him, sent in 1623, he gives a drawing of the machine and tells how it works.

One of the first examples of such mechanisms was the “counting clock” of the German mathematician Wilhelm Schickard. In 1623, he created a machine that became the first automatic calculator. Schickard's machine could add and subtract six-digit numbers, ringing a bell when it was full. Unfortunately, history has not preserved information about the further fate of the car.

The inventions of Leonardo da Vinci and Wilhelm Schiccard became known only in our time. They were unknown to their contemporaries.

The most famous of the first computers was the summing machine of Blaise Pascal, who in 1642 built the Pascalina model - adding machine for eight-digit numbers. B. Pascal began creating Pascalina at the age of 19, observing the work of his father, who was a tax collector and often had to carry out long and tedious calculations. And his only goal was to help him with his work.

In 1673, the German mathematician Leibniz created the first arithmometer, which allowed him to perform all four arithmetic operations. “...My machine makes it possible to perform multiplication and division over huge numbers instantly, without resorting to sequential addition and subtraction,” wrote V. Leibniz to one of his friends. Leibniz's machine was known in most European countries.

The principle of calculations turned out to be successful; subsequently, the model was repeatedly refined in different countries by different scientists.

And from 1881, serial production of adding machines was organized, which were used for practical calculations until the sixties of the 20th century.

The most famous mass-produced model was the Felix adding machine, Russian-made, which received its name in 1900. at the international exhibition in Paris a gold medal.

Also included in the mechanical period are the theoretical developments of Babidge’s analytical machines, which were not implemented due to lack of funding. Theoretical developments date back to 1920-1971. The Analytical Engine was supposed to be the first machine using the principle of program control and intended to calculate any algorithm, input-output was planned using punched cards, it was supposed to work on a steam engine. The analytical engine consisted of the following four main parts: a storage unit for initial, intermediate and resulting data (warehouse - memory); data processing unit (mill - arithmetic device); calculation sequence control unit (control device); block for inputting initial data and printing results (input/output devices), which later served as the prototype for the structure of all modern computers. Lady Ada Lovelace (daughter of the English poet George Byron) worked simultaneously with the English scientist. She developed the first programs for the machine, laid down many ideas and introduced a number of concepts and terms that have survived to this day. Countess Lovelace is considered the first computer programmer, and the ADA programming language is named after her. Although the project was not implemented, it was widely known and highly appreciated by scientists. Charles Babidge was a century ahead of his time.

To be continued…

At all times, starting from antiquity, people needed to count. At first, they used their own fingers or pebbles to count. However, even simple arithmetic operations with large numbers are difficult for the human brain. Therefore, already in ancient times, the simplest instrument for counting was invented - the abacus, invented more than 15 centuries ago in the Mediterranean countries. This prototype of modern accounts was a set of dominoes strung on rods and was used by merchants.

The abacus rods in the arithmetic sense represent decimal places. Each domino on the first rod has a value of 1, on the second rod - 10, on the third rod - 100, etc. Until the 17th century, the abacus remained practically the only counting instrument.

In Russia, the so-called Russian abacus appeared in the 16th century. They are based on the decimal number system and allow you to quickly perform arithmetic operations (Fig. 6)

Rice. 6. Abacus

In 1614, mathematician John Napier invented logarithms.

A logarithm is an exponent to which a number must be raised (the base of the logarithm) to obtain another given number. Napier's discovery was that any number can be expressed in this way, and that the sum of the logarithms of any two numbers is equal to the logarithm of the product of these numbers. This made it possible to reduce the action of multiplication to the simpler action of addition. Napier created tables of logarithms. In order to multiply two numbers, you need to look at their logarithms in this table, add them and find the number corresponding to this sum in the reverse table - antilogarithms. Based on these tables, in 1654 R. Bissacar and in 1657, independently, S. Partridge developed a rectangular slide rule: the engineer’s main calculating device until the middle of the 20th century (Fig. 7).

Rice. 7. Slide Rule

In 1642, Blaise Pascal invented a mechanical adding machine using the decimal number system. Each decimal place was represented by a wheel with ten teeth, indicating numbers from 0 to 9. There were 8 wheels in total, that is, Pascal's machine was 8-bit.

However, it was not the decimal number system that won in digital computing, but the binary number system. The main reason for this is that in nature there are many phenomena with two stable states, for example, “on / off”, “there is voltage / no voltage”, “false statement / true statement”, but there are no phenomena with ten stable states. Why is the decimal system so widespread? Yes, simply because a person has ten fingers on two hands, and they are convenient to use for simple mental counting. But in electronic computing it is much easier to use a binary number system with only two stable states of elements and simple addition and multiplication tables. In modern digital computing machines - computers - the binary system is used not only to record numbers on which computational operations must be performed, but also to record the commands themselves for these calculations and even entire programs of operations. In this case, all calculations and operations are reduced in a computer to the simplest arithmetic operations on binary numbers.



One of the first to show interest in the binary system was the great German mathematician Gottfried Leibniz. In 1666, at the age of twenty, in his work “On the Art of Combinatorics,” he developed a general method that allows one to reduce any thought to precise formal statements. This opened up the possibility of transferring logic (Leibniz called it the laws of thought) from the kingdom of words to the kingdom of mathematics, where the relations between objects and statements are defined precisely and definitely. Thus, Leibniz was the founder of formal logic. He was researching the binary number system. At the same time, Leibniz endowed it with a certain mystical meaning: he associated the number 1 with God, and 0 with emptiness. From these two figures, in his opinion, everything happened. And with the help of these two numbers you can express any mathematical concept. Leibniz was the first to suggest that the binary system could become a universal logical language.

Leibniz dreamed of building a “universal science.” He wanted to highlight the simplest concepts, with the help of which, according to certain rules, concepts of any complexity can be formulated. He dreamed of creating a universal language in which any thoughts could be written down in the form of mathematical formulas. I thought about a machine that could derive theorems from axioms, about turning logical statements into arithmetic ones. In 1673, he created a new type of adding machine - a mechanical calculator that not only adds and subtracts numbers, but also multiplies, divides, raises to powers, and extracts square and cubic roots. It used the binary number system.

The universal logical language was created in 1847 by the English mathematician George Boole. He developed propositional calculus, which was later named Boolean algebra in his honor. It represents formal logic translated into the strict language of mathematics. The formulas of Boolean algebra are superficially similar to the formulas of the algebra that we are familiar with from school. However, this similarity is not only external, but also internal. Boolean algebra is a completely equal algebra, subject to the set of laws and rules adopted during its creation. It is a notation system applicable to any objects - numbers, letters and sentences. Using this system, you can encode any statements that need to be proven true or false, and then manipulate them like ordinary numbers in mathematics.

George Boole (1815–1864) - English mathematician and logician, one of the founders of mathematical logic. Developed the algebra of logic (in the works “Mathematical Analysis of Logic” (1847) and “Study of the Laws of Thought” (1854)).

The American mathematician Charles Peirce played a huge role in the spread of Boolean algebra and its development.

Charles Pierce (1839–1914) was an American philosopher, logician, mathematician and natural scientist, known for his work on mathematical logic.

The subject of consideration in the algebra of logic is the so-called statements, i.e. any statements that can be said to be either true or false: “Omsk is a city in Russia,” “15 is an even number.” The first statement is true, the second is false.

Complex statements obtained from simple ones using the conjunctions AND, OR, IF...THEN, negations NOT, can also be true or false. Their truth depends only on the truth or falsity of the simple statements that form them, for example: “If it’s not raining outside, then you can go for a walk.” The main task of Boolean algebra is to study this dependence. Logical operations are considered that allow you to construct complex statements from simple ones: negation (NOT), conjunction (AND), disjunction (OR) and others.

In 1804, J. Jacquard invented a weaving machine for producing fabrics with large patterns. This pattern was programmed using a whole deck of punched cards - rectangular cards made of cardboard. On them, information about the pattern was recorded by punching holes (perforations) located in a certain order. When the machine was operating, these punched cards were felt using special pins. It was in this mechanical way that information was read from them to weave a programmed fabric pattern. Jacquard's machine was the prototype of computer-controlled machines created in the twentieth century.

In 1820, Thomas de Colmar developed the first commercial adding machine capable of multiplying and dividing. Since the 19th century, adding machines have become widespread when performing complex calculations.

In 1830, Charles Babbage tried to create a universal analytical engine that was supposed to perform calculations without human intervention. To do this, programs were introduced into it that were pre-recorded on punched cards made of thick paper using holes made on them in a certain order (the word “perforation” means “punching holes in paper or cardboard”). The programming principles for Babbage's Analytical Engine were developed in 1843 by Ada Lovelace, the daughter of the poet Byron.


Rice. 8. Charles Babbage


Rice. 9. Ada Lovelace

An analytical engine must be able to remember data and intermediate results of calculations, that is, have memory. This machine was supposed to contain three main parts: a device for storing numbers typed using gears (memory), a device for operating on numbers (arithmetic unit), and a device for operating numbers using punched cards (program control device). The work on creating the analytical engine was not completed, but the ideas contained in it helped build the first computers in the 20th century (translated from English this word means “calculator”).

In 1880 V.T. Odner in Russia created a mechanical adding machine with gear wheels, and in 1890 he launched its mass production. Subsequently, it was produced under the name “Felix” until the 50s of the 20th century (Fig. 11).


Rice. 10. V.T. Odner


Rice. 11. Mechanical adding machine "Felix"

In 1888, Herman Hollerith (Fig. 12) created the first electromechanical calculating machine - a tabulator, in which information printed on punched cards (Fig. 13) was deciphered by electric current. This machine made it possible to reduce the counting time for the US Census several times. In 1890, Hollerith's invention was used for the first time in the 11th American Census. The work that 500 employees had previously taken as long as 7 years to complete was completed by Hollerith and 43 assistants on 43 tabulators in one month.

In 1896, Hollerith founded a company called the Tabulating Machine Co. In 1911, this company was merged with two other companies specializing in the automation of statistical data processing, and received its modern name IBM (International Business Machines) in 1924. It became an electronic corporation, one of the world's largest manufacturers of all types of computers and software , a provider of global information networks. The founder of IBM was Thomas Watson Sr., who headed the company in 1914, essentially created the IBM Corporation and led it for more than 40 years. Since the mid-1950s, IBM has taken a leading position in the global computer market. In 1981, the company created its first personal computer, which became the industry standard. By the mid-1980s, IBM controlled about 60% of the world's production of electronic computers.


Rice. 12. Thomas Watson Sr.

Rice. 13. Herman Hollerith

At the end of the 19th century, punched tape was invented - paper or celluloid film, on which information was applied with a punch in the form of a set of holes.

Wide punched paper tape was used in the monotype, a typesetting machine invented by T. Lanston in 1892. The monotype consisted of two independent devices: a keyboard and a casting apparatus. The keyboard served to compile a typing program on punched tape, and the casting machine made the typing in accordance with the program previously compiled on the keyboard from a special typographic alloy - gart.

Rice. 14. Punch card

Rice. 15. Punched tapes

The typesetter sat down at the keyboard, looked at the text standing in front of him on the music stand and pressed the appropriate keys. When one of the letter keys was struck, the needles of the punching mechanism used compressed air to punch a code combination of holes in the paper tape. This combination corresponded to a given letter, sign or space between them. After each strike on the key, the paper tape moved one step - 3 mm. Each horizontal row of holes on the punched paper corresponds to one letter, sign, or space between them. The finished (punched) spool of punched paper tape was transferred to a casting machine, in which, also using compressed air, the information encoded on it was read from the punched paper tape and a set of letters was automatically produced. Thus, the monotype is one of the first computer-controlled machines in the history of technology. It was classified as a hot-type typesetting machine and over time gave way first to phototypesetting and then to electronic typesetting.

Somewhat earlier than the monotype, in 1881, the pianola (or phonola) was invented - an instrument for automatically playing the piano. It also operated using compressed air. In a pianola, each key of an ordinary piano or grand piano corresponds to a hammer that strikes it. All the hammers together make up the counter-keyboard, which is attached to the piano keyboard. A wide paper punched tape wound on a roller is inserted into the pianola. The holes on the punched tape are made in advance while the pianist is playing - these are a kind of “notes”. When a pianola operates, the punched paper tape is rewound from one roller to another. The information recorded on it is read using a pneumatic mechanism. He activates hammers that correspond to holes on the punched tape, causing them to strike the keys and reproduce the pianist's performance. Thus, the pianola was also a program-controlled machine. Thanks to the preserved punched piano tapes, it was possible to restore and re-record, using modern methods, the performances of such remarkable pianists of the past as composer A.N. Scriabin. The pianola was used by famous composers and pianists Rubinstein, Paderewski, Busoni.

Later, information was read from punched tape and punched cards using electrical contacts - metal brushes, which, when contacted with a hole, closed an electrical circuit. Then the brushes were replaced with photocells, and information reading became optical, contactless. This is how information was recorded and read in the first digital computers.

Logical operations are closely related to everyday life.

Using one OR element for two inputs, two AND elements for two inputs and one NOT element, you can build a logical circuit of a binary half-adder capable of performing the binary addition operation of two single-digit binary numbers (i.e., fulfilling the rules of binary arithmetic):

0 +0 =0; 0+1=1; 1+0=1; 1+1=0. In doing so, it allocates the carry bit.

However, such a circuit does not contain a third input to which a carry signal from the previous bit of the sum of binary numbers can be applied. Therefore, the half-adder is used only in the least significant bit of the logic circuit for summing multi-bit binary numbers, where there cannot be a carry signal from the previous binary bit. A full binary adder adds two multi-bit binary numbers, taking into account the carry signals from the addition in the previous binary bits.

By connecting binary adders in a cascade, you can obtain a logical adder circuit for binary numbers with any number of digits.

With some modifications, these logic circuits are also used to subtract, multiply and divide binary numbers. With their help, the arithmetic devices of modern computers were built.

In 1937, George Stibitz (Fig. 16) created a binary adder from ordinary electromechanical relays - a device capable of performing the operation of adding numbers in binary code. And today, the binary adder is still one of the main components of any computer, the basis of its arithmetic device.


Rice. 16. George Stibitz

In 1937–1942 John Atanasoff (Fig. 17) created a model of the first computer that ran on vacuum tubes. It used the binary number system. Punched cards were used to enter data and output calculation results. Work on this machine was almost completed in 1942, but due to the war, further funding was stopped.


Rice. 17. John Atanasoff

In 1937, Konrad Zuse (Fig. 12) created his first computer Z1 based on electromechanical relays. The initial data was entered into it using a keyboard, and the result of the calculations was displayed on a panel with many light bulbs. In 1938, K. Zuse created an improved model Z2. Programs were entered into it using punched tape. It was made by punching holes in used 35mm photographic film. In 1941, K. Zuse built a functioning computer Z3, and later Z4, based on the binary number system. They were used for calculations in the creation of aircraft and missiles. In 1942, Konrad Zuse and Helmut Schreier conceived the idea of ​​converting the Z3 from electromechanical relays to vacuum tubes. Such a machine was supposed to work 1000 times faster, but it was not possible to create it - the war got in the way.


Rice. 18. Konrad Zuse

In 1943–1944, at one of the IBM enterprises (IBM), in collaboration with scientists at Harvard University led by Howard Aiken, the Mark-1 computer was created. It weighed about 35 tons. "Mark-1" was based on the use of electromechanical relays and operated with numbers encoded on punched tape.

When creating it, the ideas laid down by Charles Babbage in his Analytical Engine were used. Unlike Stiebitz and Zuse, Aiken did not realize the advantages of the binary number system and used the decimal system in his machine. The machine could manipulate numbers up to 23 digits long. To multiply two such numbers she needed to spend 4 seconds. In 1947, the Mark-2 machine was created, which already used the binary number system. In this machine, addition and subtraction operations took an average of 0.125 seconds, and multiplication - 0.25 seconds.

The abstract science of logic algebra is close to practical life. It allows you to solve a variety of control problems.

The input and output signals of electromagnetic relays, like statements in Boolean algebra, also take only two values. When the winding is de-energized, the input signal is 0, and when current is flowing through the winding, the input signal is 1. When the relay contact is open, the output signal is 0, and when the contact is closed, it is 1.

It was precisely this similarity between statements in Boolean algebra and the behavior of electromagnetic relays that was noticed by the famous physicist Paul Ehrenfest. As early as 1910, he proposed using Boolean algebra to describe the operation of relay circuits in telephone systems. According to another version, the idea of ​​​​using Boolean algebra to describe electrical switching circuits belongs to Peirce. In 1936, the founder of modern information theory, Claude Shannon, combined the binary number system, mathematical logic and electrical circuits in his doctoral dissertation.

It is convenient to designate connections between electromagnetic relays in circuits using the logical operations NOT, AND, OR, REPEAT (YES), etc. For example, a series connection of relay contacts implements an AND operation, and a parallel connection of these contacts implements a logical OR operation. Operations AND, OR, NOT are performed similarly in electronic circuits, where the role of relays that close and open electrical circuits is performed by contactless semiconductor elements - transistors, created in 1947–1948 by American scientists D. Bardeen, W. Brattain and W. Shockley.

Electromechanical relays were too slow. Therefore, already in 1943, the Americans began developing a computer based on vacuum tubes. In 1946, Presper Eckert and John Mauchly (Fig. 13) built the first electronic digital computer, ENIAC. Its weight was 30 tons, it occupied 170 square meters. m area. Instead of thousands of electromechanical relays, ENIAC contained 18,000 vacuum tubes. The machine counted in the binary system and performed 5000 addition operations or 300 multiplication operations per second. Not only an arithmetic device, but also a storage device was built on vacuum tubes in this machine. Numerical data was entered using punched cards, while programs were entered into this machine using plugs and typesetting fields, that is, thousands of contacts had to be connected for each new program. Therefore, it took up to several days to prepare to solve a new problem, although the problem itself was solved in a few minutes. This was one of the main disadvantages of such a machine.


Rice. 19. Presper Eckert and John Mauchly

The work of three outstanding scientists - Claude Shannon, Alan Turing and John von Neumann - became the basis for creating the structure of modern computers.

Shannon Claude (born 1916) is an American engineer and mathematician, the founder of mathematical information theory.

In 1948, he published the work “Mathematical Theory of Communication,” with his theory of information transmission and processing, which included all types of messages, including those transmitted along nerve fibers in living organisms. Shannon introduced the concept of the amount of information as a measure of the uncertainty of the state of the system, removed when receiving information. He called this measure of uncertainty entropy, by analogy with a similar concept in statistical mechanics. When the observer receives information, entropy, that is, the degree of his ignorance about the state of the system, decreases.

Alan Turing (1912–1954) – English mathematician. His main works are on mathematical logic and computational mathematics. In 1936–1937 wrote the seminal work “On Computable Numbers,” in which he introduced the concept of an abstract device, later called the “Turing machine.” In this device he anticipated the basic properties of the modern computer. Turing called his device a “universal machine,” since it was supposed to solve any admissible (theoretically solvable) mathematical or logical problem. Data must be entered into it from a paper tape divided into cells - cells. Each such cell either had to contain a symbol or not. The Turing machine could process symbols input from the tape and change them, that is, erase them and write new ones according to instructions stored in its internal memory.

Neumann John von (1903–1957) - American mathematician and physicist, participant in the development of atomic and hydrogen weapons. Born in Budapest, he lived in the USA since 1930. In his report, published in 1945 and becoming the first work on digital electronic computers, he identified and described the “architecture” of the modern computer.

In the next machine - EDVAC - its more capacious internal memory was capable of storing not only the original data, but also the calculation program. This idea - to store programs in the memory of machines - was put forward by mathematician John von Neumann along with Mauchly and Eckert. He was the first to describe the structure of a universal computer (the so-called “von Neumann architecture” of a modern computer). For universality and efficient operation, according to von Neumann, a computer must contain a central arithmetic-logical unit, a central device for controlling all operations, a storage device (memory) and an information input/output device, and programs should be stored in the computer's memory.

Von Neumann believed that a computer should operate on the basis of the binary number system, be electronic, and perform all operations sequentially, one after another. These principles are the basis of all modern computers.

A machine using vacuum tubes worked much faster than one using electromechanical relays, but the vacuum tubes themselves were unreliable. They often failed. To replace them in 1947, John Bardeen, Walter Brattain and William Shockley proposed using the switching semiconductor elements they invented - transistors.

John Bardeen (1908–1991) – American physicist. One of the creators of the first transistor (1956 Nobel Prize in Physics together with W. Brattain and W. Shockley for the discovery of the transistor effect). One of the authors of the microscopic theory of superconductivity (second Nobel Prize in 1957 jointly with L. Cooper and D. Schriffen).

Walter Brattain (1902–1987) - American physicist, one of the creators of the first transistor, winner of the 1956 Nobel Prize in Physics.

William Shockley (1910–1989) - American physicist, one of the creators of the first transistor, winner of the 1956 Nobel Prize in Physics.

In modern computers, microscopic transistors in an integrated circuit chip are grouped into systems of “gates” that perform logical operations on binary numbers. For example, with their help, the binary adders described above were built, which allow adding multi-digit binary numbers, subtracting, multiplying, dividing and comparing numbers with each other. Logic gates, acting according to certain rules, control the movement of data and the execution of instructions in the computer.

The improvement of the first types of computers led in 1951 to the creation of the UNIVAC computer, intended for commercial use. It became the first commercially produced computer.

The serial tube computer IBM 701, which appeared in 1952, performed up to 2200 multiplication operations per second.


IBM 701 computer

The initiative to create this system belonged to Thomas Watson Jr. In 1937, he began working for the company as a traveling salesman. He only stopped working for IBM during the war, when he was a pilot in the United States Air Force. Returning to the company in 1946, he became its vice president and headed IBM from 1956 to 1971. While remaining a member of the IBM board of directors, Thomas Watson served as the United States Ambassador to the USSR from 1979 to 1981.


Thomas Watson (Jr.)

In 1964, IBM announced the creation of six models of the IBM 360 family (System 360), which became the first computers of the third generation. The models had a single command system and differed from each other in the amount of RAM and performance. When creating models of the family, a number of new principles were used, which made the machines universal and made it possible to use them with equal efficiency both for solving problems in various fields of science and technology, and for processing data in the field of management and business. IBM System/360 (S/360) is a family of mainframe class universal computers. Further developments of IBM/360 were the 370, 390, z9 and zSeries systems. In the USSR, the IBM/360 was cloned under the name ES COMPUTER. They were software compatible with their American prototypes. This made it possible to use Western software in conditions of underdevelopment of the domestic “programming industry.”


IBM/360 computer


T. Watson (Jr.) and V. Lerson at the IBM/360 computer

The first in the USSR Small Electronic Computing Machine (MESM) using vacuum tubes was built in 1949–1951. under the leadership of academician S.A. Lebedeva. Regardless of foreign scientists S.A. Lebedev developed the principles of constructing a computer with a program stored in memory. MESM was the first such machine. And in 1952–1954. under his leadership, the High-Speed ​​Electronic Calculating Machine (BESM) was developed, performing 8,000 operations per second.


Lebedev Sergey Alekseevich

The creation of electronic computers was led by the largest Soviet scientists and engineers I.S. Brook, W.M. Glushkov, Yu.A. Bazilevsky, B.I. Rameev, L.I. Gutenmacher, N.P. Brusentsov.

The first generation of Soviet computers included tube computers - “BESM-2”, “Strela”, “M-2”, “M-3”, “Minsk”, “Ural-1”, “Ural-2”, “M- 20".

The second generation of Soviet computers includes semiconductor small computers “Nairi” and “Mir”, medium-sized computers for scientific calculations and information processing with a speed of 5-30 thousand operations per second “Minsk-2”, “Minsk-22”, “Minsk-32” ", "Ural-14", "Razdan-2", "Razdan-3", "BESM-4", "M-220" and control computers "Dnepr", "VNIIEM-3", as well as the ultra-high-speed BESM-6 with a performance of 1 million operations per second.

The founders of Soviet microelectronics were scientists who emigrated from the USA to the USSR: F.G. Staros (Alfred Sarant) and I.V. Berg (Joel Barr). They became the initiators, organizers and managers of the microelectronics center in Zelenograd near Moscow.


F.G. Staros

Third-generation computers based on integrated circuits appeared in the USSR in the second half of the 1960s. The Unified Computer System (ES COMPUTER) and the Small Computer System (SM COMPUTER) were developed and their mass production was organized. As mentioned above, this system was a clone of the American IBM/360 system.

Evgeniy Alekseevich Lebedev was an ardent opponent of the copying of the American IBM/360 system, which in the Soviet version was called the ES Computer, which began in the 1970s. The role of the EU Computers in the development of domestic computers is ambiguous.

At the initial stage, the emergence of ES computers led to the unification of computer systems, made it possible to establish initial programming standards and organize large-scale projects related to the implementation of programs.

The price of this was the widespread curtailment of their own original developments and becoming completely dependent on the ideas and concepts of IBM, which were far from the best at that time. The abrupt transition from easy-to-use Soviet machines to the much more complex hardware and software of the IBM/360 meant that many programmers had to overcome difficulties associated with the shortcomings and errors of IBM developers. The initial models of ES computers were often inferior in performance characteristics to domestic computers of that time.

At a later stage, especially in the 80s, the widespread introduction of EU computers turned into a serious obstacle to the development of software, databases, and dialog systems. After expensive and pre-planned purchases, enterprises were forced to operate obsolete computer systems. In parallel, systems developed on small machines and on personal computers, which became more and more popular.

At a later stage, with the beginning of perestroika, from 1988–89, our country was flooded with foreign personal computers. No measures could stop the crisis of the EU computer series. Domestic industry was unable to create analogues or substitutes for ES computers based on the new element base. The economy of the USSR did not allow by that time to spend gigantic financial resources on the creation of microelectronic equipment. As a result, there was a complete transition to imported computers. Programs for the development of domestic computers were finally curtailed. Problems arose of transferring technologies to modern computers, modernizing technologies, employing and retraining hundreds of thousands of specialists.

Forecast S.A. Lebedeva was justified. Both in the USA and throughout the world, they subsequently followed the path that he proposed: on the one hand, supercomputers are created, and on the other, a whole series of less powerful computers aimed at various applications - personal, specialized, etc.

The fourth generation of Soviet computers was implemented on the basis of large-scale (LSI) and ultra-large-scale (VLSI) integrated circuits.

An example of large fourth-generation computer systems was the Elbrus-2 multiprocessor complex with a speed of up to 100 million operations per second.

In the 1950s, the second generation of transistor-based computers was created. As a result, the speed of the machines increased 10 times, and the size and weight were significantly reduced. They began to use storage devices on magnetic ferrite cores, capable of storing information indefinitely even when computers are turned off. They were designed by Joy Forrester in 1951–1953. Large amounts of information were stored on external media, such as magnetic tape or a magnetic drum.

The first hard disk drive in the history of computing (winchester) was developed in 1956 by a group of IBM engineers led by Reynold B. Johnson. The device was called 305 RAMAC - a random access method of accounting and control. The drive consisted of 50 aluminum disks with a diameter of 24 inches (about 60 cm) and a thickness of 2.5 cm each. A magnetic layer was applied to the surface of the aluminum plate, onto which recording was carried out. This entire structure of disks on a common axis rotated in operating mode at a constant speed of 1200 rpm, and the drive itself occupied an area measuring 3x3.5 m. Its total capacity was 5 MB. One of the most important principles used in the design of the RAMAC 305 was that the heads did not touch the surface of the disks, but hovered at a small fixed distance. For this purpose, special air nozzles were used, which directed the flow to the disk through small holes in the head holders and thereby created a gap between the head and the surface of the rotating plate.

The Winchester (hard drive) provided computer users with the ability to store very large amounts of information and at the same time quickly retrieve the necessary data. After the creation of the hard drive in 1958, magnetic tape media was abandoned.

In 1959, D. Kilby, D. Herney, K. Lehovec and R. Noyce (Fig. 14) invented integrated circuits (chips), in which all electronic components, along with conductors, were placed inside a silicon wafer. The use of chips in computers has made it possible to shorten the paths for current flow during switching. The speed of calculations has increased tenfold. The dimensions of the machines have also decreased significantly. The appearance of the chip made it possible to create the third generation of computers. And in 1964, IBM began producing IBM-360 computers based on integrated circuits.


Rice. 14. D. Kilby, D. Hurney, K. Lechovec and R. Noyce

In 1965, Douglas Engelbart (Fig. 15) created the first “mouse” - a computer handheld manipulator. It was first used in the Apple Macintosh personal computer, released later, in 1976.


Rice. 19. Douglas Engelbart

In 1971, IBM began producing the computer floppy disk, invented by Yoshiro Nakamatsu, a removable flexible magnetic disk (“floppy disk”) for permanent storage of information. Initially, the floppy disk had a diameter of 8 inches and a capacity of 80 KB, then - 5 inches. The modern 1.44 MB floppy disk, first released by Sony in 1982, is housed in a hard plastic case and has a diameter of 3.5 inches.

In 1969, the creation of a defense computer network began in the United States - the progenitor of the modern worldwide Internet.

In the 1970s, dot matrix printers were developed to print information output from computers.

In 1971, Intel employee Edward Hoff (Fig. 20) created the first microprocessor, the 4004, by placing several integrated circuits on a single silicon chip. Although it was originally intended for use in calculators, it was essentially a complete microcomputer. This revolutionary invention radically changed the idea of ​​computers as bulky, ponderous monsters. The microprocessor made it possible to create fourth-generation computers that fit on the user's desk.


Rice. 20. Edward Hoff

In the mid-1970s, attempts began to create a personal computer (PC), a computing machine intended for the private user.

In 1974, Edward Roberts (Fig. 21) created the first personal computer, Altair, based on the Intel 8080 microprocessor (Fig. 22). But without software it was ineffective: after all, a private user does not have his own programmer “at hand” at home.


Rice. 21. Edward Roberts


Rice. 22. First personal computer Altair

In 1975, two Harvard University students, Bill Gates and Paul Allen, learned about the creation of the Altair PC (Fig. 23). They were the first to understand the urgent need to write software for personal computers and within a month they created it for the Altair PC based on the BASIC language. That same year, they founded Microsoft, which quickly became a leader in personal computer software and became the richest company in the world.


Rice. 23. Bill Gates and Paul Allen


Rice. 24. Bill Gates

In 1973, IBM developed a hard magnetic disk (hard drive) for a computer. This invention made it possible to create large-capacity long-term memory, which is retained when the computer is turned off.

The first Altair-8800 microcomputers were just a collection of parts that still needed to be assembled. In addition, they were extremely inconvenient to use: they had neither a monitor, nor a keyboard, nor a mouse. Information was entered into them using switches on the front panel, and the results were displayed using LED indicators. Later they began to display results using a teletype - a telegraph machine with a keyboard.

In 1976, 26-year-old engineer Steve Wozniak from Hewlett-Packard created a fundamentally new microcomputer. He was the first to use a keyboard similar to a typewriter keyboard to enter data, and an ordinary TV to display information. Symbols were displayed on its screen in 24 lines of 40 characters each. The computer had 8 KB of memory, half of which was occupied by the built-in BASIC language, and half the user could use to enter his programs. This computer was significantly superior to the Altair-8800, which had only 256 bytes of memory. S. Wozniak provided a connector (the so-called “slot”) for his new computer for connecting additional devices. Steve Wozniak’s friend Steve Jobs was the first to understand and appreciate the prospects of this computer (Fig. 25). He offered to organize a company for its serial production. On April 1, 1976, they founded the Apple company, and officially registered it in January 1977. They called the new computer Apple-I (Fig. 26). Within 10 months, they managed to assemble and sell about 200 copies of Apple-I.


Rice. 25. Steve Wozniak and Steve Jobs


Rice. 26. Apple-I Personal Computer

At this time, Wozniak was already working on improving it. The new version was called Apple-II (Fig. 23). The computer was made in a plastic case, received a graphics mode, sound, color, expanded memory, 8 expansion connectors (slots) instead of one. It used a cassette recorder to save programs. The basis of the first Apple II model was, as in the Apple I, the 6502 microprocessor from MOS Technology with a clock frequency of 1 megahertz. BASIC was recorded in permanent memory. The 4 KB RAM capacity was expanded to 48 KB. The information was displayed on a color or black-and-white TV operating in the NTSC standard system for the USA. In text mode, 24 lines were displayed, 40 characters each, and in graphic mode, the resolution was 280 by 192 pixels (six colors). The main advantage of the Apple II was the ability to expand its RAM up to 48 KB and use 8 connectors for connecting additional devices. Thanks to the use of color graphics, it could be used for a wide variety of games (Fig. 27).


Rice. 27. Apple II personal computer

Thanks to its capabilities, the Apple II has gained popularity among people of various professions. Its users were not required to have knowledge of electronics or programming languages.

The Apple II became the first truly personal computer for scientists, engineers, lawyers, businessmen, housewives and schoolchildren.

In July 1978, the Apple II was supplemented with the Disk II drive, which significantly expanded its capabilities. The disk operating system Apple-DOS was created for it. And at the end of 1978, the computer was improved again and released under the name Apple II Plus. Now it could be used in the business sphere to store information, conduct business, and help in decision making. The creation of such application programs as text editors, organizers, and spreadsheets began.

In 1979, Dan Bricklin and Bob Frankston created VisiCalc, the world's first spreadsheet. This tool was best suited for accounting calculations. Its first version was written for the Apple II, which was often purchased only to work with VisiCalc.

Thus, in a few years, the microcomputer, largely thanks to Apple and its founders Steven Jobs and Steve Wozniak, turned into a personal computer for people of various professions.

In 1981, the IBM PC personal computer appeared, which soon became the standard in the computer industry and displaced almost all competing personal computer models from the market. The only exception was Apple. In 1984, the Apple Macintosh was created, the first computer with a graphical interface controlled by a mouse. Thanks to its advantages, Apple managed to stay in the personal computer market. It has conquered the market in education and publishing, where the outstanding graphics capabilities of Macintoshes are used for layout and image processing.

Today, Apple controls 8–10% of the global personal computer market, and the remaining 90% is IBM-compatible personal computers. Most Macintosh computers are owned by users in the United States.

In 1979, the optical compact disc (CD) appeared, developed by Philips and intended only for listening to music recordings.

In 1979, Intel developed the 8088 microprocessor for personal computers.

Personal computers of the IBM PC model, created in 1981 by a group of IBM engineers led by William C. Lowe, became widespread. The IBM PC had an Intel 8088 processor with a clock frequency of 4.77 MHz, 16 Kb of memory expandable up to 256 Kb, and the DOS 1.0 operating system. (Fig. 24). The DOS 1.0 operating system was created by Microsoft. In just one month, IBM managed to sell 241,683 IBM PCs. By agreement with Microsoft executives, IBM paid a certain amount to the creators of the program for each copy of the operating system installed on the IBM PC. Thanks to the popularity of the IBM PC, Microsoft executives Bill Gates and Paul Allen soon became billionaires, and Microsoft took a leading position in the software market.


Rice. 28. Personal computer model IBM PC

The IBM PC applied the principle of open architecture, which made it possible to make improvements and additions to existing PC designs. This principle means the use of ready-made blocks and devices in the design when assembling a computer, as well as the standardization of methods for connecting computer devices.

The principle of open architecture contributed to the widespread adoption of IBM PC-compatible clone microcomputers. A large number of companies around the world began assembling them from ready-made blocks and devices. Users, in turn, were able to independently upgrade their microcomputers and equip them with additional devices from hundreds of manufacturers.

In the late 1990s, IBM PC-compatible computers accounted for 90% of the personal computer market.

The IBM PC soon became the standard in the computer industry and drove almost all competing personal computer models out of the market. The only exception was Apple. In 1984, the Apple Macintosh was created, the first computer with a graphical interface controlled by a mouse. Thanks to its advantages, Apple managed to stay in the personal computer market. It has conquered the market in the field of education, publishing, where their outstanding graphics capabilities are used for layout and image processing.

Today, Apple controls 8–10% of the global personal computer market, and the remaining 90% is IBM-compatible personal computers. Most Macintosh computers are owned by US users.

Over the last decades of the 20th century, computers have greatly increased their speed and the amount of information they process and store.

In 1965, Gordon Moore, one of the founders of Intel Corporation, a leader in the field of computer integrated circuits - “chips”, suggested that the number of transistors in them would double every year. Over the next 10 years, this prediction came true, and then he suggested that this number would now double every 2 years. Indeed, the number of transistors in microprocessors doubles every 18 months. Computer scientists now call this trend Moore's Law.


Rice. 29. Gordon Moore

A similar pattern is observed in the development and production of RAM devices and information storage devices. By the way, I have no doubt that by the time this book is published, many digital data in terms of their capacity and speed will have become outdated.

The development of software, without which it is generally impossible to use a personal computer, and, above all, operating systems that ensure interaction between the user and the PC, has not lagged behind.

In 1981, Microsoft developed the MS-DOS operating system for its personal computers.

In 1983, the improved personal computer IBM PC/XT from IBM was created.

In the 1980s, black-and-white and color inkjet and laser printers were created to print information output from computers. They are significantly superior to dot matrix printers in terms of print quality and speed.

In 1983–1993, the global computer network Internet and E-mail were created, which were used by millions of users around the world.

In 1992, Microsoft released the Windows 3.1 operating system for IBM PC-compatible computers. The word “Windows” translated from English means “windows”. A windowed operating system allows you to work with several documents at once. It is a so-called “graphical interface”. This is a system of interaction with a PC in which the user deals with so-called “icons”: pictures that he can control using a computer mouse. This graphical interface and window system was first created at the Xerox research center in 1975 and applied to Apple PCs.

In 1995, Microsoft released the Windows-95 operating system for IBM PC-compatible computers, more advanced than Windows-3.1, in 1998 - its modification Windows-98, and in 2000 - Windows-2000, and in 2006 – Windows XP. A number of application programs have been developed for them: Word text editor, Excel spreadsheets, a program for using the Internet and E-mail - Internet Explorer, Paint graphic editor, standard application programs (calculator, clock, dialer), Microsoft Schedule diary, universal player, phonograph and laser player.

In recent years it has become possible to combine text and graphics with sound and moving images on the personal computer. This technology is called “multimedia”. Optical CD-ROMs (Compact Disk Read Only Memory - i.e. read-only memory on a CD) are used as storage media in such multimedia computers. Outwardly, they do not differ from audio CDs used in players and music centers.

The capacity of one CD-ROM reaches 650 MB; in terms of capacity, it occupies an intermediate position between floppy disks and a hard drive. A CD drive is used to read CDs. Information on a CD is written only once in an industrial environment, and on a PC it can only be read. A wide variety of games, encyclopedias, art albums, maps, atlases, dictionaries and reference books are published on CD-ROM. All of them are equipped with convenient search engines that allow you to quickly find the material you need. The memory capacity of two CD-ROMs is enough to accommodate an encyclopedia larger in volume than the Great Soviet Encyclopedia.

In the late 1990s, write-once CD-R and rewritable CD-RW optical compact discs and drives were created, allowing the user to make any audio and video recordings to their liking.

In 1990–2000, in addition to desktop personal computers, “laptop” PCs were released in the form of a portable suitcase and even smaller pocket “palmtops” (handhelds) - as their name suggests, they fit in your pocket and on the palm of your hand. Laptops are equipped with a liquid crystal display screen located in the hinged lid, and for palmtops - on the front panel of the case.

In 1998–2000, miniature solid-state “flash memory” (without moving parts) was created. Thus, Memory Stick memory has the dimensions and weight of a piece of chewing gum, and SD memory from Panasonic has the size and weight of a postage stamp. Meanwhile, the volume of their memory, which can be stored indefinitely, is 64–128 MB and even 2–8 GB or more.

In addition to portable personal computers, supercomputers are being created to solve complex problems in science and technology - weather and earthquake forecasts, rocket and aircraft calculations, nuclear reactions, deciphering the human genetic code. They use from several to several dozen microprocessors that perform parallel calculations. The first supercomputer was developed by Seymour Cray in 1976.

In 2002, the NEC Earth Simulator supercomputer was built in Japan, performing 35.6 trillion operations per second. Today it is the fastest supercomputer in the world.


Rice. 30. Seymour Cray


Rice. 31. Supercomputer Cray-1


Rice. 32. Supercomputer Cray-2

In 2005, IBM developed the Blue Gene supercomputer with a performance of over 30 trillion operations per second. It contains 12,000 processors and has a thousand times more power than the famous Deep Blue, with which world champion Garry Kasparov played chess in 1997. IBM and researchers from the Swiss Polytechnic Institute in Lausanne have attempted to model the human brain for the first time.

In 2006, personal computers turned 25 years old. Let's see how they have changed over the years. The first of them, equipped with an Intel microprocessor, operated with a clock frequency of only 4.77 MHz and had 16 KB of RAM. Modern PCs equipped with a Pentium 4 microprocessor, created in 2001, have a clock frequency of 3–4 GHz, RAM 512 MB - 1 GB and long-term memory (hard drive) with a capacity of tens and hundreds of GB and even 1 terabyte. Such gigantic progress has not been observed in any branch of technology except digital computing. If the same progress had been made in increasing the speed of aircraft, then they would have been flying at the speed of light long ago.

Millions of computers are used in almost all sectors of the economy, industry, science, technology, pedagogy, and medicine.

The main reasons for this progress are the unusually high rates of microminiaturization of digital electronics devices and programming advances that have made the “communication” of ordinary users with personal computers simple and convenient.