Semiconductor elements were used in which generation of machines. What is a computer? Generations of computers. A high degree of integration helps to increase the packaging density of electronic equipment, increasing its reliability, which leads to increased speed

Introduction

1. First generation of computers 1950-1960s

2. Second generation of computers: 1960-1970s

3. Third generation of computers: 1970-1980s

4. Fourth generation of computers: 1980-1990s

5. Fifth generation of computers: 1990-present

Conclusion

Introduction

Since 1950, every 7-10 years the design-technological and software-algorithmic principles of constructing and using computers have been radically updated. In this regard, it is legitimate to talk about generations of computers. Conventionally, each generation can be allocated 10 years.

Computers have come a long evolutionary way in terms of the element base (from lamps to microprocessors) as well as in the sense of the emergence of new capabilities, expanding the scope and nature of their use.

The division of computers into generations is a very conditional, loose classification of computing systems according to the degree of development of hardware and software, as well as methods of communication with the computer.

The first generation of computers includes machines created at the turn of the 50s: vacuum tubes were used in the circuits. There were few commands, the controls were simple, and the RAM capacity and performance indicators were low. Performance is about 10-20 thousand operations per second. Printing devices, magnetic tapes, punched cards and punched paper tapes were used for input and output.

The second generation of computers includes those machines that were designed in 1955-65. They used both vacuum tubes and transistors. RAM was built on magnetic cores. At this time, magnetic drums and the first magnetic disks appeared. So-called high-level languages ​​have appeared, the means of which allow the description of the entire sequence of calculations in a visual, easily understandable form. A large set of library programs has appeared for solving various mathematical problems. Second-generation machines were characterized by software incompatibility, which made it difficult to organize large information systems, so in the mid-60s there was a transition to the creation of computers that were software compatible and built on a microelectronic technological base.

Third generation of computers. These are machines created after the 60s that have a single architecture, i.e. software compatible. Multiprogramming capabilities have appeared, i.e. simultaneous execution of several programs. Third generation computers used integrated circuits.

Fourth generation of computers. This is the current generation of computers developed after 1970. 4th generation machines were designed to effectively use modern high-level languages ​​and simplify the programming process for the end user.

In terms of hardware, they are characterized by the use of large integrated circuits as an elemental base and the presence of high-speed random access storage devices with a capacity of several MB.

4th generation machines are multi-processor, multi-machine complexes running on external power. memory and general field ext. devices. Performance reaches tens of millions of operations per second, memory - several million words.

The transition to the fifth generation of computers has already begun. It consists in a qualitative transition from data processing to knowledge processing and in increasing the basic parameters of a computer. The main emphasis will be on “intelligence”.

To date, the actual "intelligence" demonstrated by the most complex neural networks is below the level of an earthworm, however, no matter how limited the capabilities of neural networks are today, many revolutionary discoveries may be just around the corner.

1. First generation of computers 1950-1960s

Logic circuits were created using discrete radio components and electronic vacuum tubes with a filament. Random access memory devices used magnetic drums, acoustic ultrasonic mercury and electromagnetic delay lines, and cathode ray tubes (CRTs). Drives on magnetic tapes, punched cards, punched tapes and plug-in switches were used as external storage devices.

The programming of this generation of computers was carried out in the binary number system in machine language, that is, the programs were strictly focused on a specific model of the machine and “died” along with these models.

In the mid-1950s, machine-oriented languages ​​such as symbolic coding languages ​​(SCLs) appeared, which made it possible to use their abbreviated verbal (letter) notation and decimal numbers instead of binary notation of commands and addresses. In 1956, the first high-level programming language for mathematical problems was created - the Fortran language, and in 1958 - the universal programming language Algol.

Computers, starting from UNIVAC and ending with BESM-2 and the first models of the Minsk and Ural computers, belong to the first generation of computers.

2. Second generation of computers: 1960-1970s

Logic circuits were built on discrete semiconductor and magnetic elements (diodes, bipolar transistors, toroidal ferrite microtransformers). Printed circuit circuits (boards made of foil getinax) were used as a design and technological basis. The block principle of machine design has become widely used, which allows you to connect a large number of different external devices to the main devices, which provides greater flexibility in the use of computers. Clock frequencies of electronic circuits have increased to hundreds of kilohertz.

External drives on hard magnetic disks1 and floppy disks began to be used - an intermediate level of memory between magnetic tape drives and RAM.

In 1964, the first computer monitor appeared - the IBM 2250. It was a monochrome display with a 12 x 12 inch screen and a resolution of 1024 x 1024 pixels. It had a frame rate of 40 Hz.

Control systems created on the basis of computers demanded higher performance from computers, and most importantly, reliability. Error detection and correction codes and built-in control circuits have become widely used in computers.

In second-generation machines, batch processing and teleprocessing modes of information were implemented for the first time.

The first computer that partially used semiconductor devices instead of vacuum tubes was the SEAC (Standards Eastern Automatic Computer) machine, created in 1951.

In the early 60s, semiconductor machines began to be produced in the USSR.

3. Third generation of computers: 1970-1980s

In 1958, Robert Noyce invented the small silicon integrated circuit, which could house dozens of transistors in a small area. These circuits later became known as Small Scale Integrated circuits (SSI). And already in the late 60s, integrated circuits began to be used in computers.

The logic circuits of 3rd generation computers were already entirely built on small integrated circuits. Clock frequencies of electronic circuits have increased to several megahertz. The supply voltage (units of volts) and the power consumed by the machine have decreased. The reliability and speed of computers have increased significantly.

Random access memories used smaller ferrite cores, ferrite plates, and magnetic films with a rectangular hysteresis loop. Disk drives have become widely used as external storage devices.

Two more levels of storage devices have appeared: ultra-random access memory devices on trigger registers, which have enormous speed but small capacity (tens of numbers), and high-speed cache memory.

Since the widespread use of integrated circuits in computers, technological progress in computing can be observed using the well-known Moore's Law. One of the founders of Intel, Gordon Moore, discovered a law in 1965 according to which the number of transistors in one chip doubles every 1.5 years.

Due to the significant complexity of both the hardware and logical structure of 3rd generation computers, they often began to be called systems.

Thus, the first computers of this generation were models of IBM systems (a number of IBM 360 models) and PDP (PDP 1). In the Soviet Union, in collaboration with the countries of the Council for Mutual Economic Assistance (Poland, Hungary, Bulgaria, East Germany, etc.), models of the Unified System (EU) and the system of small computers (SM) began to be produced.

In third-generation computers, significant attention is paid to reducing the complexity of programming, the efficiency of program execution in machines, and improving communication between the operator and the machine. This is ensured by powerful operating systems, advanced programming automation, efficient program interruption systems, time-sharing operating modes, real-time operating modes, multi-program operating modes and new interactive communication modes. An effective video terminal device for communication between the operator and the machine has also appeared - a video monitor, or display.

Much attention is paid to increasing the reliability and reliability of computer operation and facilitating their maintenance. Reliability and reliability are ensured by the widespread use of codes with automatic error detection and correction (Hamming correction codes and cyclic codes).

The modular organization of computers and the modular construction of their operating systems have created ample opportunities for changing the configuration of computer systems. In this regard, a new concept of “architecture” of a computing system has emerged, which defines the logical organization of this system from the point of view of the user and programmer.

4. Fourth generation of computers: 1980-1990s

A revolutionary event in the development of computer technology of the third generation of machines was the creation of large and very large integrated circuits (Large Scale Integration - LSI and Very Large Scale Integration - VLSI), a microprocessor (1969) and a personal computer. Since 1980, almost all computers began to be created on the basis of microprocessors. The most popular computer has become a personal computer.

Logic integrated circuits in computers began to be created on the basis of unipolar field-effect CMOS transistors with direct connections, operating with smaller amplitudes of electrical voltages (units of volts), consuming less power than bipolar ones, and thereby allowing the implementation of more advanced nanotechnologies (in those years - on a scale units of microns).

The first personal computer was created in April 1976 by two friends, Steve Jobe (b. 1955), an Atari employee, and Stefan Wozniak (b. 1950), who worked at Hewlett-Packard. Based on an integrated 8-bit controller of a hard-soldered circuit of a popular electronic game, working in the evenings in a car garage, they made a simple Apple gaming computer programmed in BASIC, which was a wild success. In early 1977, Apple Co. was registered, and production of the world's first personal computer, Apple, began.

5. Fifth generation of computers: 1990-present

Features of the architecture of the modern generation of computers are discussed in detail in this course.

Briefly, the basic concept of a fifth-generation computer can be formulated as follows:

1. Computers on ultra-complex microprocessors with a parallel-vector structure, simultaneously executing dozens of sequential program instructions.

2. Computers with many hundreds of parallel working processors, allowing the construction of data and knowledge processing systems, efficient network computer systems.

Sixth and subsequent generations of computers

Electronic and optoelectronic computers with massive parallelism, neural structure, with a distributed network of a large number (tens of thousands) of microprocessors modeling the architecture of neural biological systems.

Conclusion

All stages of computer development are conventionally divided into generations.

The first generation was created on the basis of vacuum electric lamps, the machine was controlled from a remote control and punch cards using machine codes. These computers were housed in several large metal cabinets that occupied entire rooms.

The third generation appeared in the 60s of the 20th century. Computer elements were made on the basis of semiconductor transistors. These machines processed information under the control of programs in Assembly language. Data and programs were entered from punched cards and punched tapes.

The third generation was performed on microcircuits containing hundreds or thousands of transistors on one plate. An example of a third generation machine is the ES COMPUTER. The operation of these machines was controlled from alphanumeric terminals. High-level languages ​​and Assembly were used for control. Data and programs were entered both from the terminal and from punched cards and punched tapes.

The fourth generation was created on the basis of large-scale integrated circuits (LSI). The most prominent representatives of the fourth generation of computers are personal computers (PCs). A universal single-user microcomputer is called personal. Communication with the user was carried out through a color graphic display using high-level languages.

The fifth generation is based on ultra-large-scale integrated circuits (VLSI), which are distinguished by the colossal density of logic elements on the chip.

It is assumed that in the future, input of information into a computer from voice, communication with a machine in natural language, computer vision, machine touch, the creation of intelligent robots and robotic devices will become widespread.

Comparison options Computer generations
first second third fourth
Period of time 1946 - 1959 1960 - 1969 1970 - 1979 since 1980
Element base (for control unit, ALU) Electronic (or electric) lamps Semiconductors (transistors) Integrated circuits Large scale integrated circuits (LSI)
Main type of computer Large Small (mini) Micro
Basic input devices Remote control, punched card, punched tape input Added alphanumeric display and keyboard Alphanumeric display, keyboard Color graphic display, scanner, keyboard
Main output devices Alphanumeric printing device (ADP), punched tape output Plotter, printer
External memory Magnetic tapes, drums, punched tapes, punched cards Added magnetic disk Punched paper tapes, magnetic disk Magnetic and optical disks
Key software solutions Universal programming languages, translators Batch operating systems that optimize translators Interactive operating systems, structured programming languages Friendly software, network operating systems
Computer operating mode Single program Batch Time sharing Personal work and network processing
Purpose of using a computer Scientific and technical calculations Technical and economic calculations Management and economic calculations Telecommunications, information services

Table - Main characteristics of computers of various generations


Generation

1

2

3

4

Period, years

1946 -1960

1955-1970

1965-1980

1980-present vr.

Element base

Vacuum tubes

Semiconductor diodes and transistors

Integrated circuits

Very Large Scale Integrated Circuits

Architecture

Von Neumann architecture

Multiprogram mode

Local computer networks, shared computing systems

Multiprocessor systems, personal computers, global networks

Performance

10 – 20 thousand op/s

100-500 thousand op/s

About 1 million op/s

Tens and hundreds of millions op/s

Software

Machine languages

Operating systems, algorithmic languages

Operating systems, dialog systems, computer graphics systems

Application packages, databases and knowledge, browsers

External devices

Input devices from punched tapes and punched cards,

ATsPU, teleprinters, NML, NMB

Video terminals, HDDs

NGMD, modems, scanners, laser printers

Application

Calculation problems

Engineering, scientific, economic tasks

ACS, CAD, scientific and technical tasks

Management tasks, communications, creation of workstations, text processing, multimedia

Examples

ENIAC, UNIVAC (USA);
BESM - 1,2, M-1, M-20 (USSR)

IBM 701/709 (USA)
BESM-4, M-220, Minsk, BESM-6 (USSR)

IBM 360/370, PDP -11/20, Cray -1 (USA);
EU 1050, 1066,
Elbrus 1.2 (USSR)

Cray T3 E, SGI (USA),
PCs, servers, workstations from various manufacturers

Over the course of 50 years, several generations of computers have appeared, replacing each other. The rapid development of VT throughout the world is determined only by advanced element base and architectural solutions.
Since a computer is a system consisting of hardware and software, it is natural to understand a generation as computer models characterized by the same technological and software solutions (element base, logical architecture, software). Meanwhile, in a number of cases it turns out to be very difficult to classify VT by generation, because the line between them becomes more and more blurred from generation to generation.
First generation.
Element base - electronic tubes and relays; RAM was performed on flip-flops, later on ferrite cores. Reliability - low, a cooling system was required; Computers had significant dimensions. Performance - 5 - 30 thousand arithmetic op/s; Programming - in computer codes (machine code), later autocodes and assemblers appeared. Programming was carried out by a narrow circle of mathematicians, physicists, and electronics engineers. First generation computers were used mainly for scientific and technical calculations.

Second generation.
Semiconductor element base. Reliability and performance are significantly increased, dimensions and power consumption are reduced. Development of input/output facilities and external memory. A number of progressive architectural solutions and further development of programming technology - time sharing mode and multiprogramming mode (combining the work of the central processor for data processing and input/output channels, as well as parallelization of operations for fetching commands and data from memory)
Within the second generation, the differentiation of computers into small, medium and large began to clearly appear. The scope of application of computers to solve problems - planning, economic, production process management, etc. - has expanded significantly.
Automated control systems (ACS) for enterprises, entire industries and technological processes (ACS) are being created. The end of the 50s is characterized by the emergence of a number of problem-oriented high-level programming languages ​​(HLP): FORTRAN, ALGOL-60, etc. Software development was achieved in the creation of libraries of standard programs in various programming languages ​​and for various purposes, monitors and dispatchers for controlling modes operation of a computer, planning its resources, which laid the foundation for the concepts of next-generation operating systems.

Third generation.
Element base on integrated circuits (IC). A series of computer models appear that are software compatible from the bottom up and have increasing capabilities from model to model. The logical architecture of computers and their peripheral equipment have become more complex, which has significantly expanded the functionality and computing capabilities. Operating systems (OS) become part of a computer. Many tasks of managing memory, input/output devices and other resources began to be taken over by the OS or directly by the computer hardware. Software is becoming powerful: database management systems (DBMS), design automation systems (CAD) for various purposes are appearing, automated control systems and process control systems are being improved. Much attention is paid to the creation of application program packages (APP) for various purposes.
Languages ​​and programming systems are developing. Examples: - series of IBM/360 models, USA, serial production - since 1964; -EU Computers, USSR and CMEA countries since 1972.
Fourth generation.
The element base is becoming large-scale (LSI) and ultra-large-scale (VLSI) integrated circuits. Computers were already designed for the efficient use of software (for example, UNIX-like computers, best immersed in the UNIX software environment; Prolog machines focused on artificial intelligence tasks); modern nuclear power plants. Telecommunications information processing is rapidly developing by improving the quality of communication channels using satellite communications. National and transnational information and computer networks are being created, which make it possible to talk about the beginning of the computerization of human society as a whole.
Further intellectualization of computer technology is determined by the creation of more developed human-computer interfaces, knowledge bases, expert systems, parallel programming systems, etc.
The element base has made it possible to achieve great success in miniaturization, increasing the reliability and performance of computers. Micro- and mini-computers have appeared, surpassing the capabilities of medium-sized and large computers of the previous generation at a significantly lower cost. The production technology of VLSI-based processors accelerated the pace of computer production and made it possible to introduce computers to the broad masses of society. With the advent of a universal processor on a single chip (microprocessor Intel-4004, 1971), the era of the PC began.
The first PC can be considered the Altair-8800, created on the basis of the Intel-8080, in 1974. E.Roberts. P. Allen and W. Gates created a translator from the popular Basic language, significantly increasing the intelligence of the first PC (they later founded the famous company Microsoft Inc). The face of the 4th generation is largely determined by the creation of supercomputers characterized by high performance (average speed 50 - 130 megaflops. 1 megaflops = 1 million operations per second with floating point) and non-traditional architecture (the principle of parallelization based on pipelined processing of commands) . Supercomputers are used in solving problems of mathematical physics, cosmology and astronomy, modeling complex systems, etc. Since powerful computers play and will continue to play an important switching role in networks, network issues are often discussed together with questions on supercomputers. Among domestic developments, supercomputers -Computers can be called the Elbrus series machines, the PS-2000 and PS-3000 computer systems, containing up to 64 processors controlled by a common command stream; performance on a number of tasks was achieved on the order of 200 megaflops. At the same time, given the complexity of the development and implementation of modern super-computer projects, which require intensive fundamental research in the field of computer science, electronic technologies, high production standards, and serious financial costs, it seems very unlikely that domestic super-computers will be created in the foreseeable future, according to the main characteristics not inferior to the best foreign models.
It should be noted that with the transition to IP technology for computer production, the defining emphasis of generations is increasingly shifting from the element base to other indicators: logical architecture, software, user interface, application areas, etc.
Fifth generation.

Third generation of computers

The rapidly developing aviation, space technology and other fields of science and technology required miniature, reliable and fast computing devices. Therefore, the further development of electronic computing technology required the development of new technology, and such technology was not slow to appear. New breakthroughs in performance, reliability and miniaturization were made possible by integrated circuit technology, which marked the transition to the third generation of computers created from 1964 to 1974.

The use of integrated circuits has provided a number of advantages:

1. Computer reliability has increased. The reliability of integrated circuits is an order of magnitude higher than the reliability of similar circuits using discrete components. The increase in reliability is primarily due to the reduction of inter-circuit connections, which are one of the weakest links in the design of a computer. Increased reliability, in turn, led to a significant reduction in the cost of computer operation.

2. By increasing the packing density of electronic circuits, the time of signal transmission along conductors has decreased and, as a result, the speed of the computer has increased.

3. The production of integrated circuits lends itself well to automation, which, in mass production, sharply reduces production costs and contributes to the popularization and expansion of the scope of computer applications.

4. The high packing density of electronic circuits has reduced the dimensions, weight and power consumption of computers by several orders of magnitude, which has made it possible to use them in previously inaccessible areas of science and technology, such as aviation and space technology.

Despite the obvious advantages of using integrated circuit technology, in practice their widespread use in computers began 12 years later, after the development of the concept of an integrated circuit, published in 1952 by Geoffrey Dummer of the British Ministry of Defense. However, Dammer only expressed the idea of ​​​​creating electronic elements in the form of a single block using semiconductor layers from the same material, and he did not indicate how to place several elements in a single monolith in practice. In 1956, Dammer tried to turn his ideas into reality, but the devices he developed turned out to be ineffective.

Jack Kilby from Texas Instruments and Robert Noyce from the small company Fairchild Semiconductor managed to implement the ideas outlined in practice.


In May 1958, Jack Kilby took a job at Texas Instruments, where he began developing transistors, capacitors and resistors (he had previously worked at Centralab and was involved in the production of transistor-based hearing aids). One day, the team that Jack Kilby worked for was tasked with exploring options for creating alternative micromodules. Various options were proposed, and Kilby, pondering the problem, came to the conclusion that it would be most profitable for the company to produce only semiconductor elements, and that resistors and capacitors could be made from the same material as the active elements, and placed them in a single monolithic block of the same material. While mulling over this idea, Jack figured out the topology of the multivibrator circuit. So July 24, 1958 The idea of ​​practical implementation of an integrated circuit was born.

Having outlined his ideas to his superiors, Jack was tasked with creating a prototype to prove the validity of his calculations. Then a trigger circuit was built from discrete germanium elements. On August 28, 1958, Jack Kilby demonstrated the layout to Willis Adcock.

After approval from his superiors, Kilby began creating a true monolithic integrated circuit - a phase-shift oscillator.

In parallel with Jack Kilby, Robert Noyce was developing an integrated circuit. Robert really didn't like the technology of producing discrete elements. He said that the labor-intensive process of cutting a silicon wafer into individual elements and then connecting them into a single circuit seemed rather pointless. Noyce proposed isolating individual transistors in a crystal from each other with reverse-biased p-n junctions, and covering the surface with an insulating oxide. Contact between individual elements was carried out through areas etched in the insulating oxide according to a special pattern on the surface of the microcircuit. These sections were connected to each other by thin aluminum lines.

Kilby created his chip and applied for a patent a little earlier than Noyce, however, Noyce’s technology was more thoughtful and convenient, and the application documents were prepared more carefully. As a result, Noyce received a patent for the invention earlier - in April 1961, and Kilby - only in June 1964.

The numerous trials that followed and the war for the right to be considered the inventor of the technology ended in peace. Ultimately, the Court of Appeal upheld Noyce's claim to technological primacy, but ruled that Kilby was credited with creating the first working microcircuit.

Serial production of integrated circuits began in 1961, at the same time the first experimental computer based on integrated circuits was created by Texas Instruments, commissioned by the US Air Force. Development took 9 months and was completed in 1961. The computer had only 15 commands, was unicast, the clock frequency was 100 KHz, the storage capacity was only 30 numbers, 11 binary digits were used to represent numbers, the power consumption was only 16 W, the weight was 585 g, the occupied volume was 100 cubic centimeters.

The first integrated circuits were low-density, but over time the technology for their production was fine-tuned and the density increased. Third-generation computers used low- and medium-density integrated circuits, which made it possible to combine hundreds of elements in one chip. Such microcircuits could be used as separate operational circuits - registers, decoders, counters, etc.

The advent of integrated circuits made it possible to improve the block diagram of second-generation computers. Thus, tightly coupled control devices (CU) and an arithmetic-logical unit (ALU) were combined into a single unit, which became known as a processor. Moreover, the processor could have several arithmetic-logical devices, each of which performed its own function, for example, one ALU was focused on working with integer numbers, another on floating-point numbers, and a third on addresses. There could also be several control devices, one central, and several peripheral ones, used to control individual computer blocks.

Often computers consisted of several processors, which made it possible to make full use of the new prospects in parallel problem solving.

In third-generation computers, the memory hierarchy is already clearly distinguished. RAM is divided into independent blocks with their own control systems, operating in parallel. The structure of RAM is divided into pages and segments. The internal memory of the processor is also developing - prerequisites are being created for the introduction of memory caching.

External storage devices (ESD) are connected through a special selector channel controller (SCC). Their capacity and speed increase significantly. So in June 1973, the IBM 3340 hard drive was released as an external storage device.

The drive was sealed - this protected the working surfaces of the disks from dust and dirt, which made it possible to place the heads very close to the magnetic surface of the disk. For the first time, the principle of an aerodynamic magnetic head was applied, which literally hovered above the rotating surface of the hard drive under the influence of aerodynamic force.

All this made it possible to significantly increase the recording density (up to 1.7 Mbit per square inch) and increase the capacity to 30 MB (on non-removable media). The drive also had removable media with a capacity of 30 MB.

Along with the improvement of logical devices and memory, the modernization of input/output devices was in full swing. The speed of new computers required a faster and more reliable data input/output system than punched card readers and teletypes. They were replaced by keyboards, graphic input panels, light pen displays, plasma panels, raster graphics systems and other devices.

A wide variety of peripheral devices, their relatively high speed, and the need to separate I/O operations from the computing process led to the creation of a specialized multiplex channel controller (MCC), which allowed processors to work in parallel with data I/O.

A generalized block diagram of a third generation computer, illustrating the above, is shown in the diagram below.

On the diagram:

UVV – input-output device;
RAM – one or more random access memory devices;
ALU - one or more arithmetic-logical units;
CU - one or more control devices;
MK - multiplex channel controller (channel for connecting slow devices);
SK - selector channel controller (channel for connecting high-speed devices);
ESD is an external storage device.

The use of integrated technologies significantly reduced the cost of computers, which immediately led to an increase in demand. Many organizations purchased computers and successfully operated them. An important factor is the desire for standardization and the release of entire series of computers that are software compatible from the bottom up.

There is a huge need for application software products, and since the software market has not yet developed, and it is almost impossible to find ready-made, reliable and cheap software, there is a gigantic increase in the popularity of programming and the demand for competent software developers. Each enterprise strives to organize its own staff of programmers; specialized teams arise that develop software and strive to occupy a piece of an as yet untapped niche in the arena of rapidly growing computer technology.

The software market is developing rapidly, software packages are being created to solve standard problems, problem-oriented programming languages ​​and entire software systems for managing the operation of computers, which will later be called operating systems.

The first operating systems began to appear back in the days of second-generation computers. So in 1957, Bell Labs developed the BESYS (Bell Operating System) operating system. And in 1962, General Electric developed the GCOS (General Comprehensive Operating System) operating system, designed to work on Mainframes. But these were all just prerequisites for the creation of truly popular and in-demand operating systems. By the end of the 1960s, a number of operating systems had already been created, implementing many of the necessary functions for managing a computer. In total, more than a hundred different operating systems were used.

Among the most developed operating systems were:

OS/360, developed by IBM in 1964 to manage mainframe computers;

MULTICS- one of the first operating systems with time-sharing programs;

UNIX, developed in 1969 and subsequently grown into a whole family of operating systems, many of which are among the most popular today.

The use of operating systems simplified working with computers and contributed to the popularization of electronic computing technology.

Against the background of a significant increase in interest in electronic computing in the USA, Europe, Japan and other countries, in the USSR there has been a decline in progress in this field of science. So in 1969, the Soviet Union entered into an agreement on cooperation in the development of a Unified Computer System, the model of which was one of the best computers at that time - the IBM360. The USSR's focus on foreign achievements subsequently led to a significant lag in the field of computer technology.

Among the third generation computers, the most significant developments were:

IBM System - 360- a whole family of computers, the production of which began in 1964. All models of the family had a single command system and differed from each other in the amount of RAM and performance, and were universal, capable of solving both complex logical problems and being useful in economic calculations. The versatility of the computer is reflected in its name. 360 means 360 degrees, i.e. her ability to work in any direction. The cost of developing System-360 amounted to about $5 billion, which was twice what the United States spent during World War II on the Manhattan Project, which aimed to create an atomic bomb. The project to create the IBM 360 was second in cost only to the Apollo program. The IBM 360 architecture turned out to be extremely successful and largely determined the direction of development of computing technology;

PDP8- a minicomputer developed on March 22, 1965 by Digital Equipment Corporation (DEC). The term "mini" is relative. This computer was approximately the size of a refrigerator, but, compared to other representatives of electronic computers, its size was truly miniature. This project was commercially very profitable. In total, about 50,000 copies of this car were sold. The PDP-8 system had a lot of similar solutions - clones all over the world. So in the USSR several analogues of this computer were developed: Elektronika-100, Saratov-2, etc.;

Nairi 3- one of the first third-generation computers independently developed in the USSR. This development was released in 1970 at the Yerevan Research Institute of Mathematical Machines. It used simplified machine language to make programming easier. It was also possible to enter some problems in mathematical language;

ES COMPUTER- a unified system of electronic computers, based on the successful and well-proven architecture of the IBM System-360. The first cars of this series were created in the USSR in 1971. The performance of the first samples ranged from 2,750 operations per second (EC-1010) to 350,000 operations per second (EC-1040). Subsequently, productivity was raised to several tens of millions of operations per second, but practically all these developments were stopped in the 1990s after the collapse of the USSR;

ILLIAC 4– one of the most productive third-generation computers. ILLIAC 4 was created in 1972 at the University of Illinois and had a pipeline architecture consisting of 64 processors. The computer was intended to solve a system of partial differential equations and had a speed of about 200 million operations per second.

This list can be continued, but it is clear that computers have already firmly and for a long time entered our lives, and their further development and improvement cannot be stopped. With the development of integrated circuit production technology, the density of the elements has gradually increased. Super-large integrated circuits began to appear, and third-generation computers, built on low- and medium-density integrated circuits, gradually began to be replaced by fourth-generation computers on large and super-large integrated circuits.

Bibliography

1. History of the development of computer technology. Lanina E.P. ISTU, Irkutsk – 2001

2. Development of computer technology. Apokin I.A. M., “Science”, 1974

3. Techie look.

4. Methodologist.

6. From the abacus to the computer. R. S. Guter. Publishing house "Knowledge", Moscow 1981.

After the creation of the EDSAC model in England in 1949, a powerful impetus was given to the development of general purpose computers, which stimulated the emergence of computer models that made up the first generation in a number of countries. Over the course of more than 40 years of development of computer technology (CT), several generations of computers have appeared, replacing each other.

The first generation computers used vacuum tubes and relays as their elemental base; RAM was performed on flip-flops, later on ferrite cores; performance was, as a rule, in the range of 5-30 thousand arithmetic op/s; they were characterized by low reliability, required cooling systems and had significant dimensions. The programming process required considerable skill, good knowledge of computer architecture and its software capabilities. At the beginning of this stage, programming in computer codes (machine code) was used, then autocodes and assemblers appeared. As a rule, first-generation computers were used for scientific and technical calculations, and the programming process itself was more like an art, which was practiced by a very narrow circle of mathematicians, electrical engineers and physicists.

EDSAC computer, 1949

2nd generation computer

The creation of the first transistor in the USA on July 1, 1948 did not foreshadow a new stage in the development of VT and was associated primarily with radio engineering. At first, it was more like a prototype of a new electronic device, requiring serious research and refinement. And already in 1951, William Shockley demonstrated the first reliable transistor. However, their cost was quite high (up to $8 apiece), and only after the development of silicon technology did their price drop sharply, helping to accelerate the process of miniaturization in electronics, which also affected VT.

It is generally accepted that the second generation begins with the RCA-501 computer, which appeared in 1959 in the USA and was created on a semiconductor element base. Meanwhile, back in 1955, an onboard transistor computer was created for the ATLAS intercontinental ballistic missile. New element technology has made it possible to dramatically increase the reliability of the VT, reduce its dimensions and power consumption, and significantly increase productivity. This made it possible to create computers with greater logical capabilities and performance, which contributed to the expansion of the scope of computer applications to solve problems of economic planning, production process management, etc. Within the framework of the second generation, the differentiation of computers into small, medium and large is becoming more and more clear. The end of the 50s is characterized by the beginning of the stage of automation of programming, which led to the emergence of the programming languages ​​Fortran (1957), Algol-60, etc.

3rd generation computer

The third generation is associated with the advent of computers with an elemental base on integrated circuits (IC). In January 1959, Jack Kilby created the first IC, which was a thin germanium plate 1 cm long. To demonstrate the capabilities of integrated technology, Texas Instruments created for the US Air Force an on-board computer containing 587 ICs and a volume (40 cm3) 150 times smaller than a similar old-style computer. But the Kilby IC had a number of significant shortcomings, which were eliminated with the advent of Robert Noyce's planar ICs that same year. From that moment on, IC technology began its triumphal march, capturing more and more new sections of modern electronics and, first of all, computer technology.

The software that ensures the functioning of the computer in various operating modes is becoming significantly more powerful. Developed database management systems (DBMS), design automation systems (CAD) are appearing; Much attention is paid to the creation of application program packages (APP) for various purposes. New languages ​​and programming systems continue to appear and existing ones are developed.

4th generation computer

The design and technological basis of the 4th generation VT are large-scale (LSI) and ultra-large-scale (VLSI) integrated circuits, created respectively in the 70-80s. Such ICs already contain tens, hundreds of thousands and millions of transistors on one crystal (chip). At the same time, LSI technology was partially used in projects of the previous generation (IBM/360, ES Computer Series-2, etc.). The most important conceptual criterion by which 4th generation computers can be separated from 3rd generation computers is that the former were designed with the expectation of effectively using modern computers and simplifying the programming process for the problem programmer. In terms of hardware, they are characterized by extensive use of IC technology and high-speed storage devices. The most famous series of fourth-generation computers can be considered the IBM/370, which, unlike the equally well-known 3rd generation IBM/360 series, has a more developed command system and a wider use of microprogramming. In the older models of the 370 series, a virtual memory device was implemented, which allows the user to create the appearance of unlimited RAM resources.

The phenomenon of the personal computer (PC) dates back to the creation in 1965 of the first minicomputer, the PDP-8, which emerged as a result of the universalization of a specialized microprocessor for controlling a nuclear reactor. The machine quickly gained popularity and became the first mass-produced computer of this class; in the early 70s, the number of cars exceeded 100 thousand units. A further important step was the transition from mini- to micro-computers; this new structural level of VT began to take shape at the turn of the 70s, when the advent of LSI made it possible to create a universal processor on a single chip. The first microprocessor Intel-4004 was created in 1971 and contained 2250 elements, and the first universal microprocessor Intel-8080, which was the standard for microcomputer technology and created in 1974, already contained 4500 elements and served as the basis for the creation of the first PCs. In 1979, one of the most powerful and versatile 16-bit microprocessor Motorolla-68000 with 70,000 elements was released, and in 1981, Hewlett Packard's first 32-bit microprocessor with 450 thousand elements was released.

PC Altair-8800

The first PC can be considered the Altair-8800, created based on the Intel-8080 microprocessor in 1974 by Edward Roberts. The computer was mailed, cost only $397, and was expandable with peripherals (only 256 bytes of RAM!!!). For the Altair-8800, Paul Allen and Bill Gates created a translator from the popular Basic language, significantly increasing the intelligence of the first PC (they later founded the now famous Microsoft Inc). Equipping a PC with a color monitor led to the creation of a competing PC model, the Z-2; a year after the appearance of the first Altair-8800 PC, more than 20 different companies and firms joined the PC production; The PC industry began to take shape (PC production itself, their sales, periodicals and non-periodic publications, exhibitions, conferences, etc.). And already in 1977, three PC models Apple-2 (Apple Computers), TRS-80 (Tandy Radio Shark) and PET (Commodore) were put into mass production, of which Apple, which was initially lagging behind in the competition, soon became leader in PC production (its Apple-2 model was a huge success). By 1980, Apple Corporation entered Wall Street with the largest share capital and annual income of $117 million.

But already in 1981, IBM, in order to avoid losing the mass market, began producing its now widely known IBM PC/XT/AT and PS/2 series of PCs, which opened a new era of personal computer technology. The entry of the giant IBM into the arena of the PC industry puts PC production on an industrial basis, which makes it possible to solve a number of important issues for the user (standardization, unification, developed software, etc.), to which the company paid great attention already within the framework of the production of the IBM/360 series and IBM/370. We can reasonably believe that in the short period of time that passed from the debut of the Altair-8800 to the IBM PC, more people joined the VT than in the entire long period - from Babage’s Analytical Engine to the invention of the first IPs.

The first computer that opened the supercomputer class itself can be considered the Amdahl 470V16 model, created in 1975 and compatible with the IBM series. The machine used an effective parallelization principle based on pipeline processing of commands, and the element base used LSI technology. Currently, the class of supercomputers includes models with an average speed of at least 20 megaflops (1 megaflops = 1 million floating-point operations per second). The first model with such performance was the largely unique ILLIAC-IV computer, created in 1975 in the USA and having a maximum speed of about 50 megaflops. This model had a huge impact on the subsequent development of supercomputers with matrix architecture. A bright page in the history of supercomputers is associated with the Cray series of S. Cray, the first model of which, Cray-1, was created in 1976 and had a peak speed of 130 megaflops. The architecture of the model was based on the pipeline principle of vector and scalar data processing with an elemental base on VLSI. It was this model that laid the foundation for the class of modern supercomputers. It should be noted that despite a number of interesting architectural solutions, the success of the model was achieved mainly due to successful technological solutions. Subsequent models Cray-2, Cray X-MP, Cray-3, Cray-4 brought the series performance to about 10 thousand megaflops, and the Cray MP model, using a new architecture with 64 processors and an elemental base on new silicon chips, had peak performance about 50 gigaflops.

Concluding the excursion into the history of modern military technology with one or another detail of its individual stages, several significant comments should be made. First of all, there is an increasingly smooth transition from one generation of computers to another, when the ideas of the new generation mature to one degree or another and are even implemented in the previous generation. This is especially noticeable during the transition to IC technology for the production of VT, when the defining emphasis of generations is increasingly shifting from the element base to other indicators: logical architecture, software, user interface, application areas, etc. The most diverse VT appears, the characteristics of which do not fit into traditional classification frameworks; one gets the impression that we are at the beginning of a kind of universalization of computer technology, when all its classes strive to level out their computing capabilities. Many elements of the fifth generation are, to one degree or another, characteristic today.

The development of computers is divided into several periods. Generations of computers of each period differ from each other in their elemental base and software.

First generation of computers

The first generation (1945-1958) of computers was built on vacuum tubes - diodes and triodes. Most of the first generation machines were experimental devices and were built to test certain theoretical principles. The use of vacuum tube technology, the use of memory systems on mercury delay lines, magnetic drums, cathode ray tubes (Williams tubes), made their operation very unreliable. In addition, such computers were heavy and occupied large areas, sometimes entire buildings. Punched tapes and punched cards, magnetic tapes and printing devices were used for data input and output.

The concept of a stored program was implemented. The software of 1st generation computers consisted mainly of standard subroutines; their speed ranged from 10 to 20 thousand ops. /sec.

Machines of this generation: ENIAC (USA), MESM (USSR), BESM-1, M-1, M-2, M-Z, "Strela", "Minsk-1", "Ural-1", "Ural-2" ", "Ural-3", M-20, "Setun", BESM-2, "Hrazdan", IBM - 701, used a lot of electricity and consisted of a very large number of vacuum tubes. For example, the Strela machine consisted of 6,400 vacuum tubes and 60 thousand pieces of semiconductor diodes. Their performance did not exceed 2-3 thousand operations per second, RAM did not exceed 2 KB. Only the M-2 machine (1958) had 4 KB of RAM and a speed of 20 thousand operations per second.

Second generation of computers

2nd generation computers were developed in 1959-1967. The main element was no longer vacuum tubes, but semiconductor diodes and transistors, and magnetic cores and magnetic drums, the distant ancestors of modern hard drives, began to be used as memory devices. Computers have become more reliable, their performance has increased, energy consumption has decreased, and the overall dimensions of the machines have decreased.

With the advent of memory on magnetic cores, its operating cycle decreased to tens of microseconds. The main principle of the structure is centralization. High-performance devices for working with magnetic tapes and memory devices on magnetic disks appeared. In addition, it became possible to program in algorithmic languages. The first high-level languages ​​were developed - Fortran, Algol, Cobol. The performance of 2nd generation machines has already reached 100-5000 thousand ops. /sec.

Examples of second generation machines: BESM-6, BESM-4, Minsk-22 - are designed to solve scientific, technical and economic planning problems; Minsk-32 (USSR), M-40 computer, - 50 - for missile defense systems; Ural - 11, - 14, - 16 - general-purpose computers focused on solving engineering and technical problems.

Third generation of computers

The third generation computers (1968-1973) used integrated circuits. The development in the 60s of integrated circuits - entire devices and assemblies of tens and hundreds of transistors made on a single semiconductor crystal (what are now called microcircuits) led to the creation of 3rd generation computers. At the same time, semiconductor memory appeared, which is still used in personal computers as operational memory. The use of integrated circuits has greatly increased the capabilities of computers.

Now the central processor has the ability to work in parallel and control numerous peripheral devices. Computers could simultaneously process several programs (the principle of multiprogramming). As a result of the implementation of the multiprogramming principle, it became possible to work in time-sharing mode in an interactive mode. Users remote from the computer were given the opportunity, independently of each other, to quickly interact with the machine.

Computers were designed on the basis of integrated circuits of low degree of integration (MIS - 10-100 components per chip) and medium degree of integration (SIS - 10-1000 components per chip). An idea arose, which was implemented, of designing a family of computers with the same architecture, which was based mainly on software. In the late 60s, minicomputers appeared. In 1971, the first microprocessor appeared. The speed of 3rd generation computers has reached about 1 million ops. /sec.

During these years, computer production acquired an industrial scale. Starting with the 3rd generation computers, the development of serial computers has become traditional. Although machines of the same series were very different from each other in capabilities and performance, they were informationally, software and hardware compatible. The most common in those years was the System/360 family from IBM. The CMEA countries produced computers of a single series "ES Computer": ES-1022, ES-1030, ES-1033, ES-1046, ES-1061, ES-1066, etc. Computers of this generation also include "IVM-370", "Electronics-100/25", "Electronics-79", "SM-3", "SM-4", etc.

For computer series, the software was greatly expanded (operating systems, high-level programming languages, application programs, etc.). In 1969, the Unix operating system and the C programming language appeared simultaneously, which had a huge impact on the software world and still maintains its leading position.

Fourth generation of computers

In fourth-generation computers (1974-1982), the use of large-scale integrated circuits (LSI - 1000-100000 components per chip) and ultra-large-scale integrated circuits (VLSI - 100000-10000000 components per chip) increased their performance to tens and hundreds of millions. op. /sec.

The beginning of this generation is considered to be 1975 - Amdahl Corp. released six AMDAHL 470 V/6 computers, which used LSI as an elemental base. High-speed memory systems on integrated circuits began to be used - MOS RAM with a capacity of several megabytes. If the machine is turned off, the data contained in the MOS RAM is saved by automatically transferring to disk. When the machine is turned on, the system starts using a boot program stored in ROM (read-only memory), which unloads the operating system and resident software into the MOS RAM.

The development of 4th generation computers went in 2 directions: 1st direction - the creation of supercomputers - complexes of multiprocessor machines. The speed of such machines reaches several billion operations per second. They are capable of processing huge amounts of information. These include the complexes ILLIAS-4, CRAY, CYBER, Elbrus-1, Elbrus-2, etc. Multiprocessor computing complexes (MCC) Elbrus-2 were actively used in the Soviet Union in areas requiring a large volume of calculations, before everything in the defense industry.

2nd direction - further development on the basis of LSI and VLSI microcomputers and personal computers (PC). The first representatives of these machines are computers from Apple, IBM - PC (XT, AT, PS / 2), domestic "Iskra", "Electronics", "Mazovia", "Agat", "ES-1840", "ES-1841" etc. Starting from this generation, computers began to be called computers. The software is complemented by databases and banks.

Fifth generation of computers

The fifth generation computer is the computer of the future. The development program for the so-called fifth generation of computers was adopted in Japan in 1982. It was assumed that by 1991 fundamentally new computers would be created, focused on solving problems of artificial intelligence. With the help of the Prolog language and innovations in computer design, it was planned to come close to solving one of the main problems of this branch of computer science - the problem of storing and processing knowledge. In short, for fifth-generation computers there would be no need to write programs, but it would be enough to explain in “almost natural” language what is required of them.

It is assumed that their elemental base will not be VLSI, but devices created on their basis with elements of artificial intelligence. To increase memory and speed, advances in optoelectronics and bioprocessors will be used.

For fifth-generation computers, completely different tasks are posed than during the development of all previous computers. If the developers of computers from the 1st to 4th generations were faced with such tasks as increasing productivity in the field of numerical calculations, achieving large memory capacity, then the main task of the developers of the 5th generation computers is the creation of artificial intelligence of the machine (the ability to draw logical conclusions from the presented facts), the development of " intellectualization" of computers - eliminating the barrier between man and computer.

Unfortunately, the Japanese fifth-generation computer project repeated the tragic fate of early research in the field of artificial intelligence. More than 50 billion yen of investment were wasted, the project was discontinued, and the developed devices turned out to be no higher in performance than mass-produced systems of that time. However, the research conducted during the project and the experience gained in knowledge representation and parallel inference methods have greatly helped progress in the field of artificial intelligence systems in general.

Already now, computers are able to perceive information from handwritten or printed text, from forms, from the human voice, recognize the user by voice, and translate from one language to another. This allows all users to communicate with computers, even those who do not have special knowledge in this area.

Many of the advances that artificial intelligence has made are being used in industry and the business world. Expert systems and neural networks are effectively used for classification tasks (SPAM filtering, text categorization, etc.). Genetic algorithms conscientiously serve humans (used, for example, to optimize portfolios in investment activities), robotics (industry, also multi-agent systems. Other areas of artificial intelligence, for example, distributed knowledge representation and problem solving on the Internet, are not asleep: thanks to them, in the next few years revolution can be expected in a number of areas of human activity.

At the present stage, computer device remote history

The need for faster, cheaper and more versatile processors is forcing manufacturers to constantly increase the number of transistors in them. However, this process is not endless. The exponential growth in this number predicted by Gordon Moore in 1973 is becoming increasingly difficult to maintain. Experts say that this law will cease to apply as soon as the gates of transistors, which regulate the flow of information in the chip, become commensurate with the wavelength of the electron (in silicon, on which production is currently built, this is about 10 nanometers). And this will happen somewhere between 2010 and 2020. As computer architectures become more sophisticated as they approach the physical limit, the cost of designing, manufacturing, and testing chips increases. Thus, the stage of evolutionary development will sooner or later be replaced by revolutionary changes.

As a result of the race to increase productivity, many problems arise. The most acute of them is overheating in ultra-dense packaging, caused by a significantly smaller heat transfer area. The energy concentration in modern microprocessors is extremely high. Current strategies for dissipating the generated heat, such as reducing the supply voltage or selectively activating only the necessary parts in microcircuits, are ineffective unless active cooling is used.

As the size of transistors has decreased, the insulating layers have become thinner, which means their reliability has also decreased, since electrons can penetrate through thin insulators (tunnel effect). This problem can be solved by reducing the control voltage, but only to certain limits.

Today, the main condition for increasing processor performance is parallelism methods. As you know, a microprocessor processes a sequence of instructions (commands) that make up a particular program. If you organize parallel (that is, simultaneous) execution of instructions, overall performance will increase significantly. The problem of parallelism is solved by methods of pipelining calculations, using superscalar architecture and branch prediction. Multi-core architecture. This architecture involves the integration of several simple microprocessor cores on a single chip. Each core executes its own stream of instructions. Each microprocessor core is significantly simpler than a multi-threaded processor core, making chip design and testing easier. But meanwhile, the memory access problem is getting worse, and compilers need to be replaced.

Multi-threaded processor. These processors are similar in architecture to tracers: the entire chip is divided into processing elements reminiscent of a superscalar microprocessor. Unlike a trace processor, here each element processes instructions from different threads within one clock cycle, thereby achieving thread-level parallelism. Of course, each thread has its own program counter and set of registers.

"Tile" architecture. Proponents believe that software should be compiled directly into the hardware, as this will provide maximum parallelism. This approach requires quite complex compilers, which have not yet been created. The processor in this case consists of many “tiles”, each of which has its own RAM and is connected to other “tiles” in a kind of lattice, the nodes of which can be turned on and off. The order in which instructions are executed is set by the software.

Multi-storey architecture. Here we are talking not about logical, but about physical structure. The idea is that the chips would contain vertical "stacks" of microcircuits made using thin-film transistor technology borrowed from TFT display manufacturing. In this case, relatively long horizontal interconnects are converted into short vertical ones, which reduces signal latency and increases processor performance. The idea of ​​“three-dimensional” chips has already been implemented in the form of working samples of eight-story memory chips. It is quite possible that it is also acceptable for microprocessors, and in the near future all microchips will be expanded not only horizontally, but also vertically.