The history of the development of computer technology of the computer generation is brief. History Tue. Early devices and counting devices


The need for devices to speed up the counting process appeared in humans thousands of years ago. Back then, simple means were used for this, such as counting sticks. Later, the abacus appeared, better known to us as the abacus. It allowed you to perform only the simplest arithmetic operations. A lot has changed since then. Almost every home has a computer and a smartphone in their pocket. All this can be combined under the general name “Computer technology” or “Computer technology”. In this article you will learn a little more about the history of its development.

1623 Wilhelm Schickard thinks: “Why don’t I invent the first adding machine?” And he invents it. He produces a mechanical device capable of performing basic arithmetic operations (addition, multiplication, division and subtraction) and works with the help of gears and cylinders.

1703 Gottfried Wilhelm Leibniz describes the binary number system in his treatise “Explication de l’Arithmtique Binaire”, which is translated into Russian as “Explanation of Binary Arithmetic”. The implementation of computers using it is much simpler, and Leibniz himself knew about this. Back in 1679, he created a drawing of a binary computer. But in practice, the first such device appeared only in the middle of the 20th century.

1804 Punched cards (punched cards) appeared for the first time. Their use continued into the 1970s. They are sheets of thin cardboard with holes in some places. Information was recorded by various sequences of these holes.

1820 Charles Xavier Thomas (yes, almost like Professor X) releases the Thomas Adding Machine, which goes down in history as the first mass-produced counting device.

1835 Charles Babbage wants to invent his own analytical engine and describes it. Initially, the purpose of the device was to calculate logarithmic tables with high accuracy, but Babbage later changed his mind. Now his dream was a general purpose car. At that time, the creation of such a device was quite possible, but working with Babbage turned out to be difficult because of his character. As a result of disagreements, the project was closed.

1845 Israel Staffel creates the first ever device capable of extracting square roots from numbers.

1905 Percy Ludgert publishes a design for a programmable mechanical computer.

1936 Konrad Zuse decides to create his own computer. He calls it Z1.

1941 Konrad Zuse releases the Z3, the world's first software-controlled computer. Subsequently, several dozen more Z series devices were released.

1961 Launch of ANITA Mark VII, the world's first fully electronic calculator.

A few words about computer generations.

1st generation. These are so-called tube computers. They work using vacuum tubes. The first such device was created in the middle of the 20th century.

2nd generation. Everyone used 1st generation computers, until suddenly in 1947 Walter Brattain and John Bardeen invented a very important thing - the transistor. This is how the second generation of computers appeared. They consumed much less energy and were more productive. These devices were common in the 50s and 60s of the XX century, until the integrated circuit was invented in 1958.

3rd generation. The operation of these computers was based on integrated circuits. Each such circuit contains hundreds of millions of transistors. However, the creation of the third generation did not stop the production of second-generation computers.

4th generation. In 1969, Ted Hoff came up with the idea of ​​replacing many integrated circuits with one small device. It was later called a microcircuit. Thanks to this, it became possible to create very small microcomputers. The first such device was released by Intel. And in the 80s, microprocessors and microcomputers turned out to be the most common. We still use them now.

This was a brief history of the development of computer technology and computing technology. I hope I managed to interest you. Goodbye!

As soon as a person discovered the concept of “quantity”, he immediately began to select tools that would optimize and facilitate counting. Today, super-powerful computers, based on the principles of mathematical calculations, process, store and transmit information - the most important resource and engine of human progress. It is not difficult to get an idea of ​​how the development of computer technology took place by briefly considering the main stages of this process.

The main stages of the development of computer technology

The most popular classification proposes to highlight the main stages of the development of computer technology on a chronological basis:

  • Manual stage. It began at the dawn of the human era and continued until the middle of the 17th century. During this period, the basics of counting emerged. Later, with the formation of positional number systems, devices appeared (abacus, abacus, and later a slide rule) that made calculations by digits possible.
  • Mechanical stage. It began in the middle of the 17th century and lasted almost until the end of the 19th century. The level of development of science during this period made it possible to create mechanical devices that perform basic arithmetic operations and automatically remember the highest digits.
  • The electromechanical stage is the shortest of all that unite the history of the development of computer technology. It only lasted about 60 years. This is the period between the invention of the first tabulator in 1887 until 1946, when the very first computer (ENIAC) appeared. New machines, the operation of which was based on an electric drive and an electric relay, made it possible to perform calculations with much greater speed and accuracy, but the counting process still had to be controlled by a person.
  • The electronic stage began in the second half of the last century and continues today. This is the story of six generations of electronic computers - from the very first giant units, which were based on vacuum tubes, to the ultra-powerful modern supercomputers with a huge number of parallel working processors, capable of simultaneously executing many commands.

The stages of development of computer technology are divided according to a chronological principle rather arbitrarily. At a time when some types of computers were in use, the prerequisites for the emergence of the following were actively being created.

The very first counting devices

The earliest counting tool known to the history of the development of computer technology is the ten fingers on human hands. Counting results were initially recorded using fingers, notches on wood and stone, special sticks, and knots.

With the advent of writing, various ways of writing numbers appeared and developed, and positional number systems were invented (decimal in India, sexagesimal in Babylon).

Around the 4th century BC, the ancient Greeks began to count using an abacus. Initially, it was a clay flat tablet with stripes applied to it with a sharp object. Counting was carried out by placing small stones or other small objects on these stripes in a certain order.

In China, in the 4th century AD, a seven-pointed abacus appeared - suanpan (suanpan). Wires or ropes - nine or more - were stretched onto a rectangular wooden frame. Another wire (rope), stretched perpendicular to the others, divided the suanpan into two unequal parts. In the larger compartment, called “earth,” there were five bones strung on wires, in the smaller compartment, called “sky,” there were two of them. Each of the wires corresponded to a decimal place.

Traditional soroban abacus has become popular in Japan since the 16th century, having arrived there from China. At the same time, abacus appeared in Russia.

In the 17th century, based on logarithms discovered by the Scottish mathematician John Napier, the Englishman Edmond Gunter invented the slide rule. This device was constantly improved and has survived to this day. It allows you to multiply and divide numbers, raise to powers, determine logarithms and trigonometric functions.

The slide rule became a device that completed the development of computer technology at the manual (pre-mechanical) stage.

The first mechanical calculating devices

In 1623, the German scientist Wilhelm Schickard created the first mechanical "calculator", which he called a counting clock. The mechanism of this device resembled an ordinary clock, consisting of gears and sprockets. However, this invention became known only in the middle of the last century.

A quantum leap in the field of computing technology was the invention of the Pascalina adding machine in 1642. Its creator, French mathematician Blaise Pascal, began work on this device when he was not even 20 years old. "Pascalina" was a mechanical device in the form of a box with a large number of interconnected gears. The numbers that needed to be added were entered into the machine by turning special wheels.

In 1673, the Saxon mathematician and philosopher Gottfried von Leibniz invented a machine that performed the four basic mathematical operations and could extract the square root. The principle of its operation was based on the binary number system, specially invented by the scientist.

In 1818, the Frenchman Charles (Karl) Xavier Thomas de Colmar, taking Leibniz's ideas as a basis, invented an adding machine that could multiply and divide. And two years later, the Englishman Charles Babbage began constructing a machine that would be capable of performing calculations with an accuracy of 20 decimal places. This project remained unfinished, but in 1830 its author developed another - an analytical engine for performing accurate scientific and technical calculations. The machine was supposed to be controlled by software, and perforated cards with different locations of holes were to be used to input and output information. Babbage's project foresaw the development of electronic computing technology and the problems that could be solved with its help.

It is noteworthy that the fame of the world's first programmer belongs to a woman - Lady Ada Lovelace (nee Byron). It was she who created the first programs for Babbage's computer. One of the computer languages ​​was subsequently named after her.

Development of the first computer analogues

In 1887, the history of the development of computer technology entered a new stage. The American engineer Herman Hollerith (Hollerith) managed to design the first electromechanical computer - the tabulator. Its mechanism had a relay, as well as counters and a special sorting box. The device read and sorted statistical records made on punched cards. Subsequently, the company founded by Hollerith became the backbone of the world-famous computer giant IBM.

In 1930, the American Vannovar Bush created a differential analyzer. It was powered by electricity, and vacuum tubes were used to store data. This machine was capable of quickly finding solutions to complex mathematical problems.

Six years later, the English scientist Alan Turing developed the concept of a machine, which became the theoretical basis for modern computers. It had all the main properties of modern computer technology: it could step-by-step perform operations that were programmed in the internal memory.

A year after this, George Stibitz, a scientist from the United States, invented the country's first electromechanical device capable of performing binary addition. His operations were based on Boolean algebra - mathematical logic created in the mid-19th century by George Boole: the use of the logical operators AND, OR and NOT. Later, the binary adder will become an integral part of the digital computer.

In 1938, Claude Shannon, an employee of the University of Massachusetts, outlined the principles of the logical design of a computer that uses electrical circuits to solve Boolean algebra problems.

The beginning of the computer era

The governments of the countries involved in World War II were aware of the strategic role of computing in the conduct of military operations. This was the impetus for the development and parallel emergence of the first generation of computers in these countries.

A pioneer in the field of computer engineering was Konrad Zuse, a German engineer. In 1941, he created the first computer controlled by a program. The machine, called the Z3, was built on telephone relays, and programs for it were encoded on perforated tape. This device was able to work in the binary system, as well as operate with floating point numbers.

The next model of Zuse's machine, the Z4, is officially recognized as the first truly working programmable computer. He also went down in history as the creator of the first high-level programming language, called Plankalküll.

In 1942, American researchers John Atanasoff (Atanasoff) and Clifford Berry created a computing device that ran on vacuum tubes. The machine also used binary code and could perform a number of logical operations.

In 1943, in an English government laboratory, in an atmosphere of secrecy, the first computer, called “Colossus,” was built. Instead of electromechanical relays, it used 2 thousand electronic tubes for storing and processing information. It was intended to crack and decrypt the code of secret messages transmitted by the German Enigma encryption machine, which was widely used by the Wehrmacht. The existence of this device was kept in the strictest confidence for a long time. After the end of the war, the order for its destruction was signed personally by Winston Churchill.

Architecture development

In 1945, the Hungarian-German American mathematician John (Janos Lajos) von Neumann created the prototype for the architecture of modern computers. He proposed writing a program in the form of code directly into the machine’s memory, implying joint storage of programs and data in the computer’s memory.

Von Neumann's architecture formed the basis for the first universal electronic computer, ENIAC, being created at that time in the United States. This giant weighed about 30 tons and was located on 170 square meters of area. 18 thousand lamps were used in the operation of the machine. This computer could perform 300 multiplication operations or 5 thousand additions in one second.

Europe's first universal programmable computer was created in 1950 in the Soviet Union (Ukraine). A group of Kyiv scientists, led by Sergei Alekseevich Lebedev, designed a small electronic calculating machine (MESM). Its speed was 50 operations per second, it contained about 6 thousand vacuum tubes.

In 1952, domestic computer technology was replenished with BESM, a large electronic calculating machine, also developed under the leadership of Lebedev. This computer, which performed up to 10 thousand operations per second, was at that time the fastest in Europe. Information was entered into the machine's memory using punched paper tape, and data was output via photo printing.

During the same period, a series of large computers was produced in the USSR under the general name “Strela” (the author of the development was Yuri Yakovlevich Bazilevsky). Since 1954, serial production of the universal computer "Ural" began in Penza under the leadership of Bashir Rameev. The latest models were hardware and software compatible with each other, there was a wide selection of peripheral devices, allowing you to assemble machines of various configurations.

Transistors. Release of the first serial computers

However, the lamps failed very quickly, making it very difficult to work with the machine. The transistor, invented in 1947, managed to solve this problem. Using the electrical properties of semiconductors, it performed the same tasks as vacuum tubes, but occupied much less space and did not consume as much energy. Along with the advent of ferrite cores for organizing computer memory, the use of transistors made it possible to significantly reduce the size of machines, make them even more reliable and faster.

In 1954, the American company Texas Instruments began mass-producing transistors, and two years later the first second-generation computer built on transistors, the TX-O, appeared in Massachusetts.

In the middle of the last century, a significant part of government organizations and large companies used computers for scientific, financial, engineering calculations, and working with large amounts of data. Gradually, computers acquired features familiar to us today. During this period, plotters, printers, and storage media on magnetic disks and tape appeared.

The active use of computer technology has led to an expansion of the areas of its application and required the creation of new software technologies. High-level programming languages ​​have appeared that make it possible to transfer programs from one machine to another and simplify the process of writing code (Fortran, Cobol and others). Special translator programs have appeared that convert code from these languages ​​into commands that can be directly perceived by the machine.

The emergence of integrated circuits

In 1958-1960, thanks to engineers from the United States Robert Noyce and Jack Kilby, the world learned about the existence of integrated circuits. Miniature transistors and other components, sometimes up to hundreds or thousands, were mounted on a silicon or germanium crystal base. The chips, just over a centimeter in size, were much faster than transistors and consumed much less power. The history of the development of computer technology connects their appearance with the emergence of the third generation of computers.

In 1964, IBM released the first computer of the SYSTEM 360 family, which was based on integrated circuits. From this time on, the mass production of computers can be counted. In total, more than 20 thousand copies of this computer were produced.

In 1972, the USSR developed the EC (unified series) computer. These were standardized complexes for the operation of computer centers that had a common command system. The American IBM 360 system was taken as the basis.

The following year, DEC released the PDP-8 minicomputer, the first commercial project in this area. The relatively low cost of minicomputers has made it possible for small organizations to use them.

During the same period, the software was constantly improved. Operating systems were developed aimed at supporting the maximum number of external devices, and new programs appeared. In 1964, they developed BASIC, a language designed specifically for training novice programmers. Five years after this, Pascal appeared, which turned out to be very convenient for solving many applied problems.

Personal computers

After 1970, production of the fourth generation of computers began. The development of computer technology at this time is characterized by the introduction of large integrated circuits into computer production. Such machines could now perform thousands of millions of computational operations in one second, and their RAM capacity increased to 500 million bits. A significant reduction in the cost of microcomputers has led to the fact that the opportunity to buy them gradually became available to the average person.

Apple was one of the first manufacturers of personal computers. Its creators, Steve Jobs and Steve Wozniak, designed the first PC model in 1976, giving it the name Apple I. It cost only $500. A year later, the next model of this company was presented - Apple II.

The computer of this time for the first time became similar to a household appliance: in addition to its compact size, it had an elegant design and a user-friendly interface. The proliferation of personal computers at the end of the 1970s led to the fact that the demand for mainframe computers fell markedly. This fact seriously worried their manufacturer, IBM, and in 1979 it released its first PC to the market.

Two years later, the company's first open-architecture microcomputer appeared, based on the 16-bit 8088 microprocessor manufactured by Intel. The computer was equipped with a monochrome display, two drives for five-inch floppy disks, and 64 kilobytes of RAM. On behalf of the creator company, Microsoft specially developed an operating system for this machine. Numerous IBM PC clones appeared on the market, which stimulated the growth of industrial production of personal computers.

In 1984, Apple developed and released a new computer - the Macintosh. Its operating system was extremely user-friendly: it presented commands in the form of graphic images and allowed them to be entered using a mouse. This made the computer even more accessible, since now no special skills were required from the user.

Some sources date computers of the fifth generation of computing technology to 1992-2013. Briefly, their main concept is formulated as follows: these are computers created on the basis of highly complex microprocessors, having a parallel-vector structure, which makes it possible to simultaneously execute dozens of sequential commands embedded in the program. Machines with several hundred processors working in parallel make it possible to process data even more accurately and quickly, as well as create efficient networks.

The development of modern computer technology already allows us to talk about sixth generation computers. These are electronic and optoelectronic computers running on tens of thousands of microprocessors, characterized by massive parallelism and modeling the architecture of neural biological systems, which allows them to successfully recognize complex images.

Having consistently examined all stages of the development of computer technology, an interesting fact should be noted: inventions that have proven themselves well in each of them have survived to this day and continue to be used successfully.

Computer Science Classes

There are various options for classifying computers.

So, according to their purpose, computers are divided:

  • to universal ones - those that are capable of solving a wide variety of mathematical, economic, engineering, technical, scientific and other problems;
  • problem-oriented - solving problems of a narrower direction, associated, as a rule, with the management of certain processes (data recording, accumulation and processing of small amounts of information, performing calculations in accordance with simple algorithms). They have more limited software and hardware resources than the first group of computers;
  • specialized computers usually solve strictly defined tasks. They have a highly specialized structure and, with a relatively low complexity of the device and control, are quite reliable and productive in their field. These are, for example, controllers or adapters that control a number of devices, as well as programmable microprocessors.

Based on size and productive capacity, modern electronic computing equipment is divided into:

  • to ultra-large (supercomputers);
  • large computers;
  • small computers;
  • ultra-small (microcomputers).

Thus, we saw that devices, first invented by man to take into account resources and values, and then to quickly and accurately carry out complex calculations and computational operations, were constantly developing and improving.

Technical means for implementing information processes

The history of the development of VT has several periods: mechanical, electromechanical and electronic.

To carry out calculations in Ancient Babylon (about 3 thousand years BC), and then in Ancient Greece and Ancient Rome (IV century BC), counting boards called abacus. The abacus board was a clay plate with recesses into which pebbles were placed. Subsequently, the recesses were replaced by wire with stringed bones (the prototype of the counting).

In the 17th century in Europe, mathematicians (W. Schiccard (1623 ᴦ.) and Blaise Pascal (1642 ᴦ.), G. Leibniz (1671 ᴦ.)) invented mechanical machines, capable of automatically performing arithmetic operations (a prototype of an adding machine).

In the first third of the 19th century, the English mathematician C. Babbage developed a project for a programmable automatic mechanical computing device known as Babbage's Analytical Engine. The project's patron, Countess Ada Augusta Lovelace, was the programmer of this “analytical engine.”

G. Hollerith in 1888 ᴦ. created electromechanical a machine that consisted of a puncher, a punched card sorter, and a adding machine called a tabulator. This machine was first used in the USA when processing census results.

The speed of calculations in mechanical and electromechanical machines was limited, and therefore in the 1930s. development has begun electronic computers, the elemental base of which was a three-electrode vacuum tube.

In 1946 ᴦ. at the university ᴦ. Pennsylvania (USA) an electronic computer was built, called UNIAK. The machine weighed 30 tons, occupied an area of ​​200 square meters, and contained 18,000 lamps. Programming was carried out by installing switches and connecting connectors. As a result, even the simplest program took a very long time to create and execute. Difficulties in programming on UNIAK prompted John von Neumann, who was a consultant on the project, to develop new principles for constructing a computer architecture.

In the USSR, the first computer was created in 1948.

The history of computer development is usually considered by generation.

First generation(1946-1960) - ϶ᴛᴏ the time of formation of the architecture of machines of the von Neumann type, built on vacuum tubes with a speed of 10-20 thousand op/s. First generation computers were bulky and unreliable. software tools were represented by machine languages.

In 1950 ᴦ. In the USSR, the MESM (small electronic calculating machine) was put into operation, and two years later a large electronic calculating machine (10 thousand op/s) appeared.

Second generation(1960 – 1964) - machines built on transistors with speeds of up to hundreds of thousands of operations per second. Magnetic drums were used to organize external memory, and magnetic cores were used for main memory. At the same time, high-level algorithmic languages ​​such as Algol, Cobol, Fortran were developed, which made it possible to compose programs without taking into account the type of machine. The first computer with the distinctive features of the second generation was the IBM 704.

Third generation(1964 – 1970) are characterized by the fact that instead of transistors, integrated circuits (ICs) and semiconductor memory began to be used.

Most of the machines belonging to the third generation in terms of their characteristics were part of the series (family) of “System/360” machines (analogue of the ES computer), released by IBM in the mid-60s. The machines in this series had a single architecture and were software compatible.

At this time, the first supercomputer BESM 6 appeared in the USSR, which had a productivity of 1 million op/s.

Fourth generation(1970 – 1980) - ϶ᴛᴏ machines built on large integrated circuits (LSI). Such circuits contain up to several tens of thousands of elements per chip. Computers of this generation perform tens and hundreds of millions of operations per second.

In 1971. The world's first four-bit microprocessor Intel 4004, containing 2300 transistors on a chip, appeared, and a year later the eight-bit processor Intel 8008 appeared. The creation of microprocessors served as the basis for the development of a personal computer (PC), ᴛ.ᴇ. a device that performs the same functions as a large computer, but is designed for use by one user.

1973 ᴦ. Xerox created the first prototype of a personal computer.

1974 ᴦ. the first commercially distributed personal computer Altair-8800 appeared, for which at the end of 1975 ᴦ. Paul Allen and Bill Gates wrote an interpreter for the BASIC language.

In August 1981 ᴦ. IBM released the IBM PC. The then-new 16-bit Intel 8088 microprocessor was used as the main microprocessor. The PC was built in accordance with the principles of open architecture. Users were able to independently upgrade their computers and equip them with additional devices from various manufacturers. After one or two years, the IBM PC took a leading position in the market, displacing 8-bit computer models.

Today there are many types of computers, which are classified according to their element base, principles of operation, cost, size, performance, purpose and areas of application.

Supercomputer And mainframe computers(mainframes) - used for complex scientific calculations or for processing large flows of information in large enterprises. Οʜᴎ, as a rule, are the main computers of corporate computer networks.

Mini- And micro computer used to create control systems for large and medium-sized enterprises.

Personal computers intended for the end user. In turn, PCs are divided into desktop (desktop), portable (notebook) and pocket (palmtop) models.

History of the development of computer technology - concept and types. Classification and features of the category "History of the development of computer technology" 2017, 2018.

  • - History of the development of computer technology

    The need to automate data processing, including calculations, arose a long time ago. It is believed that historically the first and, accordingly, the simplest counting device was the abacus, which refers to hand-held counting devices. Abacus - counting board...


  • - History of the development of computer technology

    The history of the development of computer technology goes back a long way. Back in the 14th century. Leonardo da Vinci developed a design for a 13-bit adding device. The current model was built in 1642 by the famous French physicist, mathematician and engineer Blaise Pascal. His... .


  • - BRIEF HISTORY OF THE DEVELOPMENT OF COMPUTER ENGINEERING.

    1623 The first "counting machine" created by William Schickard. This rather cumbersome apparatus could apply simple arithmetic operations (addition, subtraction) with 7-digit numbers. 1644 Blaise Pascal's "Calculator" is the first truly popular calculating machine...

  • History of the creation and development of computer technology

    In computer technology, there is a peculiar periodization of the development of electronic computers. A computer is classified into one generation or another depending on the type of main elements used in it or on the technology of their manufacture. It is clear that the boundaries of generations in terms of time are very blurred, since at the same time computers of various types were actually produced; For an individual machine, the question of whether it belongs to one generation or another is resolved quite simply.

    Even in the times of ancient cultures, people had to solve problems related to trade calculations, time calculation, determining the area of ​​land, etc. The increase in the volume of these calculations even led to the fact that specially trained people were invited from one country to another, well proficient in arithmetic counting techniques. Therefore, sooner or later devices had to appear that would make everyday calculations easier. Thus, in Ancient Greece and Ancient Rome, counting devices called abacus were created. The abacus is also called the Roman abacus. These abacus were a bone, stone or bronze board with grooves called stripes. There were dominoes in the recesses, and counting was carried out by moving the dominoes.

    In the countries of the Ancient East, there were Chinese abacuses. There were five two dominoes on each thread or wire in these abacus. Counting was done in ones and fives. In Russia, Russian abacus, which appeared in the 16th century, was used for arithmetic calculations, but in some places abacus can still be found today.

    The development of counting devices kept pace with the achievements of mathematics. Soon after the discovery of logarithms in 1623, the slide rule was invented by the English mathematician Edmond Gunter. The slide rule was destined to have a long life: from the 17th century to the present day.

    However, neither the abacus, nor the abacus, nor the slide rule mean mechanization of the calculation process. In the 17th century, the outstanding French scientist Blaise Pascal invented a fundamentally new calculating device - an arithmetic machine. B. Pascal based her work on the well-known idea of ​​performing calculations using metal gears. In 1645, he built the first adding machine, and in 1675, Pascal managed to create a real machine that performed all four arithmetic operations. Almost simultaneously with Pascal in 1660 - 1680. The calculating machine was designed by the great German mathematician Gottfierd Leibniz.

    The calculating machines of Pascal and Leibniz became the prototype of the adding machine. The first arithmometer for four arithmetic operations, which found arithmetic application, was built only a hundred years later, in 1790, by the German watchmaker Hahn. Subsequently, the device of the adding machine was improved by many mechanics from England, France, Italy, Russia, and Switzerland. Arithmometers were used to perform complex calculations in the design and construction of ships. Bridges, buildings, during financial transactions. But the productivity of adding machines remained low; automation of calculations was an urgent requirement of the time.

    In 1833, the English scientist Charles Babage, who was involved in compiling tables for navigation, developed a project for an “analytical engine.” According to his plan, this machine was to become a giant program-controlled adding machine. Babage's machine also included arithmetic and storage devices. His machine became the prototype of future computers. But it used far from perfect components; for example, it used gears to remember the digits of a decimal number. Babidge failed to implement his project due to insufficient development of technology, and the “analytical engine” was forgotten for a while.

    Only 100 years later, Babidge's machine attracted the attention of engineers. At the end of the 30s of the 20th century, German engineer Konrad Zuse developed the first binary digital machine Z1. It made extensive use of electromechanical relays, that is, mechanical switches actuated by electric current. In 1941, K. Wujie created the Z3 machine, which was completely controlled by software.

    In 1944, the American Howard Aiken, at one of the IBM enterprises, built the Mark 1, a powerful machine for those times. This machine used mechanical elements - counting wheels - to represent numbers, and electromechanical relays were used for control.

    Computer generations

    It is convenient to describe the history of the development of computers using the idea of ​​generations of computers. Each generation of computer is characterized by design features and capabilities. Let's begin to describe each of the generations, but we must remember that the division of computers into generations is conditional, since machines of different levels were produced at the same time.

    First generation

    A sharp leap in the development of computer technology occurred in the 40s, after the Second World War, and it was associated with the advent of qualitatively new electronic devices - electron vacuum tubes, which worked much faster than circuits based on electromechanical relays, and relay machines were quickly replaced by more productive ones and reliable electronic computers (computers). The use of computers has significantly expanded the range of problems being solved. Tasks that were simply not posed before have become available: calculations of engineering structures, calculations of the motion of planets, ballistic calculations, etc.

    The first computer was created in 1943 – 1946. in the USA and it was called ENIAC. This machine contained about 18 thousand vacuum tubes, many electromechanical relays, and about 2 thousand tubes failed every month. The control center of the ENIAC machine, as well as other early computers, had a serious drawback - the executable program was not stored in the machine’s memory, but was accumulated in a complex way using external jumpers.

    In 1945, the famous mathematician and physicist-theorist von Neumann formulated the general principles of operation of universal computing devices. According to von Neumann, the computer was supposed to be controlled by a program with sequential execution of commands, and the program itself was to be stored in the machine’s memory. The first computer with a stored program was built in England in 1949.

    In 1951, MESM was created in the USSR; this work was carried out in Kyiv at the Institute of Electrodynamics under the leadership of the largest designer of computer technology S. A. Lebedev.

    Computers were constantly improved, thanks to which by the mid-50s their performance was increased from several hundred to several tens of thousands of operations per second. However, the electron tube remained the most reliable element of the computer. The use of lamps began to slow down the further progress of computing technology.

    Subsequently, semiconductor devices replaced lamps, thereby completing the first stage of computer development. Computers of this stage are usually called first-generation computers

    Indeed, first-generation computers were located in large computer rooms, consumed a lot of electricity and required cooling with powerful fans. Programs for these computers had to be written in machine code, and this could only be done by specialists who knew the details of the computer structure.

    Second generation

    Computer developers have always followed the progress in electronic technology. When semiconductor devices replaced vacuum tubes in the mid-50s, the conversion of computers to semiconductors began.

    Semiconductor devices (transistors, diodes) were, firstly, much more compact than their tube predecessors. Secondly, they had a significantly longer service life. Thirdly, the energy consumption of semiconductor computers was significantly lower. With the introduction of digital elements on semiconductor devices, the creation of second-generation computers began.

    Thanks to the use of a more advanced element base, relatively small computers began to be created, and a natural division of computers into large, medium and small took place.

    In the USSR, the “Hrazdan” and “Nairi” series of small computers were developed and widely used. The Mir machine, developed in 1965 at the Institute of Cybernetics of the Academy of Sciences of the Ukrainian SSR, was unique in its architecture. It was intended for engineering calculations that were performed on a computer by the user himself without the help of an operator.

    Medium computers included domestic machines of the Ural, M-20 and Minsk series. But the record among domestic machines of this generation and one of the best in the world was BESM-6 (“large electronic calculating machine”, model 6), which was created by the team of Academician S. A. Lebedev. The performance of BESM-6 was two to three orders of magnitude higher than that of small and medium-sized computers, and amounted to more than 1 million operations per second. Abroad, the most common second-generation machines were Elliot (England), Siemens (Germany), and Stretch (USA).

    Third generation

    The next change in computer generations occurred at the end of the 60s when semiconductor devices in computer devices were replaced with integrated circuits. An integrated circuit (microcircuit) is a small wafer of silicon crystal on which hundreds and thousands of elements are placed: diodes, transistors, capacitors, resistors, etc.

    The use of integrated circuits has made it possible to increase the number of electronic elements in a computer without increasing their actual dimensions. Computer speed increased to 10 million operations per second. In addition, it became possible for ordinary users to compose computer programs, and not just for specialists - electronics engineers.

    In the third generation, large series of computers appeared, differing in their performance and purpose. This is a family of large and medium-sized IBM360/370 machines developed in the USA. In the Soviet Union and in the CMEA countries, similar series of machines were created: ES Computers (Unified System of Computers, large and medium-sized machines), SM Computers (System of Small Computers) and “Electronics” (micro-computer system).

    The very first computing devices were human fingers. When this remedy was not enough, pebbles, sticks, and shells were used. By adding such a set into tens, and then hundreds, a person learned to count and use means of measuring numbers. It was with pebbles and shells that the history of the development of computer technology began. By arranging them in different columns (ranks) and adding or removing the required number of pebbles, it was possible to add and subtract large numbers. With repeated addition, it was possible to perform even such a complex operation as multiplication.

    Then begins the history of the development of means. The first means for calculation was the abacus invented in Rus'. In them, numbers were divided into tens using horizontal guides with bones. They became an indispensable assistant to traders, officials, clerks and managers. These people knew how to use abacus simply masterfully. Subsequently, such a necessary device penetrated into Europe.

    The very first mechanical device for counting, which is known to the history of the development of computing technology, was the calculating machine, which was built by the outstanding French scientist Blaise Pascal in 1642. His mechanical "computer" could perform operations such as addition and subtraction. This car was called “Pascalina” and it consisted of a whole complex in which wheels with numbers from 0 to 9 were mounted vertically. When the wheel turned full circle, it hooked the adjacent wheel and turned it by one number. The number of wheels determined the number of digits of the computer. If five wheels were installed on it, then it could already carry out operations with huge numbers up to 99999.

    Then in 1673, the German mathematician Leibniz created a device that could not only subtract and add, but also divide and multiply. In contrast, the wheels were geared and had nine different lengths of teeth, which ensured such incredibly “complex” actions as multiplication and division. technology knows many names, but one name is known even to non-specialists. This is an English mathematician. He is deservedly called the father of all modern computer technology. It was he who came up with the idea that a computer needs a device that will store numbers. Moreover, this device must not only store numbers, but also give commands to the computer what it should do with these numbers.

    Babbage's idea formed the basis for the design and development of all modern computers. Such a block in a computing processor. However, the scientist did not leave any drawings or descriptions of the machine he invented. This was done by one of his students in his article, which he wrote in French. The article was read by Countess Ada Augusta Lovelace, the daughter of the famous poet George Byron, who translated it into English and developed her own programs for this machine. Thanks to her, the history of the development of computer technology received one of the most advanced programming languages ​​- ADA.

    The 20th century gave a new impetus to the development of computer technology associated with electricity. An electronic device was invented that stored electrical signals - a tube trigger. The first computers created with its help could count thousands of times faster than the most advanced mechanical calculating machines, but were still very bulky. The first computers weighed about 30 tons and occupied a room measuring more than 100 square meters. meters. Further development was achieved with the advent of an extremely important invention - the transistor. Well, modern computer technology is unthinkable without the use of a microprocessor - a complex integrated circuit developed in June 1971. This is a brief history of the development of computer technology. Modern achievements of science and technology have raised the level of modern computers to unprecedented heights.