Von Neumann's principles of program control. The advent of computers, von Neumann's principles


Also known as the von Neumann model, or Princeton architecture, it is based on a technique described in 1945 by mathematician and physicist John von Neumann as part of his "First Design" report on the EDVAC computer.

Architecture diagram

Von Neumann's report described an architecture diagram for an electronic digital computer with parts consisting of processing units, which contains:

  • arithmetic-logical unit;
  • register processor;
  • a control unit containing a command register and a program counter;
  • a storage device for storing data;
  • external storage device;
  • input and output mechanisms.

The point of the design was that any information stored on a computer could be used by a program in which the selected operation data could not be played back simultaneously because they shared a common bus. This is mentioned in the "First Project", which describes the scientist's thoughts on what architecture should be. Von Neumann called this situation a “bottleneck,” which often limits system performance.

A digital computer is a computer that stores a program that contains program instructions, read data, write data, and also includes random access memory (RAM). The principles of John von Neumann's architecture are also set out in his work “The First Project”. According to him, stored program computers were an improvement over control computers such as ENIAC. The latter was programmed by setting switches and inserting a patch causing data and control signals to be routed between different functional blocks. The vast majority of modern computers also use memory in this way. At the same time, von Neumann differs, for example, from Harvard, in that it uses cache memory rather than main memory.

Background

The first ones had predetermined fixed programs. Some very simple computers still use this design, either for simplicity or for educational purposes. For example, a desktop calculator is also a fixed-program computer. It can work with basic math, but it cannot be used as a game console. Changing the fixed program of a machine requires rewiring, restructuring or reorganizing the apparatus. The earliest computers were not so narrowly focused, since they were developed for the first time and for scientific purposes. Reprogramming came much later and was a labor-intensive process, starting with block diagrams and paper notes and ending with detailed technical designs. The process of physically upgrading the vehicle's restoration channels was especially difficult. It can take three weeks to install the program on the ENIAC and try to get it to work.

New idea

With the introduction of computers that stored programs in memory, everything changed. Stored in memory, they are a construct with a set of instructions. This means that the machine can immediately receive a set of commands to perform calculations.

The design of such programs refers to self-modifying codes. One of the first considerations for such an object was the need for an algorithm to increase or otherwise change the address portion of instructions. It was made by hand in early designs. This became less important when index registers and indirect addressing became common features found in John von Neumann machine architectures. Another use is to insert frequently used data in a command stream using an immediate solution. But self-modifying code has been largely criticized because it tends to be difficult to understand and debug. In addition, it also turned out to be ineffective in terms of reproducing and caching the circuitry of modern processors.

By and large, the ability to treat instructions as data is what makes assemblers, compilers, assemblers, loaders, and other tools possible computer-aided programming objects. So to speak, write programs that write programs. On a smaller scale, repetitive intensive input and output operations, such as BitBlt manipulation of primitive or pixel and vertex shaders in modern 3D graphics, have been found to be inefficient for operation without custom hardware.

Development of the concept of a stored program

A mathematician who became interested in the problem of mathematical logic after Max Newman's lecture at Cambridge University wrote a paper in 1936, it was published in the publication of the London Mathematical Society. In it, he described a hypothetical machine that he called the "universal computing machine" and which is now known as the universal Turing machine. It had an infinite storage (in modern terminology - memory), which contained both instructions and data, for which this architecture was created. Von Neumann met Turing while he was a visiting professor at Cambridge in 1935, and during Turing's doctoral thesis at the Institute for Advanced Study in Princeton, New Jersey, in 1936-1937.

Independently, Gee Presper Eckert and John Mauchly, who developed the ENIAC in the School of Electrical Engineering at Pennsylvania State University, wrote about the concept of a machine that stored a program in memory in December 1943. When planning the new machine, the EDVAC, Eckert wrote in January 1944 that it would store data and programs in a new device with memory addressing using a mercury metal delay. This was the first time that the construction of a machine storing a program in memory was proposed in practice. At the same time, he and Mauchly were not aware of Turing's work (photo below).

Computer Architecture: Von Neumann's Principle

Von Neumann was involved in the "Manhattan Project" at Los Alamos National Laboratory, which required enormous amounts of computation. This attracted him to the ENIAC project in the summer of 1944. There he entered into discussions on the development of the EDVAC computer. As part of this group, he wrote a paper entitled "First Draft Report on EDVAC", based on the work of Eckert and Mauchly. It was unfinished when his colleague Goldstein distributed the project with the name of von Neumann (by the way, Eckert and Mauchly were dumbfounded by such news). This document was read by dozens of von Neumann's colleagues in America and Europe and had a major influence on the next stage of computer development.

The basic principles of von Neumann's architecture, outlined in the "First Project", were gaining widespread fame while Turing was highlighting his report on the electronic calculator, which was described in detail in engineering and programming. It also outlined the author’s idea of ​​a machine called the Automatic Computing Engine (ACE). He presented it to the executive committee of the British National Physical Laboratory in 1946. Over time, there have even been various successful implementations of the ACE design.

Start of projects implementation

Both the von Neumann project and the Turing papers described computers that stored a specific program in memory, but von Neumann's paper achieved greater circulation in society, and computer architecture became known as John von Neumann architecture.

In 1945, Professor Neumann, who was then at the School of Engineering in Philadelphia, where the first ENIAC was built, issued a paper on behalf of a group of his colleagues on the logical design of digital computers. The report contained a fairly detailed proposal for the design of the machine, which has since become known as EDVAC. It had only recently been created in America, but the report inspired von Neumann to create EDSAC.

Maniacs and Joniacs

In 1947, Burks, Goldstein, and von Neumann published another report highlighting the design of another type of machine (this time parallel) that would be extremely fast, perhaps capable of performing up to 20,000 operations per second. They noted that an unresolved problem in building it was designing a suitable memory whose entire contents must be instantly accessible. They first proposed using a special vacuum tube called a Selectron, which was invented at the Princeton Laboratory. Such tubes were expensive and very difficult to make, especially if this architecture was used. Von Neumann subsequently decided to build a machine based on the Williams memory. This machine, which was completed in June 1952 in Princeton, became widely known as MANIAC (or simply Maniacs). Its design inspired the construction of half a dozen or more similar devices that are now being built in America and are jokingly called Johniacs.

Creation principles

One of the most modern digital computers, embodying developments and improvements in automatic electronic computing techniques, was demonstrated at the National Physical Laboratory at Teddington, where it was designed and built by a small team of mathematicians, electronics and research engineers, with the assistance of a number of manufacturing engineers from the English Electric Company Ltd. The equipment is still in the laboratory, but only as a prototype of a much larger unit known as the Automatic Computing Engine. But despite its relatively low weight and containing only 800 thermionic valves, it is an extremely fast and versatile calculating machine.

The basic concepts and abstract principles of calculation using a machine were formulated by Dr. Turing on the basis of the same London Mathematical Society back in 1936, but work on such machines in Great Britain was delayed by the war. In 1945, consideration of the problems of creating such devices continued at the National Physical Laboratory by Dr. Wormsley, laboratory superintendent of the Department of Mathematics. He joined Turing with his small staff of specialists, and by 1947 preliminary planning was sufficiently advanced to justify the creation of a special group.

The first computers based on von Neumann architecture

The first project describes a design that has been used by many universities and corporations to build their computers. Among them, only ILLIAC and ORDVAC had compatible instruction sets.

The classical von Neumann architecture was embodied in the Manchester Small Experimental Machine (SSEM), nicknamed Baby, from the University of Manchester, which made its first successful run as a program memory device on June 21, 1948.

EDSAC from the University of Cambridge, the first practical electronic computer of its type, was launched successfully for the first time in May 1949.

Development of created models

IBM SSEC had the ability to treat instructions as data and was publicly demonstrated on January 27, 1948. This ability was claimed in a US patent. However, it was a partially electromechanical machine rather than a fully electronic one. In practice, the instructions were read from a paper tape due to its limited memory.

Baby was the first fully electronic computer to run stored programs. He ran the factoring program for 52 minutes on June 21, 1948, after running a simple division and calculation that shows two numbers are coprime.

ENIAC was modified to operate as a primitive read-only computer, but using the same architecture, and was demonstrated on September 16, 1948, with Adele Goldstein launching the program with the help of von Neumann.

BINAC conducted several test programs in February, March and April 1949, although it was not completed until September 1949. In addition, test runs (some successful) of other electronic computers, which are characterized by this architecture, were carried out. Von Neumann, by the way, continued to work on the Manhattan Project. Such a universal person.

Evolution of bus system architecture

Decades later, already in the 60s and 70s, computers in general became smaller and faster, which led to some of the evolutions that von Neumann's computer architecture underwent. For example, memory mapping of input and output allows the corresponding devices whose data and instructions for integration into the system will be processed to remain in memory. One bus system can be used to provide a modular system with smaller ones. This is sometimes called "rationalization" of the architecture. In subsequent decades, simple microcontrollers sometimes omit some features of the typical model in order to reduce cost and size. Larger computers, on the other hand, follow the established architecture as they have added features to improve performance.

In 1946, D. von Neumann, G. Goldstein and A. Berks, in their joint article, outlined new principles for the construction and operation of computers. Subsequently, the first two generations of computers were produced on the basis of these principles. There have been some changes in later generations, although Neumann's principles are still relevant today.

In fact, Neumann managed to summarize the scientific developments and discoveries of many other scientists and formulate something fundamentally new on their basis.

Von Neumann's principles

  1. Use of the binary number system in computers. The advantage over the decimal number system is that devices can be made quite simple, and arithmetic and logical operations in the binary number system are also performed quite simply.
  2. Computer software control. The operation of the computer is controlled by a program consisting of a set of commands. Commands are executed sequentially one after another. The creation of a machine with a stored program was the beginning of what we call programming today.
  3. Computer memory is used not only to store data, but also programs.. In this case, both program commands and data are encoded in the binary number system, i.e. their recording method is the same. Therefore, in certain situations, you can perform the same actions on commands as on data.
  4. Computer memory cells have addresses that are numbered sequentially. At any time, you can access any memory cell by its address. This principle opened up the possibility of using variables in programming.
  5. Possibility of conditional jump during program execution. Despite the fact that commands are executed sequentially, programs can implement the ability to jump to any section of code.

The most important consequence of these principles is that now the program was no longer a permanent part of the machine (like, for example, a calculator). It became possible to easily change the program. But the equipment, of course, remains unchanged and very simple.

By comparison, the program of the ENIAC computer (which did not have a stored program) was determined by special jumpers on the panel. It could take more than one day to reprogram the machine (set jumpers differently). And although programs for modern computers can take years to write, they work on millions of computers after a few minutes of installation on the hard drive.

How does a von Neumann machine work?

A von Neumann machine consists of a storage device (memory) - a memory, an arithmetic-logical unit - ALU, a control device - CU, as well as input and output devices.

Programs and data are entered into memory from the input device through an arithmetic logic unit. All program commands are written to adjacent memory cells, and data for processing can be contained in arbitrary cells. For any program, the last command must be the shutdown command.

The command consists of an indication of what operation should be performed (from the possible operations on a given hardware) and the addresses of memory cells where the data on which the specified operation should be performed is stored, as well as the address of the cell where the result should be written (if it needs to be saved in memory).

The arithmetic logic unit performs the operations specified by the instructions on the specified data.

From the arithmetic logic unit, the results are output to memory or an output device. The fundamental difference between a memory and an output device is that in a memory, data is stored in a form convenient for processing by a computer, and it is sent to output devices (printer, monitor, etc.) in a way that is convenient for a person.

The control unit controls all parts of the computer. From the control device, other devices receive signals “what to do”, and from other devices the control unit receives information about their status.

The control device contains a special register (cell) called the “program counter”. After loading the program and data into memory, the address of the first instruction of the program is written to the program counter. The control unit reads from memory the contents of the memory cell, the address of which is in the program counter, and places it in a special device - the “Command Register”. The control unit determines the operation of the command, “marks” in memory the data whose addresses are specified in the command, and controls the execution of the command. The operation is performed by the ALU or computer hardware.

As a result of the execution of any command, the program counter changes by one and, therefore, points to the next command of the program. When it is necessary to execute a command that is not next in order to the current one, but is separated from the given one by a certain number of addresses, then a special jump command contains the address of the cell to which control must be transferred.

Von Neumann Principles (Von Neumann Architecture)

    Computer architecture

In 1946, D. von Neumann, G. Goldstein and A. Berks, in their joint article, outlined new principles for the construction and operation of computers. Subsequently, the first two generations of computers were produced on the basis of these principles. There have been some changes in later generations, although Neumann's principles are still relevant today.

In fact, Neumann managed to summarize the scientific developments and discoveries of many other scientists and formulate something fundamentally new on their basis.

Von Neumann's principles

    Use of the binary number system in computers. The advantage over the decimal number system is that devices can be made quite simple, and arithmetic and logical operations in the binary number system are also performed quite simply.

    Computer software control. The operation of the computer is controlled by a program consisting of a set of commands. Commands are executed sequentially one after another. The creation of a machine with a stored program was the beginning of what we call programming today.

    Computer memory is used not only to store data, but also programs.. In this case, both program commands and data are encoded in the binary number system, i.e. their recording method is the same. Therefore, in certain situations, you can perform the same actions on commands as on data.

    Computer memory cells have addresses that are numbered sequentially. At any time, you can access any memory cell by its address. This principle opened up the possibility of using variables in programming.

    Possibility of conditional jump during program execution. Despite the fact that commands are executed sequentially, programs can implement the ability to jump to any section of code.

The most important consequence of these principles is that now the program was no longer a permanent part of the machine (like, for example, a calculator). It became possible to easily change the program. But the equipment, of course, remains unchanged and very simple.

By comparison, the program of the ENIAC computer (which did not have a stored program) was determined by special jumpers on the panel. It could take more than one day to reprogram the machine (set jumpers differently). And although programs for modern computers can take years to write, they work on millions of computers after a few minutes of installation on the hard drive.

How does a von Neumann machine work?

A von Neumann machine consists of a storage device (memory) - a memory, an arithmetic-logical unit - ALU, a control device - CU, as well as input and output devices.

Programs and data are entered into memory from the input device through an arithmetic logic unit. All program commands are written to adjacent memory cells, and data for processing can be contained in arbitrary cells. For any program, the last command must be the shutdown command.

The command consists of an indication of what operation should be performed (from the possible operations on a given hardware) and the addresses of memory cells where the data on which the specified operation should be performed is stored, as well as the address of the cell where the result should be written (if it needs to be saved in memory).

The arithmetic logic unit performs the operations specified by the instructions on the specified data.

From the arithmetic logic unit, the results are output to memory or an output device. The fundamental difference between a memory and an output device is that in a memory, data is stored in a form convenient for processing by a computer, and it is sent to output devices (printer, monitor, etc.) in a way that is convenient for a person.

The control unit controls all parts of the computer. From the control device, other devices receive signals “what to do”, and from other devices the control unit receives information about their status.

The control device contains a special register (cell) called the “program counter”. After loading the program and data into memory, the address of the first instruction of the program is written to the program counter. The control unit reads from memory the contents of the memory cell, the address of which is in the program counter, and places it in a special device - the “Command Register”. The control unit determines the operation of the command, “marks” in memory the data whose addresses are specified in the command, and controls the execution of the command. The operation is performed by the ALU or computer hardware.

As a result of the execution of any command, the program counter changes by one and, therefore, points to the next command of the program. When it is necessary to execute a command that is not next in order to the current one, but is separated from the given one by a certain number of addresses, then a special jump command contains the address of the cell to which control must be transferred.

Von Neumann's principles[edit | edit source text]

The principle of memory homogeneity

Commands and data are stored in the same memory and are externally indistinguishable in memory. They can only be recognized by the method of use; that is, the same value in a memory cell can be used as data, as a command, and as an address, depending only on the way it is accessed. This allows you to perform the same operations on commands as on numbers, and, accordingly, opens up a number of possibilities. Thus, by cyclically changing the address part of the command, it is possible to access successive elements of the data array. This technique is called command modification and is not recommended from the standpoint of modern programming. More useful is another consequence of the principle of homogeneity, when instructions from one program can be obtained as a result of the execution of another program. This possibility underlies translation - the translation of program text from a high-level language into the language of a specific computer.

Targeting principle

Structurally, the main memory consists of numbered cells, and any cell is available to the processor at any time. Binary codes of commands and data are divided into units of information called words and stored in memory cells, and to access them the numbers of the corresponding cells - addresses are used.

Program control principle

All calculations provided for by the algorithm for solving the problem must be presented in the form of a program consisting of a sequence of control words - commands. Each command prescribes some operation from a set of operations implemented by the computer. Program commands are stored in sequential memory cells of the computer and are executed in a natural sequence, that is, in the order of their position in the program. If necessary, using special commands, this sequence can be changed. The decision to change the order of execution of program commands is made either based on an analysis of the results of previous calculations, or unconditionally.

Binary coding principle

According to this principle, all information, both data and commands, is encoded with binary digits 0 and 1. Each type of information is represented by a binary sequence and has its own format. A sequence of bits in a format that has a specific meaning is called a field. In numeric information, there is usually a sign field and a significant digits field. In the command format, two fields can be distinguished: the operation code field and the addresses field.

Another truly revolutionary idea, the importance of which is difficult to overestimate, is the “stored program” principle proposed by Neumann. Initially, the program was set by installing jumpers on a special patch panel. This was a very labor-intensive task: for example, it took several days to change the program of the ENIAC machine (while the calculation itself could not last more than a few minutes - the lamps failed). Neumann was the first to realize that a program could also be stored as a series of zeros and ones, in the same memory as the numbers it processed. The absence of a fundamental difference between the program and the data made it possible for the computer to form a program for itself in accordance with the results of the calculations.

Von Neumann not only put forward the fundamental principles of the logical structure of a computer, but also proposed its structure, which was reproduced during the first two generations of computers. The main blocks according to Neumann are a control unit (CU) and an arithmetic-logical unit (ALU) (usually combined into a central processor), memory, external memory, input and output devices. The design diagram of such a computer is shown in Fig. 1. It should be noted that external memory differs from input and output devices in that data is entered into it in a form convenient for a computer, but inaccessible to direct perception by a person. Thus, the magnetic disk drive refers to external memory, and the keyboard is an input device, display and print are output devices.

Rice. 1. Computer architecture built on von Neumann principles. Solid lines with arrows indicate the direction of information flows, dotted lines indicate control signals from the processor to other computer nodes

The control device and the arithmetic-logical unit in modern computers are combined into one unit - the processor, which is a converter of information coming from memory and external devices (this includes retrieving instructions from memory, encoding and decoding, performing various, including arithmetic, operations, coordination of the operation of computer nodes). The functions of the processor will be discussed in more detail below.

Memory (memory) stores information (data) and programs. The storage device in modern computers is “multi-tiered” and includes random access memory (RAM), which stores the information with which the computer is working directly at a given time (the executable program, part of the data necessary for it, some control programs), and external storage devices (ESD). ) much larger capacity than RAM. but with significantly slower access (and significantly lower cost per 1 byte of stored information). The classification of memory devices does not end with RAM and VRAM - certain functions are performed by both SRAM (advanced random access memory), ROM (read-only memory), and other subtypes of computer memory.

In a computer built according to the described scheme, instructions are sequentially read from memory and executed. Number (address) of the next memory cell. from which the next program command will be extracted is indicated by a special device - a command counter in the control unit. Its presence is also one of the characteristic features of the architecture in question.

The fundamentals of the architecture of computing devices developed by von Neumann turned out to be so fundamental that they received the name “von Neumann architecture” in the literature. The vast majority of computers today are von Neumann machines. The only exceptions are certain types of systems for parallel computing, in which there is no program counter, the classical concept of a variable is not implemented, and there are other significant fundamental differences from the classical model (examples include streaming and reduction computers).

Apparently, a significant deviation from the von Neumann architecture will occur as a result of the development of the idea of ​​fifth-generation machines, in which information processing is based not on calculations, but on logical conclusions

.

The foundations of the doctrine of computer architecture were laid by the outstanding American mathematician John von Neumann. He became involved in the creation of the world's first tube computer, ENIAC, in 1944, when its design had already been selected. During his work, during numerous discussions with his colleagues G. Goldstein and A. Berks, von Neumann expressed the idea of ​​a fundamentally new computer. In 1946, scientists outlined their principles for constructing computers in the now classic article “Preliminary Consideration of the Logical Design of an Electronic Computing Device.” Half a century has passed since then, but the provisions put forward in it remain relevant today.

The article convincingly substantiates the use of the binary system to represent numbers (it is worth recalling that previously all computers stored processed numbers in decimal form). The authors convincingly demonstrated the advantages of the binary system for technical implementation, the convenience and simplicity of performing arithmetic and logical operations in it. Later, computers began to process non-numeric types of information - text, graphic, sound and others, but binary data coding still forms the information basis of any modern computer.

Another truly revolutionary idea, the importance of which is difficult to overestimate, is the “stored program” principle proposed by Neumann. Initially, the program was set by installing jumpers on a special patch panel. This was a very labor-intensive task: for example, it took several days to change the program of the ENIAC machine (while the calculation itself could not last more than a few minutes the lamps failed). Neumann was the first to realize that a program could also be stored as a series of zeros and ones, in the same memory as the numbers it processed. The absence of a fundamental difference between the program and the data made it possible for the computer to form a program for itself in accordance with the results of the calculations.

Von Neumann not only put forward the fundamental principles of the logical structure of a computer, but also proposed its structure, which was reproduced during the first two generations of computers. The main blocks according to Neumann are a control unit (CU) and an arithmetic-logical unit (ALU) (usually combined into a central processor), memory, external memory, input and output devices. The design diagram of such a computer is shown in Fig. 1. It should be noted that external memory differs from input and output devices in that data is entered into it in a form convenient for a computer, but inaccessible to direct perception by a person. Thus, the magnetic disk drive is an external memory, and the keyboard is an input device, display and printing are output devices.

Rice. 1. Computer architecture built on von Neumann principles. Solid lines with arrows indicate the direction of information flows, dotted lines indicate control signals from the processor to other computer nodes

The control device and the arithmetic-logical unit in modern computers are combined into one block processor, which is a converter of information coming from memory and external devices (this includes retrieving instructions from memory, encoding and decoding, performing various, including arithmetic, operations, coordination of the operation of computer nodes). The functions of the processor will be discussed in more detail below.

Memory (memory) stores information (data) and programs. The storage device in modern computers is “multi-tiered” and includes random access memory (RAM), which stores the information with which the computer is working directly at a given time (the executable program, part of the data necessary for it, some control programs), and external storage devices (ESD). ) much larger capacity than RAM. but with significantly slower access (and significantly lower cost per 1 byte of stored information). The classification of memory devices does not end with RAM and VRAM; certain functions are performed by both SRAM (random access memory), ROM (read-only memory), and other subtypes of computer memory.

In a computer built according to the described scheme, instructions are sequentially read from memory and executed. Number (address) of the next memory cell. from which the next program command will be extracted is indicated by a special device - a command counter in the control unit. Its presence is also one of the characteristic features of the architecture in question.

The fundamentals of the architecture of computing devices developed by von Neumann turned out to be so fundamental that they received the name “von Neumann architecture” in the literature. The vast majority of computers today are von Neumann machines. The only exceptions are certain types of systems for parallel computing, in which there is no program counter, the classical concept of a variable is not implemented, and there are other significant fundamental differences from the classical model (examples include streaming and reduction computers).

Apparently, a significant deviation from the von Neumann architecture will occur as a result of the development of the idea of ​​fifth-generation machines, in which information processing is based not on calculations, but on logical conclusions.

Something Like Nostalgia: Von Neumann's Principles

I was unable to find the notebook with the first part of the lectures on Computer Architecture, so the data had to be taken from other sources.
In 1945, John von Neumann, a Hungarian-born physicist and mathematician who worked in the United States on the ENIAC project, published a report outlining the basic principles of building a computer. The provisions expressed in the report were called "Von Neumann's Principles."

1. The principle of program control.
A program consists of a set of instructions executed sequentially by the processor. A program is retrieved from memory using a program counter. Fetching commands from memory stops when the “stop” command is reached and executed.


This shows that the commands in the program and the programs themselves are executed sequentially one after another. Also, the von Neumann architecture allows you to make conditional and unconditional branches if you need to execute a command that does not immediately follow the executed one, but is located in a different memory location. But this does not violate the sequential principle of command execution since only one command can be executed at a time.

2. The principle of memory homogeneity.
Programs and data are encoded in binary code and stored in the same memory. You can perform the same actions on commands as on data.


Thus, for memory it does not matter what is stored in a given cell - data or commands. Also, this principle allows the program to subject itself to processing during execution (this is how the execution of cycles and subroutines is organized in the program). Commands from one program can be obtained as results from the execution of another program. Translation methods—translating program text from a high-level programming language into the command language of a specific machine—are based on this principle.

Different types of data can in turn be distinguished by formats.

3. The principle of targeting.

Structurally, main memory (RAM) consists of numbered cells. Any cell is available to the processor at any time.


RAM is, as it were, divided into cells of a fixed length. Each such cell has an address (and actually a number), by contacting which you can obtain the contents of the cell.

These principles became decisive for a long time in the development of computers. Only in the 60s did a theory of computing systems emerge that went beyond the principles of von Neumann (the main difference was the parallelism of calculations). But this applied to large professional computers, and personal computers used these principles until recently. Our computer architecture teacher Igor Yusupovich told us that the Von Neumann computer had practically exhausted itself. Then I had no idea what it would entail, but now two- and four-core processors have become commonplace

Computer architecture and von Neumann principles

The term "architecture" is used to describe the principle of operation, configuration and interconnection of the main logical nodes of a computer. Architecture is a multi-level hierarchy of hardware and software from which a computer is built.

The foundations of the doctrine of computer architecture were laid by the outstanding American mathematician John von Neumann. The first Eniak computer was created in the USA in 1946. The group of creators included von Neumann, who suggested basic principles of computer construction: transition to the binary number system for representing information and the principle of a stored program.

It was proposed to place the calculation program in the computer's storage device, which would ensure automatic execution of commands and, as a consequence, increase the speed of the computer. (Recall that previously all computers stored processed numbers in decimal form, and programs were specified by installing jumpers on a special patch panel.) Neumann was the first to guess that a program could also be stored as a set of zeros and ones, and in the same memory as and the numbers it processes.

Basic principles of computer construction:

1. Any computer consists of three main components: processor, memory and device. input-output (I/O).

2. The information with which the computer works is divided into two types:

    a set of processing commands (programs); data to be processed.

3. Both commands and data are entered into memory (RAM) – stored program principle .

4. The processing is controlled by the processor, whose control unit (CU) selects commands from RAM and organizes their execution, and the arithmetic-logical unit (ALU) performs arithmetic and logical operations on the data.


5. Input/output devices (I/O) are connected to the processor and RAM.

Von Neumann not only put forward the fundamental principles of the logical structure of computers, but also proposed a structure that was reproduced during the first two generations of computers.

External storage device (ESD)

Rice. 1. Computer architecture End of form,

Random Access Memory (RAM)

built on the principles

von Neumann

- direction of information flows; - direction of control signals from the processor to other computer nodes

The fundamentals of the architecture of computing devices developed by von Neumann turned out to be so fundamental that they received the name “von Neumann architecture” in the literature. The vast majority of VMs today are von Neumann machines.

The emergence of the third generation of computers was due to the transition from transistors to integrated circuits, which led to an increase in processor speed. Now the processor was forced to idle, waiting for information from slower input/output devices, and this reduced the efficiency of the entire computer as a whole. To solve this problem, special circuits were created to control the operation of external devices, or simply controllers.

The architecture of modern personal computers is based on backbone-modular principle. Information communication between computer devices is carried out through system bus(another name is system highway).

A bus is a cable consisting of many conductors. One group of conductors - data bus processed information is transmitted, on the other - address bus- addresses of memory or external devices accessed by the processor. The third part of the highway - control bus, control signals are transmitted through it (for example, a signal that the device is ready for operation, a signal to start operation of the device, etc.).

How does the system bus work? We have already said that one and zero bits exist only in the heads of programmers. For a processor, only the voltages at its contacts are real. Each pin corresponds to one bit, and the processor only needs to distinguish between two voltage levels: yes/no, high/low. Therefore, the address for a processor is a sequence of voltages on special contacts called the address bus. You can imagine that after voltages are set on the contacts of the address bus, voltages appear on the contacts of the data bus, encoding the number stored at the specified address. This picture is very rough because it takes time to retrieve data from memory. To avoid confusion, the operation of the processor is controlled by a special clock generator. It produces pulses that divide the processor's work into separate steps. The unit of processor time is one clock cycle, that is, the interval between two pulses of the clock generator.

The voltages appearing on the processor address bus are called the physical address. In real mode, the processor works only with physical addresses. On the contrary, the processor's protected mode is interesting because the program works with logical addresses, and the processor invisibly converts them into physical ones. The Windows system uses protected mode for the processor. Modern operating systems and programs require so much memory that the protected mode of the processor has become much more “real” than its real mode.

The system bus is characterized clock frequency and bit depth. The number of bits simultaneously transmitted on the bus is called bus width. Clock frequency characterizes the number of elementary data transfer operations in 1 second. The bus width is measured in bits, the clock frequency is measured in megahertz.


Any information transmitted from the processor to other devices via the data bus is accompanied by address transmitted over the address bus. This can be the address of a memory cell or the address of a peripheral device. It is necessary that the bus width allows the address of the memory cell to be transmitted. Thus, in words, the bus width limits the amount of computer RAM; it cannot be greater than , where n is the bus width. It is important that the performance of all devices connected to the bus is consistent. It is not wise to have a fast processor and slow memory, or a fast processor and memory, but a slow hard drive.

Rice. 2. Diagram of a computer built on the backbone principle

In modern computers it is implemented open architecture principle, allowing the user to assemble the computer configuration he needs and, if necessary, upgrade it.

Configuration A computer refers to the actual collection of computer components that make up a computer. The principle of open architecture allows you to change the composition of computer devices. Additional peripheral devices can be connected to the information highway, and some device models can be replaced by others.

The hardware connection of a peripheral device to the backbone at the physical level is carried out through a special block - controller(other names - adapter, board, card). There are special connectors for installing controllers on the motherboard - slots.

Software control of the operation of a peripheral device is carried out through the program - driver, which is a component of the operating system. Since there is a huge variety of devices that can be installed on a computer, each device usually comes with a driver that interacts directly with this device.

The computer communicates with external devices through ports– special connectors on the back panel of the computer. Distinguish sequential And parallel ports. Serial (COM – ports) are used to connect manipulators, a modem and transmit small amounts of information over long distances. Parallel (LPT - ports) are used to connect printers, scanners and transmit large amounts of information over short distances. Recently, universal serial ports (USB) have become widespread, to which you can connect various devices.







2024 gtavrl.ru.