History of the emergence and development of the computer. Evolution of personal computer development


This article describes the main stages of computer development. The main directions of development of computer technologies and the reasons for their development are described.

The main stages of computer development

During the evolution of computer technology, hundreds of different computers have been developed. Many of them have long been forgotten, while the influence of others on modern ideas turned out to be quite significant. In this article, we will give a brief overview of some key historical moments to better understand how developers arrived at the concept of modern computers. We will consider only the main points of development, leaving many details outside the brackets. The computers that we will consider are presented in the table below.

The main stages in the history of computer development:

Year of issue Computer name Creator Notes
1834 Analytical Engine Babbage First attempt to build a digital computer
1936 Z1 Zus First relay Calculating machine
1943 COLOSSUS British government First electronic computer
1944 Mark I Aiken The first American multi-purpose computer
1946 ENIAC I Eckert/Mouchley The history of modern computers begins with this machine
1949 EDSAC Wilkes The first computer with programs stored in memory
1951 Whirlwind I MIT First real time computer
1952 IAS Von Neumann This design is used in most modern computers
1960 PDP-1 DEC First mini-computer (50 copies sold)
1961 1401 IBM A very popular small computer
1962 7094 IBM A very popular small computing machine
1963 B5000 Burroughs First machine designed for a high level language
1964 360 IBM First family of computers
1964 6600 CDC The first supercomputer for scientific calculations
1965 PDP-8 DEC First mass-market mini-computer (50,000 units sold)
1970 PDP-11 DEC These minicomputers dominated the computer market in the 70s
1974 8080 Intel The first universal 8-bit computer on a chip
1974 CRAY-1 Cray The first vector supercomputer
1978 VAX DEC First 32-bit superminicomputer
1981 IBM PC IBM The era of modern personal computers has begun
1981 Osbome-1 Osborne First laptop computer
1983 Lisa Apple The first PC with a graphical user interface
1985 386 Intel First 32-bit predecessor to the Pentium line
1985 MIPS MIPS First RISC computer
1987 SPARC Sun First work station RISC based on SPARC processor
1990 RS6000 IBM First superscalar computer
1992 Alpha DEC First 64-bit PC
1993 Newton Apple The first pocket computer

In total, 6 stages in the development of computers can be distinguished from history: the generation of mechanical computers, computers based on vacuum tubes (such as ENIAC), transistor computers (IBM 7094), the first computers based on integrated circuits (IBM 360), personal computers (lines with Intel CPUs) and so-called invisible computers.

Zero generation - mechanical computers (1642-1945)

The first person to create a calculating machine was the French scientist Blaise Pascal (1623-1662), after whom one of the programming languages ​​is named. Pascal designed this machine in 1642, when he was just 19 years old, for his father, a tax collector. It was a mechanical design with gears and a manual drive. Pascal's calculating machine could only perform addition and subtraction operations.

Thirty years later, the great German mathematician Gottfried Wilhelm Leibniz (1646-1716) built another mechanical machine that could perform multiplication and division in addition to addition and subtraction. In essence, Leibniz created something like a pocket calculator three centuries ago with four functions.

Another 150 years later, Cambridge University mathematics professor Charles Babbage (1792-1871), inventor of the speedometer, developed and constructed difference machine. This mechanical machine, which, like Pascal's machine, could only add and subtract, calculated tables of numbers for sea navigation. The machine was equipped with only one algorithm - the finite difference method using polynomials. This machine had a rather interesting way of outputting information: the results were extruded with a steel stamp on a copper plate, which anticipated later input and output means - punched cards and compact discs.

Although his device worked quite well, Babbage soon became bored with a machine that performed only one algorithm. He spent a lot of time, most of his family fortune and another £17,000 from the government, developing the Analytical Engine. The analytical engine had 4 components: a storage device (memory), a computing device, an input device (for reading punched cards), an output device (a punch and a printing device). The memory consisted of 1000 words of 50 decimal places; each of the words contained variables and results. The computing device received operands from memory, then performed addition, subtraction, multiplication, or division operations and returned the resulting result back to memory. Like the difference engine, this device was mechanical.

The advantage of the Analytical Engine was that it could perform different tasks. She read commands from punch cards and carried them out. Some commands told the machine to take 2 numbers from memory, transfer them to a computing device, perform an operation on them (for example, add), and send the result back to the storage device. Other commands checked the number and sometimes performed a branch operation depending on whether it was positive or negative. If punched cards with a different program were inserted into the reader, the machine performed a different set of operations. That is, unlike the difference analytical engine, it could perform several algorithms.

Since the Analytical Engine was programmed in rudimentary assembly language, it needed software. To create this software, Babbage hired a young woman, Ada Augusta Lovelace, daughter of the famous British poet Byron. Ada Lovelace was the world's first computer programmer. Named in her honor modern language programming - Ada.

Unfortunately, like many modern engineers, Babbage never debugged a computer. He needed thousands and thousands of gears, made with a precision that was unavailable in the 19th century. But Babbage's ideas were ahead of his era, and even today most modern computers are similar in design to the Analytical Engine. Therefore, it is fair to say that Babbage was the grandfather of the modern digital computer.

In the late 1930s, the German Konrad Zuse designed several automatic adding machines using electromagnetic relays. He was unable to receive funds from the government for his developments because the war began. Zus knew nothing of Babbage's work; his machines were destroyed during the bombing of Berlin in 1944, so his work had no influence on the future development of computer technology. However, he was one of the pioneers in this field.

A little later, calculating machines were designed in America. John Atanasoff's machine was extremely advanced for its time. It used binary arithmetic and information containers, which were periodically updated to avoid data destruction. Modern dynamic memory (RAM) works on exactly the same principle. Unfortunately, this machine never became operational. In a sense, Atanasov was like Babbage - a dreamer who was not satisfied with the technology of his time.

George Stibbitz's computer actually worked, although it was more primitive than Atanasov's machine. Stibits demonstrated his machine at a conference at Dartmouth College in 1940. Attending this conference was John Mauchley, a then unremarkable professor of physics at the University of Pennsylvania. He later became very famous in the field of computer development.

While Zus, Stibits and Atanasov were developing automatic adding machines, young Howard Aiken at Harvard was persistently designing manual adding machines as part of his doctoral dissertation. After completing the study, Aiken realized the importance of automatic calculations. He went to the library, read about Babbage's work and decided to create the same computer from relays that Babbage had failed to create from gears.

Aiken's first computer, the Mark I, was completed in 1944. The computer had 72 words of 23 decimal places each and could execute any command in 6 seconds. Punched paper tape was used for input/output devices. By the time Aiken completed work on the Mark II computer, relay computers were already obsolete. The era of electronics has begun.

First generation - vacuum tubes (1945-1955)

The impetus for the creation of an electronic computer was the Second World War. At the beginning of the war, German submarines destroyed British ships. The German admirals sent commands to the submarines by radio, and although the British could intercept these commands, the problem was that the radio messages were encoded using a device called ENIGMA, whose predecessor was designed by amateur inventor and former US President Thomas Jefferson.

At the beginning of the war, the British managed to acquire ENIGMA from the Poles, who, in turn, stole it from the Germans. However, in order to decipher the encoded message, it was necessary great amount calculations, and they had to be made immediately after intercepting the radiogram. Therefore, the British government founded a secret laboratory to create an electronic computer called COLOSSUS. The famous British mathematician Alan Turing took part in the creation of this machine. COLOSSUS was already working in 1943, but since the British government had complete control over the project and treated it as a military secret for 30 years, COLOSSUS did not become the basis for further computer development. We mention it only because it was the world's first electronic digital computer.

The Second World War influenced the development of computer technology in the United States. The Army needed tables that were used in targeting heavy artillery. Hundreds of women were hired to do calculations on manual adding machines and fill in the fields of these tables (it was believed that women were more accurate in calculations than men). However, this process was time consuming and errors were common.

John Mauchley, who was familiar with the work of Atanasoff and Stibblits, realized that the army was interested in calculating machines. He demanded that the army finance work on the creation of an electronic computer. The requirement was met in 1943, and Mauchley and his student J. Presper Eckert began constructing an electronic computer, which they called ENIAC (Electronic Numerical Integrator and Computer). ENIAC consisted of 18,000 vacuum tubes and 1,500 relays, weighed 30 tons and consumed 140 kilowatts of electricity. The machine had 20 registers, each of which could hold a 10-bit decimal number. (A decimal register is a very small memory that can hold a number up to a certain maximum number of digits, something like an odometer that remembers how many miles a car has traveled.) The ENIAC had 6,000 multi-channel switches installed and many cables running to connectors.

Work on the machine was completed in 1946, when it was no longer needed - at least to achieve the original goals.

Since the war was over, Mauchley and Eckert were allowed to set up a school where they shared their work with fellow scientists. It was at this school that interest in creating large digital computers.

After the school appeared, other researchers took up the construction of electronic computers. The first working computer was EDSAC (1949). This machine was designed by Maurice Wilkes at the University of Cambridge. Next - JOHNIAC at the Rand Corporation, ILLIAC at the University of Illinois, MANIAC at the Los Alamos Laboratory and WEIZAC at the Weizmann Institute in Israel.

Eckert and Mauchley soon began work on the car EDVAC(Electronic Discrete Variable Computer - electronic discrete parametric machine). Unfortunately, the project folded when they left university to start a computer corporation in Philadelphia (there was no Silicon Valley at that time). After a series of mergers, this company became Unisys Corporation.

Eckert and Mauchley wanted to obtain a patent for the invention of a digital computer. After several years of litigation, it was decided that the patent was invalid, since Atanasov invented the digital computer, although he did not patent it.

While Eckert and Mauchley were working on the EDVAC machine, one of the ENIAC project participants, John von Neumann, went to the Institute for Advanced Studies in Princeton to construct his own version of the EDVAC, called IAS(Immediate Address Storage - memory with direct addressing). Von Neumann was a genius in the same fields as Leonardo da Vinci. He knew many languages, was an expert in physics and mathematics, and had a phenomenal memory: he remembered everything he had ever heard, seen or read. He could quote verbatim from memory the text of books he had read several years ago. When von Neumann became interested in computers, he was already the most famous mathematician in the world.

Von Neumann soon realized that building computers with many switches and cables was time consuming and very tedious. He came to the idea that the program should be represented in the computer's memory in digital form, along with the data. He also noted that the decimal arithmetic used in the ENIAC machine, where each digit was represented by ten vacuum tubes A on and 9 off) should be replaced by parallel binary arithmetic. By the way, Atanasov came to a similar conclusion only a few years later.

The main project that von Neumann described at the beginning is now known as von Neumann computer. It was used in the EDSAC, the first memory-program machine, and even now, more than half a century later, is the basis of most modern digital computers. The idea itself and the IAS machine had a very great influence on further development computer technology, so it is worth briefly describing von Neumann's project. It is worth keeping in mind that although the project is associated with the name of von Neumann, other scientists took an active part in its development - in particular, Goldstein. The architecture of this machine is illustrated by the following figure:

The von Neumann machine consisted of five main parts: memory, arithmetic-logical unit, control unit, and input-output devices. The memory included 4096 words of 40 bits in size, a bit is 0 or 1. Each word contained either 2 instructions of 20 bits, or a signed integer of 40 bits. 8 bits indicated the instruction type, and the remaining 12 bits identified one of the 4096 words. The arithmetic unit and the control unit constituted the “brain center” of the computer. In modern machines, these blocks are combined in one chip, called the central one. processor (CPU).

Inside the arithmetic-logical unit there was a special internal 40-bit register, the so-called accumulator. A typical command would add a word from memory to the accumulator or store the contents of the accumulator in memory. This machine did not perform floating point arithmetic, since Von Neumann believed that any competent mathematician could keep floating point in his head.

Around the same time that Von Neumann was working on the IAS machine, MIT researchers were developing their Whirlwind I computer. Unlike the IAS, ENIAC, and other machines of the same type with long words, the Whirlwind I machine had 16-bit words and was intended for work in real time. This project led to Jay Forrester's invention of magnetic core memory and then the first mass-produced minicomputer.

At that time, IBM was a small company that made punched cards and mechanical machines for sorting punched cards. Although IBM partially financed Aiken's project, it had no interest in computers and did not build the 701 computer until 1953, many years after Eckert and Mauchley's UNIVAC had become number one in the computer market.

The 701 had 2048 words of 36 bits, each word containing two instructions. The 701 became the first computer to lead the market for ten years. Three years later, the 704 computer appeared, which had 4 KB of magnetic core memory, 36-bit instructions and a floating-point processor. In 1958, IBM began work on the last vacuum tube computer, the 709, which was essentially a sophisticated version of the 704.

Second generation - transistors (1955-1965)

The transistor was invented by Bell Laboratories employees John Bardeen, Walter Brattain, and William Shockley, for which they received the Nobel Prize in Physics in 1956. Within ten years, transistors had revolutionized computer manufacturing, and by the end of the 50s, vacuum tube computers were already hopelessly outdated. The first computer using transistors was built in the laboratory of MIT (Massachusetts Technical Institute). It contained 16-bit words, just like Whirlwind I. The computer was called TX-0(Transistorized experimental computer 0 - experimental transistorized computer 0) and was intended only for testing the future TX-2 machine.

The TX-2 vehicle did not have of great importance, but one of the engineers at this laboratory, Kenneth Olsen, founded the DEC (Digital Equipment Corporation) company in 1957 to produce a mass-produced machine similar to the TX-0. This machine, the PDP-1, did not appear until four years later, mainly because those who financed DEC considered computer production unprofitable. Therefore, DEC sold mainly small electronic circuit boards.

The PDP-1 computer appeared only in 1961. It had 4096 words of 18 bits and a speed of 200,000 commands per second. This performance was half that of the 7090, the transistor equivalent of the 709. The PDP-1 was the fastest computer in the world at the time. The PDP-1 cost $120,000, while the 7090 cost millions. DEC sold dozens of PDP-1 computers, and the computer industry was born.

One of the first machines, the PDP-1 model, was given to MIT, where it immediately attracted the attention of some young researchers who showed great promise. One of the innovations of the PDP-1 was a 512 x 512 pixel display on which dots could be drawn. Soon, MIT students compiled special program for the PDP-1 to play War of the Worlds, the world's first computer game.

A few years later, DEC developed the PDP-8, a 12-bit computer. The PDP-8 cost much less than the PDP-1 (A$6,000). The main innovation is the single bus (omnibus), shown in Fig. 1.5. Tire is a set of parallel connected wires for connecting computer components. This innovation radically differentiated the PDP-8 from the IAS. This structure has since been used in all computers. DEC sold 50,000 PDP-8 computers and became the leader in the minicomputer market.

As noted, with the invention of the transistor, IBM built a transistor version of the 709 - 7090, and later the 7094. This version had a cycle time of 2 microseconds and memory consisted of 32,536 words of 36 bits. 7090 and 7094 were the latest computers type ENIAC, but they were widely used for scientific calculations in the 60s of the last century.

IBM also produced 1401 computers for commercial computing. This machine could read and write magnetic tapes and punched cards and print the result as quickly as the 7094, but at a lower cost. It was not suitable for scientific calculations, but it was very convenient for keeping business records.

1401 had no registers and no fixed word length. The memory contained 4000 bytes of 8 bits (in later models the size increased to a then unimaginable 16,000 bytes). Each byte contained a 6-bit character, an administrative bit, and a bit to indicate the end of a word. The MOVE command, for example, has a source address and a destination address. This instruction moves bytes from the first address to the second until the end-of-word bit is set to 1.

In 1964, CDC (Control Data Corporation) released the 6600, which was almost an order of magnitude faster than the 7094. This computer for complex calculations was very popular, and CDC took off. The secret to such high performance was that inside the CPU (central processing unit) there was a machine with a high degree of parallelism. She had several functional devices for addition, multiplication and division, and they could all work simultaneously. In order for the machine to work quickly, it was necessary to compose good program, and with some effort, it was possible to make the machine execute 10 commands simultaneously.

The 6600 machine had several small computers built into it. The central processor, therefore, only counted numbers, and the remaining functions (controlling the operation of the machine, as well as input and output of information) were performed by small computers. Some of the operating principles of the 6600 device are also used in modern computers.

The developer of the 6600 computer, Seymour Cray, was a legendary figure, as was von Neumann. He dedicated his entire life to creating very powerful computers which are now called supercomputers. Among them are 6600, 7600 and Sgau-1. Seymour Cray is also the author of the famous "car buying algorithm": you go to the store closest to your house, point to the car closest to the door, and say, "I'll take that one." This algorithm allows you to spend a minimum of time on not very important things (buying cars) and allows you to spend most of your time on important things (developing supercomputers).

Another computer worth mentioning is the Burroughs B5000. The developers of the PDP-1, 7094 and 6600 machines focused only on the hardware, trying to reduce its cost (DEC) or make it work faster (IBM and CDC). The software has not changed. The B5000 manufacturers took a different route. They designed the machine with the intention of programming it in Algol 60 (the predecessor of C and Java), designing the hardware to make the compiler's task easier. So the idea came up that when
When designing a computer, you also need to take software into account. But this idea was soon forgotten.

Third generation - integrated circuits (1965-1980)

The invention of the silicon integrated circuit in 1958 by Robert Noyce meant that dozens of transistors could be placed on one small chip. Integrated circuit computers were smaller, faster, and cheaper than their transistor-based predecessors.

By 1964, IBM was the leader in the computer market, but there was one big problem: the 7094 and 1401 computers it produced were incompatible with each other. One of them was intended for complex calculations, it used binary arithmetic on 36-bit registers, the second used the decimal number system and words of different lengths. Many buyers had both of these computers and didn't like that they were completely incompatible.

When it came time to replace these two series of computers, IBM took the plunge. It released the System/360 line of transistor computers, which were intended for both scientific and commercial calculations. The System/360 line had many innovations. It was a whole family of computers for working with one language (assembly). Each new model was more capable than the previous one. The company was able to replace the 1401 with the 360 ​​(model 30) and the 7094 with the 360 ​​(model 75). The Model 75 was larger, faster, and more expensive, but programs written for one could be used in the other. In practice, programs written for the small model were executed by the large model without much difficulty. But in the case of transferring software from a large machine to a small one, there might not be enough memory. Still, the creation of such a line of computers was a great achievement. The idea of ​​creating families of computers soon became very popular, and within a few years most computer companies were releasing series of similar machines with varying prices and features. In table Below are some parameters of the first models from the 360 ​​family. We will talk about other models of this family further.

The first models of the IBM 360 series:

Options Model 30 Model 40 Model 50 Model 65
Relative performance 1 3,5 10 21
Cycle time (ns) 1000 625 500 250
Maximum memory capacity (bytes) 65536 262144 262144 524288
Number of bytes recalled from memory per cycle 1 2 4 16
Maximum number of data channels 3 3 4 6

Another innovation in 360 - multiprogramming. Several programs could be in the computer's memory at the same time, and while one program was waiting for the I/O process to end, another was running. As a result, processor resources were used more efficiently.

The 360 ​​computer was the first machine that could completely emulate the operation of other computers. The small models could emulate the 1401, and the larger ones could emulate the 7094, so programmers could leave their old programs unchanged and use them with the 360. Some 360 ​​models ran programs written for the 1401 much faster than the 1401 itself, so reworking programs became pointless .

The 360 ​​series computers could emulate other computers because they were built using microprogramming. It was necessary to write only three microprograms: one for the 360 ​​instruction set, another for the 1401 instruction set, and a third for the 7094 instruction set. The requirement for flexibility was one of the main reasons for using microprogramming.

Computer 360 managed to resolve the dilemma between binary and decimal systems radix: This computer had 16 32-bit registers for binary arithmetic, but the memory consisted of bytes, like the 1401. The 360 ​​used the same commands to move records different sizes from one part of memory to another, like Iv 1401.

The memory capacity of the 360 ​​was 2 24 bytes (16 MB). At that time, this amount of memory seemed enormous. The 360 ​​line was later replaced by the 370 line, then the 4300, 3080, 3090. All of these computers had a similar architecture. By the mid-1980s, 16 MB of memory was no longer enough, and IBM had to give up some of the compatibility to move to the 32-bit addressing required for 2.32 byte memory.

One might think that since the machines had 32-bit words and registers, they might as well have 32-bit addresses. But at that time, no one could even imagine a computer with a memory capacity of 16 MB. Blaming IBM for lack of foresight is the same as blaming modern personal computer manufacturers for having only 32-bit addresses. Perhaps in a few years the memory capacity of computers will be much larger than 4 GB, and then 32-bit addresses will not be enough.

The world of minicomputers took a big step forward in the third generation with the production of the PDP-11 line of computers, successors to the PDP-8 with 16-bit words. In many ways, the PDP-11 was the little brother of the 360, and the PDP-1 was the little brother of the 7094. Both the 360 ​​and the PDP-11 had registers, words, and byte memory, and the computers in both lines had different prices and different features. . The PDP-1 was widely used, especially in universities, and DEC continued to lead the minicomputer maker.

Fourth generation - ultra-large-scale integrated circuits (1980-?)

Appearance very large scale integrated circuits (VLSI) in the 80s it made it possible to place first tens of thousands, then hundreds of thousands and finally millions of transistors on one board. This led to the creation of smaller and faster computers. Before the PDP-1, computers were so large and expensive that companies and universities had to have special departments ( computing centers). By the 1980s, prices had fallen so much that not only organizations, but also individuals, had the opportunity to purchase computers. The era of personal computers has begun.

Personal computers were required for completely different purposes than their predecessors. They were used to process words, spreadsheets, as well as for running applications with a high level of interactivity (for example, games), with which large computers couldn't cope.

The first personal computers were sold as kits. Each kit contained printed circuit board, a set of integrated circuits typically including an Intel 8080 circuit, some cables, a power supply, and sometimes an 8-inch floppy drive. The buyer had to assemble the computer himself from these parts. No software was included with the computer. The buyer had to write the software himself. Later, the CP/M operating system appeared, written by Gary Kildall for the Intel 8080. This operating system was placed on a floppy disk, it included a file management system and an interpreter for executing custom commands, which were typed from the keyboard.

Another personal computer, the Apple (and later the Apple II), was developed by Steve Jobs and Steve Wozniak. This computer became extremely popular among home users and schools, making Apple company serious player in the market.

Observing what other companies were doing, IBM, then the leader in the computer market, also decided to start producing personal computers. But instead of designing a computer based on individual components IBM "from scratch", which would have taken too much time, the company provided to one of its employees, Philip Estridge, a large amount money, ordered him to go somewhere away from the meddling bureaucrats at the company's headquarters in Armonk, New York, and not to return until a functioning personal computer was created. Estridge opened a company quite far from the company headquarters (in Florida), took an Intel 8088 as a central processor and created a personal computer from disparate components. This computer (IBM PC) appeared in 1981 and became the most sold computer in history.

However, IBM did one thing that it later regretted. Instead of keeping the machine's design secret (or at least shielding itself with patents) as it usually did, the company published the complete designs, including all the circuitry, in a $49 book. This book was published so that other companies could produce replacement boards for the IBM PC, thereby increasing the compatibility and popularity of this computer. Unfortunately for IBM, once the IBM PC project became widely known, many companies began making clones PC and often sold them much cheaper than IBM (since all the components of the computer could be easily purchased). Thus began the rapid production of personal computers.

Although some companies (such as Commodore, Apple, and Atari) produced personal computers using their own processors rather than Intel's, the production potential of the IBM PC was so great that other companies had to struggle. Only a few of them managed to survive, and only because they specialized in narrow areas, for example, in the production of workstations or supercomputers.

The first version of the IBM PC was equipped with the MS-DOS operating system, which was produced by the then tiny Microsoft Corporation. IBM and Microsoft jointly developed the operating system OS/2, which followed MS-DOS. characteristic feature which was graphic user interface (Graphical User Interface, GUI), similar to the Apple Macintosh interface. Meanwhile, Microsoft also developed its own Windows operating system, which ran on MS-DOS, in case OS/2 was not in demand. OS/2 really was not in demand, and Microsoft successfully continued to release the Windows operating system, which caused a huge discord between IBM and Microsoft. The legend of how tiny Intel and even tinier Microsoft managed to topple IBM, one of the largest, richest and most powerful corporations in world history, is told in detail in business schools around the world.

The initial success of the 8088 processor encouraged Intel to further improve it. Particularly noteworthy is the 386, released in 1985, the first of the Pentium line. Modern Pentium processors are much faster than the 386, but from an architectural standpoint they are simply more powerful versions of the 386.

In the mid-80s, CISC (Complex Instruction Set Computer) was replaced by a computer with full set commands) a RISC (Reduced Instruction Set Computer) computer arrived. RISC commands were simpler and much faster. In the 1990s, superscalar processors appeared that could execute many instructions simultaneously, often out of the order in which they appear in the program.

Up until 1992, personal computers were 8-, 16-, and 32-bit. Then came DEC's revolutionary 64-bit Alpha, the ultimate RISC computer that far outperformed all other PCs. However, at that time the commercial success of this model turned out to be very modest - only a decade later did 64-bit machines gain popularity, and then only as professional servers.

Fifth generation - invisible computers

In 1981, the Japanese government announced plans to allocate $500 million to national companies to develop fifth-generation computers based on artificial intelligence, which were supposed to displace the “tight in the head” fourth generation cars. Watching Japanese companies rapidly seize market positions in industries ranging from cameras to stereos to televisions, American and European manufacturers panicked and rushed to demand similar subsidies and other support from their governments. However, despite the great noise, the Japanese project to develop fifth-generation computers ultimately showed its inconsistency and was neatly “pushed into the back drawer.” In a sense, this situation turned out to be close to the one that Babbage faced: the idea was so ahead of its time that there was no adequate technological basis for its implementation.

Nevertheless, what can be called the fifth generation of computers did materialize, but in a very unexpected way - computers began to rapidly shrink. The Apple Newton model, which appeared in 1993, clearly proved that a computer could fit into a case the size of a cassette player. Handwriting, implemented in Newton, would seem to complicate matters, but subsequently the user interface of similar machines, which are now called personal electronic secretaries(Personal Digital Assistants, PDA), or simply pocket computers, was improved and gained wide popularity. Many handheld computers today are no less powerful than conventional PCs two or three years ago.

But even pocket computers did not become a truly revolutionary development. Much greater importance is attached to the so-called “invisible” computers - those that are built into household appliances, watches, bank cards and a huge number of other devices. Processors of this type provide extensive functionality and an equally wide range of application options at a very reasonable price. The question is whether it is possible to combine these microcircuits into one full generation (and there are
they are from the 1970s) remains controversial. The fact is that they expand the capabilities of household and other devices by an order of magnitude. Already influence invisible computers on the development of world industry is very large, and over the years it will increase. One of the features of this type of computer is that their hardware and software are often designed using co-development.

Conclusion

So, the first generation includes computers based on vacuum tubes (such as ENIAC), to the second - transistor machines ( IBM 7094), to the third - the first computers on integrated circuits ( IBM 360), the fourth - personal computers (CPU lines Intel). As for the fifth generation, it is no longer associated with a specific architecture, but with a paradigm shift. Computers of the future will be built into all imaginable and inconceivable devices and, due to this, will truly become invisible. They
will become firmly part of everyday life - they will open doors, turn on lamps, distribute money and perform thousands of other duties. This model, developed by Mark Weiser in the late period of his activity, was originally called widespread computerization, but nowadays the term “ pervasive computerization" This phenomenon promises to change the world no less radically than the industrial revolution.

Based on materials from the book “Computer Architecture” by E. Tannenbaum, 5th edition.

1. First generation of computers
The first generation of computers saw the light of day in 1942, when the first electronic digital computer was created. This invention belongs to the American physicist Atanasov. In 1943, Englishman Alan Turing develops Colossus, a secret computer designed to decipher intercepted messages from German troops. These computers ran on lamps and were the size of a room. In 1945, mathematician John von Neumann proved that a computer could efficiently perform any calculation using the appropriate program control without changing the hardware. This principle became the basic rule for future generations of high-speed digital computers. 2. Second generation of computers
In 1947, engineers John Bardeen and Walter Brattain invented the transistor. They were quickly introduced into radio engineering and replaced the inconvenient and large vacuum tube. In the 60s XX century transistors became the elementary basis for second-generation computers. The performance of machines began to reach hundreds of thousands of operations per second, Volume internal memory increased hundreds of times compared to first generation computers. High-level programming languages ​​began to actively develop: Fortran, Algol, Cobol.
3. Third generation of computers
The transition to the third generation is associated with significant changes in computer architecture. Machines were already running on integrated circuits. It was possible to run several programs on one computer. The speed of many machines reached several million operations per second. began to appear magnetic disks, widely used input/output devices.
4. Fourth generation of computers.
Another revolutionary event in electronics occurred in 1971, when the American company Intel announced the creation of a microprocessor. By connecting microprocessors to I/O devices, external memory, got new type computers - microcomputers, 4th generation of computers. These computers were small, cheap, and used a color graphic display, manipulators, and keyboard.
In 1976, the first personal computer was created - Apple II. The first domestic personal computer is Agat (1985). Since 1980, the American company IBM has become a trendsetter in the computer market. In 1981, it released its first personal computer, PC, and formed another line in the development of 4th generation computers - supercomputers. Among domestic machines, Elbrus computers were classified as supercomputers. Fifth generation computers are machines of the near future. Their main quality should be a high intellectual level. In fifth-generation machines, voice input, voice communication, machine “vision” and “touch” will be possible. Much has already been done in this direction.

If you are reading this text, it means that a couple of seconds ago your computer performed a huge number of complex operations. Today we will talk about those times when this same process would have taken a couple of minutes, and even about those times when it was simply impossible. The evolution of the computer - in new materialAmateur.mediaand Rostec.

Human Computer

Computers appeared in England around the 15th century - and this is no joke

Computers appeared in England around the 15th century - and this is not a belated April Fool's joke. Computers were originally people in England whose job was to carry out complex arithmetic calculations. The word computer itself, in fact, comes from the Latin “computo” - I calculate. Of course, the functionality of modern computers has long gone beyond the scope of purely mathematical operations, but the first ones, known under the abbreviation COMPUTER (electronic computers), were created precisely for this purpose.

"Global Village"

In 1822, young English mathematician Charles Babbage (in the future, by the way, a member of the Imperial Academy of Sciences in St. Petersburg) brought a mechanism with many gears and levers to a meeting of the Royal Astronomical Society. The difference machine, as the inventor himself called it, shocked everyone present: it could, for example, calculate the values ​​of polynomials of the 7th degree. However, this invention remained just a successful experiment, since its small memory did not allow it to carry out the calculations necessary for astronomers.

Computing part difference machine

After the difference engine, Babbage decided to aim for something more ambitious and created a project for an analytical engine, in the image of which modern computers are built. The analytical engine was never built: in its final form it was supposed to be no smaller than a railway locomotive. The structure of Babbage's analytical engine was more reminiscent of a description of some village: inside there was a “warehouse” (now we would call it memory), a “mill” (in modern terminology - a processor), a control element (Babbage probably could not come up with a rural names) and a device for input and output of data. Essentially, all that was left to do to create a computer was to come up with a circuit with a stored program. This took more than a hundred years.


Charles Babbage described the approximate design of a computer in the middleXIXcentury

IBM- “blue giant” with giant computers

The IBM company (at that time, however, it was also called TMC) appeared in 1896, its founder was a descendant of German emigrants, Hermann Hollerith. Initially, the company specialized in the production of calculating machines. In 1911, Hollerith decided it was time for him to retire and sold the company to the extraordinarily successful millionaire entrepreneur Charles Flint. One of the reasons for Flint’s success is that he did not disdain any orders: the company carried out US government orders, and at the same time, according to Edwin Black, author of the book “IBM and the Holocaust,” statistics on imprisoned Jews in Nazi Germany were kept precisely on machines from IBM.


IBM employees include five Nobel laureates

In 1941, as Europe was rocked by World War II, Harvard mathematician Howard Aiken and his four assistants assembled the first American programmable computer, MARKI, at IBM. The giant weighed 4.5 tons, reached a length of about 17 meters, and was taller than any person. The total length of the wires in it was almost 800 kilometers. “MARKI” took more than a minute to calculate one logarithm, but could perform division in just 15 seconds. Each program for him was a huge tape roll from which he read instructions. At the same time, many researchers call it the first really working computer, since there was no need for humans to interfere with its operation. Later, Howard Aiken left IBM and independently continued to develop the Mark line, in 1952 he created “MARKIV”.


A small part "MARKI»

« PDP-1" is an expensive pleasure

1960 was a turning point for computer technology: DEC introduced the first minicomputer PDP-1 to the market

The year 1960 was a turning point for computer technology: DEC introduced the first minicomputer PDP-1 to the market, which was equipped with a keyboard and monitor. True, only 50 copies were sold, and only three have reached us.


This car was sold for $120,000

By the way, the first computer mouse was wooden

A few years later, the last attribute was created, without which it is difficult to imagine a modern computer - Douglas Engelbart invented computer mouse. By the way, it was wooden, and initially the scientist planned to make five buttons on it for each finger. And the inventor called it a mouse because the wire coming out from the back looked like a tail. The cursor, in turn, also received a funny name: at first it was called a bug. In the USSR, the Kolobok Manipulator computer mouse was released much later.


This type of mouse was soon replaced by ball drive mice

The computer era began then

If in the 60th year computer technology made a significant step forward, then in 1969 it ran a marathon distance. This year, the Pentagon created the ARPAnet network, which is rightfully considered the predecessor of the Internet, at the same time the first floppy disk was at the final stage of development (they, by the way, are still used in the US presidential administration), and at the same time the Honeywell company announced the release of the “Kitchen Computer” H316. It was the H316 that became the world's first home computer.

The world's first home computer was the H316 from Honeywell.

Then, in 1969, schoolboy Steve Jobs met graduate Stephen Wozniak. The two began to collect own computers in the garage, will take over the field of computer technology for a long time and become one of the most famous people modernity. Read more about them in this material.

Ivan Steinert

One of the greatest inventions of its time. Billions of people use computers in their Everyday life worldwide.

Over the decades, the computer has evolved from a very expensive and slow device to today's extremely smart machines with incredible processing power.

No single person is credited with inventing the computer; many believe that Konrad Zuse and his Z1 machine were the first in a long line of innovations that brought us the computer. Konrad Zuse was a German who gained fame for creating the first freely programmable mechanical computing device in 1936. Zuse's Z1 was created with an emphasis on 3 main elements that are still used in modern calculators. Later, Konrad Zuse created the Z2 and Z3.

The first Mark series computers were built at Harvard. MARK was created in 1944, and this computer was the size of a room, measuring 55 feet long and 8 feet high. MARK could perform a wide range of calculations. It became a successful invention and was used by the US Navy until 1959.

The ENIAC computer was one of the most important advances in computing. It was commissioned during World War II by the American military. This computer used vacuum tubes instead of electric motors and levers for fast calculations. Its speed was thousands of times faster than any other computing device at the time. This computer was huge and had a total cost of $500,000. ENIAC was in service until 1955.

RAM or Random Access Memory was introduced in 1964. The first RAM was a metal detecting plate placed next to a vacuum tube that detected differences in electrical charges. It was an easy way to store computer instructions.

There were many innovations in 1940. Manchester developed the telecommunications Research Establishment. It was the first computer to use a stored program, and it became operational in 1948. Manchester MARK I continued to live in 1951 and showed enormous progress.

UNIVAC was built by the creators of ENIAC. It was the fastest and most innovative computer capable of processing many calculations. It was a masterpiece of its time and was highly praised by the public.

IBM, the first personal computer widely used and available to people. The IBM 701 was the first general purpose computer developed by IBM. New computer language called "Fortran" was used in the new 704 model. The IBM 7090 was also a great success and dominated both office computer over the next 20 years. In the late 1970s and 1980, IBM developed the personal computer known as the PC. IBM has had a huge influence on the computers used today.

With the growth of the personal computer market in the early and mid-1980s, many companies realized that GUI more user friendly. This led to the development operating system under named Windows, Microsoft. The first version was called Windows 1.0 and later came Windows 2.0 and 3.0. Microsoft is becoming more and more popular today.

Today, computers are extremely powerful and more affordable than ever. They have practically infiltrated every aspect of our lives. They are used as powerful tool communication and trade. The future of computers is huge.

The personal computer (PC) has greatly changed humanity's relationship with computing resources. With each new PC model, people transferred more and more functions onto the shoulders of the machine, starting from simple calculations and ending with accounting or design. That is why malfunctions, failures, and downtime of computer technology have become not just unwanted misunderstandings, but a real disaster that can lead to direct economic losses and other unacceptable consequences.

The first milestones in the development of personal computers


In the second half of the 20th century, only large companies had computers, and not only because of the high price of the equipment, but also because of its impressive size. Therefore, enterprises involved in the development and manufacture of computer equipment sought to miniaturize and reduce the cost of their products. As a result, microminiaturization, as well as the widespread development of microcircuits, led to the fact that the computer could fit on a desk, and Xerox introduced the first personal computer Alto in 1973. For the first time, programs and files were displayed on the screen in the form of “windows.”

In 1975, the first commercial PC, Altair-8800, was released, built on the Intel 8080 microprocessor. RAM was 256 bytes. The PC was controlled by a special switch panel. An 8-inch disk drive was installed for data input and output. floppy disks, purchased separately. The first version of the i8080 microprocessor was manufactured in a 48-pin planar package, the maximum clock frequency was 2 MHz. However, the processor had a serious flaw that caused it to freeze. Only the “reset” signal allowed the system to be revived. A corrected and improved version of the processor - 8080A - was released six months later. It was manufactured in a DIP-40 package, and the maximum clock frequency increased to 2.5 MHz.

The beginning of the journey of Apple and Intel


In 1976, Steve Jobs and Steve Wozniak assembled a working computer board called the Apple I in Palo Alto. It was housed in a wooden case and did not have a keyboard or screen. The processor was assembled on the board, RAM 8 KB, and provided for the possibility of displaying information on the screen.

In 1977, Wozniak and Jobs developed the first complete PC, the Apple II, in a plastic case, with an integrated keyboard, and a TV used as a display. That same year, Commodore introduced a PC called the PET.

In June 1978, Intel created the first 16-bit microprocessor, the i8086. Thanks to the segmented memory organization, it could address up to 1024 KB of RAM. The i8086 used an instruction set that is also used in modern processors. With the advent of the i8086 processor, the x86 architecture became known. The processor clock frequency ranged from 4 to 10 MHz. It should be noted that the 8086 processor gained popularity mainly thanks to the Compaq DeskPro computer.

In 1980, Osborne Computer began producing the first portable PCs, which had the dimensions of a suitcase and weighed 11 kg.

IBM's first steps


In 1981, IBM released the IBM PC microcomputer with open architecture, based on the 16-bit Intel 8088 microprocessor. The 16-bit i8088 processor with an 8-bit data bus had a clock speed of 5 to 10 MHz. The PC was equipped with a monochrome text display, two 160 KB 5-inch floppy disk drives and 64 KB RAM.

In 1983, the IBM PC XT (extended Technology) computer appeared, which had 256 KB of RAM and HDD 10 MB. The processor clock frequency was 5 MHz.

The IBM PC AT (Advanced Technology) was introduced in 1984. The computer ran on an Intel 80286 microprocessor and ISA architecture, and came with a 20 MB hard drive. The use of the Intel 80286 microprocessor (produced since February 1, 1986) made it possible to switch to the AT bus: 16-bit data bus, 24-bit address bus. It became possible to address RAM up to 16 MB (compared to 640 KB of the original IBM PC model). Motherboard provided a battery to power the microcircuit, and the time was stored in memory (capacity - 50 bytes). Processor clock speed: 80286 – 6 – 6 MHz, 80286 – 8 – 8 MHz, 80286-10 – 10 MHz, 80286 – 12 – 12.5 MHz.

In October 1985, Intel created the first 32-bit microprocessor, the i80386, which included about 275 thousand transistors. The first PC to use this microprocessor was the Compaq DeskPro 386. A cheaper alternative, the 32-bit i80386 processor, which later received the DX suffix, did not appear until June 1988. It was the 386th processor that provided a noticeable increase in the clock speed of personal computers. Different models of 386 processors operated at clock frequencies of 16.20, 25, 33.40 MHz.

Intel's Colossal Breakthrough


In 1989, Intel released the 486DX microprocessor. It had 1.2 million transistors on a single chip and was fully compatible with x86 processors. This chip was the first to combine a central processor, a mathematical coprocessor and cache memory. Clock frequencies of various modifications of 486 processors ranged from 16 to 150 MHz. Computers based on the 486th processor reached a frequency of 133 MHz (the so-called DX4). The 486 DX2 processors had a multiplication factor of 2 (at a frequency system bus 50 MHz processor frequency was 100 MHz). Later, processors with the DX4 index were produced. Their multiplication factor was not 4, but 3. After Intel's 486 processors left the market, AMD released the 486DX4-120 and 486DX4-133 processors. As a result of the introduction of multipliers, the concept of overclocking arose for the first time - increasing productivity by increasing the bus clock frequency or multiplication factor. There were systems on sale where i486 processors were overclocked to 160 MHz.

In March 1993, Intel began shipping 66 and 60 MHz versions of the Pentium processor. Pentium-based PCs are fully compatible with computers using i8088, i80286, i80386, i486 microprocessors. New processor contained approximately 3.1 million transistors and had a 32-bit address and 64-bit external data bus.

In May 1997, Intel introduced the Pentium II processor, based on the Pentium Pro. A processing unit for MMX instructions was added to the P6 core. The second level cache was removed from the processor case, and this contributed to the mass distribution of the Pentium II. The clock speeds of Pentium II processors have increased noticeably. Different models had: 233, 266,300, 333,350, 400, 433,450,466, 500, 533 MHz.

Sixth generation 32-bit microprocessor Intel Pentium III was released by Intel in February 1999. It practically copied the Pentium II, but included new features: 70 real instructions SSE (Streaming SIMD Extensions, also called MMX2), focused on multimedia support; Improved L1 cache controller. The clock frequencies of Pentium III (Katmai) processors were 450,500,533, 550,600 MHz. Based on Coppermine - from 533 to 1133 MHz. Pentium III processors on the Tualatin core have speeds from 1000 to 1400 MHz.

The era of multi-core processors


At the end of November 2000, Intel introduced Pentium 4 processors with a clock speed of more than 1 GHz, built on the NetBurst architecture and using fast memory Rambus with an effective system bus frequency of 400 MHz. The processors contained 144 additional SSE2 instructions. Clock speeds of the first Pentium 4 processors ranged from 1.4 to 2.0 GHz. In the following modifications, the clock frequency increased from 2.2 to 3.8 GHz.

In July 2006, Intel created dual-core processors - Core 2, the first processors in this line were Intel Core 2 Duo and Intel Core 2 Extreme. The processors were developed based on the new Intel architecture Core, which the company calls the most significant stage in the development of its microprocessors since the advent of the Intel Pentium brand in 1993. Using EM64T technology, Intel Core 2 processors can operate in both 32-bit and 64-bit modes. The main differences between the new processors and the Pentium 4 family are low heat generation and power consumption, as well as greater overclocking capabilities. The frequency of Core 2 Duo processors ranges from 1.5 to 3.5 GHz.

At the beginning of 2007, the Core 2 Quad, a quad-core processor, was introduced. Clock frequencies – from 2.33 to 3.2 GHz.

In January 2010, Intel Core i3 processors appeared. They have added so-called “graphical” processors; they carry out calculations in “graphical” mode. There is a built-in function that provides “intelligence” in operation, auto acceleration. At medium and low loads it operates at rated performance and saves energy. Increased load causes auto increase processor performance. The size of the cache (internal RAM of the processor) has been increased; it is dynamically distributed between the cores - depending on the load. New processors get hotter, especially during auto overclocking. Accordingly, they require a more efficient cooling system. Clock frequencies of i-Series processors (i3, i5, i7) are from 2.66 to 3.6 GHz.







2024 gtavrl.ru.