I. History of computer development
As early as 17th century, a number of mathematicians in Europe began to design and manufacture digital computers for basic computing in digital form. In 1642, the French mathematician Pascal made the earliest decimal adder by using a gear drive similar to clockwork. In 1678, the German mathematician Leibniz made the computer, further solved the decimal number multiplication, divide operation.
Babbage, a British mathematician, proposed a scenario in 1822 when making a differential machine model, and each time an arithmetic operation was completed it would develop to automate a particular complete operation. In 1884, Babbage designed a general-purpose analytical machine for program control. Although this analysis machine has been described in the program control mode of the computer's prototype, but limited to the technical conditions at the time and failed to achieve.
Babbage the idea of the next more than 100 years, electromagnetics, electrical Engineering, electronics continue to make significant progress in the components, devices, successively invented the vacuum diode and vacuum transistor, in the system technology, has invented the radio newspaper, television and radar .... All these achievements prepared technical and material conditions for the development of modern computers.
At the same time, mathematics and physics also flourish accordingly. By the the 1930s, the various fields of physics have undergone quantitative stages, describing mathematical equations of various physical processes, some of which have been difficult to solve with classical analytical methods. Therefore, numerical analysis has been paid attention to, the numerical integration, numerical differentiation, and the numerical solution of differential equations, the calculation process is summed up as a huge number of basic operations, thus laying the foundation of modern computer numerical algorithm.
The urgent need for advanced computing tools in society is the fundamental motive force for the birth of modern computers. After 20th century, the difficulty of computing in various scientific and technical sectors has hampered the continued development of disciplines. Especially before and after the outbreak of the Second World War, military science and technology have urgent need for high speed computing tools. In the meantime, the German, the United States, the British Department in the computer development work, almost at the same time began the electromechanical computer and electronic research.
In 1936, the British mathematician Annot Toux published his landmark paper "on the application of digital computing in solving the problem of strategy", and put forward the conception of "Turing-spirit" in the paper. Turing is not a specific machine, but an abstract computational model. He proves that any computable function has a corresponding Turing, whereas a function that does not exist is a function that does not have an algorithm for calculating its function value. This means that any complex solvable problem can be classified as a simple and repetitive operation of some columns.
Germany's Giu Sai first used electrical components to manufacture computers. The automatic relay computer Z-3, which he made in 1941, has the characteristics of modern computers such as floating-point notation, binary operations, and digital storage address instruction forms. In the United States, 1940-1947 years also have been made of relay computer MARK-1, MARK-2, Model-1, Model-5 and so on. However, the switching speed of the relay is about 1% seconds, which makes the computing speed of the computer greatly limited.
The development process of electronic computer has undergone the evolution from making parts to machine from special machine to general machine, from "external program" to "stored program". In 1938, Bulgarian-American scholar Atanasoff first made an operational component of an electronic computer. In 1943, the British Foreign Office communications office made a "giant" computer. This is a dedicated cipher-analysis machine, which was used in the Second World War.
the large electronic Digital integration computer (ENIAC), which was made by the University of Pennsylvania at Moore College in February 1946, was originally dedicated to the calculation of fire cannon shells, and has been improved many times to become a general-purpose machine capable of various scientific calculations. The computer that performs arithmetic, logic, and information storage entirely on an electronic circuit is 1000 times times faster than a relay computer. This is the first computer in the world that people often refer to. However, the computer program is still an external, storage capacity is too small, not fully equipped with the main characteristics of modern computers.
The new major breakthrough was done by the design team led by mathematician von Neumann. March 1945 they published an entirely new program of stored programming universal computers-electronic discrete variable automatic computer (EDVAC). Then in June 1946, von Neumann and other people put forward a more perfect design report, "The logical structure of computer devices." In July-August, they lectured to experts from more than 20 institutions in the United States and the UK on the theory and technology of computer design, which facilitated the design and manufacture of computer-stored programs.
in 1949, the Mathematics Laboratory of the University of Cambridge took the lead in the electronic discrete-time automatic computer (EDSAC), and in 1950 the Eastern Standard Automatic computer (SFAC) was made in the United States. At this point, the beginning of the development of electronic computers ended, began the development of modern computer time.
in the creation of digital computer, but also developed another kind of important computing tools-analog computer. When physicists summarize the laws of nature, the mathematical equations are used to describe a process, whereas the process of solving the mathematical equation is also possible by means of a physical process simulation, in which, after the invention of the logarithm, the calculation ruler, which was made in 1620, is calculated by multiplying and dividing into addition and subtraction. Maxwell cleverly transformed the calculation of the integral (area) into a measure of length, which was made into an integrator in 1855.
Another significant achievement of mathematical physics in the 19th century-Fourier analysis has played a direct role in the development of simulators. In the late 19th century and early 20th century, a variety of differential analysis machines for the analysis of Fourier coefficients and differential equations were made. However, when we try to popularize the differential analysis machine to solve the partial differential equation and use the simulator for solving the general scientific calculation problem, people gradually realize the limitation of the simulator in the generality and accuracy, and turn the main energy to the digital computer.
after the advent of electronic digital computers, analog computers continue to evolve and combine with digital computers to produce hybrid computers. Simulators and mixers have developed into a special breed of modern computers, which are efficient information processing tools or simulation tools used in specific fields.
since the middle of 20th century, the computer has been in a high speed development period, the computer from the only hardware development to include hardware, software and firmware of the three subsystems of the computer system. The performance-price ratio of a computer system increases by an average of two orders per 10. The variety of computers has also evolved into microcomputers, small computers, general-purpose computers (including giant, large and medium-sized machines), as well as a variety of specialized machines (such as various control computers, analog-digital hybrid computers).
computer devices, from tubes to transistors, from discrete components to integrated circuits and microprocessors, have led to three of leaps in the development of computers.
in the electronic tube computer period (1946~1959), the computer is mainly used for scientific calculation. Main memory is the main factor determining the appearance of computer technology. At that time, the main memory has the mercury delay line memory, cathode rays oscilloscope tubes static memory, drum and core memory types, usually according to the computer classification.
By the time of the transistor computer (1959~1964), the main memory is the core memory, and the drum and the disk are used as primary auxiliary memory. Not only does scientific computing continue to evolve, but computers in small and medium-sized computers, especially inexpensive small-scale data processing, begin mass production.
1964, in the development of integrated circuit computer, computer also entered the product serialization of the development period. The semiconductor memory gradually replaces the primary memory status of the core memory, the disk becomes the indispensable auxiliary memory, and the virtual storage technology is widely adopted. With the rapid development of a variety of semiconductor read-only memory and rewritable read-only memory, as well as the development and application of micro-program technology, the firmware subsystem begins to appear in the computer system.
after the 1970s, the integration of computer integrated circuits rapidly from small to large-scale, ultra-large-scale level, microprocessor and microcomputer came into being, the performance of all kinds of computers quickly improved. With the word length 4-bit, 8-bit, 16-bit, 32-bit and 64-bit microcomputer have been introduced and widely used, the demand for small computers, general-purpose computers and special computer has increased correspondingly.
after a large number of microcomputer applications in society, an office building, a school, a warehouse often has dozens of or even hundreds of computers. The realization of the interconnection of the local network arose, and further promoted the development of computer application system from centralized system to distributed system.
in the electronic tube computer period, some computers are equipped with assembly language and sub-Library, scientific computing with the high-level language Fortran first dew. In the transistor computer phase, the COBOL language of transaction processing, the ALGOL language used by scientific computers, and the high-level languages such as Lisp for symbolic processing began to enter the practical stage. The initial molding of the operating system, so that the use of the computer by manual operation to change to automatic operation management.
after entering the development period of integrated circuit computer, a considerable scale software subsystem has been formed in the computer, the advanced language types have been further improved, and the operating system has been perfected, which has many functions such as batch processing, time-sharing processing and realtime processing. The database management system, the communication processing program, the network software and so on are also continuously added to the software subsystem. The function of software subsystem is continuously enhanced, which obviously changes the use property of computer, and the efficiency of use is greatly improved.
In modern computers, the value of peripheral devices is generally more than half of the computer hardware subsystem, its technical level to a large extent determines the technology of the face of the computer. Peripheral equipment technology is very comprehensive, not only depends on the electronics, mechanics, optics, magnetics and other disciplines of the integration of knowledge, but also depends on the precision of mechanical technology, electrical and electronic processing technology and measurement technology and technology level.
peripheral devices include secondary memory and input two major categories. Auxiliary memory includes disk, drum, tape, laser memory, mass memory and micro-memory, etc., input is divided into input, output, conversion, mode information processing equipment and terminal apparatus. In these varieties of equipment, the most significant impact on computer technology is the disk, terminal equipment, mode information processing equipment and conversion equipment.
A new generation of computers is an intelligent computer system that combines information acquisition, storage processing, communication and artificial intelligence. It not only can carry on the general information processing, but also can face the knowledge processing, has the formal inference, the association, the study and the explanation ability, will be able to help the human to develop the unknown domain and obtains the new knowledge.
The extensive application of computer in all walks of life often produces significant economic and social benefits, which leads to major changes in industrial structure, product structure, management and service mode. In the industrial structure has already shown the computer manufacturing and computer services, as well as the knowledge industry and other new industries.
Say that the computer is invented, in fact, some appropriate, because the computer is not a person invented, not even an independent organization or institutions invented, posterity always stand on the shoulders of predecessors continue to climb. Computer does not know how many generations of human effort crystallization, human civilization is always accompanied by a variety of tools for the invention and improvement, it should be said that the computer is the product of human civilization. In addition, I believe that the development of civilization to a certain extent will inevitably produce computers, McLuhan said that the media is a human extension, the computer is the extension of human brain, when the human body extension to a certain extent, human needs more advanced and efficient extension.
The most important founders of the advent of computers were British scientist Ailain Toux (Alan Turing) and Hungarian-American scientist von Neumann John von Neumann). Turing's contribution is to establish a theoretical model of Turing, which lays the foundation of artificial intelligence. and Feng · Neumann is the first to put forward the idea of computer architecture. 1946 Hungarian-American scientist von Neumann put forward the principle of stored procedures, the program itself as data to treat, the program and the data processed by the program stored in the same way, and determine the storage program computer five components and basic working methods. for more than half a century, the computer manufacturing technology has undergone great changes, but the von Neumann architecture is still used so far, people always call von Neumann the "computer originator". The following focuses on Turing and von Neumann architectures. Second, TuringTuring's basic idea is to use a machine to simulate the process of mathematical operations with paper and pens, which he sees as the following two simple actions: to write or erase a symbol on a piece of paper, to move attention from one position of paper to another, and at each stage to determine the next action, depending on (a) The symbol and (b) the state of the person's current thinking on the paper where the person is currently concerned. to simulate this process of human operation, Turing constructs an imaginary machine consisting of the following parts:
- An infinitely long paper tape TAPE. The paper tapes are divided into small squares, one after the other, each containing a symbol from the finite alphabet, and a special symbol in the alphabet that represents the blanks. The lattice on the tape is numbered 0,1,2,... from left to right, and the right end of the tape can be stretched indefinitely.
- A read-write header head. The reading and writing head can be moved around the paper tape, it can read the current meaning of the lattice symbol, and can change the current lattice symbol.
- A set of control rules TABLE. It determines the next action of the read-write header based on the current state of the machine and the symbol on the grid that the current read-write header refers to, and changes the value of the status register to bring the machine into a new state.
- A status register. It is used to preserve the current state of the Turing. The number of all possible states of a Turing machine is limited, and there is a special state called the shutdown state.
Note that each part of the machine is limited, but it has a potentially infinite length of paper tape, so this machine is just an ideal device. Turing believed that such a machine could simulate any computational process that humans could perform. in some models, the read-write head moves along a fixed paper tape. The instruction to be performed (Q1) is displayed in the read-write header. In this model, the "blank" tape is all 0. Shaded squares, including the blanks scanned by the reading and writing heads, the squares marked with 1,1,b, and the read and write head symbols form the state of the system. (Drawn by Minsky (1967) p.121). third, von Neumann system structure
- Computer-processed data and instructions are used in binary tables
- Sequential Store program. in the process of running the computer, the program to be executed and the data processed are first stored in the main memory (memory), and the computer executes the program automatically and sequentially takes out the instruction from the main memory, which is called the Sequential storage program.
- The computer hardware consists of five parts, the arithmetic device, the controller, the memory, the input equipment and the output device.
The first chapter, the invention of the computer