It is no exaggeration to say that the Unix model is the prototype of the modern operating system! Whether it is the original UNIX major series such as Aix, Solaris, HP-UX, FreeBSD, NetBSD ,... or UNIX like Linux... or a variety of Microsoft Operating Systems Based on the Windows NT architecture, the basic idea is derived from UNIX. Although these systems are more complicated, remember one sentence: all basic ideas must be simple and simple!
Maybe, many people may feel a little dismissive here. After all, they feel that they are technical geeks, and they feel that they can prove their knowledge and technology only by playing with complicated things, it is believed that UNIX V6 was an outdated system in the 1970s S, and the content in it has long been ruthlessly abandoned by years. But in fact, if you really read the Unix source code of that era, or after reading the god book of Leon, you will find that those simple ideas are not outdated at all, many of our technologies today rely on the idea that has been simply implemented as early as 1975. When you are three years old, ignoring history is to despise the future.
If you want to know what we will do next, it is useless to listen to others. You must open the history in person. As part of a curve, if you want to know the next trend of the curve, you can guess the next step, because everything follows the history, which is wise to read history. I had dinner with my friends a few days ago and invited a chip expert. But I did not care about anything except the chip. a traditional Chinese so-called skill industry has a specialization, it's like in the world, except that chips are high-tech, all others are pediatrics! But did you think of something through this? This friend is a typical Chinese study. The so-called technical profession involves specialized research. It is exactly this so-called specialization, as a result, China has never actually led an era. As we said earlier, the agricultural revolution has been carried out in the middle and lower reaches of the Nile, and the Bronze Age was led by Western and Eastern European nomadic peoples, the idea of democracy comes from Greece, imperialism comes from Rome, the industrial revolution comes from Western Europe, the electric revolution comes from Western Europe, and the computer revolution comes from the United States, I didn't mention the Renaissance above. It was my intention, because if I mention the Renaissance, many people will say that we also have many such things in China, for example, Zhou Zhao's Republican, Qin Wang's scan of Liuhe, Han Wu's emperor, Guangwu ZTE, and Zhen Guan's governance ,... well, but you must know that, like the Renaissance, these are all local events. In any case, we live in buildings, play mobile phones, play computers, drive cars, and take high-speed trains, none of these are our own products. The reason is that our thinking is not divergent. We do software, but do not know that some ideas can also be used in hardware, we are proficient in learning Oo and are proficient in C ++ and Java, but we do not know that OO comes from AI... I know the relationship between the livemix flight and the computer memory access model...
It seems useless to say so much. In fact, I want to say something else, although we are all engaged in it. Many programming experts may have been familiar with the Unix source code in 1975, that is, history. This expert knows how to design classes and how to implement a sorting algorithm, but this is just the case. After all, he is just a high-price mercenary and will never become a general. He may be killed at any time. I wrote this series of articles to explain that you will get more and narrow down the development of operating systems and any technology from a historical perspective, UNIX is a treasure. Any thought you can find in modern software is its shadow in UNIX.
Before the official start, let's briefly talk about the background. The computer we are now dealing with has not been a computer 40 years ago. Due to the need for industrialization and threshold reduction, the entire computer industry and the communication industry have undergone many integrations, in the end, we strictly separate software and hardware, and strictly divide the boundaries between each component of software and hardware. This result is conducive to the social division of labor, people have embarked on a specialized professional road in the industry (in fact, many thinkers have started to attack specialization since the ancient Greek Solun era), but their shortcomings are also obvious, that is, "For geek, modern computers are no longer fun ".
Modern computers are no longer fun.
After reading Linus's autobiography, I agree that today's computers are no longer suitable, mainly because they are no longer fun. Linus always believes that technology is born for entertainment! It is worth noting that Linus's autobiography "just for fun" does not mean that "computers are no longer fun", but Linus believes that, the current computer is "as complicated as your car" and is no longer suitable for you to go on with your own.
In fact, since computers are no longer fun, we cannot or are difficult to create the so-called "times" for computers themselves "! Computers have now become a profession, a means of making a living, and the content has become a routine and rule! To create an era, you must not be specialized in the industry, but you must be a complete talent.
The times have come out, I thought! At that time, the group was unintentional. Then, when all the gameplay rules were standardized, the benefits dominated everything. Something that can be played with must be simple. Something that is too complex has no aesthetic in art. in engineering, you must spend a lot of energy on complexity management, therefore, the essence of things is hidden under the complexity. It is difficult for you to explore and appreciate it. I have read a few books recently and feel quite touched. You can show off things that are too complicated to others you don't know. But in fact, people you hear don't even know what you are talking about. You just talk about yourself and enjoy yourself. Have you ever wondered why the masters have come together? Ancient Greek philosophers, Chinese pre-Qin philosophers, painters in the Renaissance, physicists in the early 20th century, and Computer Talents in the 1950s s, hackers in 1970s...
The computer industry also started from the age of universal tossing, which is the same as the Solun reform in Greece, but it is the same as the age of Burkley in Greece, which is industrialized, specialization has ended the computer hipster era. Now, only Richard Stallman and Eric s Raymond are trying their best to continue the geek era. In fact, today's specializations allow many quasi-geek to face complex computation with limited power. So how did the geek era end? I started from where I met software and hardware.
Where hardware and software meet
The originally designed machine is not intended for everyone to control it, but belongs to the elite. However, if someone finds the huge business opportunities in these machines, they will promote a universal movement. The universal movement will develop to a certain extent, and the technology is already complex enough, so they will become professional, as a result, this technology gradually moved away from the masses and was once again monopolized by the elites, just as it was at the beginning.
For computer technology, I turn the technology of the previous elite age into hardware. I call the technology of the latter elite age as software. Now, in the hardware field, I have the silicon crystal technology, such as OpenGL and HDL, in the software field, there are terms such as Java, Python, PHP, Oo, design patterns, and agile development. Professionalism makes all these things exclusive, and creates a place in the computer era, it is where software and hardware are met, that is, where the Assembly language evolves from C language.
You know, if you do not understand the features of machines, you are not good at using assembly languages. If you do not understand the features of machines, you cannot use C to write the most efficient code. In turn, if you only understand the C language, you cannot fully utilize the capabilities of machines. Compilation and C are the places where software and hardware are met. The times are what they created, history is what they write.
Assembly Language Programming allows people to follow machine instructions, it is difficult for programmers to build their own high-level logic, because you have to spend a lot of energy on the instructions themselves, after the emergence of C language, people's logic thinking has been liberated. You can write such statements as A = B + C without having to "put the number immediately into a register and then the number immediately .... the values of the two registers are added, or a register is directly added with an immediate number. The result is... ". The biggest share of C language is to solve the addressing problem, which means it solves all the problems, because computer programming can be classified as addressing art, c. All addressing problems come down to a concept, that is, pointers! You must know that the only problem in the operation of modern storage-type computers is addressing. Both data and commands are stored in the memory, you must find instructions or data in the correct place. The C pointer shields all addressing details and makes it possible to easily build application logic, including complex conditional statements, loop statements, Goto statements, and even buffer overflow.
The relationship between hardware and software is how to do and what to do. After the programmer tells the hardware to do anything, he can do anything with confidence because he believes that the hardware knows how to do it. This interface is a C language, while the hardware's internal circuit logic implements the logic of "how to do it.
The water doesn't know how cold the ice is, and the ice doesn't know how passionate the water is. Once the same thing is isolated, it is like a mountain separation. Originally, hardware and software are one, but now they have become two exclusive industries, even within the software industry, there are also a variety of exclusive isolation, such as "bottom layer", "protocol stack", "Write class ",... the full-stack programmers who use hardware and software are now connected by cool (bitter. But 30-40 years ago, the computer age was created by full-stack programmers! Where software and hardware are met, the Group belongs to both the hardware and software elites. In fact, they lead a hip-hop age in which all people play computer games, this era was extended by hackers in 1970s. As we all know, almost all celebrities in the industry came from the era where UNIX started the hacker spirit, such as Steve Jobs's apple and Bill. gates's Microsoft, Richard. stoman's GNU, Bill. joey's BSD, ibm pc, Intel ia32 ,... not listed. This is an era of simplicity and purity, a time of simplification and concentration.
PC and Unix Technologies
The place where software and hardware met was where Unix was born. UNIX, as a nutritious root, is destined to grow into a towering tree with too many fruits, so that almost all of the technologies we are currently using.
PC technology makes computers miniaturized, which leads to the trend of miniaturization. As the technology becomes more advanced, hardware becomes smaller and cheaper, and huge business opportunities Promote the Development of PC technology, however, the software has not followed up in real time. At that time, Unix was the most popular multi-user, multi-task time-sharing system, but Unix was controlled by a variety of non-technical factors. This gave Microsoft, apple, and other companies the lead, the two of them are chasing after me. Another reason for this result is that UNIX never belongs to the grassroots (just because it has GNU), regardless of Apple or Microsoft, they are all related to a grassroots organization called the Homebrew Club. Everyone shows their machines and technologies in a relatively peaceful environment, and that era is the age of hackers. It can be said that UNIX did not catch up with the trend of the PC era, but this is Unix. Just as the enlightened thinker himself did not participate in the French Revolution, UNIX, as an ideological pioneer, has some ancient ideas, even Windows 8 is still in use. The design concept of svr4 VM can be said to subvert the traditional understanding of memory; file abstraction makes the IO Interface simple; the idea of Virtual Memory Based on page switching and On-Demand page paging affects almost all operating system designs. The idea of Hierarchical Storage is even better. Inside the CPU, the CPU cache is the memory cache, in the VM, the physical memory is the cache of the virtual memory. In the MMU, the physical memory is the cache of the disk or network ...; the most important thing is the Unix Process Model (both the thread model and the process group model are extensions of the process model). I really don't know how to make it clear.
I have to mention the relationship between Intel, Microsoft, and PC technology. However, this topic is too large. I can only remind you that PC technology has created two empire, the cooperation between Intel and Microsoft is so tacit that it becomes an existence like the Roman Empire...
Mac OS X Technology
It can be said that PC technology relies on Intel technology and Microsoft technology in parallel development, during which there are too many fancy features, and even a lot of things are directly solidified in the hardware. Today, Apple also uses Intel chips, but it does not use Microsoft technology. Instead, it uses UNIX Mach/BSD to build its own operating system Mac OS X, it can be said that Apple has pulled PC technology into Unix. in another direction, Android and IoS are competing for the market. However, both Android and IOS have underlying systems built based on direct UNIX ideas, one is Linux and the other is Mach/BSD. Back to the original truth, Unix technology has gone through 40 years, and its basic concepts and core have not changed, which is enough to show how good its design is.
The idea of UNIX success is that it never pays attention to implementation details, because the details will pull you away from the target. Unix only provides basic concepts. Therefore, the implementations of Aix, Solaris, Mach, and xxbsd are absolutely different, but they are all called UNIX. If Microsoft is willing, Windows NT can also be called UNIX, because the NT system implements the basic UNIX philosophy to 70%.
In addition to Unix, there is another Evolution Path, namely complexity evolution. It strictly follows the latest hardware features. The software caters to the hardware, and the hardware is used to bad software. Everyone is flattering to each other. In fact, everyone has forgotten it. In fact, everyone is actually one. In some cases, some complex but non-universal mechanisms will be solidified in hardware for the sake of a single commercial benefit. The inverse example of this is the hierarchical architecture, it is the direct embodiment of UNIX in the hardware field.
Example of complexity Evolution
Data addresses are naturally aligned according to the length multiples of their types, which is an unwritten programming convention. But why do we make such an agreement? Isn't alignment possible? In some architectures, the processor is really not good, for example, many of them. However, for the sake of programming convenience, the core of the CISC architecture processor, such as Intel/AMD X86 architecture processor, can perform this laborious operation for you. Where is the root of all this?
Before analyzing the entire process, we must understand that modern microprocessor is the crystallization of ultra-large-scale integrated circuits, and the chips are carved on semiconductors such as silicon, with extremely sophisticated technology, therefore, the simplicity of wiring is the foundation of everything. It is amazing to engrave a circuit on a silicon crystal. (Do you still remember what we learned in the nuclear boat note ). Unlike advanced language programming, silicon crystal carving does not even allow you to make things too complex (mainly the difficulty of crystal cabling). Therefore, many things need to be ruled out, such as natural data alignment. The perfect Processor Architecture only requires a minimum instruction set to complete any operation. The architecture is simple, more space for silicon crystals can be used to implement efficient pipelines rather than complex commands (a complex command often has a public part with other complex commands, which leads to a waste of space ), this kind of processor is called a RISC processor. The above facts are summarized at the beginning of the new era of integrated circuit technology. It is a pity that intel is the first person to eat crabs, then it achieved great success, and AMD was also pulled down by its instruction set compatible with it.
We know that, aside from the operational logic that must be implemented inside the microprocessor, its external interfaces are addressing and Io.
When Intel first implemented a 16-bit microprocessor, it had to implement a hardware instruction for every addressing logic that we could think of. In fact, it did the same thing. When opening any book in assembly language, a large number of addressing commands are first caught. If you look at Intel's manual, you will find that many registers do not exist as operands, it is built into the instruction itself! Therefore, mov eax 1 and mov EBX 1 may be two completely different commands, not just the operands. In this way, the implementation of complex commands occupies a large area of the chip, because the transactions completed by separate commands are still segmented and have different lengths, it is difficult for intel to implement an efficient long pipeline. Machine commands are the interface between hardware and software. They are the only interface for software requests to be implemented by hardware. To be compatible with previous software, interfaces cannot be changed at will, however, when Intel realized that commands must be simplified enough and functions alone enough to implement a long pipeline, its applications had blossomed everywhere. Therefore, Intel and AMD adopt a policy to modify the implementation of commands. After obtaining commands, they divide a single complex command into multiple simple commands, that is to say, you have used a metaphor that is inseparable from your difficulties-to add an intermediate layer!
Therefore, Intel commands are actually implemented by simple operations. These operations are called micro operations, and the set of micro operations that implement one command is called micro programs. All micro programs are called micro code. As a result, the X86 architecture is actually just a CISC system architecture for the interface, and its kernel is already a server architecture of the RISC system!
To explain the memory alignment problem, we must know its physical structure. The key is not how to design the memory chip, but how to map the pins of the memory chip to the address bus derived from the CPU, how to establish the relationship between the two is the most important. Don't forget that the only problem that computers need to solve outside the CPU is addressing! If the data is any alignment, when the CPU sends an addressing command, there may be two pieces of data on the chip. For simple wiring, this addressing command must be completed twice, if the bus is locked between two operations, the efficiency will be greatly reduced. If the bus is not locked, there will be an atomic problem. It is easiest to require the compiler or programmer to write such a command, so this task falls on the compiler or programmer's head. This is why data on some machines must be naturally aligned in length.
At the beginning of UNIX
Today, when applications are booming, it is hard to think of the objective of UNIX and other time-sharing systems at the beginning of their construction. A time-sharing system is used to allow the system to replace batch processing in a pipeline, so that users do not have to wait for a long time, in fact, the time-sharing system was presented to the system users in the form of a time slice illusion. In fact, the throughput and total work latency of the system were not improved, only people recognize this illusion. This is the initial time-sharing system. What about UNIX? Unix has gone further in many ways. For at&t, its telephone business is big, and it certainly hopes to provide users with an illusion-based service method with a more cost-effective investment, users cannot feel this difference. Bell's initial time-sharing system was not used to run any applications, but to control voice calls, but later it went to the terminal.
Phone users can easily connect with printers and end users because they belong to remote users and share the same host resource. Before the emergence of a time-sharing system, either waiting or establishing a parallel system is not satisfactory for users or for the return on investment. The time-sharing system solves all the problems. File abstraction is also for more convenient I/O, so we can see what role the early unix I/O played. Just like routers and switches, early UNIX provided control planes at the process model abstraction layer and data planes at the file abstraction layer, and its kernel was born for the control plane, the data plane should use user-state I/O as much as possible. Following this principle will get a return, which is still true until today.
Fortunately, many vendors have already taken this path and many such technologies, such as pf_ring, have emerged. All of these have been decided as early as the plain UNIX era.
Simple Unix-Open history