Undoubtedly, a company's Unix architecture is likely to be the most important part of all its IT problems. It supports your email system, Web server, and even your most important enterprise applications. Although Unix systems are extremely secure operating systems, in this era of malicious code and hacker behavior, we cannot take the security of Unix architecture lightly. This article will discuss the key methods that any Unix user should be clear about to Ensure Unix security.
So what elements constitute the Unix architecture? It is difficult to define the standard answer to this question, but in general, most companies have services for customers or the public. These servers are "public" servers. Everything that provides services to the external world has its own particularity. They are machines that users can log on to. These users can be a legitimate ISP account or a development team of a company. We call these login servers "login servers" and must treat them in particular. Note that a considerable number of computers in our network architecture provide services for other servers, which can only be accessed by Super Users.
Public Service
The first problem is that you must take a look at all the servers that can provide services for the external world and think about whether you really need them. Generally, they can be placed after a firewall or after a combined firewall and proxy server. For example, if you run a client-oriented WEB site on four WEB servers, it is possible to minimize the exposure of these servers. A proxy server or a pair of redundant proxy servers placed before these WEB servers can accept all client connections and then check and clean up these services. This is the role of the proxy server. The proxy server can reduce the risks of backend WEB servers and is not affected and accessed by the Internet.
The most common cause of security problems is the lack of timely patching or unknown services. For a long time, the forgotten WEB server is the old Apache version or vulnerable PHP script service because its kernel is outdated. It may be too common to prescribe solutions to catastrophic faults. However, if your WEB server is hidden behind a proxy server, there is almost no risk of forgetting patches or services.
The same is true for other services. Many sites have some extreme restrictions, such as the firewall administrator must verify any new network application and work well. In general, the company's network is completely open, and its WEB applications are insecure. servers with which applications can interact can often be accessed by the Internet without any reason.
Login Server
Remote users are limited to use only given interfaces, such as E-MAIL services, WEB applications, or B2B services. Local users who can access the system shell are completely unrestricted. If there is a malicious user in your system, he can access the root directory unless he takes extreme measures. As for the update issue, especially those Kernel updates that require upgrade and restart, they must be applied on the date of release of the new kernel. In short, the operating system needs to strengthen its robustness. During architecture design, you must pay special attention to ensuring that you can only access the specified region.
If your network has some developers who can access the root directory of some machines, the possibility of injury will increase. The possibility of a developer becoming a malicious user may be minimal, but it cannot be ruled out. In fact, some strange new programs installed by developers may damage the system. For example, the Slammer Worm is very fast because it is spread over the network in Windows. The worm uses the buffer overflow vulnerability of Microsoft SQL Server2000 to gain control of the system, and generate a large number of random IP addresses for attacks, resulting in the rapid spread of worms and the formation of DoS attacks, a large amount of network bandwidth is occupied.
Other problems
Theoretically, most enterprise computers are not Internet-oriented. If so, limiting the exposure of users is the top priority for all companies. We can ignore these servers to some extent. The only weakness on these servers is the interface they provide. "If my WEB applications are patched on a regular basis, there is no need to worry about the operating system itself ." Some people have such opinions.
This view may be correct if an enterprise can restrict access to system management. Maintaining regular patches for applications may be a safe measure. However, if a security vulnerability is forgotten only because the Administrator has not considered it, your entire architecture is at risk, not just one of your servers. Once an attacker enters the system, it is quite simple to access other servers.
Therefore, it seems that there are two ways to ensure the security of an architecture: one is to limit the degree of exposure and expect (only expectations) that an unexpected threat will not happen. Second, take measures to protect your own security. Therefore, Once attackers attempt to penetrate your defense system, they will not cause any harm. In reality, most enterprises cannot even admit that their security policies can only belong to one of the two methods.
Firewalls are very easy to conquer, especially when exposed applications are in danger. In fact, a seemingly safe firewall usually attracts more attackers. This is not because they want to accept any challenges, but because they know that the system is external. We should remember that the external defense of most enterprises is still very fragile.