Algorithms are much more important than we think.

Source: Internet
Author: User

Algorithms are important because any program or software is composed of many algorithms and data structures. In this regard, algorithms are very important, but this does not mean that algorithms are important to the actual work of each software designer. In fact, the software industry can be divided into algorithm-intensive models, such as a search engine. Business logic-intensive, such as an ERP. Experience-intensive, typical example: a full Flash site. Therefore, not every software practitioner requires good algorithm skills. The key lies in your ability to solve problems and quickly grasp things you don't know.

To develop games, you need to be familiar with various algorithms. Flash Lite is no exception. You need to find an algorithm suitable for this game. This requires time to accumulate, we usually need to think more, read more, and ask more (sometimes one person may think that it is better for two or more people to discuss the effect ).
When developing flashlite mobile phones, we must optimize the mobile computing restrictions.
I remember when I went to college with C language, I asked my teacher, "what can I do ?" The teacher asked me to think about it myself. I didn't know much about my classmates in the answer class, because at that time, java was also very direct to make corresponding software and websites. C is very useful now!
After getting started with falshlite, I gradually learned about the importance of algorithms! No matter java, c, flashlite, php,... you can't leave it without it. It also tells you how to use the algorithms you will use in Flash Lite games.

For example, how many ants can get food? "Ant colony optimization (ACO), also known as ant colony algorithm, is a probability technology used to find optimization paths in the figure. It was introduced by Marco Dorigo in his doctoral thesis in 1992 and is inspired by the behavior of ant financial to discover paths while searching for food ."

Some people may say, "is the computer so fast today that algorithms are important ?" In fact, there will never be computers that are too fast, because we will always come up with new applications. Despite Moore's Law, the computing power of computers is growing rapidly every year, and the price is also declining. But we should not forget that the amount of information to be processed increases exponentially. Every day, everyone creates a large amount of data (photos, videos, voices, texts, and so on ). The increasingly advanced recording and storage methods have led to an explosive increase in the amount of information for each of us. The Internet's information traffic and log capacity are also growing rapidly. In terms of scientific research, with the advancement of research methods, the amount of data has reached an unprecedented level. Massive computing is required for 3D graphics, massive data processing, machine learning, and speech recognition. In the Internet era, more and more challenges need to be solved by superior algorithms.
Let's look at another example of the Internet age. In Internet and mobile phone search, if you want to find a nearby coffee shop, how should the search engine handle this request? The simplest way is to find out all the city's coffee shops, calculate the distance between them and you, sort them, and return the nearest result. But how can we calculate the distance? There are many algorithms in graph theory that can solve this problem.
This may be the most intuitive, but it is definitely not the fastest. If there are only a few coffee shops in a city, there should be no problem in doing so, but it does not take much effort. However, if there are a lot of coffee shops in a city and many users need similar searches, the server will be under a lot of pressure. In this case, how can we optimize the algorithm?
First, we can "pre-process" the whole city's coffee shop ". For example, a city is divided into several grids, and a user is placed in a grid based on the location of the user, and only the distance of the coffee in the grid is sorted.
The problem is that if the grid size is the same, the vast majority of results can be found in a grid in the city center, and there are only a few results in the grid in the suburbs. In this case, we should separate several grids in the city center. Furthermore, the grid should be a "Tree Structure", with a big grid at the top-the whole city falling down layer by layer, and the grid is getting smaller and smaller, this helps you to perform accurate search. If there are not many search results in the bottom-layer lattice, you can increase the search scope step by step.
The above algorithm is very useful for coffee shops, but is it universal? The answer is no. Abstract The coffee shop. It is a "point". What should I do if I want to search for a "surface? For example, if a user wants to go to a reservoir and a reservoir has several entrances, which one is closest to the user? At this time, the above "tree Structure" should be changed to "r-tree", because each node in the middle of the tree is a range and a boundary range.
Through this small example, we can see that the requirements of applications are ever-changing. In many cases, we need to break down a complex problem into several simple small problems, and then select appropriate algorithms and data structures.

The above example is a small case in Google! Every day, Google's website processes more than one billion searches, GMail stores 2G mailboxes of tens of millions of users, and Google Earth allows hundreds of thousands of users to travel across the globe at the same time, and submit suitable images to each user over the Internet. Without good algorithms, these applications cannot become a reality.
In these applications, even the most basic problems will bring great challenges to traditional computing. For example, more than one billion users access Google's website every day. Using Google's services, many logs are generated ). Because each Log is increasing rapidly every second, we must have a smart way to handle it. I have asked some questions during the interview about how to analyze and handle logs. Many interviewees have answered the correct answer logically, but it is almost impossible in practical application. According to their algorithms, even if tens of thousands of machines are used, our processing speed cannot be based on the data generation speed.
So how does Google solve these problems?
First, in the Internet era, even the best algorithms must be executed in a parallel computing environment. In Google's data center, we use super-large parallel computers. However, when traditional parallel algorithms run, the efficiency will quickly decrease as the number of machines increases. That is to say, if ten machines have five times the efficiency, up to one thousand servers may only have dozens of times the effect. No company can afford this cost. In addition, in many parallel algorithms, as long as a node makes a mistake, all computing efforts will be exhausted.
So how does Google develop efficient and fault-tolerant parallel computing?
Jeff Dean, Google's most senior computer scientist, recognizes that the vast majority of data processed by Google can be attributed to a simple parallel algorithm: Map and Reduce. This algorithm can achieve high efficiency and scalability in many kinds of computing (that is to say, even if the number of one thousand machines cannot reach one thousand times, at least hundreds of times ). Another major feature of Map and Reduce is that it can use a large number of cheap machines to form a powerful server farm. Finally, fram is capable of exception tolerance. Even if a server farm goes down halfway, the entire fram can still run. It is precisely because of the knowledge of this genius that the Map and Reduce algorithm is available. With this algorithm, Google is able to increase computing workload almost infinitely and grow with the ever-changing Internet applications.

An example outside the computer field: in terms of high-energy physics research, many experiments have several terabytes of data per second. However, due to insufficient processing and storage capabilities, scientists have to discard most of the unprocessed data. But you need to know that the information of new elements is likely to be hidden in the data that we cannot process. Similarly, in any other field, algorithms can change human life. For example, the study of human genes may lead to the invention of new medical methods due to algorithms. In the national security field, effective algorithms may prevent the occurrence of the next 911. In terms of meteorology, algorithms can better predict the occurrence of natural disasters in the future to save lives.
Therefore, if you place the development of computers in the environment of rapid growth of applications and data, you will surely find that the importance of algorithms is not decreasing, but increasing.


































Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.