Principle
Consistency of the database
1. Transaction and Agent Checklist tricks (pre-write logging)
1. Action list trick: write the log to write the operation to the hardware first. Even if the database operation is wrong, it can be corrected according to the log. The operation of the log has the same power, that is, each operation in the log, whether executed once or multiple times, will have the same effect.
2. Transaction: The transaction as a whole, either complete, or midway through the failure of the previous operation according to the log (that is, the reverse operation, before plus, minus now), so that the database back to the state before the transaction (ROLLBACK TRANSACTION). That is, the transaction is atomic, indivisible, and avoids the occurrence of some execution in the transaction, some of which are not executed.
2. Pre-submission Tricks (two-paragraph submission agreement)
1. In the case of a replicated database, the preliminary phase of the transaction to be performed for the primary database: The primary database locks the corresponding data rows, writes the new data to the pre-write log, and then sends the new data to each backup database, and each backup database locks the corresponding data rows to write new data in the log. The second stage: each backup database returns whether the execution succeeds, any one failure, will let the primary database roll back the transaction, and notify the other backup database rollback, otherwise the master data will send the information for each database to execute the transaction.
3. relational database and virtual table tricks
1. Principle: Each table stores different sets of information, but individuals of different tables are usually connected in some way. That is, let the user save the information as a table, and then separate the duplicate information from the table to place in the new table, the original table and the new table through the same repetition of their own saved match (similar to the program to separate the often-appearing code into a function or class), that is, with a small number of repetitions instead of Dai Liang's repetition, thus saving It also has the advantage of a change of profit everywhere.
2. Virtual table tricks: The query, the use of the table through their respective saved duplicates to match (that is, merge) to form a used to delete the virtual table, the ejection operation to remove some unnecessary columns (such as not to show the column is not the column as a query criteria), and then according to the query criteria selected part of the row discard other new virtual table, Discard the columns that are not to be displayed and get a dummy table that answers the original query.
Image recognition (i.e. classification)
Learning is done in two ways: 1. A detailed professor of others 2. Watch the example used to teach yourself. The first is difficult for the computer to achieve, only the second. So the process of computer learning is to let the computer automatically "learn" how to classify samples. The basic strategy is to label the computer with a large number of data (categorized samples), so that the computer can identify the characteristics of various classes, thus classifying unclassified samples according to these characteristics.
principle: Classification is to be classified as a sample to be located on a learning sample up
1. Nearest neighbor classification Trick
Comparing the samples to be categorized and learning samples, the category of the closest sample is the category of the sample.
No training, no knowledge of classification rules, but classification requires a large number of implementations
Situation: 1. If the location is the closest one or K study samples to the sample to be classified
2. If the image is the smallest learning sample to be classified
2. Decision Tree
Even with a dichotomy, the sample to be classified is eventually targeted to a learning sample by the appropriate conditions. And these problems by the computer through the Learning sample training, after the continuous optimization.
Training is required, but the classification rules are obtained, and the classification time is very short.
3. Neural networks
The decision of a thing is the process of synthesizing the conclusion according to various conditions, and the conclusion depends on each condition (ie, variable).
The effect of the conclusion of the actual variables is not only the same, but also many conditions will affect the conclusion, it needs to be combined to see that the model conforms to these conditions only the neural network. If the effect of the conclusion is to point to the same neuron (the node that the part of the variable occupies the composite weight, then the node represents the part of the node and other synthetic models), and then the band Analysis sample block, each block based on the neural network to derive the final weight, and thus draw a conclusion.
Training process, the random pre-set the network and the weight of the node, in the practice sample to test, and then adjust the network and weights to close the results of the sample should be the conclusion, when the learning use case enough, the neural network to get enough adjustment, then the neural network to classify the more accurate samples.
Problem: Even though the accuracy is high enough, some nodes of the neural network still seem to be random, but like most of the neurons ' connections seem random, but as a whole, these loosely linked sets of connections produce human intelligent behavior.
< nine algorithms to change the future > Reading notes two