Review of AI in game development
(Personal review, some are only to their own review tips (=w=), reproduced annotated Source: http://blog.csdn.net/hcbbt/article/details/42815479)
Supporting materials: Artificial Intelligence in game development
Knowledge points
- Move
- Bresenham, line of sight (slightly), intercept
// Bresenhamif (deltaCol > deltaRow) { fraction = deltaRow * 2 - deltaCol; while (nextCol != endCol) { if (fraction >= 0) { nextRow = nextRow + stepRow; fraction = fraction - deltaCol; } nextCol = nextCol + stepCol; fraction = fraction + deltaRow; pathRow[currentStep] = nextRow; pathCol[currentStep] = nextCol; currentStep++; }} else { fraction = deltaCol * 2 - deltaRow; while (nextRow != endRow) { if (fraction >= 0) { nextCol = nextCol + stepCol; fraction = fraction - deltaRow; } nextRow = nextRow + stepRow; fraction = fraction + deltaCol; pathRow[currentStep] = nextRow; pathCol[currentStep] = nextCol; currentStep++; } }
// 拦截,类似视线void DoIntercept(void) { Vector u, v; Bool left = false; Bool right = false; Vector Vr, Sr, St; Double tc; Vr = Prey.vVelocity - Predator.vVelocity; Sr = Prey.vPosition - Predator.vPosition; tc = Sr.Magnitude() / Vr.Magnitude(); St = Prey.vPosition + (Prey.vVelocity * tc); u = VRotate2D(-Predator.fOrientation, (St - Predator.vPosition)); u.Normalize(); if (u.x < -_TOL) left = true; else if (u.x > _TOL) right = true; Predator.SetThrusters(left, right);}
- Move mode:
- Rectangular
- Patrol
- In the simulation physics
- Colony
- Reynolds basic Clustering Algorithm core: Cohesion rules, alignment rules, separation rules.
- The number of neighboring units is determined by the function of the view angle and the radius of the field of view.
- Avoid the idea of obstruction: for each unit in front of the installation of Virtual tentacles (feeler), when the tentacles touch things, you need to turn.
- Lead person
- Potential function
- Lenard-jones potential function:
U = - A / (r^n) + B / (r^m)
- After derivative:
F = - nA / (r^(n+1)) + nB / (r^(m+1))
- A, B can be the strength of gravity and repulsion, respectively, N, M represents the attenuation of these two forces
- Application: Chase/Flash dodge obstacles in droves
- Path Search
- Basic path Search (slightly)
- Random movement to avoid obstacles: advantages
- Bypass Obstacle: Segment Detour, Sight detour
- Breadcrumbs (breadcrumb) 1, initialize footstep array 2, record player position 3, drop breadcrumbs 4, follow breadcrumbs
- Walk along the road: according to the weights go
- Waypoint Path: Node table
- A *
听说A*不考
- BFS + priority_queue (Open list & Closed list)
- Terrain Cost
- Impact correspondence
- Scripting AI and composition
- Scripting Overview: A programming language designed to simplify the complex tasks of a particular program.
- Composition: Language and engine
- Stack machine is a basic form commonly used in scripting language virtual machine technology: virtual machine, stack and instruction pointer
- Example: Describes attributes, behaviors, events, and so on.
- Finite state machine
- Basic concepts
- A finite state machine is a behavior model that consists of state, transform (Transition), and action. The finite state machine is transferred to the next state after the initial state (Start) receives the input event. The condition (Guard) is adopted to determine whether the transfer condition is met. A finite state machine can be regarded as a special direction graph.
- Diagram and program transformation
- State and Transfer functions: switch
- Fuzzy logic
- Meaning and Essence
- The mathematical model of fuzzy object is established, and the fuzzy set of infinite multivalued value is taken in the [0,1] interval.
- Steps
- Blurring of
- Generating fuzzy output by fuzzy rules
- Anti-Blur
- differs from Boolean logic in operation: Define step Attribution function Fuzzy axiom evaluation operation
- In the application of fuzziness
- Fence function: Change the shape of the attribution function
- Fuzzy axioms
- Intersection \ Union \ Complement set
- Anti-Blur
- Find the geometric center of the area of the output fuzzy set
- Single-Value output attribution function
- Rule-type AI
- Definition and composition of rule system
- Put together a set of production, let them cooperate with each other, synergy, one generation of the conclusion can be used as a precondition for another production, in this way to solve the problem, this is called the rule (production type) system
- The production system can also be counted as a deductive system.
- Form: Consisting of a series of if-then rules used to make inferences or action decisions.
- The rule system has two main parts: working memory (facts and assertions) and regular memory
- Consists of three parts: a comprehensive database (data base) a set of production rules (rule base) a control system (rule interpreter)
- Deduction
- Pairing the rules with the facts stored in the working memory is to check whether the if portion of each rule matches a set of facts and assertions in the working memory.
- Conflict resolution, check all the matching rules, and in some way find out the rules you want to start.
- When a rule is selected, the rule is started, that is, the then part is executed
- Inductive method
- Expert system
Probability theory and Bayesian technology
- Standard probability
- is the classical probability: p=p (E) =n/n
- Bayesian Network and its composition
- The graph network based on probabilistic inference can express the relationship between random variables concisely for specific problems.
- Bayesian network is composed of two parts: network structure and conditional probability table. The network structure is a direction-free graph, which consists of a node representing a random variable, and an arc or connection that represents the causal relationship between the variables of the Yin machine.
- Prior probability, posterior probability and conditional probability (cancer yin-yang type)
- A probability of occurring at various times based on historical data or subjective judgments.
- Posterior probability: Through Bayesian formula, combined with the investigation and other methods to obtain new additional information, the prior probability correction to obtain more realistic probability
- Conditional probability: The probability of the occurrence of an event after it occurs
- Conditional probability formula, Bayesian formula
- Conditional probability formula: P (b| A) = P (B) p (a| B)/P (A)
- Bayesian formula: P (bi| A) = P (Bi) p (a| BI)/sum_ (i=1~n) {p (BI) p (a| Bi)}
- Test Centers Cancer
- Reasoning method
- Causal Chain predictive Inference
- Inference of common genetic network diagnosis
- Common Results Network Interpretation elimination
- Predictive algorithms and diagnostic algorithms
- Input: A given Bayesian network B (including the network structure m nodes and the connection between some nodes, the causal node to the intermediate node of the conditional probability or joint conditional probability) given a number of reasons for the occurrence of the node or not the fact that the vector f (or the evidence vector); Given a node to be predicted T.
- Output: The probability of node t occurring.
- Predictive algorithms
- Input the evidence vector into Bayesian network B;
- For each of the unhandled node N in B, if it has the facts (evidence) that occurred, mark it as already processed;
- If one of its parent nodes is not processed, the node is not processed; otherwise, continue with the following steps;
- The probability distribution of node n is calculated according to the probability of all parent nodes of node n and the probability of conditional probability or joint condition, and the node n is marked as processed.
- Repeat steps (2)-(4) total m times. At this point, the probability distribution of the node T is the probability that it occurs/does not occur. The algorithm ends.
- Diagnostic algorithms
- Input the evidence vector into Bayesian network B;
- For each of the unhandled node N in B, if it has the facts (evidence) that occurred, mark it as already processed;
- If one of its child nodes is not processed, the node is not processed; otherwise, continue with the following steps;
- According to the probability of node n and the probability of conditional probability or joint condition, the probability distribution of node n is calculated according to the conditional probability formula, and the node n is marked as processed.
- Repeat steps (2)-(4) total m times. At this point, the probability distribution of the cause node T is the probability that it occurs/does not occur. The algorithm ends.
Neural network
- Meaning and structure
- is the abstraction and simulation of several basic characteristics of the human brain or natural neural network.
- "Artificial Neural Network (ANN) is a dynamic system which is constructed by a graph-based topological structure , which is processed by the corresponding state of continuous or intermittent input. ”
- Features: Approximation of nonlinear relationships, robustness and fault tolerance, parallel distribution processing, learning and adaptive, simultaneous processing of quantitative and qualitative knowledge.
- Structure: Divided into input layer (predictor), output layer (target variable) and hidden layer (processing).
- Feedforward Neural Networks
- The value of the latter node is passed through the node in front of it, then the value is weighted into the size of each connection weight input activation function to obtain a new value, further propagation to the next node.
- In the event of an error, the neural network will "learn"
- The weight of the connections between nodes can be regarded as the "trust" degree of the latter node to the previous node.
- Use the method of punishment to adjust the weight.
- Total node Input value = SUM (input value of the preceding node * weight of the connection) + deviation term
- Use the activation function to make the output more nonlinear: Logistic function (S-type function), step function, hyperbolic tangent function, linear activation function
- Deviation item = deviation value * deviation weight. The total input value is moved horizontally, and the threshold of neuron activation is effectively changed.
- Supervised learning networks and non-supervised learning networks
- Supervised Learning network that provides training examples from problem areas, including input data and output data.
- Unsupervised learning network, not knowing whether the classification results are correct when learning
- Training
- The goal of training is to find the weights that connect the values of all neurons so that the input layer can produce the desired output values.
- First, a training data set is required, consisting of input data (and corresponding output data);
- Then, using a certain technology, the whole network is repeatedly used to find a set of weight values, so that the network can produce in accordance with the training data set, the output of each group of data;
- Finally, let the network start, providing new data that is not in the training data set, so that it produces reasonable output results.
- Inverted transfer method and its process
- The essence of the inverted transfer method is to minimize the error with the trial and error method. With the input value and the resulting output value, the resulting output value is compared with the known desired output value, and the mean variance is used to quantify the coincidence between the two.
- 1. Training Data Set
- 2. The weighted initial value is set to some smaller random number
- 3. Calculate the output value
- 4. Compare with the desired value, calculate the error (VLine squared error)
- 5. Adjust weight (correction = Learning Rate * Neuron error * neuron output value), repeat
- Genetic algorithm
- Concept
- Population: The evolution of organisms in the form of groups, such a group called the population.
- Population size: The number of individual chromosomes in a population.
- Individual: The individual organism that makes up the population, i.e. the solution to the optimization problem.
- Gene: a genetic factor.
- Chromosome: Contains a set of genes. Chromosomes are generally expressed as simple strings or string numbers, but there are other representations that depend on special problems, and this process is called coding.
- The survival of the fittest: the opportunity for individuals with high degree of environmental adaptability to participate in reproduction is more and more offspring. Low-adaptability individuals are less likely to participate in reproduction, and fewer offspring.
- Heredity and mutation: The new individual inherits the genes from each part of the parent, and at the same time has a certain probability of genetic variation.
- Crossover probability: Controls the frequency at which the crossover operator is used.
- Mutation Probability: Controls the frequency of use of the mutation operator.
- mechanism
- First, the algorithm randomly generates a certain number of individuals, calculates the fitness, sorts
- the next step is to produce the next generation of individuals and make up the population, the process is done through selection and evolution of
- Select common "Proportional selection": "Roulette Algorithm"
- Evolution includes crossover and mutation.
- Termination criteria: Frequency limit, resource limit, optimal value, fitness saturation, human intervention
Simple algorithm
Roulette Betting algorithm
/* 按设定的概率,随机选中一个个体, P[i]表示第i个个体被选中的概率*/int RWS() { m = 0; r = Random(0, 1); //r为0至1的随机数 for (i = 1; i <= N; i++) { /* 产生的随机数在m~m+P[i]间则认为选中了i,因此i被选中的概率是P[i]*/ m = m + P[i]; if (r <= m) return i; }}
Simple genetic algorithm-pseudo code
/*Pc:交叉发生的概率, Pm:变异发生的概率, M:种群规模,G:终止进化的代数,Tf:进化产生的任何一个个体的适应度函数超过Tf,则可以终止进化过程*/初始化Pm,Pc,M,G,Tf等参数。随机产生第一代种群Pop;do { 计算种群Pop中每一个体的适应度F(i); 初始化空种群newPop; do { 根据适应度以比例选择算法从种群Pop中选出2个个体; if (random(0, 1) < Pc) { 对2个个体按交叉概率Pc执行交叉操作; } if (random(0, 1) < Pm) { 对2个个体按变异概率Pm执行变异操作; } 将2个新个体加入种群newPop中; } until(M个子代被创建); 用newPop取代Pop;} until(任何染色体得分超过Tf, 或繁殖代数超过G);
Review of AI in game development