Fusion of image velocity data and accelerometer data based on Kalman filtering algorithm

Source: Internet
Author: User

Recently before the improvement of the visual fixed-point algorithm, there was only one location ring, now ready to cascade a series of speed loop, but to solve the drone translation speed is still quite a headache, the online information is very small, we need to move our own brain to solve this problem.
The first step is to measure the level of speed, the traditional method is GPS, I designed the UAV in the application scene of the GPS signal although there is, but not so stable, followed by the GPS precision is not up to the requirements of the visual guidance I need, because the GPS position ring does not feed back the position of the target object I needed.
Second, if simply using the accelerometer to get the speed, then you will find that after a period of time, the solution of the speed value does not know where to go, this drift is too large, completely unable to meet our design requirements.
However, GPS fusion accelerometer to measure speed is a lot of application examples, so we can the visual solution of the location information and the addition of information fusion.

At the beginning I used a complementary filtering algorithm for image data and add data to a fusion, the effect is not satisfactory, here only release code, the first and second-order difference is not small, only the first order of complementary filter Code:

    PARA_POS.FILTER_VX = Para_pos.filter_vx_coeff * para_pos.v_x\
        + (1-para_pos.filter_vx_coeff) * (Para_pos.filter_ VX + AX*DT);
    Para_pos.filter_vy = Para_pos.filter_vy_coeff * para_pos.v_y\
        + (1-para_pos.filter_vy_coeff) * (Para_pos.filter_ VY + Ay*dt);

Two directions: fusion of X and y direction of the speed, adjust the Filter_vx_coeff,filter_vy_coeff of these two coefficients, to add the speed of integration and image calculation of the degree of trust, you can see the code is very simple, it is true, but the effect is still some, Like in a two-wheel balance car, if the angle solution is not so high, the algorithm is sufficient to meet the requirements of the upright balance car.

Let's go to the bottom: Kalman Filter Data fusion for two sensors
Kalman filter mainly uses 5 equations (derivation is not given): Two predictive equations and 3 update equations.
Predictive equation:

Update equation:

The Kalman filter produces the optimal estimation in the linear system, so the sensor or the system must be (close to) the linear system to be used for Kalman filtering. The Kalman filter does not require a long system state history to be filtered by the line because it relies only on the previous system state and the variance matrix that defines the system state to be modified by probability. This can ensure the high real-time requirements of the system. Kalman filter has two kinds of equations: predictive equation and renewal equation. The predictive equation predicts the current state based on the previous state and control quantity, and the update equation indicates that the sensor data is more or less believed to be more than the overall estimate (determined by the Kalman gain kg). The general working principle of the filter is to predict the current state based on the predictive equation and to detect the predicted result by updating the equation, which has been updating the current state repeatedly. The relationship between the predictive equation and the renewal equation is shown in Figure 3-3.

This is the Intercept. The application of Kalman filter to the fusion of gyroscope and accelerometer is an explanation of Kalman filter, according to the interpretation of the article and some matrix theory (matrix multiplication, covariance matrix, etc.) with the knowledge of the above 5 equations, you can understand the entire derivation process, the final programming can be achieved.

The Kalman filter code is written mainly according to the formula (3-2) ~ (3-6) 5 formulas. At this point, the integral of the Y axis angular velocity in time is the control quantity of the predictive value, that is, U (k) =gyro_y, the Pitching angle (Pitch=arctan (accel_x/accel_z)) as the observed value, that is, Z (k) =pitch. Because of the need to get the y-axis angular velocity and pitch angle, so the Kalman filter needs to estimate two values, one is pitch angle, the other is Gyro drift q_bias. According to the formula (3-2) to establish the equation of the angle Measurement model: (Excuse me to take a direct screenshot, the formula to edit the problem)

I think I can go to Baidu Library to see this document, I think it is quite clear, I fully understand how the derivation.
Https://wenku.baidu.com/view/3c42b7733186bceb18e8bb29.html

Finally posted my Kalman fusion code, only the X-directional

void Kalman_filter (float vx, float acc_x,float DT) {filter_vx.v_acc+= (acc_x-filter_vx.q_bias) * DT;
        Filter_vx.v_err = VX-FILTER_VX.V_ACC; Filter_vx. Pdot[0]=filter_vx. Q_v-filter_vx. P[0][1]-FILTER_VX.
        P[1][0]; Filter_vx. pdot[1]=-FILTER_VX.
        P[1][1]; Filter_vx. pdot[2]=-FILTER_VX.
        P[1][1]; Filter_vx. Pdot[3]=filter_vx.
        Q_ACC; Filter_vx. P[0][0] + = Filter_vx.
        PDOT[0] * DT; Filter_vx. P[0][1] + = Filter_vx.
        PDOT[1] * DT; Filter_vx. P[1][0] + = Filter_vx.
        PDOT[2] * DT; Filter_vx. P[1][1] + = Filter_vx.
        PDOT[3] * DT; Filter_vx. Pct_0 = Filter_vx. C_0 * FILTER_VX.
        P[0][0]; Filter_vx. Pct_1 = Filter_vx. C_0 * FILTER_VX.
        P[1][0]; Filter_vx. E = Filter_vx. R_angle + FILTER_VX. C_0 * FILTER_VX.
        PCT_0; Filter_vx. K_0 = Filter_vx. Pct_0/filter_vx.
        E Filter_vx. K_1 = Filter_vx. Pct_1/filter_vx.
        E Filter_vx.t_0 = Filter_vx.
        PCT_0; Filter_vx.t_1 = Filter_vx. C_0 * FILTER_VX.
     P[0][1];   Filter_vx. P[0][0]-= FILTER_VX.
        K_0 * FILTER_VX.T_0; Filter_vx. P[0][1]-= FILTER_VX.
        K_0 * FILTER_VX.T_1; Filter_vx. P[1][0]-= FILTER_VX.
        K_1 * FILTER_VX.T_0; Filter_vx. P[1][1]-= FILTER_VX.
        K_1 * FILTER_VX.T_1; FILTER_VX.V_ACC + = Filter_vx. 
        K_0 * FILTER_VX.V_ERR; Filter_vx.q_bias + = Filter_vx.
        K_1 * FILTER_VX.V_ERR;
PARA_POS.FILTER_VX = FILTER_VX.V_ACC; }

Finally look at the fusion of the effect, the top is the image to solve the position waveform, the middle Burr and noise is the image difference to calculate the speed waveform. And the green waveform is the fusion of the speed waveform.

It can be seen that the fusion of the speed waveform can basically reflect the motion direction and speed of the image, unlike the individual image speed waveform noise so large, the effect is very good.


Finally, the data to make a low-pass filter can be used, for the control of the speed loop, this precision is enough to use, if you want to improve the accuracy of the calculation, to further improve the image part of the algorithm, the current image algorithm, or there will be lost the target of the situation occurs, further upgrade, the effect should be better

In addition, now a lot of optical flow modules (mostly plagiarized Pixflow) can also be combined with the accelerometer, the effect should be better. I've changed the code and parameters of Pixflow, for the elimination of the rotation of the optical flow is still some doubt, linear compensation effect is not good, in my vision system was intended to introduce the optical flow module, but later thought in my application scenario may not be suitable, it is not used, With the experience of the optical flow module friends Welcome to communicate with me, how to eliminate such noise as the rotation of the flow of light, I have time I will write some of my optical flow module research.

Reprint please indicate the source: http://blog.csdn.net/gyh_420/article/details/76762118

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.