Transferred from Su Ying msn:suyuwang3@hotmail.com
Directory:
Objective
Drive article:
1, micron sensor ISP schematic diagram
2, the principle frame of sensor
3, Sensor initialization steps
4, preview time of sensor settings
5, capture time of sensor settings
6, the Power frequency interference debugging
7, Brightness and night view mode
Debug article:
1, the definition of the test
2, gray-scale reproduction
3, the uniformity of the screen and the compensation of the dark foot
4, distortion
5, the white balance debugging
Objective
Micron sensor is the most used image sensor in our company, and it is also the mainstream sensor product which is highly appraised in the market at present. The purpose of this article is to let the successor debugging sensor sensor debugging has a preliminary idea and some of the characteristics of micron sensor have some understanding, hope that the future debugging work can take some detours less. Specific sensor working principles and more in-depth image engineering knowledge, you can refer to the datasheet of each sensor and the Internet to find some of the test data on camera.
Drive article:
Schematic diagram of micron sensor ISP:
The following figure is the functional framework composition of the sensor:
The Sensor Core Register is the register that actually controls Sensor. is a register that directly controls Sensor (corresponding to page 0 of the Sensor register).
The register in Image flow Processor is primarily a register of algorithms that control sensor. Color pipeline is mainly the output data and some control of the signal. such as Base configuration, lens shading, resize, output format (page 1)
Camera control focuses on the sensor core algorithm, and the control of sensor core is done in this register group. (page 2) such as AE, AWB, flicker, Camera control sequencer.
Initialization steps for Sensor:
The initialization of general sensor usually consists of the following steps:
1, sensor on the electricity. The power supply of micron sensor is divided into digital power, analog power and IO voltage. These three power supplies are not strictly sequential, and can be opened at the same time in the code.
2, to sensor output MCLK, configure the PCLK sampling output frequency, which is the key to the normal reception of sensor data.
3, the configuration v,h synchronous signal output polarity, if the polarity configuration is not correct, will cause the image not to collect normally, the natural display chaos.
4, the hardware reset. The reset of micron sensor is low reset and lasts at least 1US,
5, the software reset. Since it is reset for the software, it must require BB or multimedia map to be able to write registers for sensor. That is to ensure that IIC can write data into the sensor ISP, this is to ensure that the software can debug the basis. Software reset is usually based on the different sensor will change, such as mt9d111 with an MCU, so in the reset time to the MCU at the same time reset. And MT9M11 is not with MCU. Note the hardware reset to retain some time to use the IIC bus. Usually over 10 us.
6, Micron sensor mtd9111 series with the ISP 2M sensor default in the ISP a set of registers, can be reset after no IIC write any register can output image, this time sensor input clock is output Clock twice times, early can use this method to verify hardware and software power supply, reset, etc. is correct, when the back-end of the image engine can only use MCLK to synchronize work, you must correctly configure the received sampling frequency, no
It is not possible to draw the correct image.
The above picture is the phenomenon that the sampling frequency does not match. Notice what the difference is with the YUV,RGB sequence configuration error. The following illustration is an inverted sequence:
7, writes the initialization register which the Micron engineer gives, and configures the output frequency and the output image resolution.
8, read the sensor version number, if with our product version consistent, on behalf of the initialization of the correct completion of the work.
The rationale for initializing sensor is simple, and if the platform is more mature, it is possible to initialize it at once, and it may take a lot of time to look at the problem, especially if you encounter an unstable phenomenon when IIC writes to sensor or sensor receive IIC commands and data, But does not press the normal output, that is more troublesome, but the 2m sensor ISP since already brought the MCU, may as well treat it as an application processor to maintain, writes some changes sensor internal working state The register to pay attention to the delay. There are times when a register can be written multiple times to ensure that it works properly, which is an experience that has not yet come up with a reason. According to the above process inspection should be able to accurately locate more than 80% of the problem.
Preview Time Sensor settings:
Preview in order to get a higher frame rate, usually low resolution output, that is, the length of the only half of the highest resolution, can set a smaller resolution output, this time the output of sensor is the interval output, is not a sample of the object after the uniform output. But because the multimedia chip CoreLogic cannot receive the irregular pclk the reason, only then does not have. And some platforms of camera interface does not exist this problem. In order to directly output the image of the screen size, reduce the complexity of the back-end processing and save the size of the buffer opened by the preview.
Capture Time Sensor settings:
Capture in order to achieve greater resolution and better image quality, so must adopt high resolution output, then switch to capture when you need to sensor a set of register settings, Micron sensor to provide users with two relatively independent context, To be able to save two register groups, the default setting is to preview with context A, capture with context B, and context A is usually low power mode, and Context B is full power mode, due to Corelog IC can only use MCLK to synchronize sampling, so that it can only accept the rules of the PCLK, or the sampling will have problems, then this requires both context A and the full power mode of both A and B, so as to ensure that either preview or capture, Sensor can be a fixed frequency of output PCLK, the back-end map can normally receive image information, do not appear the flower screen and color wrong. Because such a solution is not micron recommended, but the way to make up. So at least half of the problems on the Vienna platform are caused by this flaw in CoreLogic and the sensor output modified to make up for this flaw, and if you want to change the multimedia application processor later, consider whether the map supports accepting irregular pclk. At present, Vienna and Qualcomm platforms do not support the change of PCLK, while the vision platform is supported.
Output in lower power mode
Output in full power mode
Proof that CoreLogic cannot accept the sensor output of lower power mode.
Because the output clock frequency preview and capture fixed, the CMOS sensor exposure principle is the line exposure, exposure time equals line exposure time, lines time=hsync times +hblank times. For example, the sensor of 1.3 million pixels, preview output Vga,hsync time=640*k, K is shutter width (shutter time). and capture when the HSync time=1280*k, in the K unchanged, HSync time has undergone a huge change, so that exposure times also took place a huge change, the phenomenon is photographed when the picture is obviously exposed. However, Micron gives a register that can change the width of the shutter, by changing the register to adjust the combined exposure time, so that the problem can be solved, in the aging test often occur exposure is not, Because this register is not written or written after the sensor did not reflect the resulting, because the preview->capture->preview in the middle of the conversion value is the software to calculate the real time, so whether it is IIC read or write, or ambient brightness caused sensor brightness calculation error, will affect exposure, maintenance of this code should be particularly careful. If you encounter a photo with the preview image quality difference is very large, please start from this code to check.
The image of the output when preview.
Capture down the image, you can see the obvious phenomenon of excessive violence.
According to experience is not a pattern switch failure will not normally output million pixels, but not the normal output is the mode switch failed. The time and success rate of mode switching has a certain relationship with frame rate, in general, the faster the frame rate, the shorter the time, the higher the success rate. From the time of Capture->preview must also set the exposure value, to ensure that the image will not suddenly darken, if the picture is more and more dark, most of the time to return to preview set failure.
Note: When the mode switch, the oscilloscope can see the sensor when doing the switch will appear suddenly pull low vsnyc signal, To form a longer hidden (blank) signal, then output another mode under the signal, sometimes unstable phenomenon is that the blank signal is too long, especially in the case of low frame rate, the backend ISP or DSP can not collect data, no way to produce the interruption of the camera, Causes the task to suspend (enters the idle task) or freezes to reboot (is bitten by the dog) the phenomenon, the concrete reason and the countermeasure may analyze and solve according to the different platform realization method.
here Sensor two basic state of debugging even if the basic completion, the latter is the maintenance of this code.
Work Frequency Interference:
If the mobile phone appears in the following figure of this water ripple is the power frequency interference. The power frequency interference is caused by the flashing of indoor fluorescent lamp. CMOS and CCD two different processes produced by the sensor power frequency interference phenomenon is not the same, which is caused by the different ways of exposure.
CMOS is the line exposure, that is, in each line exposure time determines the brightness of the screen, for example: a 50HZ light source, voltage curve for a sine wave, that energy curve qualitative analysis can be considered to take the absolute value of the voltage curve. That's the cycle of energy doing 1/100 seconds. The time required for exposure must be an integer multiple of 1/100 seconds. If you do not adjust the exposure time to 1/100 seconds of integer multiples, it is possible that each row of the exposure value is different, resulting in the same image with water ripple phenomenon. The CCD is the whole frame at the same time exposure, so the power frequency interference performance is the image has a slight flicker. The principle of production is similar to that of CMOS sensor.
If you find such a problem, you can first calculate the exposure time, and then on the basis of fine-tuning. I believe it will soon be able to adjust to no frequency interference.
Micron has a register that can adjust the exposure to eliminate power-frequency interference.
Algorithm: Line Time *0x58 (page 2)/pclk=n/100 (the value of this formula to be calculated by fine-tuning the test, Reg 0X58 is mt9m111 registers, different sensor is not the same, but must be able to find a similar register, N is the natural number, Datasheet inside did not introduce, hehe, but if you understand the CMOS exposure principle, I believe it is easy to understand.
Brightness and night view mode:
I believe now we all know that the brightness of the image and exposure time, so in order to let the dark picture can clearly show the need to increase sensor to the dark image of the exposure time, that is, line times will be set more than the normal mode of time, so that the CMOS Sensor has more exposure time, which increases brightness.
The above picture is not using the night view mode of photos, the following image is the use of night scene mode photos
We have two ways to control image brightness, one is to use AE target, one is to increase the gray gain. We are using AE target mode, so the picture color is more lifelike. This method will affect the frame rate, and when the framing rate reaches the limit of our limit, it will be done with an increase in the simulation gain, thus amplifying the image noise at the same time. Generally do not need to increase the digital amplification gain to adjust the image brightness.
Debug article
Unlike the General IC driver, sensor need to debug the image quality in addition to working and stability. At this point, all sensor have to undergo the same assessment test, debugging. There are usually several main areas of camera debugging:
Definition of the test
Using ISO12233 Standard Board test
Center Vertical resolution
Center Horizontal Resolution
Mainly see the naked eye just can distinguish the tick value of line time.
Sharpness is determined mainly by sensor fabrication process level and lens parameters, but can be enhanced by the adjustment of sharpness (sharp).
The side effect is that the higher sharpness has an effect on the smoothness of the image, making the edges of the object especially visible and even jagged.
Gray-scale reproduction test
General sensor will have a set of registers to adjust the gamma curve, which is what we call gamma table, because of my limited level, this debugging is usually done by micron engineers.
Gamma graph
the uniformity of the picture and the compensation of the dark Foot:
Through the above figure we can find that the picture is not very uniform, the center and edge of the brightness is significantly different, due to the lens of the reason, sensor is always the middle of the pixel exposure is relatively sufficient. Micron can debug lens shading to solve this problem, can debug the picture more evenly.
The picture after debugging
This debugging process to prevent the occurrence of the aperture phenomenon.
Distortion:
The distortion is formed by lenses and is determined by the camera manufacturing process, so it cannot be improved by changing ISP settings.
The following figure is a sample of the test distortion.
White balance:
When it comes to white balance, we first establish the concept of color temperature.
In fact, in the field of photography, light sources are mostly defined by their color Win Lai. The unit of color temperature is Kelvin, the color that appears at different temperature is color temperature. When a black object is heated, it starts to glow, it will turn dark red first, and as the temperature continues to rise it will turn yellow, then white, and eventually blue (you can look at the filament in the bulb, but you won't see it turning blue because of the temperature limit). In short, this phenomenon is very common in daily life.
The image above is a picture of the same set of objects irradiated by different color temperature sources.
The human brain can carefully analyze the signals received from the eyes, and thus perceive different color temperature (color temperature) to show the same white. But camera but not, in the morning of the photo is red, and dusk is yellow, even if the same piece of white paper in different circumstances were photographed, such as different time, different light source, will appear to varying degrees of deviation.
To adjust the white balance is to give a definition of white, to correctly record the color that our eyes see. Micron sensor gives two kinds of control methods of white balance, one is automatic white balance (AWB) and the other is manual white balance (MWB).
Automatic white balance is the default setting for sensor internal ISP, the ISP has a complex rectangular chart, it can determine the screen white balance datum point, so as to achieve white balance adjustment. Because the mobile phone camera not belong to the camera category of high color requirements, so we generally use automatic white balance, automatic white balance in the light source is not particularly complex when there is a better effect.
Manual white balance needs to set its own r,g,b gain value, micron to our registers have specifically set these three values register.