Hardware debugging on Qualcomm Android platform-camera

Source: Internet
Author: User

 

1. Introduction to camera imaging principles

Camera Workflow

The imaging principle of camera can be summarized as follows:

The scene (scene) is projected to the image sensor surface through an optical image generated by the lens (lens), and then converted into an electrical signal, after A/D (modulus conversion) after conversion, it becomes a digital image signal, which is then sent to the digital signal processing chip (DSP) for processing, then transmitted to the CPU through the IO Interface for processing, and the image can be seen through the display.

The charge coupler (CCD) or complementary metal oxide semiconductor (CMOs) receives an image transmitted from an optical lens and converts it to a digital signal through a mode/number converter (A/D, stored after encoding.

The process is as follows:
1. CCD/CMOS converts the optical signals of the subject into electrical signals-electronic images (analog signals)
2. The analog signal is converted into a digital signal by using the analog/digital converter (ADC) chip.
3. After digital signals are formed, the signals are compressed by the DSP or encoding library and converted to a specific image file format for storage.

The optical lens of a digital camera is the same as that of a traditional camera. It aggregates an image to a photosensitive device, that is, an optical charge coupler (CCD ). CCD replaces the position of the photosensitive film in traditional cameras. Its function is to convert optical signals into electrical signals, which are the same as those in TV cameras.

CCD is a semiconductor device and the core of a digital camera. The number of units of the included devices determines the imaging quality of a digital camera. The more pixels, the higher the number of pixels, the better the imaging quality, the pixel level usually represents the digital camera's grade and technical indicator.

2. Android camera framework

The camera subsystem of Android provides a frame for taking photos and recording videos.

It Concatenates the upper-layer applications of camera with the application framework and user libraries. This user library communicates with the hardware layer of camera to operate the camera hardware.

3. code structure of Android camera

The camera code of Android is mainly in the following directory:
Java section of camera
Packages/apps/camera /. Camera. Java is the main implementation file. The goal is camera.apk.
The com. Android. Camera package contains the following main class files:
Photoviewer: gallerypicker. Java (all image sets) ---> imagegallery. Java (a folder list) ---> viewimage. Java (View A specific image)

Videoplayer: gallerypicker. Java (all video sets) ---> movieview. Java (watch a video)
Camera: Camera. Java (camera video capture and photo taking)
Videocamera: videocamera. Java (video camera)

The framework of camera for upper-layer applications to call.

Base/CORE/Java/Android/hardware/camera. Java

The target part is framework. jar.

JNI section of camera
Frameworks/base/CORE/JNI/android_hardware_camera.cpp
This part of content is compiled to libandroid_runtime.so.

Camera UI Library
Frameworks/base/libs/UI/camera
This part of content is compiled into the library libcamera_client.so.

Camera Service Section
Frameworks/base/camera/libcameraservice/
This part of content is compiled into the library libcameraservice. So.

Camera Hal Layer
Hardware/msm7k/libcamera
Or
Vendor/qcom/Android-Open/libcamera2
To implement a specific function of camera, A hardware-related camera library is required at the Hal layer (for example, by calling the video for Linux driver and JPEG encoding program or directly using the private library implemented by various chip vendors, for example, libcamera implemented by Qualcomm. so and libqcamera. so) to implement the interface specified by camerahardwareinterface, call the relevant library, drive the relevant driver, and perform operations on the camera hardware. This library will be called by the service library libcameraservice. So of camera.

Hardware debugging on Qualcomm Android platform-camera

Debug two camera sensor models on Qualcomm's Android platform. One is ov's 5 m YUV sensor, which supports JPEG out and Af. debugging is relatively simple, because other projects are already in use, just port the relevant drivers. The other is Samsung's new 3 m yuv ff sensor, this article takes debugging the sensor as an example to describe how to debug camera on Qualcomm android from the perspective of the underlying driver, the architecture and Principles of the Qualcomm camera platform are not described too much.
I. Preparations
From the project perspective, the software must be ready before the hardware (board) ready. From the perspective of the underlying driver, the software part can be divided into two parts: one is related to the Qualcomm platform, and the other is the sensor part, the common practice is to port the sensor-related settings to the Qualcomm platform framework. In this way, you need to obtain the sensor spec and the sensor register setting file provided by the vendor. The purpose of spec is to understand the timing of communication between the Qualcomm platform and sensor (read/write registers) and related parameter settings; the setting file provided by the vendor is using various functions of camera (preview, snapshot ...) it needs to be written to the sensor.
In this project, the Qualcomm platform is msm7x27 and camera is Samsung 5ca. According to spec, the I2C ID of the sensor is 0x78, and the I2C communication adopts the dual-byte mode. In addition, the rules for reading and writing the sensor register are also clarified, from the debugging perspective, this is basically enough. In addition, the setting file provided by the vendor is actually a register list, which tells us when to write the register value, usually a register address plus a register value, however, Samsung provides text for debugging on the PC, which needs to be converted into a two-dimensional array in the C language. From the file, the register data can be divided into several parts: initialization, IQ settings (tuning-related), CLK settings, preview settings, snapshot settings, basically these are enough, you can use spec to adjust the brightness, set the special effect, and set the white balance.
After the sensor part is completed, the next step is to modify the driver of the Qualcomm Camera part, mainly including:
Kernal:
1. Check the sensor power configuration and modify the settings in the software. A total of 3 power supplies are used in this project: 2.8/1.8/1.5.
2. Check and modify the Sensor Reset settings. Note that the reset time must be consistent with that in spec; otherwise, the sensor may not work.
3. Modify the I2C driver, use the dual-byte read/write interface, and complete the interface for reading the Sensor ID. This is used to check whether I2C communication is OK.
4. Import register settings. Write the corresponding register values in initialization, preview, and snapshot.
Note: The reset and write register must follow the spec rules to add some delay; otherwise, the sensor may work abnormally.

User space:
This section mainly configures vfe Based on hardware specifications, such as the sensor output data format, interface mode, resolution size, and synchronous signal mode. It is relatively simple, but check carefully, debugging fails if any part is incorrect.
So far, the preparation of the software has come to an end.

Ii. Prepare the debugging environment (the Board is out, but the sensor sample is not ready yet)
First, prepare the test point.
Before debugging, you need to think about how to debug the sensor if it cannot work. This requires measuring some signals, for example, power, reset, I2C, M/p clk, H/V synchronous signal, data signal, etc., to ensure that these signals can be measured.
Next, you need to select the debugging environment of the Software. Here, you need to execute the Qualcomm mm-qcamera-test program in the ADB environment for debugging. The related traces can be printed out.
In this way, everything is ready, and only the sensor is missing.

 

Iii. debugging (sensor finally got it)
Connect the sensor to the board. After the server is started, the debugging program is run in the ADB. The preview screen is not displayed, but failed. It is a little disappointing. I thought it could be done in one breath, after all, this is a brand new sensor. If you do not think of a bit in any place, it will lead to failure. Find the reason.
1. First, I learned from the trace that I2C has read the Sensor ID: 0x05ca, which indicates that I2C communication is normal.
2. Check the sensor power configuration and measure the three power supplies that supply the sensor. They are all OK.

3. Measure the mclk. This is provided to the sensor and is normal (24 MHz)
4. pclk is measured. This is output by sensor. It is normal (58 MHz, and the maximum value of Qualcomm is 96 MHz) and consistent with that configured in the register.
5. Measure the H/V synchronous signal, which is output by the sensor and is normal. Consistent with FPS and resolution.
6. measurement data signal, which is output by sensor and is normal. (Data signal, which can be seen on the oscilloscope)
In this case, sensor is working normally, but why is the preview screen unavailable? Check the settings of Qualcomm.
From the trace, Qualcomm's vfe has been reset and started, But it is strange that the sensor has already output the Preview data, why didn't vfe spit out the data after receiving it? Is the vfe output by this sensor unidentifiable? To verify this problem, I measured the waveform of OV sensor output data on another board, mainly M/P CLK and H/V synchronization signals, and then compared them, however, no exception was found, except that the H/V synchronization signal is different, and the main duty cycles are different. Will this be the problem? For further verification, I measured both the H/V signal and the data signal, and then found ov
The data signal output by sensor is included in the lower level of the V-frame synchronization signal, while the data signal output by Samsung 5ca is included in the high level of the V-frame synchronization signal, is it because the V Signal Polarity is set incorrectly that vfe does not read the sensor output data? After checking the settings of Qualcomm vfe again, a parameter is used to set the polarity of the V signal. By default, this parameter is active low, and I have not modified it here. Then, change this parameter to active high. After you build and download it again, start the system and run it. OK. The preview screen is displayed normally. So far, the hardware debugging of sensor can be counted as completed, and other functions can be improved later.

Camera for FSL debugging

The camera Hal layer of FSL does not implement the interface for setting parameters from the upper layer to the lower layer, so you need to implement it yourself. Fortunately, the parameters from the application to the Hal layer have been completed, otherwise the workload will increase.
The parameter setting function called at the Hal layer is status_t camerahal: setparameters (const cameraparameters & Params ). Set each parameter in this function. Parameter settings are mainly implemented through the cameraparameters class. By observing this class, we can find that there is a get () function in it, and each parameter can be obtained separately. For example
Const char * white_balance = Params. Get (cameraparameters: key_white_balance); you can obtain the current White Balance parameter, that is, the return value. Then determine the situation based on the returned value, such
If (strcmp (white_balance, cameraparameters: white_balance_auto) = 0) {// determines Automatic white balance
Logv ("white_balance to IOCTL is Auto! /N ");
CTL. ID = v4l2_cid_auto_white_balance; // The Automatic white balance command. CTL is in the v4l2_control structure, which is very useful.
CTL. value = 1;
If (IOCTL (camera_device, vidioc_s_ctrl, & CTL) <0) {// pass the CTL structure down through vidioc_s_ctrl
LogE ("set control failed/N ");
// Return-1;
}
} Else if (strcmp (white_balance, cameraparameters: white_balance_incandescent) = 0) {// incandescent Mode
Logv ("white_balance to IOCTL is incandescent! /N ");
CTL. ID = v4l2_cid_do_white_balance; // This command is used for other white balances.
CTL. value = 2; // The value is sorted according to the number of White Balance modes defined by the user.
If (IOCTL (camera_device, vidioc_s_ctrl, & CTL) <0) {// use vidioc_s_ctrl to pass the CTL struct, and then discuss it based on the Value
LogE ("set control failed/N ");
// Return-1;
}
}

In mxc_v4l_ioctl of the mxc_v4l2_capture.c file of the driver, mxc_v4l_ioctl calls mxc_v4l_do_ioctl and mxc_v4l_do_ioctl to explain the command as follows:
/*!
* V4l2 vidioc_s_ctrl IOCTL
*/
Case vidioc_s_ctrl :{
Pr_debug ("case vidioc_s_ctrl/N ");
Retval = mxc_v4l2_s_ctrl (cam, ARG );
Break;
}
In this way, mxc_v4l2_s_ctrl is returned. In mxc_v4l2_s_ctrl, call the CTL. ID
Switch (c-> ID ){
......
Case v4l2_cid_auto_white_balance:
Ipu_csi_enable_mclk_if (csi_mclk_i2c, Cam-> CSI, true, true );
Ret = vidioc_int_s_ctrl (cam-> Sensor, c); // This function is equivalent to s_ctl in the ov7670 driver of v4l2.
Ipu_csi_enable_mclk_if (csi_mclk_i2c, Cam-> CSI, false, false );
Break;
Case v4l2_cid_do_white_balance:
Ipu_csi_enable_mclk_if (csi_mclk_i2c, Cam-> CSI, true, true );
Ret = vidioc_int_s_ctrl (cam-> Sensor, C );
Ipu_csi_enable_mclk_if (csi_mclk_i2c, Cam-> CSI, false, false );
Break;
......
Vidioc_int_s_ctrl () is the ioctl_s_ctrl in the ov7670 driver corresponding to v4l2. The specific code is not pasted due to space reasons.
You can perform this operation based on the ID of the CTL struct.
Switch (VC-> ID ){
.....
Case v4l2_cid_auto_white_balance:
Retval = ov7670_autowhitebalance (VC-> value );
Break;
Case v4l2_cid_do_white_balance:
Retval = ov7670_dowhitebalance (VC-> value );
Break;
......
The following is the implementation of the whitebalance function.
Static int ov7670_autowhitebalance (INT value)
{
Unsigned char V = 0;
Int ret;
Printk ("0v7670_autowhitebalance called/N ");
Ret = ov7670_read (ov7670_data.i2c_client, reg_com8, & V );
If (value)
V | = com8_awb; // Automatic white balance

Msleep (10);/* fixme */
RET + = ov7670_write (ov7670_data.i2c_client, 0x01, 0x56 );
RET + = ov7670_write (ov7670_data.i2c_client, 0x02, 0x44 );
RET + = ov7670_write (ov7670_data.i2c_client, reg_com8, V );

Return ret;
}

Static int ov7670_dowhitebalance (INT value)
{
Unsigned char V = 0;
Int ret;
Printk ("0v7670_dowhitebalance called value: % d/N", value );
Ret = ov7670_read (ov7670_data.i2c_client, reg_com8, & V );
If (value)
V & = ~ Com8_awb; // disable automatic White Balance

Msleep (10);/* fixme */
RET + = ov7670_write (ov7670_data.i2c_client, reg_com8, V );
If (value = 2) // incandescence // The value is the value of CTL.
{
RET + = ov7670_write (ov7670_data.i2c_client, 0x01, 0x8c );
RET + = ov7670_write (ov7670_data.i2c_client, 0x02, 0x59 );
} Else if (value = 3) // fluorescent
{
RET + = ov7670_write (ov7670_data.i2c_client, 0x01, 0x7e );
RET + = ov7670_write (ov7670_data.i2c_client, 0x02, 0x49 );
} Else if (value = 4) // daylight
{
RET + = ov7670_write (ov7670_data.i2c_client, 0x01, 0x52 );
RET + = ov7670_write (ov7670_data.i2c_client, 0x02, 0x66 );
}

Return ret;
}
Ox01 and 0x02 in the function are the blue-red channel gain registers respectively.

The above is the parameter setting process from the Hal layer to the sensor. Other processes, such as the color effect and scene capturing mode, are the same.
Set the specific register according to the specific situation, such as the night mode.
The color effect is mainly achieved by setting the UV value.

This article from the Linux community website (www.linuxidc.com) original link: http://www.linuxidc.com/Linux/2011-09/42372p2.htm

 

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.