The code distribution of Android camera is as follows:
1. The architecture of the Upper-layer app of camera is not analyzed. Everyone should be clear about it.
2. The client section in the C/S structure commonly used for Android multimedia
Frameworks/base/CORE/Java/Android/hardware/camera. Java, Android. Hardware. Camera
Frameworks/base/CORE/JNI/android_hardware_camera.cpp, generate library libandroid_runtime.so
Frameworks/base/libs/camera, generate library libcamera_client.so; implement the camera class, which inherits bncameraclient and deathrecipient.
The following is the server section:
Frameworks/base/camera/libcameraservice to generate libcameraservice. So
Implemented the cameraservice class, which inherits bncameraservice (icameraservice) and implements instantiate (). It contains an internal class client: Public bncamera. The implementation of the icamera class mainly relies on bncamera. opencamerahardware () is called in this class.
3. The following is a unique part of the tegra2 platform:
Hardware/tegra/hal/libnvomxcamera, libcamera. So, implement the camerahardwareinterface interface, and opencamerahardware () is implemented in this library.
Hardware/tegra/CORE/Drivers/openmax/ilclient, libnvomxilclient. So
This library dynamically loads the libnvomx. So library and assigns values to the struct nvxframework containing nine OMX core functions. This method bypasses pvmf. Pmvf is actually an OMX client for OMX.
Hardware/tegra/CORE/Drivers/openmax/IL, libnvomx. So: OMX core library
Libnvodm_imager.so: Specifies the Hal library of the ODM image. By default, NV only provides binary. During full build, the Library is copied to the system directory and then integrated into the system. IMG.
Libnvodm_query.so: the ODM Query Library. The configuration of gpio, power supply, I2C, and other related hardware is completed in this library.
For a mobile phone that supports two or more camera, the upper layer tells the bottom layer which camera to use. Then, every time OMX re-constructs OMX graph, different camera hardware is used in the final enable port. The upper-layer processing is basically consistent.
Add a camera in Android tegra2
The tegra chip has not yet been well integrated with the Android system. After all, NVIDIA's decision to support Android is not too long. I have heard that it is already in progress and will be well integrated on android3.0, this is just speculation.
In this case, the camera driver will not be placed under the kernel. Currently, she is placed under hardware/tegra/ODM/product, to add a camera and its driver, follow these steps:
1) odm_kit/query/include/nvodm_query_discovery_imager.h
Define an ID sequence, for example
# Define qq1234_guid nv_odm_guid ('s ',' _ ', 'Q', 'Q', '1', '2', '3', '4 ')
2) odm_kit/query/subboards/nvodm_query_discovery_e *** _ addresses. h
Configure camera hardware connection Parameters
# Define qq1234_pins (nvodm_camera_device_is_default)
Static const nvodmioaddress s_ffaimagerqq1234addresses [] =
{
I2C configuration;
Reset gpio configuration;
Powerdown gpio configuration;
Camera VDD configuration;
Vcsi configuration;
Video Input configuration;
External clock (CSUs) configuration;
};
3) odm_kit/query/subboards/nvodm_query_discovery_e *** _ peripherals. h
Camera device entry address is associated with guid
// Qq1234
{
Qq1234_guid,
S_ffaimagerqq1234addresses,
Nv_array_size (s_ffaimagerqq1234addresses ),
Nvodmperipheralclass_imager
},
4) odm_kit/adaptations/Imager/Android. mk
Local_src_files + = sensor_yuv_qq1234.c
5) odm_kit/adaptations/Imager/imager_hal.c
The camera type that will be enumerated when the Hal layer is added
# Include "sensor_yuv_qq1234.h"
Devicehaltable g_sensorhaltable [] = {
....
{Qq1234_guid, sensoryuvqq1234_gethal },
....
};
5) odm_kit/adaptations/Imager/sensor_yuv_qq1234.c
Odm_kit/adaptations/Imager/sensor_yuv_qq1234.h
Nvbool sensoryuvxqq1234_gethal (nvodmimagerhandle himager );
This is a file for specific implementation of the configuration and functions of the camera device. Hardware calibration is mainly used to modify sensor_yuv_qq1234.c.
1: NvBool SensorYuvQQ1234_GetHal(NvOdmImagerHandle hImager)
2:
3: {
4:
5: if (!hImager || !hImager->pSensor)
6:
7: return NV_FALSE;
8:
9: hImager->pSensor->pfnOpen = SensorYuv_Open;
10:
11: hImager->pSensor->pfnClose = SensorYuv_Close;
12:
13: hImager->pSensor->pfnGetCapabilities = SensorYuv_GetCapabilities;
14:
15: hImager->pSensor->pfnListModes = SensorYuv_ListModes;
16:
17: hImager->pSensor->pfnSetMode = SensorYuv_SetMode;
18:
19: hImager->pSensor->pfnSetPowerLevel = SensorYuv_SetPowerLevel;
20:
21: hImager->pSensor->pfnGetPowerLevel = SensorYuv_GetPowerLevel;
22:
23: hImager->pSensor->pfnSetParameter = SensorYuv_SetParameter;
24:
25: hImager->pSensor->pfnGetParameter = SensorYuv_GetParameter;
26:
27: return NV_TRUE;
28:
29: }
From: http://qiuzhenqing.blog.edu.cn/2010/581021.html
Http://qiuzhenqing.blog.edu.cn/2010/591863.html