標籤:android camera
硬體平台:Atmel SAMA5D3 SoC + OV2640 Camera SensorAndroid版本:4.2.2
mediaserver進程是Camera Service的容器進程,它會動態載入Camera HAL和Gralloc HAL。視頻資料幀首先必須從Camera驅動程式到達Camera硬體抽象層。在Camera硬體抽象層,視頻資料幀被從video capture buffer拷貝到gralloc buffer。
surfaceflinger進程作為顯示伺服器會動態載入HWComposer HAL和Gralloc HAL。在HWComposer硬體抽象層,會把資料幀從gralloc buffer拷貝到video output buffer。
經過上述過程,Camera Sensor採集的映像最終通過LCDC HEO顯示在顯示屏上。
圖中紅色實線為視頻資料幀流向,不帶箭頭的紅線串連的兩端為同一塊記憶體。
涉及三塊記憶體,分別如下:video capture buffer /dev/video1gralloc buffer 匿名共用記憶體 mediaserver進程和surfacelinger進程都可以訪問這塊記憶體 video output buffer /dev/video0
進行了兩次資料拷貝操作,如下:media server進程Camera HAL video capture buffer -> gralloc buffer
surfaceflinger進程HWCompser HAL gralloc buffer -> video output buffer
video capture buffer的分配與記憶體映射https://github.com/Android4SAM/platform_hardware_atmel/blob/android4sam_v4.0/camera/CameraHardwareSam.cpphttps://github.com/Android4SAM/platform_hardware_atmel/blob/android4sam_v4.0/camera/V4L2Camera.cpp
申請video capture buffer
CameraHardwareSam
::startPreviewInternal -> V4L2Camera
::startPreview -> isi_v4l2_reqbufs
ret = ioctl(fp, VIDIOC_REQBUFS, &req);
記憶體映射CameraHardwareSam
::startPreviewInternal
mPreviewHeap = mGetMemoryCb((int)mV4L2Camera->getCameraFd(), aligned_buffer_size, kBufferCount, 0);
獲得gralloc buffer以及視頻資料幀從video capture buffer到gralloc buffer的拷貝https://github.com/Android4SAM/platform_hardware_atmel/blob/android4sam_v4.0/camera/CameraHardwareSam.cpp
CameraHardwareSam
::previewThread
if (mPreviewWindow && mGrallocHal) { buffer_handle_t *buf_handle; int stride; if (0 != mPreviewWindow->dequeue_buffer(mPreviewWindow, &buf_handle, &stride)) { ALOGE("Could not dequeue gralloc buffer!\n"); goto callbacks; } void *vaddr; if (!mGrallocHal->lock(mGrallocHal, *buf_handle, GRALLOC_USAGE_SW_WRITE_OFTEN, 0, 0, width, height, &vaddr)) { char *frame = ((char *)mPreviewHeap->data) + offset; // the code below assumes YUV, not RGB { int h; char *src = frame; char *ptr = (char *)vaddr; memcpy(ptr, src, frame_size); //YUY2toYV12(frame, vaddr, width, height); } mGrallocHal->unlock(mGrallocHal, *buf_handle); } else ALOGE("%s: could not obtain gralloc buffer", __func__); if (0 != mPreviewWindow->enqueue_buffer(mPreviewWindow, buf_handle)) { ALOGE("Could not enqueue gralloc buffer!\n"); goto callbacks; } }
video output buffer的分配與記憶體映射https://github.com/Android4SAM/platform_hardware_atmel/blob/android4sam_v4.0/hwcomposer/hwcomposer.cpphttps://github.com/Android4SAM/platform_hardware_atmel/blob/android4sam_v4.0/hwcomposer/v4l2_utils.cpp
申請video ouput buffer
hwc_prepare -> assign_heo_overlay_window -> v4l2_overlay_req_buf
ret = ioctl(win->fd, VIDIOC_REQBUFS, &reqbuf);
記憶體映射
hwc_prepare -> assign_heo_overlay_window ->
v4l2_overlay_map_buf
*len = buf.length; *start = mmap(NULL, buf.length, PROT_READ | PROT_WRITE, MAP_SHARED, fd, buf.m.offset);
視頻資料幀從gralloc buffer到video output buffer的拷貝
hwc_set -> copy_heo_src_content
for (unsigned int i = 0; i < cur_layer->visibleRegionScreen.numRects; i++) { uint8_t *cur_dst_addr = dst_addr; uint8_t *cur_src_addr = src_addr; for (int j = 0; j < h ; j++) { memcpy(cur_dst_addr, cur_src_addr, cpy_size); cur_dst_addr = &cur_dst_addr[cpy_size]; cur_src_addr = &cur_src_addr[(cur_layer->displayFrame.right - cur_layer->displayFrame.left) * (prev_handle->uiBpp / 8)]; } cur_rect++; }