研究了一段時間Android的surface系統,一直執著地認為所有在surface或者螢幕上顯示的畫面,必須要轉換成RGB才能顯示,yuv資料也要通過色彩空間轉換成RGB才能顯示。可最近在研究stagefright視頻顯示時發現,根本找不到omx解碼後的yuv是怎麼轉換成RGB的代碼,yuv資料在render之後就找不到去向了,可畫面確確實實的顯示出來了,這從此顛覆了yuv必須要轉換成RGB才能顯示的真理了。
稍微看一下AsomePlayer的代碼,不難發現,視頻的每一幀是通過調用了SoftwareRenderer來渲染顯示的,我也嘗試用利用SoftwareRenderer來直接render yuv資料顯示,竟然成功了,這是一個很大的突破,比如以後網路攝影機採集到的yuv,可以直接丟yuv資料到surface顯示,無需耗時耗效率的yuv轉RGB了。
代碼原創,貼出來與大家分享:Android 4.4平台 (其中yuv資料的地址可以從這裡下載點擊開啟連結,放到/mnt/sdcard目錄)
#include#include #include #include #include #include #include #include #include #include #include #include #include using namespace android;bool getYV12Data(const char *path,unsigned char * pYUVData,int size){FILE *fp = fopen(path,"rb");if(fp == NULL){printf("read %s fail !!!!!!!!!!!!!!!!!!!\n",path);return false;}fread(pYUVData,size,1,fp);fclose(fp);return true;}int main(void){// set up the thread-pool sp proc(ProcessState::self()); ProcessState::self()->startThreadPool();// create a client to surfaceflinger sp client = new SurfaceComposerClient();sp dtoken(SurfaceComposerClient::getBuiltInDisplay( ISurfaceComposer::eDisplayIdMain));DisplayInfo dinfo;//擷取螢幕的寬高等資訊 status_t status = SurfaceComposerClient::getDisplayInfo(dtoken, &dinfo);printf("w=%d,h=%d,xdpi=%f,ydpi=%f,fps=%f,ds=%f\n", dinfo.w, dinfo.h, dinfo.xdpi, dinfo.ydpi, dinfo.fps, dinfo.density); if (status) return -1;//建立surface,有些系統可能報錯,dinfo.w和dinfo.h也可以寫成固定值 sp surfaceControl = client->createSurface(String8("showYUV"), dinfo.w, dinfo.h, PIXEL_FORMAT_RGBA_8888, 0);/*************************get yuv data from file;****************************************/printf("[%s][%d]\n",__FILE__,__LINE__);int width,height;width = 320;height = 240;int size = width * height * 3/2;unsigned char *data = new unsigned char[size];char *path = "/mnt/sdcard/yuv_320_240.yuv";getYV12Data(path,data,size);//get yuv data from file;/*********************配置surface*******************************************************************/ SurfaceComposerClient::openGlobalTransaction(); surfaceControl->setLayer(100000);//設定Z座標surfaceControl->setPosition(100, 100);//以左上方為(0,0)設定顯示位置surfaceControl->setSize(width, height);//設定視頻顯示大小 SurfaceComposerClient::closeGlobalTransaction();sp surface = surfaceControl->getSurface();printf("[%s][%d]\n",__FILE__,__LINE__);/****************************************************************************************/sp meta = new MetaData;meta->setInt32(kKeyWidth, width);meta->setInt32(kKeyHeight, height);/*指定yuv格式,支援以下yuv格式 * OMX_COLOR_FormatYUV420Planar: * OMX_TI_COLOR_FormatYUV420PackedSemiPlanar: * HAL_PIXEL_FORMAT_YV12: * 其他的貌似會轉換成OMX_COLOR_Format16bitRGB565 */meta->setInt32(kKeyColorFormat, HAL_PIXEL_FORMAT_YV12);//setRect不要也可以,我也不知道設定了有什麼用,原理是什麼,但是設定,參數一定要正確meta->setRect( kKeyCropRect, 0,//left 0,//top width -1,//right height -1);//bottomprintf("[%s][%d]\n",__FILE__,__LINE__);SoftwareRenderer* sr = new SoftwareRenderer(surface,meta);//初始化printf("[%s][%d]\n",__FILE__,__LINE__);sr->render(data,size,NULL);//關鍵在這裡,顯示到螢幕上delete[] data;printf("[%s][%d]\n",__FILE__,__LINE__);IPCThreadState::self()->joinThreadPool();//可以保證畫面一直顯示,否則瞬間消失 IPCThreadState::self()->stopProcess();return 0;}
LOCAL_PATH:= $(call my-dir)include $(CLEAR_VARS)LOCAL_SRC_FILES:= \main.cppLOCAL_STATIC_LIBRARIES := \libstagefright_color_conversionLOCAL_SHARED_LIBRARIES := \libcutils \libutils \libbinder \ libui \ libgui \libstagefright\libstagefright_foundationLOCAL_C_INCLUDES := \frameworks/native/include/media/openmax \ frameworks/av/media/libstagefrightLOCAL_MODULE:= showYUVLOCAL_MODULE_TAGS := testsinclude $(BUILD_EXECUTABLE)