[原創]讓android支援RTSP(live555分析)

來源:互聯網
上載者:User
如何讓Android支援C++異常機制

Android不支援C++異常機制,如果需要用到的話,則需要在編譯的時候加入比較完整的C++庫.
Android支援的C++庫可以在Android NDK中找到(解壓後找到libsupc++.a放到代碼環境中即可):
http://www.crystax.net/en/android/ndk/7
編譯時間加上參數:
-fexceptions -lstdc++
還需要將libsupc++.a連結上

 

移植live555到Android的例子

https://github.com/boltonli/ohbee/tree/master/android/streamer/jni

 

RTSP協議

參考: rfc2326, rfc3550, rfc3984

RTP Header結構[#0]
 

0                   1                   2                   3 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+|V=2|P|X|  CC   |M|     PT      |       sequence number         |+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+|                           timestamp                           |+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+|           synchronization source (SSRC) identifier            |+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+|            contributing source (CSRC) identifiers             ||                             ....                              |+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+

 

H.264視頻格式

參考: rfc3984, 『H.264中的NAL技術』, 『H.264 NAL層解析』

 

ACC音頻格式

參考: ISO_IEC_13818-7.pdf

 

live555架構分析0  總述

0.1  這裡主要以H264+ACC為基礎作介紹

0.2  live555中的demo說明,RTSP服務端為live555MediaServer,openRTSP為調試用用戶端。

0.3  可以在live555中實現一個trace_bin的函數跟蹤流媒體資料的處理過程。

    void trace_bin(const unsigned char *bytes_ptr, int bytes_num)    {        #define LOG_LINE_BYTES 16        int i, j;        for (i = 0; i <= bytes_num / LOG_LINE_BYTES; i++) {            for (j = 0;                 j < ( (i < (bytes_num / LOG_LINE_BYTES))                       ? LOG_LINE_BYTES                       : (bytes_num % LOG_LINE_BYTES) );                 j++)            {                if (0 == j) printf("%04d   ", i * LOG_LINE_BYTES);                if (LOG_LINE_BYTES/2 == j) printf("   ");                printf(" %02x", bytes_ptr[i * LOG_LINE_BYTES + j]);            }            printf("\n");        }    }

 

1  宏觀流程

1.1  對每個播放請求建立一個session,並對應音視頻建立subsession,subsession則是具體處理流媒體的單位。

------------------------------------------------------[#1]--session     <--->  client requst   |subsession  <--->  audio/video------------------------------------------------------------

1.2  資料處理流程:

------------------------------------------------------[#2]--source --> filter(source) ... --> sink|                           |      |+-------+-------------------+      |        |                          v        v                    subsession.createNewRTPSink()subsession.createNewStreamSource()------------------------------------------------------------

1.3  BasicTaskScheduler::SingleStep()

BasicTaskScheduler是live555的任務處理器,他的主要工作都是在SingleStep()中完成的.

在SingleStep()中主要完成下面三種工作:

    void BasicTaskScheduler::SingleStep(unsigned maxDelayTime) {        //1. 處理io任務        ...        int selectResult = select(fMaxNumSockets, &readSet, &writeSet,                                  &exceptionSet, &tv_timeToDelay);        ...        while ((handler = iter.next()) != NULL) {            ...            (*handler->handlerProc)(handler->clientData, resultConditionSet);            break;        }        ...                //2. handle any newly-triggered event        ...                //3. handle any delayed event        fDelayQueue.handleAlarm();    }

RTSP請求、連結建立、開始播放處理主要是在1中完成的,而視頻播放主要是在3中完成。
以ACC播放為例:

    void ADTSAudioFileSource::doGetNextFrame() {        // 讀取資料並做一些簡單處理        ...        int numBytesRead = fread(fTo, 1, numBytesToRead, fFid);        ...        // 將FramedSource::afterGetting加入fDelayQueue中        // FramedSource::afterGetting會處理讀取到資料,並又會調用        // ADTSAudioFileSource::doGetNextFrame(),這樣實現迴圈讀取檔案。        nextTask() = envir().taskScheduler().scheduleDelayedTask(0,                    (TaskFunc*)FramedSource::afterGetting, this);    }

1.4  DelayQueue
fDelayQueue是一個需要處理的任務的隊列,每次SingleStep()只會執行第一個任務head(),這裡的任務對應DelayQueue的元素,DelayQueue的各個元素都會有自己的DelayTime,用來表示延時多久後執行。而隊列中的元素便是按照DelayTime有小到大排列的,元素中fDeltaTimeRemaining記錄的是該元素相對於它之前元素的延時。參照函數DelayQueue::addEntry()便可看出是如何入隊列的。
例如([]中的數字便是相對延時(fDeltaTimeRemaining)):

    [0]->[1]->[3]->[2]->...->[1]->NULL     ^     |    head()

在處理DelayQueue時往往都要先做一次計時同步操作synchronize(),因為DelayQueue中元素的延時都是相對的,所以一般只要處理首元素即可,不過如果同步之後延時有小於0的,便都會改為DELAY_ZERO(即表示需要立即執行的)。
執行任務:

    void DelayQueue::handleAlarm() {        ...        toRemove->handleTimeout();    }    void AlarmHandler::handleTimeout() {        (*fProc)(fClientData);        DelayQueueEntry::handleTimeout();    }

任務在處理完成後便會被刪除。

 

2  類別關係

   * live555的流程分析主要就放在這個章節中,如果有需要參考函數關係或者對象關係的請參考3, 4兩個章節。
2.1  涉及到的主要類的關係圖:

------------------------------------------------------[#3]--Medium  +ServerMediaSubsession  |  +OnDemandServerMediaSubsession  |     +FileServerMediaSubsession  |        +H264VideoFileServerMediaSubsession      //h264  |        +ADTSAudioFileServerMediaSubsession      //aac  |  +MediaSink  |  +RTPSink  |     +MultiFramedRTPSink  |        +VideoRTPSink  |        |  +H264VideoRTPSink                     //h264  |        +MPEG4GenericRTPSink                     //aac  |  +MediaSource     +FramedSource      //+doGetNextFrame(); +fAfterGettingFunc;        +FramedFilter        |  +H264FUAFragmenter                       //h264        |  +MPEGVideoStreamFramer        |     +H264VideoStreamFramer                //h264        +FramedFileSource           +ByteStreamFileSource                    //h264           +ADTSAudioFileSource                     //accStreamParser  +MPEGVideoStreamParser     +H264VideoStreamParser                         //h264------------------------------------------------------------

我們看下FramedFilter和FramedFileSource相對於FramedSource增加了哪些成員:

    FramedFilter {        FramedSource* fInputSource;    }    FramedFileSource {        FILE* fFid;    }

從兩者的命名和增加的成員可以看出各自的作用。FramedFilter便是對應著[#2]中的filter,而FramedFileSource則是以本地檔案為輸入的source。

2.2  如何?帶有filter流程:
這便用到了FramedFilter中的fInputSource成員,以H264為例,

    H264VideoStreamFramer.fInputSource = ByteStreamFileSource;     H264FUAFragmenter.fInputSource = H264VideoStreamFramer;

將上遊source賦值到下遊filter的fInputSource即可,對於H264便可以得到下面的一個處理流程:

    ByteStreamFileSource -> H264VideoStreamFramer -> H264FUAFragmenter -> H264VideoRTPSink

在H264VideoStreamFramer的父類MPEGVideoStreamFramer中也有新增成員,

    MPEGVideoStreamFramer {         MPEGVideoStreamParser* fParser;     }    MPEGVideoStreamFramer.fParser = H264VideoStreamParser;

H264VideoStreamParser是用來filter過程中處理視頻資料的。

在MultiFramedRTPSink::buildAndSendPacket()中添加RTP頭[#0]。

 

3  函數關係

3.1  H264函數調用關係

------------------------------------------------------[#4]-- RTSPServer::RTSPClientSession::handleCmd_SETUP() OnDemandServerMediaSubsession::getStreamParameters(     streamToken: new StreamState(        fMediaSource: H264VideoFileServerMediaSubsession::createNewStreamSource() ) )********** RTSPServer::RTSPClientSession::handleCmd_DESCRIBE() ServerMediaSession::generateSDPDescription() OnDemandServerMediaSubsession::sdpLines() H264VideoFileServerMediaSubsession::createNewStreamSource() H264VideoStreamFramer::createNew( fInputSource: ByteStreamFileSource::createNew(),                                   fParser: new H264VideoStreamParser(                                      fInputSource: H264VideoStreamFramer.fInputSource) )********** RTSPServer::RTSPClientSession::handleCmd_PLAY() H264VideoFileServerMediaSubsession::startStream() [OnDemandServerMediaSubsession::startStream()] StreamState::startPlaying() H264VideoRTPSink::startPlaying() [MediaSink::startPlaying(fMediaSource)]//got in handleCmd_SETUP()H264VideoRTPSink::continuePlaying()     fSource, fOurFragmenter: H264FUAFragmenter(fInputSource: fMediaSource) MultiFramedRTPSink::continuePlaying() MultiFramedRTPSink::buildAndSendPacket() MultiFramedRTPSink::packFrame() H264FUAFragmenter::getNextFrame() [FramedSource::getNextFrame()] H264FUAFragmenter::doGetNextFrame() {1} 1)=No NALU=   H264VideoStreamFramer::getNextFrame() [FramedSource::getNextFrame()]   MPEGVideoStreamFramer::doGetNextFrame()   H264VideoStreamParser::registerReadInterest()   MPEGVideoStreamFramer::continueReadProcessing()   H264VideoStreamParser::parse()   H264VideoStreamFramer::afterGetting() [FramedSource::afterGetting()]   H264FUAFragmenter::afterGettingFrame()   H264FUAFragmenter::afterGettingFrame1()   goto {1}  //Now we have got NALU 2)=Has NALU=   FramedSource::afterGetting()   MultiFramedRTPSink::afterGettingFrame()   MultiFramedRTPSink::afterGettingFrame1()   MultiFramedRTPSink::sendPacketIfNecessary() ------------------------------------------------------------
4  對象關係

4.1  H264對象關係圖

------------------------------------------------------[#5]-- ServerMediaSession{#1}.fSubsessionsTail = H264VideoFileServerMediaSubsession{2}.fParentSession = {1} fStreamStates[] {   .subsession  = {2}   .streamToken = StreamState {                    .fMaster = {2}                    .fRTPSink = H264VideoRTPSink{5}.fSource/fOurFragmenter                              = H264FUAFragmenter{4} {                                  .fInputSource = H264VideoStreamFramer{3}                                  .fAfterGettingFunc = MultiFramedRTPSink::afterGettingFrame()                                  .fAfterGettingClientData = {5}                                  .fOnCloseFunc = MultiFramedRTPSink::ourHandleClosure()                                  .fOnCloseClientData = {5}                                }                    .fMediaSource = {3} {                                      .fParser = H264VideoStreamParser {                                                   .fInputSource = ByteStreamFileSource{6}                                                   .fTo = [{5}.]fOutBuf->curPtr()                                                 }                                      .fInputSource = {6}                                      .fAfterGettingFunc = H264FUAFragmenter::afterGettingFrame()                                      .fAfterGettingClientData = {4}                                      .fOnCloseFunc = FramedSource::handleClosure()                                      .fOnCloseClientData = {4}                                    }                  } } ------------------------------------------------------------

4.2  AAC對象關係圖

------------------------------------------------------[#6]-- ServerMediaSession{1}.fSubsessionsTail = ADTSAudioFileServerMediaSubsession{2}.fParentSession = {1} fStreamStates[] {   .subsession  = {2}   .streamToken = StreamState {                    .fMaster = {2}                    .fRTPSink = MPEG4GenericRTPSink {                                  .fOutBuf = OutPacketBuffer                                  .fSource = ADTSAudioFileSource {3}                                  .fRTPInterface = RTPInterface.fGS = Groupsock                                }                    .fMediaSource = {3}                  } } ------------------------------------------------------------

 

5  RTSP

5.1  RTSP命令和處理函數的對應關係:

    RTSP命令               live555中處理函數     ---------------------------------------------     OPTIONS        <--->  handleCmd_OPTIONS     DESCRIBE       <--->  handleCmd_DESCRIBE     SETUP          <--->  handleCmd_SETUP     PLAY           <--->  handleCmd_PLAY     PAUSE          <--->  handleCmd_PAUSE     TEARDOWN       <--->  handleCmd_TEARDOWN     GET_PARAMETER  <--->  handleCmd_GET_PARAMETER     SET_PARAMETER  <--->  handleCmd_SET_PARAMETER

5.2  RTSP播放互動樣本(openRTSP)

-------------------------------------------------------------------------------- ubuntu$ ./openRTSP rtsp://192.168.43.1/grandma.264 Opening connection to 192.168.43.1, port 554... ...remote connection opened Sending request: OPTIONS rtsp://192.168.43.1/grandma.264 RTSP/1.0 CSeq: 2 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29)Received 152 new bytes of response data. Received a complete OPTIONS response: RTSP/1.0 200 OK CSeq: 2 Date: Tue, Jan 25 2011 21:02:53 GMT Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER-------------------------------------------------------------------------------- Sending request: DESCRIBE rtsp://192.168.43.1/grandma.264 RTSP/1.0 CSeq: 3 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) Accept: application/sdpReceived 682 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 3 Date: Tue, Jan 25 2011 21:02:53 GMT Content-Base: rtsp://192.168.43.1/grandma.264/ Content-Type: application/sdp Content-Length: 517v=0 o=- 1295989373493698 1 IN IP4 0.0.0.0 s=H.264 Video, streamed by the LIVE555 Media Server i=grandma.264 t=0 0 a=tool:LIVE555 Streaming Media v2012.02.04 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:H.264 Video, streamed by the LIVE555 Media Server a=x-qt-text-inf:grandma.264 m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:500 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=4D4033;sprop-parameter-sets=Z01AM5p0FidCAAADAAIAAAMAZR4wZUA=,aO48gA== a=control:track1Opened URL "rtsp://192.168.43.1/grandma.264", returning a SDP description: v=0 o=- 1295989373493698 1 IN IP4 0.0.0.0 s=H.264 Video, streamed by the LIVE555 Media Server i=grandma.264 t=0 0 a=tool:LIVE555 Streaming Media v2012.02.04 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:H.264 Video, streamed by the LIVE555 Media Server a=x-qt-text-inf:grandma.264 m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:500 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=4D4033;sprop-parameter-sets=Z01AM5p0FidCAAADAAIAAAMAZR4wZUA=,aO48gA== a=control:track1Created receiver for "video/H264" subsession (client ports 56488-56489)-------------------------------------------------------------------------------- Sending request: SETUP rtsp://192.168.43.1/grandma.264/track1 RTSP/1.0 CSeq: 4 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) Transport: RTP/AVP;unicast;client_port=56488-56489Received 205 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 4 Date: Tue, Jan 25 2011 21:02:53 GMT Transport: RTP/AVP;unicast;destination=192.168.43.244;source=192.168.43.1;client_port=56488-56489;server_port=6970-6971 Session: 7626020DSetup "video/H264" subsession (client ports 56488-56489) Created output file: "video-H264-1"-------------------------------------------------------------------------------- Sending request: PLAY rtsp://192.168.43.1/grandma.264/ RTSP/1.0 CSeq: 5 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) Session: 7626020D Range: npt=0.000-Received 186 new bytes of response data. Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 5 Date: Tue, Jan 25 2011 21:02:53 GMT Range: npt=0.000- Session: 7626020D RTP-Info: url=rtsp://192.168.43.1/grandma.264/track1;seq=26490;rtptime=1809652062Started playing session Receiving streamed data (signal with "kill -HUP 6297" or "kill -USR1 6297" to terminate)... Received RTCP "BYE" on "video/H264" subsession (after 35 seconds)-------------------------------------------------------------------------------- Sending request: TEARDOWN rtsp://192.168.43.1/grandma.264/ RTSP/1.0 CSeq: 6 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) Session: 7626020DReceived 65 new bytes of response data. Received a complete TEARDOWN response: RTSP/1.0 200 OK CSeq: 6 Date: Tue, Jan 25 2011 21:03:28 GMT --------------------------------------------------------------------------------
相關文章

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.