[Original] enabling android to support RTSP (live555 Analysis)

Source: Internet
Author: User
How to enable android to support the C ++ exception Mechanism

Android does not support the C ++ exception mechanism. If needed, you need to add a complete C ++ library during compilation.
Android-supported C ++ libraries can be found in Android ndk (decompress the package and find libsupc ++. a In the code environment ):
Http://www.crystax.net/en/android/ndk/7
Add parameters during compilation:
-Fexceptions-lstdc ++
You also need to link libsupc ++.

 

Examples of migrating live555 to Android

Https://github.com/boltonli/ohbee/tree/master/android/streamer/jni

 

RTSP Protocol

Reference: rfc2326, rfc3550, rfc3984

RTP Header structure [#0]

0                   1                   2                   3 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+|V=2|P|X|  CC   |M|     PT      |       sequence number         |+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+|                           timestamp                           |+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+|           synchronization source (SSRC) identifier            |+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+=+|            contributing source (CSRC) identifiers             ||                             ....                              |+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+

 

H.264 video format

Reference: rfc3984, "nal Technology in H.264", "H.264 nal Layer Analysis 』

 

ACC audio format

Reference: ISO_IEC_13818-7.pdf

 

Live555 Architecture Analysis 0 Summary

0.1 here, we will introduce h264 + ACC as the basis.

0.2 live555 demo Description: the RTSP server is live555mediaserver, and openrtsp is the debugging client.

0.3 A trace_bin function can be implemented in live555 to track the processing process of streaming media data.

    void trace_bin(const unsigned char *bytes_ptr, int bytes_num)    {        #define LOG_LINE_BYTES 16        int i, j;        for (i = 0; i <= bytes_num / LOG_LINE_BYTES; i++) {            for (j = 0;                 j < ( (i < (bytes_num / LOG_LINE_BYTES))                       ? LOG_LINE_BYTES                       : (bytes_num % LOG_LINE_BYTES) );                 j++)            {                if (0 == j) printf("%04d   ", i * LOG_LINE_BYTES);                if (LOG_LINE_BYTES/2 == j) printf("   ");                printf(" %02x", bytes_ptr[i * LOG_LINE_BYTES + j]);            }            printf("\n");        }    }

 

1. macro process

1.1 create a session for each playback request and subsession for the corresponding audio and video. The subsession is the unit for processing streaming media.

------------------------------------------------------[#1]--session     <--->  client requst   |subsession  <--->  audio/video------------------------------------------------------------

1.2 data processing process:

------------------------------------------------------[#2]--source --> filter(source) ... --> sink|                           |      |+-------+-------------------+      |        |                          v        v                    subsession.createNewRTPSink()subsession.createNewStreamSource()------------------------------------------------------------

1.3 basictasksched.pdf: singlestep ()

Basictaskscheduler is the task processor of live555. His main work is completed in singlestep.

In singlestep (), we mainly complete the following three tasks:

Void basictaskschedstep: singlestep (unsigned maxdelaytime) {// 1. processing Io tasks... int selectresult = select (fmaxnumsockets, & readset, & writeset, & predictionset, & TV _timetodelay );... while (handler = ITER. next ())! = NULL ){... (* Handler-> handlerproc) (Handler-> clientdata, resultconditionset); break ;}... // 2. handle any newly-triggered event... // 3. handle any delayed event fdelayqueue. handlealarm ();}

RTSP requests, link creation, and start playback are mainly completed in step 1, while video playback is mainly completed in step 3.
Take ACC for example:

Void adtsaudiofilesource: dogetnextframe () {// read data and perform some simple processing... int numbytesread = fread (FTO, 1, numbytestoread, FFID );... // Add framedsource: aftergetting to fdelayqueue // framedsource: aftergetting will process the read data and call // adtsaudiofilesource: dogetnextframe (), in this way, the file is read cyclically. Nexttask () = envir (). taskscheduler (). scheduledelayedtask (0, (taskfunc *) framedsource: aftergetting, this );}

1.4 delayqueue
Fdelayqueue is a queue of tasks to be processed. Each singlestep () Only executes the first task head (). The task corresponds to the delayqueue element, each element of delayqueue has its own delaytime, which indicates how long the delay will be executed. The elements in the queue are arranged in ascending order of delaytime. The fdeltatimeremaining in the element records the delay of the element relative to its previous elements. Refer to the delayqueue: addentry () function to see how the queue is entered.
For example, the number in [] is relative latency (fdeltatimeremaining )):

    [0]->[1]->[3]->[2]->...->[1]->NULL     ^     |    head()

When processing delayqueue, you often have to perform a timing synchronization operation synchronize () first. Because the delay of elements in delayqueue is relative, you only need to process the first element, however, if the latency after synchronization is less than 0, it will be changed to delay_zero (that is, the operation that needs to be executed immediately ).
Execute the task:

    void DelayQueue::handleAlarm() {        ...        toRemove->handleTimeout();    }    void AlarmHandler::handleTimeout() {        (*fProc)(fClientData);        DelayQueueEntry::handleTimeout();    }

After the task is processed, it is deleted.

 

2 types of relationships

* Live555 Process Analysis is mainly included in this chapter. If you need to refer to function relationships or object relationships, see sections 3 and 4.
2.1 relationship diagrams of the main classes involved:

------------------------------------------------------[#3]--Medium  +ServerMediaSubsession  |  +OnDemandServerMediaSubsession  |     +FileServerMediaSubsession  |        +H264VideoFileServerMediaSubsession      //h264  |        +ADTSAudioFileServerMediaSubsession      //aac  |  +MediaSink  |  +RTPSink  |     +MultiFramedRTPSink  |        +VideoRTPSink  |        |  +H264VideoRTPSink                     //h264  |        +MPEG4GenericRTPSink                     //aac  |  +MediaSource     +FramedSource      //+doGetNextFrame(); +fAfterGettingFunc;        +FramedFilter        |  +H264FUAFragmenter                       //h264        |  +MPEGVideoStreamFramer        |     +H264VideoStreamFramer                //h264        +FramedFileSource           +ByteStreamFileSource                    //h264           +ADTSAudioFileSource                     //accStreamParser  +MPEGVideoStreamParser     +H264VideoStreamParser                         //h264------------------------------------------------------------

Let's take a look at the members added by framedfilter and framedfilesource to framedsource:

    FramedFilter {        FramedSource* fInputSource;    }    FramedFileSource {        FILE* fFid;    }

We can see their respective roles from the two naming and adding members. Framedfilter corresponds to the filter in [#2], while framedfilesource is the source input from the local file.

2.2 how to implement a filter process:
This uses the finputsource member in framedfilter. Take h264 as an example,

    H264VideoStreamFramer.fInputSource = ByteStreamFileSource;     H264FUAFragmenter.fInputSource = H264VideoStreamFramer;

Assign the upstream source to the finputsource of the downstream filter. For h264, you can obtain the following processing flow:

    ByteStreamFileSource -> H264VideoStreamFramer -> H264FUAFragmenter -> H264VideoRTPSink

Added members in the parent class mpegvideostreamframer of hsf-videostreamframer,

    MPEGVideoStreamFramer {         MPEGVideoStreamParser* fParser;     }    MPEGVideoStreamFramer.fParser = H264VideoStreamParser;

Hsf-videostreamparser is used to filter video data.

Add the RTP Header [#0] To multiframedrtpsink: buildandsendpacket ().

 

3. function relationships

3.1 h264 function call relationship

------------------------------------------------------[#4]-- RTSPServer::RTSPClientSession::handleCmd_SETUP() OnDemandServerMediaSubsession::getStreamParameters(     streamToken: new StreamState(        fMediaSource: H264VideoFileServerMediaSubsession::createNewStreamSource() ) )********** RTSPServer::RTSPClientSession::handleCmd_DESCRIBE() ServerMediaSession::generateSDPDescription() OnDemandServerMediaSubsession::sdpLines() H264VideoFileServerMediaSubsession::createNewStreamSource() H264VideoStreamFramer::createNew( fInputSource: ByteStreamFileSource::createNew(),                                   fParser: new H264VideoStreamParser(                                      fInputSource: H264VideoStreamFramer.fInputSource) )********** RTSPServer::RTSPClientSession::handleCmd_PLAY() H264VideoFileServerMediaSubsession::startStream() [OnDemandServerMediaSubsession::startStream()] StreamState::startPlaying() H264VideoRTPSink::startPlaying() [MediaSink::startPlaying(fMediaSource)]//got in handleCmd_SETUP()H264VideoRTPSink::continuePlaying()     fSource, fOurFragmenter: H264FUAFragmenter(fInputSource: fMediaSource) MultiFramedRTPSink::continuePlaying() MultiFramedRTPSink::buildAndSendPacket() MultiFramedRTPSink::packFrame() H264FUAFragmenter::getNextFrame() [FramedSource::getNextFrame()] H264FUAFragmenter::doGetNextFrame() {1} 1)=No NALU=   H264VideoStreamFramer::getNextFrame() [FramedSource::getNextFrame()]   MPEGVideoStreamFramer::doGetNextFrame()   H264VideoStreamParser::registerReadInterest()   MPEGVideoStreamFramer::continueReadProcessing()   H264VideoStreamParser::parse()   H264VideoStreamFramer::afterGetting() [FramedSource::afterGetting()]   H264FUAFragmenter::afterGettingFrame()   H264FUAFragmenter::afterGettingFrame1()   goto {1}  //Now we have got NALU 2)=Has NALU=   FramedSource::afterGetting()   MultiFramedRTPSink::afterGettingFrame()   MultiFramedRTPSink::afterGettingFrame1()   MultiFramedRTPSink::sendPacketIfNecessary() ------------------------------------------------------------
4. Object relationship

4.1 h264 object Relationship Diagram

------------------------------------------------------[#5]-- ServerMediaSession{#1}.fSubsessionsTail = H264VideoFileServerMediaSubsession{2}.fParentSession = {1} fStreamStates[] {   .subsession  = {2}   .streamToken = StreamState {                    .fMaster = {2}                    .fRTPSink = H264VideoRTPSink{5}.fSource/fOurFragmenter                              = H264FUAFragmenter{4} {                                  .fInputSource = H264VideoStreamFramer{3}                                  .fAfterGettingFunc = MultiFramedRTPSink::afterGettingFrame()                                  .fAfterGettingClientData = {5}                                  .fOnCloseFunc = MultiFramedRTPSink::ourHandleClosure()                                  .fOnCloseClientData = {5}                                }                    .fMediaSource = {3} {                                      .fParser = H264VideoStreamParser {                                                   .fInputSource = ByteStreamFileSource{6}                                                   .fTo = [{5}.]fOutBuf->curPtr()                                                 }                                      .fInputSource = {6}                                      .fAfterGettingFunc = H264FUAFragmenter::afterGettingFrame()                                      .fAfterGettingClientData = {4}                                      .fOnCloseFunc = FramedSource::handleClosure()                                      .fOnCloseClientData = {4}                                    }                  } } ------------------------------------------------------------

4.2 AAC object Relationship Diagram

------------------------------------------------------[#6]-- ServerMediaSession{1}.fSubsessionsTail = ADTSAudioFileServerMediaSubsession{2}.fParentSession = {1} fStreamStates[] {   .subsession  = {2}   .streamToken = StreamState {                    .fMaster = {2}                    .fRTPSink = MPEG4GenericRTPSink {                                  .fOutBuf = OutPacketBuffer                                  .fSource = ADTSAudioFileSource {3}                                  .fRTPInterface = RTPInterface.fGS = Groupsock                                }                    .fMediaSource = {3}                  } } ------------------------------------------------------------

 

5 RTSP

5.1 relationship between RTSP command and processing function:

RTSP command live555 handler options <---> handleapps_options describe <---> handle+_setup play <---> handle+_play pause <---> handle+_pause teardown <---> handler get_parameter <---> ---> handlepai_get_parameter set_parameter <---> handlepai_set_parameter

5.2 RTSP playback interaction example (openrtsp)

-------------------------------------------------------------------------------- ubuntu$ ./openRTSP rtsp://192.168.43.1/grandma.264 Opening connection to 192.168.43.1, port 554... ...remote connection opened Sending request: OPTIONS rtsp://192.168.43.1/grandma.264 RTSP/1.0 CSeq: 2 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29)Received 152 new bytes of response data. Received a complete OPTIONS response: RTSP/1.0 200 OK CSeq: 2 Date: Tue, Jan 25 2011 21:02:53 GMT Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER-------------------------------------------------------------------------------- Sending request: DESCRIBE rtsp://192.168.43.1/grandma.264 RTSP/1.0 CSeq: 3 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) Accept: application/sdpReceived 682 new bytes of response data. Received a complete DESCRIBE response: RTSP/1.0 200 OK CSeq: 3 Date: Tue, Jan 25 2011 21:02:53 GMT Content-Base: rtsp://192.168.43.1/grandma.264/ Content-Type: application/sdp Content-Length: 517v=0 o=- 1295989373493698 1 IN IP4 0.0.0.0 s=H.264 Video, streamed by the LIVE555 Media Server i=grandma.264 t=0 0 a=tool:LIVE555 Streaming Media v2012.02.04 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:H.264 Video, streamed by the LIVE555 Media Server a=x-qt-text-inf:grandma.264 m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:500 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=4D4033;sprop-parameter-sets=Z01AM5p0FidCAAADAAIAAAMAZR4wZUA=,aO48gA== a=control:track1Opened URL "rtsp://192.168.43.1/grandma.264", returning a SDP description: v=0 o=- 1295989373493698 1 IN IP4 0.0.0.0 s=H.264 Video, streamed by the LIVE555 Media Server i=grandma.264 t=0 0 a=tool:LIVE555 Streaming Media v2012.02.04 a=type:broadcast a=control:* a=range:npt=0- a=x-qt-text-nam:H.264 Video, streamed by the LIVE555 Media Server a=x-qt-text-inf:grandma.264 m=video 0 RTP/AVP 96 c=IN IP4 0.0.0.0 b=AS:500 a=rtpmap:96 H264/90000 a=fmtp:96 packetization-mode=1;profile-level-id=4D4033;sprop-parameter-sets=Z01AM5p0FidCAAADAAIAAAMAZR4wZUA=,aO48gA== a=control:track1Created receiver for "video/H264" subsession (client ports 56488-56489)-------------------------------------------------------------------------------- Sending request: SETUP rtsp://192.168.43.1/grandma.264/track1 RTSP/1.0 CSeq: 4 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) Transport: RTP/AVP;unicast;client_port=56488-56489Received 205 new bytes of response data. Received a complete SETUP response: RTSP/1.0 200 OK CSeq: 4 Date: Tue, Jan 25 2011 21:02:53 GMT Transport: RTP/AVP;unicast;destination=192.168.43.244;source=192.168.43.1;client_port=56488-56489;server_port=6970-6971 Session: 7626020DSetup "video/H264" subsession (client ports 56488-56489) Created output file: "video-H264-1"-------------------------------------------------------------------------------- Sending request: PLAY rtsp://192.168.43.1/grandma.264/ RTSP/1.0 CSeq: 5 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) Session: 7626020D Range: npt=0.000-Received 186 new bytes of response data. Received a complete PLAY response: RTSP/1.0 200 OK CSeq: 5 Date: Tue, Jan 25 2011 21:02:53 GMT Range: npt=0.000- Session: 7626020D RTP-Info: url=rtsp://192.168.43.1/grandma.264/track1;seq=26490;rtptime=1809652062Started playing session Receiving streamed data (signal with "kill -HUP 6297" or "kill -USR1 6297" to terminate)... Received RTCP "BYE" on "video/H264" subsession (after 35 seconds)-------------------------------------------------------------------------------- Sending request: TEARDOWN rtsp://192.168.43.1/grandma.264/ RTSP/1.0 CSeq: 6 User-Agent: ./openRTSP (LIVE555 Streaming Media v2012.02.29) Session: 7626020DReceived 65 new bytes of response data. Received a complete TEARDOWN response: RTSP/1.0 200 OK CSeq: 6 Date: Tue, Jan 25 2011 21:03:28 GMT --------------------------------------------------------------------------------
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.