1 handleCmd_SETUP()在用戶端串連成功後,調用RTSPServer::incomingConnectionHandler,在其中建立RTSPClientSession * _pClientSession 進而在其後調用RTSPClientSession::handleCmd_SETUP()方法在此方法中執行subsession->getStreamParameters()(註:在DynamicRTSPServer::lookupServerMediaSession會執行ServerMediaSession*createNewSMS函數從而建立session以及subsession所以這裡的subsession是H264VideoBufferServerMediaSubsession、ADTSAudioBufferServerMediaSubsession、MP3AudioFileServerMediaSubsession類型)重點分析getStreamParameters函數:以H264VideoBufferServerMediaSubsession為例所以這裡實際是H264VideoBufferServerMediaSubsession對象指標調用getStreamParameters函數。Step1: 產生資料來源對象指標FramedSource* mediaSource = createNewStreamSource(clientSessionId, streamBitrate)建立source 指標對象。這裡實際是執行H264VideoBufferServerMediaSubsession::createNewStreamSource根據代碼可知 這裡函數傳回值實際是H264BufferStreamFramer *類型。Step2:產生sink指標rtpSink = createNewRTPSink(rtpGroupsock, rtpPayloadType,mediaSource);這裡實際執行H264VideoBufferServerMediaSubsession::createNewRTPSink這裡函數實際傳回值是H264VideoRTPSink * 。Step3:streamToken = fLastStreamToken = new StreamState(*this, serverRTPPort, serverRTCPPort,rtpSink, udpSink,streamBitrate, mediaSource,rtpGroupsock, rtcpGroupsock);2handleCmd_PLAY在函數中調用Step1:fStreamStates[i].subsession->startStream即執行OnDemandServerMediaSubsession::startStream在startStream中執行Step2: StreamState::startPlaying()接著執行fRTPSink->startPlaying()(即MediaSink::startPlaying)Step3:H264VideoRTPSink::continuePlaying()函數如果是首次執行還會產生H264FUAFragmenter* 指標對象,然後執行MultiFramedRTPSink::continuePlaying()Step4:MultiFramedRTPSink buildAndSendPacket(Boolean isFirstPacket)step5:MultiFramedRTPSink:: packFrame()Step6:fSource->getNextFrame即MediaSource :: getNextFrameStep7:H264BufferStreamFramer::doGetNextFrame()step8: MultiFramedRTPSink::sendPacketIfNecessary(), 這裡才真正發送RTP資料包