live555 Source code Download (VC6 project): http://download.csdn.net/detail/leixiaohua1020/6374387
The source code for the Livemedia Project (http://www.live555.com/) consists of four basic libraries, various test codes, and media Server. The four basic libraries were: Usageenvironment&taskscheduler, Groupsock, Livemedia and Basicusageenvironment. The Usageenvironment and TaskScheduler classes are used for scheduling events, implementing the setting of the handle of an asynchronous read event, and the output of an error message. In addition, there is a Hashtable class that defines a generic hash table that other code will use. These are abstract classes that implement your own subclasses based on these classes in your application. The Groupsock class is an encapsulation of the network interface used to send and receive packets. As the name itself, Groupsock is mainly for multicast data sending and receiving, it also supports unicast data delivery. There are a series of classes in the Livemedia library, the base class is medium, and these classes are for different streaming media types and encodings.
A variety of test code in the Testprogram directory, such as OPENRTSP, such as the code to help understand the Livemedia application.
Media server is a pure RTSP server. Supports media files in multiple formats:
* TS stream file, extension TS.
* PS stream file, extension mpg.
* MPEG-4 Video Base stream file, extension m4e.
* MP3 file, extension mp3.
* wav file (PCM), extension wav.
* AMR audio file, extension. Amr.
* AAC file, ADTS format, extension AAC.
developing applications with live555
Based on Livemedia programs, you need to define classes to handle event scheduling, data literacy, and error handling by inheriting Usageenvironment abstract classes and TaskScheduler abstract classes. The live project source code contains a basic implementation of these classes, which is the "basicusageenvironment" library. Basicusageenvironment is mainly aimed at simple console applications, using Select to implement event acquisition and processing. This library uses UNIX or Windows console as input output, is in the application prototype or debugging purposes, you can use this library users can develop the traditional operation and console applications. By using subclasses of the custom "Usageenvironment" and "TaskScheduler" abstract classes, these applications can run in a specific environment and do not require too much modification. It should be noted that under the graphical Environment (GUI Toolkit), subclasses of the abstract class TaskScheduler should be integrated with the event handling framework of the graphical environment when implementing Doeventloop (). Basic Concepts
First, familiarize yourself with the concepts of source,sink and filter in the Livemedia library. Sink is the object of consuming data, such as storing the received data to a file, which is a Sink. Source is the object that produces data, such as reading data through RTP. The data flow passes through multiple ' source ' and ' sink ' s, following is an example:
' Source1 '-> ' source2 ' (a filter)-> ' Source3 ' (a filter)-> ' sink '
Source that receives data from other source is also called "Filters". A Module is a sink or a filter. The endpoint of the data reception is the sink class, and Mediasink is the base class for all sink classes. The Sink class implements the processing of data by implementing pure virtual function continueplaying (), usually continueplaying call Fsource->getnextframe to set the data buffer for source. The callback function that handles the data, and so on, Fsource is a class member of Mediasink type framedsource*.
The control process for the basic control process based on the Livemedia application is as follows:
An application is an event-driven loop that uses the following methods
while (1) {
finds the task that needs to be completed by looking for a list of read network handles and a deferred queue (delay queue) to
complete this task
}
For each sink, before entering this loop, the application usually calls the following method to start the build task that needs to be done: somesinkobject->startplaying (). At any time, a module needs to fetch data by invoking the Framedsource::getnextframe () method of the module that was just before it. This is achieved by pure virtual function Framedsource::d ogetnextframe (), each source module has a corresponding implementation.
Each ' source ' module ' s implementation of "Dogetnextframe ()" works by arranging for a ' after-getting ' function to be Calle D (from a event handler) when the new data becomes available for the caller.
Note this flow of data from ' sources ' to ' sinks ' happens within each application, and doesn ' t necessarily correspond t o the sending or receiving of network packets. For example, a server application (such as "Testmp3streamer") that sends RTP packets'll do so using one or more "Rtpsink" "Modules. These "Rtpsink" modules receive the data from the other, "*source" modules (e.g., to read data from a file), and, as a side effect , transmit RTP packets.
live555 code Interpretation: The process of establishing a RTSP connection
The Rtspserver class is used to build a RTSP server that internally defines a rtspclientsession class for handling individual client sessions.
First create the RTSP server (the implementation class is Dynamicrtspserver), in the creation process, first establishes the socket (Oursocket) in TCP's 554 port listens, then handles the connection processing function handle (rtspserver:: Incomingconnectionhandler) and socket handles are passed to the Task Scheduler (TaskScheduler).
The Task Scheduler puts the socket handle into the socket handle set (Freadset) that is used in subsequent select calls, and associates the socket handle with the Incomingconnectionhandler handle. The main program then begins to enter the main loop of the Task Scheduler (Doeventloop), calling the system function select Block in the main loop, waiting for the network to connect.
When the RTSP client input (RTSP://192.168.1.109/1.MPG) connects to the server, select returns the corresponding Scoket and, based on the previous saved correspondence, finds the handle to the corresponding handler function. Here is the Incomingconnectionhandler mentioned earlier. A rtspclientsession is created in Incomingconnectionhandler to begin processing the client's session.
live555 Code Interpretation of the second: DESCRIBE request message processing process
After the RTSP server receives the describe request from the client, it finds the corresponding streaming media resource according to the request URL (rtsp://192.168.1.109/1.mpg), and returns the response message. The Servermediasession class in live555 is used to process the description in the session, which contains a child session description (Servermediasubsession) of multiple (audio or video).
In the last section we talked about the RTSP server receiving a client's connection request, establishing a rtspclientsession class, and processing a separate client session. The newly established socket handle (Clientsocket) and the RTSP request are processed to handle the function handle during the establishment of the Rtspclientsession Rtspclientsession::incomingrequesthandler Passed to the Task Scheduler, which is one-to-one associated with the Task Scheduler. When the client makes a RTSP request, the select call in the main loop of the server returns, and the corresponding Incomingrequesthandler is found according to the socket handle to start the message processing. The message is parsed first, and the Handlecmd_describe function is entered if the request is found to be DESCRIBE. According to the suffix of the client request URL (for example, 1.mpg), call the member function dynamicrtspserver::lookupservermediasession to find the corresponding streaming media information servermediasession. If the servermediasession does not exist, but a 1.mpg file exists locally, a new servermediasession is created. During the creation of the servermediasession process,
Create a Media mpeg-1or2 (MPEG1OR2FILESERVERDEMUX) based on the file suffix. mpg. Then the MPEG1OR2FILESERVERDEMUX creates a child session description mpeg1or2demuxedservermediasubsession. Finally, the SDP information in the assembly response message is completed by Servermediasession (the SDP assembly process is described below), and then the response message is sent to the client to complete a message interaction.
SDP Message Assembly process
Servermediasession is responsible for generating session public descriptive information, and child session descriptions are generated by mpeg1or2demuxedservermediasubsession. Mpeg1or2demuxedservermediasubsession generates session description information in its parent class member function Ondemandservermediasubsession::sdplines (). Within the sdplines () implementation, create a fictitious (dummy) Framedsource (the specific implementation class is Mpeg1or2audiostreamframer and Mpeg1or2videostreamframer) and Rtpsink (the specific implementation class is Mpeg1or2audiortpsink and Mpeg1or2videortpsink), and the Last call to Setsdplinesfromrtpsink (...) The member function generates a child session description.
The above mentioned classes and inheritance relationships:
Medium <-servermediasession
Medium <-servermediasubsession <-ondemandservermediasubsession <-mpeg1or2demuxedservermediasubsession
Medium <-mediasource <-framedsouse <-framedfilesource <-Bytestreamfilesource
Medium <-mediasource <-framedsouse <-
Medium <-Mpeg1or2fileserverdemux
Medium <-Mpeg1or2demux
Medium <-mediasource <-framedsouse <-
Medium <-mediasource <-framedsouse <-framedfilter <-mpegvideostreamframer <- Mpeg1or2videostreamframer
Medium <-mediasink <-rtpsink <-multiframedrtpsink <-videortpsink <-mpeg1or2videortpsink
live555 code Interpretation of the third: SETUP and Play request message processing process
The Rtspclientsession class is mentioned earlier for handling individual client sessions. Its class member function Handlecmd_setup () handles the client's SETUP request. Call Parsetransportheader () to resolve the transport header of the setup request, and invoke the Getstreamparameters () of the child session (here to implement the class Ondemandservermediasubsession) function gets the stream media send transmission parameters. The parameters are assembled into a response message and returned to the client.
Gets the procedure for sending transport parameters:
The Createnewstreamsource (...) of the calling child session (the specific implementation class Mpeg1or2demuxedservermediasubsession) creates Mpeg1or2videostreamframer, Select the send transport parameter and invoke the Createnewrtpsink (...) of the child session. Create Mpeg1or2videortpsink. This information is also stored in the Streamstate class object, which is used to record the state of the stream.
The client sends two setup requests, which are used to establish the RTP reception of audio and video, respectively.
Play request message processing process:
The
Rtspclientsession class member function Handlecmd_play () handles playback requests from the client. The Startstream () of the child session is invoked first, the mediasink::startplaying (...) is called internally, and then the Multiframedrtpsink::continueplaying (). Then call Multiframedrtpsink::buildandsendpacket (...). Buildandsendpacke the RTP header is set inside the interior, and the
part calls Multiframedrtpsink::p ackframe () fills the encoded frame data. The
Packframe inside the Framedsource::getnextframe (), followed by Mpegvideostreamframer::d ogetnextframe (), Then after Mpegvideostreamframer::continuereadprocessing (), framedsource::aftergetting (...), Multiframedrtpsink: Aftergettingframe (...), multiframedrtpsink::aftergettingframe1 (...) A series of tedious calls, finally to the Multiframedrtpsink:: Sendpacketifnecessary (), here to really send RTP packets. Then calculate the next packet delivery time, put the Multiframedrtpsink::sendnext (...) The function handle is passed to the Task Scheduler as a delay event schedule. In the main loop, when the Multiframedrtpsink::sendnext () is scheduled, the call to Multiframedrtpsink::buildandsendpacket (...) starts again. Start a new process of sending data so that the client can continuously receive the RTP packets from the server.
The interval calculation method for sending RTP packets:
Update the time at which the next packet should is sent, based on the duration of the "frame" we just into it.
Some classes are involved:
MPEGVIDEOSTREAMFRAMER:A filter that breaks up a MPEG video elementary stream into headers
and frames
MPEG1OR2VIDEOSTREAMFRAMER:A filter that breaks up a MPEG 1 or 2 video elementary Stream
into frames For:video_sequence_header, Gop_header, Picture_header
Mpeg1or2demuxedelementarystream:a MPEG 1 or 2 elementary Stream, demultiplexed from
A program Stream
Mpeg1or2demux:demultiplexer for a MPEG 1 or 2 program Stream
Bytestreamfilesource:a file source is A plain byte stream (rather than frames)
Mpegprogramstreamparser:class for parsing MPEG program stream
Streamparser:abstract class for parsing a byte stream
Streamstate:a class that represents the state of a ongoing stream
RTSP Introduction (ZT)
Real time streaming Protocol, or RTSP (real-time streaming media protocol), is an application layer protocol for effectively transferring streaming media data over an IP network, which is jointly presented by both the live network and Netscape. RTSP provides an extensible framework to provide controllable, on-demand transmission of real-time data, such as audio and video files. Source data can include field data feedback and storage files. RTSP convection Media provides such as pause, fast forward control, and it does not transmit data itself, RTSP function is equivalent to the remote control of streaming media server. Transmission data can be transmitted through the TCP,UDP protocol of the Transport layer, RTSP also provides some effective methods based on RTP transmission mechanism.
RTSP message Format:
RTSP message has two main categories, one is request message (requests), one is the response message (response), the two types of messages are different formats.
Request message:
Method URI RTSP version cr LF
Message head CR LF CR LF
Message body CR LF
The method includes all the commands in the option response, and the URI is the address of the receiving party, for example: rtsp://192.168.20.136.
RTSP versions are generally rtsp/1.0. The CR LF at the back of each line indicates a carriage return line, requires corresponding parsing of the receiving end, and the last message header requires two CR LF
Response message:
RTSP version Status code explanation CR LF
Message head CR LF CR LF
Message body CR LF
The RTSP version is generally rtsp/1.0, the status code is a numeric value, 200 indicates success, interpretation is the text interpretation corresponding to the status code.
A simple RTSP interaction process:
C represents the RTSP client, S represents the RTSP server
1.c->s:option request//Ask S what methods are available for
1.s->c:option Response//s response information includes all available methods provided
2.c->s:describe Request//requires the media initialization description information provided by S
2.s->c:describe Response//s Response Media Initialization description information, mainly SDP
3.c->s:setup request// Set the properties of the session, as well as the transport mode, to remind the S to build
session 3.s->c:setup response//s to establish sessions, return session identifiers, and session-related information
4.c->s:play request//c Request
to play 4.s->c:play response//s back the information that should be requested
S->c: Send streaming media data
5.c->s:teardown request//C requests to close session
5. S->c:teardown response//s back should request
The above process is a standard, friendly RTSP process, but the actual demand is not necessarily step-by-step. The 3rd and 4 steps are required. The first step is that the option request can be done as long as the server client is agreed upon and what methods are available. In the second step, if we have other ways to get media initialization information (such as HTTP requests, etc.), we do not need to do so through the describe request in RTSP. The fifth step can be based on the system requirements of the design to determine whether the need.
Common methods in RTSP:
1.OPTION
The purpose is to get the available methods provided by the server:
OPTIONS rtsp://192.168.20.136:5000/xxx666 rtsp/1.0
Cseq:1//Each message has an ordinal number to mark, the first package is usually the option request message
USER-AGENT:VLC Media Player (LIVE555 streaming media v2005.11.10)
The server's response information includes some of the methods provided, such as:
rtsp/1.0 OK
Server:userver 0.9.7_rc1
Cseq:1//CSEQ value of each response message and the cseq of the request message should correspond
Public:options, DESCRIBE, SETUP, teardown, play, PAUSE, Scale,get_parameter//server available methods
2.DESCRIBE
C initiates the describe request to s, in order to get the session description information (SDP):
DESCRIBE rtsp://192.168.20.136:5000/xxx666 rtsp/1.0
Cseq:2
Token
Accept:application/sdp
USER-AGENT:VLC Media Player (LIVE555 streaming media v2005.11.10)
The server responds to some descriptive information (SDP) about this session:
rtsp/1.0 OK
Server:userver 0.9.7_rc1
Cseq:2
x-prev-url:rtsp://192.168.20.136:5000
x-next-url:rtsp://192.168.20.136:5000
X-accept-retransmit:our-retransmit
X-accept-dynamic-rate:1
Cache-control:must-revalidate
Last-modified:fri, Nov 2006 12:34:38 GMT
Date:fri, Nov 2006 12:34:38 GMT
Expires:fri, Nov 2006 12:34:38 GMT
content-base:rtsp://192.168.20.136:5000/xxx666/
content-length:344
Content-type:application/sdp
V=0//Below is SDP information
O=onewaveuserverng 1451516402 1025358037 in IP4 192.168.20.136
s=/xxx666
u=http:///
e=admin@
C=in IP4 0.0.0.0
T=0 0
a=isma-compliance:1,1.0,1
a=range:npt=0-
M=video 0 RTP/AVP//m represents the media description, and the following is a media description of the video channel in the session
a=rtpmap:96 mp4v-es/90000
a=fmtp:96
profile-level-id=245;config=000001b0f5000001b509000001000000012000c888b0e0e0fa62d
089028307
A=control:trackid=0//trackid=0 says the video stream is using channel 0.
3.SETUP
The client reminds the server to establish a session and determine the transfer mode:
SETUP rtsp://192.168.20.136:5000/xxx666/trackid=0 rtsp/1.0
Cseq:3
Transport:rtp/avp/tcp;unicast;interleaved=0-1
USER-AGENT:VLC Media Player (LIVE555 streaming media v2005.11.10)
The URI contains a trackid=0 that indicates that the channel is set. The transport parameter sets the transport mode, the package structure. The next packet header second byte position is interleaved, its value is each channel is different, trackid=0 interleaved value has two 0 or 1,0 to represent the RTP package, 1 represents the RTCP packet, accepts the end according to interleaved value to distinguish which packets are.
Server Response Information:
rtsp/1.0 OK
Server:userver 0.9.7_rc1
Cseq:3
session:6310936469860791894//Server Response session identifier
Cache-control:no-cache
transport:rtp/avp/tcp;unicast;interleaved=0-1;ssrc=6b8b4567
4.PLAY
The client sends a playback request:
Play rtsp://192.168.20.136:5000/xxx666 rtsp/1.0
Cseq:4
session:6310936469860791894
range:npt=0.000-//Set the range of playback time
USER-AGENT:VLC Media Player (LIVE555 streaming media v2005.11.10)
Server Response Information:
rtsp/1.0 OK
Server:userver 0.9.7_rc1
Cseq:4
session:6310936469860791894
range:npt=0.000000-
rtp-info:url=trackid=0;seq=17040;rtptime=1467265309
Seq and rtptime are all information in the RTP package.
5.TEARDOWN
Client initiates shutdown request:
Teardown rtsp://192.168.20.136:5000/xxx666 rtsp/1.0
Cseq:5
session:6310936469860791894
USER-AGENT:VLC Media Player (LIVE555 streaming media v2005.11.10)
Server response:
rtsp/1.0 OK
Server:userver 0.9.7_rc1
Cseq:5
session:6310936469860791894
Connection:close
The above methods are most commonly used in the interactive process, others have some important methods such as get/set_parameter,pause,redirect and so on.
Appendix
The format of the SDP
v=<version> o=<username> <session id> <version> <network type> <address type> < address> s=<session name> i=<session description> u=<uri> e=<email address> p=<phone number> c=<network type> <address type> <connection address> b=<modifier>:< bandwidth-value> t=<start time> <stop time> r=<repeat interval> <active duration> <list of Offsets from start-time> z=<adjustment time> <offset> <adjustment time> <offset> .... k=< method> k=<method>:<encryption key> a=<attribute> a=<attribute>:<value> m=< media> <port> <transport> <fmt list> v = (protocol version) O = (owner/creator and session identifier) s = (conversation name) i = * (Session information) U = * (UR I description) E = * (Email address) p = * (phone number) c = * (connection information) b = * (bandwidth information) z = * (time area adjustment) k = * (encryption key) A = * (0 or more session property lines) Time Description: T = (will Session activity Time) r = * (0 or more times repeated) media Description: M = (media name and transmission address) i = * (media title) C = * (evenAccess information-if it is included in the session layer, this field is optional) b = * (bandwidth information) k = * (encryption key) A = * (0 or more media property lines)
Reference article: rfc2326 (RTSP); rfc2327 (SDP)
RTSP on-Demand message flow instances
(Clients: VLC, RTSP server: LIVE555 Media Server) 1) C (client)-> M (Media server)
OPTIONS rtsp://192.168.1.109/1.mpg rtsp/1.0
cseq:1
user-agent:vlc Media Player (LIVE555 streaming media v2007.02.20)
1 M-> C
rtsp/1.0 OK
cseq:1 date:wed,
Feb 2008-GMT 07:13:24
, DESCRIBE, SETUP, teardown, play, PAUSE
2) C-> M
DESCRIBE rtsp://192.168.1.109/1.mpg rtsp/1.0
cseq:2
ACCEPT:APPLICATION/SDP
Media Player (LIVE555 streaming Media v2007.02.20)
2) M-> C
rtsp/1.0 OK
cseq:2
date:wed, Feb 2008 07:13:25 GMT
content-base:rtsp://192.168.1.109/1.mpg/
CONTENT-TYPE:APPLICATION/SDP
content-length:447
v=0
o =-2284269756 1 in IP4
192.168.1.109 or 2 program Stream, streamed by the LIVE555 media Server
i=1.mpg
t=0 0
a=tool:live555 media streaming. 02.08
a=type:broadcast
a=control:*
a=range:npt=0-66.181
a=x-qt-text-nam:mpeg-1 or program Stream, streamed by the LIVE555 Media Server
a=x-qt-text-inf:1.mpg
m=video 0 RTP/AVP-c=in IP4 0.0.0.0
a=control:track1
m=audio 0 RTP/AVP
c=in IP4 0.0.0.0 a=control:track2
3) C-> M
SETUP rtsp://192.168.1.109/1.mpg/track1 rtsp/1.0
cseq:3
TRANSPORT:RTP/AVP; unicast;client_port= 1112-1113
USER-AGENT:VLC Media Player (LIVE555 streaming media v2007.02.20)
3) M-> C
rtsp/1.0 OK
cseq:3
date:wed, Feb 2008 07:13:25 GMT Transport
:
rtp/avp;unicast;destination= 192.168.1.222;source=192.168.1.109;client_port=1112-1113;server
_port=6970-6971
Session:3
4) C-> M
SETUP rtsp://192.168.1.109/1.mpg/track2 rtsp/1.0
cseq:4
TRANSPORT:RTP/AVP; unicast;client_port=1114-1115
Session:3
USER-AGENT:VLC Media Player (LIVE555 streaming media v2007.02.20)
4) M-> C
rtsp/1.0 OK
cseq:4
date:wed, Feb 2008 07:13:25 GMT Transport
:
rtp/avp;unicast;destination= 192.168.1.222;source=192.168.1.109;client_port=1114-1115;server
_port=6972-6973
Session:3
5) C-> M
Play rtsp://192.168.1.109/1.mpg/rtsp/1.0
cseq:5
session:3
range:npt=0.000-
USER-AGENT:VLC Media Player (LIVE555 streaming Media v2007.02.20)
5) M-> C
rtsp/1.0 OK
cseq:5
range:npt=0.000-
session:3
rtp-info:
url=rtsp://192.168.1.109/1.mpg/ TRACK1;SEQ=9200;RTPTIME=214793785,URL=RTSP://192.168.1.109/1.
mpg/track2;seq=12770;rtptime=31721
(Start transport streaming media ...)