Live is not unfamiliar, now the mainstream of the agreement analysis of the contrast chart, personal insights.
Agreement |
httpflv |
Rtmp |
HLs |
Dash |
Transport layer |
HTTP stream |
tcp stream |
http |
http |
video format |
FLV |
flv tag |
ts file |
mp4 3gp webm |
delay |
Low |
low |
very high |
high |
data segmentation |
Continuous stream |
continuous stream |
Slice file |
Slice file |
HTML5 Playback |
Can be played via HTML5 unpacking (flv.js) |
Not supported |
Can be played via HTML5 unpacking (hls.js) |
If the dash file list is a MP4WEBM file, you can play it directly |
Http_flv&rtmp
The two protocols actually transmit the same data, and the data is the tag of the flv file. HTTP_FLV is an infinitely large HTTP stream file that can only be streamed live compared to rtmp, while rtmp also pushes the stream and more operations. But the advantage of HTTP is that it is 80http communication, penetrating, and rtmp is a non-open protocol.
The two protocols are the main stream of live broadcast platforms today, mainly because of the low latency.
HLs
HLS is an apple-launched live protocol that is streamed from video stream slices to file fragments. The client will first request a m3u8 file, which will have a different bitrate stream, or directly a list of TS files, by the given TS file address to play sequentially. On live streaming, the client constantly requests the m3u8 file to check if the TS list has new TS slices.
The main disadvantage of this way of broadcasting is that the delay is too large and the minimum delay is the length of the TS single file.
Dash
The dash actually works just like HLS, but not mpegts files, and dash can support a variety of slice files, such as MP4 slices. When slicing for MP4, the client can directly play with JS control using HTML5 directly. In the same way, Dash has a delay.
How does http-flv play?
Here we mainly study httpflv and HLS. See the mainstream of several web live platform, found almost all to httpflv mainly to live, so fire httpflv in the end is how to achieve live?
First of all we know that in the media format, almost all are encoded in H264 video. Now httpflv live FLV data are also H264&AAC-based. The FLV package Unit is represented by tag, and a tag can be either an audio tag or a video tag, or a script tag and other types.
It is worth noting that the byte order in the FLV is the network byte order,
Format of FLV:
flvheader+[script Tag (metadata)]+[first video tag (H264_spspps)]+[first audio tag (aac_header)]+[second video tag (h264 first keyframe)]+ And then there's the audio and video tag interaction.
The format of the tag:
Type[1byte] + body Size[3byte] + timestamp [4byte] +streamid [3byte] +[body data]+[previoustagsize 4byte]
The timestamp here is so present [time tamp 3b,time tamp ex 1b]
H264 Video tagbody:
The h264 stored here is not nal, which is stored in the body of T,
[Iskeyframe (1byte)]+0x01+[compositiontime 3byte]+[h264 size 4byte]
Compositiontime is the offset between the H264 encoded result DTS and pts.
The body of the AAC video tag:
0XAF+0X01+AAC Raw
The above is a detailed description of the FLV format, you can see the format is simple, the pre-and post-package data Association is very small, when we get the audio head and video head, you can start from any of the following keyframes to play.
Of course, want to httpflv normal playback, not lack of matedata, is the first script tag inside, which specifies the resolution, audio and video encoding format.
HTTPFLV broadcast is actually simply sending the Flvtag to the client, which is, of course, sending the key tags before the FLV, and then the first frame is the keyframe.
If the client is an OBS streaming software that pushes a stream to the server in rtmp mode, the server sends the publish result instruction to the OBS after the start handshake and its creation stream completes, and the sending metadata completes, and a series of data is sent. OBS began to push the FLV tag data to the server , start the live broadcast, and the server also got FLV data.
What does a server do when a client wants to get live data, such as httpflv, to watch live streaming?
The server will first send the previous several flvtag,header+metadata+spspps+aacheader, when these tags are sent, the server will be from the live stream tag, find the latest video keyframe tag, from this keyframe tag development data, why? Because the video stream is an IBP frame with each other, I is the complete data, the BP decoding requires I and the front frame, so, the normal video data, must be sent from the I-frame start. This is where the GOP pitch is concerned, and rtmp's low-latency second-turn is the principle.
At this point, the HTTPFLV client can accept the FLV stream data and decode to watch the live stream.
How is HLS playing?
HLS is relatively simple and rough, the server will live streaming data h264 and AAC, encapsulated slices into a TS file. When the client obtains the data of the live broadcast, it requests the m3u8 file first, the following is a m3u8 file,
#EXTM3U #ext-x-version:3#ext-x-targetduration:5 ts Max frequently 5s#ext-x-media-sequence:2 identification of the first TS file #extinf:4.993,//first TS file , duration 4.993,url Address/hls/2.ts/hls/2.ts#extinf:4.034,/hls/3.ts#extinf:4.980,/hls/4.ts
If it is live, the client will continue to request this m3u8 file, when the list has a new TS file, the client will request the new TS file appended to the local play sequence.
About TS encapsulation, TS Packaging format is more complex than FLV, the main data unit is TS package, each package has a PID, a package fixed size normal no CRC for 188, mainly divided into three types of TS package, Pat,pmt,pes,pat is the first package, When the resolution will be in the TS Package list to find the PID of the package is 0x0, Pat package, Pat is the meaning of the entrance, Pat inside has PMT package, PMT is stored in the flow of packet ID, such as the specified audio packet pid is 0x102, video packet pid is 0x101, After the 0x102 and 0x101 package is the PES package, the PES package parsing and merging the original stream, you can decode the playback.
Small program
Know how to play live, so wrote a small program , the program will record the native desktop and output audio as well as the microphone, encoded as h264 and AAC, while the local use of IOCP easy to create a server, providing Web services and live streaming services, Support httpflv and HLS live.
The following is the architecture diagram of the applet:
Run:
After startup can choose, fill in the port number and bitrate, and then select the Live mode, black screen for the low API.
Three libraries used by the program:
libx264 Video Encoding
LIBFAAC Audio encoding
Swscale Brga Turn yuv420
(1.97M): Http://files.cnblogs.com/files/luconsole/DesktopLiveStreaming.zip
Live server simple implementation http_flv and HLS intranet Live Desktop