HTTP Adaptive streaming abbreviation has, Chinese name is bitstream adaptive. With the growth of mobile terminal demand for video playback in recent years, has technology has slowly become hot.
Has technology streaming media server needs to prepare a variety of bitrate media data, and then for each bit rate of media data are cut shards, each slice length has been, about 2-10 seconds, each slice is composed of a complete GOP sequence, a GOP sequence with 1 or more I-frames, The first frame of the GOP sequence must be an I-frame, and each slice can decode the playback display separately. The index file for each bitrate slice needs to be established for the slice to be positioned for playback by the client.
Has technology streaming media player needs to choose different bitrate of slice file playback according to its network bandwidth. The player first downloads all bitrate tile index information, and then selectively plays the slice file, if the halfway bandwidth tightens, can switch to the low bit rate slice file playback at any time, because the streaming media server has done the different bit rate slice file the time synchronization and the image synchronization, they are only the code rate and the resolution is different, The network transmission bandwidth is different.
Has technology is essentially a streaming media server to prepare a variety of code-rate of the code stream, all the code stream is the same period of full unified image of the AV data, the client player based on the network bandwidth changes in a timely manner of scheduling different streams, the completion of bandwidth changes in the image without lag screen playback. Has technology needs the server side and the client perfect cooperation in order to show its advantages.
There are four ways to implement the has technology now: Apple HTTP Live streaming technology, Microsoft Smooth streaming technology, Adobe HTTP Dynamic streaming technology, MPEG Dash technology.
Traditional streaming media transmission technology is generally rtp/rtsp/rtcp, and has is based on HTTP, with the following advantages:
1, the HTTP protocol Web server has many, Lighttpd, Nginx, Apache and IIS, using the proxy caching mechanism of Web server can do high concurrent streaming media distribution;
2, the player according to the network bandwidth size dynamic switching different bitrate media source, has the good compatibility;
3, has technology by the strong technical strength of the company to lead the development of standards, reliable and practical.
Has technology from the development of Internet video technologies, if applied to the traditional industry, will also bring good results.
One, Apple HTTP Live streaming (HLS)
File slicing format is TS;
index file is m3u8;
Support Live and time shift;
Mainly for the Iphone/ipad video player, but the Android platform also has a player to support the Protocol, on the PC can be directly tested with VLC;
The HLS protocol is relatively simple and can be implemented on any platform by itself.
HLS protocol slice files in the streaming media server:
m3u8 stores a list of TS files, 1.m3u8 content:
Playlist.m3u8 stores 1.m3u8 and 2.m3u8 tile bitrate, playlist.m3u8 file contents:
HLS Protocol Standard Documentation:
Http://tools.ietf.org/html/draft-pantos-http-live-streaming-11
Ii. Microsoft Smooth Streaming
File slice format is MP4;
index file is ISM/ISMC;
Support Live and time shift;
Mainly for the Wphone video player;
IIS 7 is required as a Web server, but Nginx and Apache also have third-party modules to support the protocol, and the player needs to use Silverlight technology.
The video storage file list is:
Livesmoothstream.ism
Livesmoothstream.ismc
Stream101.isma
Stream201.ismv
Stream202.ismv
The contents of the Livesmoothstream.ism file are:
<?xml version= "1.0" encoding= "utf-16"?>
<smil xmlns= "Http://www.w3.org/2001/SMIL20/Language" >
<meta name= "Clientmanifestrelativepath" content= "LIVESMOOTHSTREAM.ISMC"/>
<body>
<switch>
<audio src= "Stream101.isma" systembitrate= "64000" systemlanguage= "eng" >
<param name= "Manifestoutput" value= "false" valuetype= "Data"/>
<param name= "TrackID" value= "101" valuetype= "Data"/>
<param name= "TrackName" value= "Audio101_eng" valuetype= "Data"/>
</audio>
<video src= "STREAM201.ISMV" systembitrate= "1200000" systemlanguage= "und" >
<param name= "Manifestoutput" value= "false" valuetype= "Data"/>
<param name= "TrackID" value= "201" valuetype= "Data"/>
<param name= "TrackName" value= "video" valuetype= "Data"/>
</video>
<video src= "STREAM202.ISMV" systembitrate= "600000" systemlanguage= "und" >
<param name= "Manifestoutput" value= "false" valuetype= "Data"/>
<param name= "TrackID" value= "202" valuetype= "Data"/>
<param name= "TrackName" value= "video" valuetype= "Data"/>
</video>
</switch>
</body>
</smil>
Third, Adobe HTTP Dynamic streaming (HDS)
File slice format is flv/f4v/mp4;
The index file is f4m (at this point the f4m is just the index of the stream file above, and each stream file can be fragmented for more accurate fragmentation index information)
Support Live and time shift;
The video storage file list is:
hds_sample1_manifest.f4m
sample1_150kbps.f4v
sample1_700kbps.f4v
sample1_1000kbps.f4v
The contents of the hds_sample1_manifest.f4m file are:
<manifest xmlns= "http://ns.adobe.com/f4m/2.0" >
<media href= ". /hds-vod/sample1_150kbps.f4v.f4m "bitrate="/>
<media href= ". /hds-vod/sample1_700kbps.f4v.f4m "bitrate="/>
<media href= ". /hds-vod/sample1_1000kbps.f4v.f4m "bitrate=" "/>"
</manifest>
Iv. MPEG Dynamic Adaptive streaming over HTTP (MPEG DASH)
Dashencoder:https://github.com/slederer/dashencoder
Dashencoder relies on X264,ffmpeg,mp4box and MySQL client libraries
Play protocol |
Play URL |
Microsoft Smooth Streaming |
Http://www.example.com/LiveSmoothStream.isml/Manifest |
Apple HTTP Live Streaming |
Http://www.example.com/video.m3u8 |
Adobe HTTP Dynamic Streaming |
http://www.example.com/video.f4m |
MPEG DASH |
Http://www.example.com/video.mpd |
HTTP Adaptive Streaming