In PHP5.1.6, 4.4.4 and previous versions, a possible buffer overflow may be triggered when the UTF-8 encoding is selected when searching for the associated character encoding for "htmlspecialchars () and Htmlentities ()".
"While we are were
reproduced in this document: Http://www.cnblogs.com/my_life/articles/5593892.html, thank the blogger enthusiastic dedication
The following content is reproduced:
Http://blog.chinaunix.net/uid-26000296-id-4932817.html
Http://blog.chinaunix.net/uid-26000296-id-4932822.html
http://blog.csdn.net/zhangxinrun/article/details/50739237
In the live app, rtmp and HLS can basically cover all the clients watching,HLS
Http://www.cnblogs.com/haibindev/archive/2013/01/30/2880764.htmlInadvertently found that the big six months did not write a blog, consciously ashamed. In fact, after 2012 half a year, the things in the family happened the same, there is no time. Fast New Year, finally sneak in, the recent technical achievements, summed up into the article, share with you.Some days ago, it was also a project need, spent some time studying HTTP live streaming(HLS) techn
1.DASH Introduction
DASH, also known as MPEG dash,dash:dynamic Adaptive streaming over HTTP, is a video streaming technology that transmits dynamic bitrate on the internet, similar to how Apple's Hls,dash will pass media Presentation description (MPD) slices video content into a short file fragment, each with multiple different bitrate, and the DASH client can choose a bitrate to play based on the network, enabling seamless switching between differen
needs.
Live stream slicing tool (stream segmenter)
Live stream slicing tool to read live data from the network, through the online real-time segmentation, output conforms to the HLS specifications of live streaming to the Internet. It generally receives the TS stream output by the encoder or other system through the UDP protocol, and divides the TS stream into small files with fixed broadcast length in re
.
ProgramFramework and implementation
Through the above analysisHls liveencoderThe logic and process of the live video encoder are basically clear: Enable the audio and video encoding threads separatelyDirectShow(Or other) technology for audio and video collection, and then call libx264 and libfaac for Video and Audio Encoding respectively. After two encoding threads encode audio and video data in real time, they are stored inMPEG-TSWhen you store
transmission, side consumption" process. This means that we need a closer look at the intermediate process (encoding) of the video from the original content element (image and audio) to the finished product (video file).Video encoding compressionLet us understand the video coding compression technology in a comprehensible sense.In order to facilitate the storage and transmission of video content, it is often necessary to reduce the volume of video content, that is, the original content elements
through coaxial cable, such as transmission, IP network development, the use of IP network excellent transmissionrelating to technology or agreement:Transport protocol: RTP and RTCP, RTSP, RTMP, HTTP, HLS (HTTP Live streaming), etc.Control signaling: SIP and SDP, SNMP, etc.4, decoding the data:Using the relevant hardware or software to decode the received encoded audio and video data and get the image/sound that can be directly displayedrelating to t
Giants have released HLS, HDS, and Smooth Streaming protocols to hide all the details in their dedicated sdks. Developers cannot freely modify the multimedia engine logic in the player: you cannot modify the adaptive bit rate rules and cache size, or even the length of your slice. These players may be easy to use, but you don't have much options to customize them. Even poor functions can only be tolerated.
However, with the increase in different appl
playback, You need to decompress (deCODE/restore ).
4. Video decoding and Compression
After the video content is encoded and compressed, it is conducive to storage and transmission. Decoding is required when you want to watch a video.
Before coding and decoding, you must agree on an agreement that can be understood by both the encoder and decoder.
For example, video image encoding and decoding:
The encoder
format:MP4, MOV, AVI, RMVB These playback formats are in fact packaging format, in addition to rmvb more special, other formats in the format of the video encoding is h264,h264 to high compression rate, compression efficiency is more than MEPG-2, but the world has no best of both worlds, H264 is 3 times times more difficult to decode.Video bitrate:The size of the video file divided by the length of the video is defined as the bitrate.The relationship between bitrate and resolution and video qua
This article from the NetEase cloud community
Preface
This article is for mobile video live development Novice, in order to quickly get started, using the powerful Google search engine with their own understanding of the "video broadcast background Knowledge".
Background Knowledge
noun explanation
Push-Stream Protocol
RTMP
Real Time Messaging Protocol (live messaging protocol)
With Flash Player as the player client, Flash Player is now installed on nearly 99% of the world's PCs, so it is genera
maximum transmission Unit (MTU) maximum of 1500 bytes, when using the IP/UDP/RTP protocol hierarchy, which includes at least 20 bytes of IP header, 8 bytes of UDP header, and 12 bytes of RTP header. This way, the header information takes at least 40 bytes, and the maximum size of the RTP payload is 1460 bytes. Taking H264 as an example, if a frame of data is greater than 1460, it needs to be fragmented and then unpacked to the receiving end and then assembled into one frame of data for decoding
rtmp Live application and delay analysis
Original address: https://github.com/ossrs/srs/wiki/v1_CN_LowLatency Author: Winlin
Related articles:Take you thoroughly understand rtmpSome considerations for implementing RTMP protocol with live push streamingHow the live platform uses rtmp for low latency video streamingRTMP protocol analysis and interaction processIn live streaming applications, rtmp and HLS can basically cover all clients,
available streaming media and live video services, in this regard, you can refer to the qiniu live video cloud service released by qiniu cloud.
Encapsulation
After the video encoding is introduced, we will introduce some encapsulation. As mentioned above, encapsulation can be understood as the type of truck used for transportation, that is, the media container.
The so-called container is a standard that encapsulates the multimedia content (video, audio, subtitles, Chapter information, etc.) gen
image encoding method, and then describe in some way what the next frame is different from the adjacent frame.Video format:MP4, MOV, AVI, RMVB These playback formats are in fact packaging format, in addition to rmvb more special, other formats in the format of the video encoding is h264,h264 to high compression rate, compression efficiency is more than MEPG-2, but the world has no best of both worlds, H264 is 3 times times more difficult to decode.Video bitrate:The size of the video file divide
choose to download the same resources at different rates from many different alternative sources, allowing streaming media sessions to accommodate different data rates. When you start a streaming session, the client downloads a extended m3u (m3u8) playlist file that contains metadata to find the available media streams.Client Support
iOS starts from 3.0 as a standard feature.
Adobe Flash Player supports HLS from 11.0 onwards.
Google'
Mac Live Server Nginx configuration support for HLSIn the previous article on the Mac to build a live server nginx+rtmp, we have built a nginx+rtmp live server. The following need to add support to the Nginx server for HLS. In Nginx addition to the support of the HLS more simple, just simple modification under the configuration file nginx.conf .Installing Nginx and RTMP modulesFor the installation of Nginx
, mpeg4, mpeg2video, pcm, and flv encoder support.
3.3 FFmpeg decoder support
The FFmpeg source code contains a lot of decoding support, which is mainly used for decoding input. It can also be understood as decompressing the compressed encoding. For decoding support, you can use. /configure-list-decoders command to view:
The decoders module supported by ffmpeg supports mpeg4, h264, h265 (HEVC), and mp3.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.