Second, what is streaming media and streaming?
2.1 Streaming Media
"Streaming media" refers to the use of "streaming" in the Internet to play media format, the most important feature of streaming media is "edge-to-bottom broadcast", commonly used streaming media format has FLV (using Flash as a video carrier), TS, etc.
2.2-Stream transmission
"Streaming" refers to the technology that transmits streaming media over the network. Streaming is divided into "live streaming" and "sequential streaming". In general, if the video is live streaming, it is streamed in real time. If the video is not live, the file is sent in sequential streams, which is sequential streaming.
Third, what is the solution reuse?
"Re-use" means to divert "separate audio" and "separate video" data from "Audio Video source", for example, we will "flv" the "h.264 video data" and "AAC audio data".
Four, RTMP, RTP, RTSP
4.1 RTMP
Name: Real time Messaging Protocol
Protocol family: The protocol is based on TCP and is a protocol family. Includes basic protocol rtmp and its variants, such as Rtmpt, Rtmps, Rtmpe, etc.
Operating environment: Mainly used for audio and video communication between "flash platform" and "Streaming media server"
4.2 RTP
Name: Real time Transport Protocol
Composition: The RTP standard defines two protocols, one is the RTP protocol (data Transfer Protocol) and the other is the RTCP Protocol (Control Protocol)
4.3 RTSP
Name: Real Time Streaming Protocol
Definition: This protocol defines how to establish/negotiate a real-time streaming session between a client and a server
4.4 Architecture
According to the network model, the protocols involved in the broadcast technology are distributed in this way.
Application layer protocol: RTSP, RTMP
Transport Layer Protocol: RTCP, RTP, TCP, UDP
Network layer protocol: RSVP, IP
V. H. V and AAC
5.1 H.
The "Video encoding format" (sometimes called video compression format) is called "mpeg4 part10". It is a matter of level with MPEG2 Part2.
5.2 AAC
AAC is an "Audio encoding format" (sometimes called an audio compression format), and it is a hierarchy with MP3, FLAC, APE, and WavPack.
5.3 Summary
In general, users do not have the "video encoding format" file and the "Audio encoding format" file directly. Instead, they have a "multimedia container format" file and then get them by de-multiplexing.
The difference between soft solution and hard solution
6.1 Soft Solution
Soft solution means just decoding with the CPU
6.2 Hard Solutions
The hard solution refers to the decoding of the GPU, and the CPU plays an auxiliary role.
Seven, YUV and PCM
In the previous introduction of the "Video encoding format" and "Audio encoding format" When we introduced the H. S and AAC, they are also called "video compression format" and "Audio compression format". It is because the original video data format YUV (that is, from the video capture chip directly output video encoding format) and the original audio data format PCM (that is, from the audio capture chip directly output audio encoding format) occupies a large area, it is necessary to encode (compress) into the small footprint of H. S and AAC format.
Eight, what is the push stream SDK?
The push-stream SDK from my point of view (I am an Android programmer) refers to a third-party support library or jar package running on Android.
The SDK will do the following: Audio and video collection, beauty filters, image processing, noise control, flow control and other functions
Nine, what is the push stream address (also called push stream URL)?
We say that pushing the stream is actually the process of sending the audio and video data that the client collects to the server. So, the client needs to know where the server is, and how to push the stream up. This push-stream URL is the locator that indicates where the client "pushes the stream". The push-stream URL is assigned by the server.
What is the difference between URI and URL (hehe)?
A URI is a syntactic structure that does not necessarily contain information about locating Web resources.
A URL is a special case of a URI that must contain information about locating a Web resource.
It can be said that the URL has the full functionality of the URI, and the URL has no URI function.
This is a bit like the inheritance in Java ———— the URL is a subclass, and the URI is the parent class.
Xi.: Introduction of MPEG family
MPEG, full name moving Picture experts Group
11.1 MPEG-1
The first video audio compression standard was subsequently adopted by the VCD standard. It consists of a series of sub-standards, called the Ministry (part) (sometimes also translated as a volume), and its architecture is as follows:
Part1:system
Part2:video
Part3:audio
One of Part3 's audio is divided into three layers:
Layerⅰ
Layerⅱ
Layerⅲ: This is what we usually say MP3
So, MP3 is not MPEG-3 but MPEG-1 Part3 Layerⅲ, or MPEG-2 Audio Layerⅲ. But MP4 is MPEG-4.
11.2 MPEG-2
Except it was used by DVD standards.
11.3 MPEG-3
Stop research and development in mid
11.4 MPEG-4
This is the famous MP4, of which tenth (Part 10) is important and is a video coding standard issued jointly by ISO, IEC and ITU-T: H.
11.5 MPEG-7 and MPEG-21
MPEG-7 is a descriptive standard for multimedia content that has been developed and completed. MPEG-21 is in the making, and its goal is to provide a complete platform for future multimedia applications.
[Linux] Streaming Media server overview