Transferred from: http://blog.csdn.net/neustar1/article/details/19480863
Text describes the process of video processing. The two-way video session video signal flow process is shown in Figure 1.
Figure 1 Video Flow
Take the video session as an example, mainly divided into the following several threads:
1) Video source generation thread: Camera production video screen, packaged into a video frame, at a certain frame rate delivery to the next module. ;
2) Acquisition Thread: The Capturer is responsible for capturing video frames, and for certain processing of video frames, such as adjusting the brightness of the screen. and sent to each transmission link encoding module encoding and send out;
3) Receive thread: RTP/RTCP is responsible for receiving the RTP/RTCP packet and parsing the packet;
4) Decoding thread: Decoder decoding the encoded video frame;
5) Delivery Thread: Render receives the decoded video frame and caches it and delivers it to the display device;
6) Display Thread: Player is responsible for drawing or outputting the video screen, can do multiple window display or a window multi-screen display.
The video decoding time is longer than the encoding time, so it opens up a separate thread to complete the decoding, and in addition, the multi-channel video screen is not similar to multi-channel audio mixing operation, but the rendering is displayed separately.
WEBRTC Source Analysis Three: Video processing flow