[Project 1 live broadcast]☞2. Pull live streams and broadcast Projects
I. Introduction to video technology 1.
Video is structured data. Data is transmitted during live broadcasting, which is a small unit of video.
Video = image + audio
2. Real-Time Video Transmission
-
Video is structured data, and live video is structured data that is transmitted in real time ".
-
Real-time events (live broadcasts) must be delayed. To improve the quality of live broadcasting, we need to reduce the latency.
3. Video Encoding and Compression
-
The size of the video content is generally relatively large. To facilitate the storage and transmission of the video content, the original content elements (images and audios) are usually compressed. (The compression algorithm is the encoding/compression format)
-
The video data transmitted during live streaming usually needs to be compressed (encoded). The smaller the amount of data transmitted, the better. during playback, You need to decompress (deCODE/restore ).
4. Video decoding and Compression
-
After the video content is encoded and compressed, it is conducive to storage and transmission. Decoding is required when you want to watch a video.
-
Before coding and decoding, you must agree on an agreement that can be understood by both the encoder and decoder.
- For example, video image encoding and decoding:
The encoder encodes multiple images and generates a piece of GOP (Group of Pictures), which transmits the GOP (Group of Pictures ), during playback, the decoder reads a piece of GOP, decodes it, reads the image, and then renders it.
GOP: indicates a group of images (that is, a group of consecutive images). It is the basic unit for video image encoder and decoder access, it consists of one I frame and several B/P frames (I frame is also called a key frame, and B/P frame is also called a reference frame ), the order of the image will be repeated until the end of the image.
5. How to transmit videos?
-
The encoder encodes multiple video images and generates a segment of GOP for transmission.
-
The video is divided into molecules (GOP), and then the molecules are divided into atoms (I frame, B frame, P frame ).
Think about it. If we change the transmission of an "object" to transmitting an "atom" one by one and transmit the smallest particles at the speed of light, we can perceive them with the human's biological eyes, what kind of experience will it be like?
--- This experience is similar to live broadcasting.
6. What protocol is used for transmission?
-
Live Video is the process of streaming every Frame of Data (Video/Audio/Data Frame) after being tagged with time series. The sending end continuously collects audio and video data, which is encoded, packed, and pushed (the anchor end), and then spread through the relay Distribution Network (that is, the video data of the anchor end is transmitted to the server, the server then transmits the data to the fans), and the Playback End continuously downloads the data and decodes and plays the video in sequence. In this way, the live video process of "production, transmission, and consumption" is realized.
7. live video business logic
One-to-multiple model:
Broadcaster-- RTMP protocol --"Live streaming media server-- RTMP/HLS protocol --"Audience (fans)
- RTMP Protocol: low latency and fast transmission speed. (The host uses this Protocol to upload video data to the server immediately)
- HLS Protocol: slice the video stream that has been transmitted by the broadcaster, and then transmit it to the fans. (Play the video based on the time point. This protocol is used by fans to play the video, and this Protocol is also used for playback)
Protocol differences:
- RTMP (Real Time Messaging Protocol) Protocol: it is a TCP Protocol (persistent connection) that forwards data at each Time point immediately upon receipt, with a latency of 1 ~ 3 seconds.
- HLS (HTTP Live Streaming) Protocol: it is an HTTP protocol (short connection) that collects data for a period of time, generates ts slice files, updates m3u8, and has a latency greater than 10 seconds.
- RTMP-FLV (RTMP over HTTP) Protocol: is an HTTP protocol (persistent connection), with RTMP, using HTTP protocol, latency 1 ~ 3 seconds.
8. latency (Delay)
- Physical latency
- Latency refers to the time difference between sending and receiving in a stable network.
- The more forwarding links, the greater the latency.
- Computable.
- Host (Shanghai)-- Forward --"Server 1 (Beijing)-- Forward -- "... -- Forward --"Server n (Hong Kong)-- Forward --"Fan end
- Latency depends on the number of server forwards/the distance between the host and the fans.
- Jitter (Jitter) Delay
- Jitter, the signal will be poor.
9. Live Broadcast Process
A complete live broadcast process mainly includes: Collection, processing, encoding, encapsulation, streaming, transmission, transcoding, distribution, pulling, decoding, and playback.
The lower the latency, the better the user experience.
10. Encoder
Encoder encoding methods include soft encoding and hard encoding.
Soft editing means that resources consume CPU and are encoded in some software code mode. (Soft editing is generally used)
Hard coding means that resources are consumed by GPUs, and GPUs are equivalent to graphics card/display functions. (Good mobile phone is not card, bad mobile phone is card)
Ii. Live Video/Video Player 1.
IjkplayerFramework Introduction
Video PlaybackIjkplayer,IjkplayerIt is an open source third-party framework of site B (bilibili. Ijkplayer has integrated FFmpeg for us.
- FFmpeg is an audio/video processing tool that provides both audio/video encoding and decoding functions and can be used as a player.
- Any player is based on FFmpeg. AVPlayer provided by Apple cannot play live video files/formats (for example, RTMP cannot play, ijkplayer can be used ). So we really need to play the video based on FFmpeg, and Station B has encapsulated it for us, so we use ijkplayer.
- This is also used by douyu. You can use this (ijkplayer) If you don't want to spend any money at the company ).
FQ tool: Polymorphism
VLC: video player with multiple encoding formats (multiple formats are supported. rtmp: // can be used to test the live video address ).
Test address: rtmp: // live.hkstv.hk.lxdns.com/live/hks)
2. Use of ijkplayer
1) configure the dependency environment: Download FFmpeg and compile FFmpeg.
2) package the IJKMediaPlayer (player) project into a framework to facilitate integration into your own project.
- DownloadIjkplayerAnd configure the dependent environment.
1) Search on the github websiteIjkplayer
2) Go to Build iOS in the README. md document and follow the steps.
3) clone a project:git clone https://github.com/Bilibili/ijkplayer.git ijkplayer-ios
4) first go to the ijkplayer-ios directory:cd ijkplayer-ios, Run:git checkout -B latest k0.7.7.1
5) execute again:./init-ios.sh(Download FFmpeg)
(Ijkplayer is based on FFmpeg. Because FFmpeg is large, it is slow to download it separately)
6) After the download, go to the ios directory:cd iosTo compile FFmpeg.
7) to compile FFmpeg, run the following command:
./compile-ffmpeg.sh clean
./compile-ffmpeg.sh all
8) after the compilation is successful, we can runIJKMediaDemo.
- SetIJKMediaPlayerProject package into framework