Audio and video parameters of streaming media
Definition
Resolution is an important concept related to images. It is a technical parameter for measuring the expressiveness of image details. High resolution is an important prerequisite for ensuring the clarity of the Color Monitor. Resolution indicates the precision of the screen image, that is, the number of points that the display can display. Generally, the "resolution" is expressed as the number of pixels in each direction. The higher the Resolution, the more points that can be displayed, the finer the screen.
Video resolution refers to the size or size of the image produced by the video imaging product. Currently, the resolution of common video images on mobile phones is 480x270,640x768 x. The screen aspect ratio is and.
Bit Rate
The bit rate (DataRate) is the number of data records transmitted per unit time during data transmission. Generally, the unit we use is kbps, that is, kilobytes per second. It is the most important part in video encoding quality control. At the same resolution, the larger the video file code stream, the smaller the compression ratio and the higher the image quality.
Bit Rate Unit
Kbps: First, you need to know that ps refers to/s, that is, every second. Kbps indicates the network speed, that is, the number of kilobytes of information transmitted per second (K indicates the kilobytes and Kb indicates the number of kilobytes ), in order to intuitively display the fast transmission speed of the network, companies generally use kb (Thousands of BITs. 1KB/S = 8 Kbps. The Internet access speed of ADSL is 512 Kbps. If it is converted to bytes, It is 512/8 = 64 KB/S (that is, 64 kilobytes per second ).
Frame Rate
The frame rate is the number of images per second. The resolution indicates the size of each image, that is, the number of pixels. The bitstream is the amount of data produced per second after the video is compressed, compression removes image space redundancy and video time redundancy. Therefore, for static scenarios, a low bitstream can be used to obtain better image quality. For scenarios with intense motion, A high bitrate may not reach a good image quality, so the conclusion is to set the frame rate to indicate your real-time performance. Setting the resolution is the size of the image you want to see, the bit rate setting depends on the camera and scenario. After on-site debugging, the bit rate can be determined until an acceptable image quality is obtained.
A frame is a static image, and continuous frames form an animation, such as a TV image. We usually refer to the number of frames. Simply put, it is the number of frames transmitted for an image in one second. It can also be understood as the number of frames that a graphic processor can refresh several times per second. fps (FramesPerSecond) is usually used). Each frame is a static image, and the rapid continuous display of frames forms the illusion of motion. A higher frame rate allows you to get smoother and more realistic animations. The more frames (fps) each second, the smoother the displayed action.
Relationship between frame rate resolution and bit stream
Sampling Rate
Defines the number of samples that extract from continuous signals every second and form discrete signals. It is represented by Hz. The reciprocal of the sampling frequency is the sampling period or the sampling time, which is the time interval between sampling. Generally speaking, sampling frequency refers to the number of sound samples collected by a computer every second. It describes the sound quality and tone of a sound file and measures the quality standards of sound cards and audio files.
In today's mainstream acquisition cards, the sampling frequency is generally divided into three levels: 22.05 KHz, 44.1 KHz, and 48 KHz. 22.05KHz can only achieve the sound quality of fmbroadcast, 44.1KHz is the theoretical limit of CD sound quality, while 48 khz is more accurate. Human ears can no longer identify the sampling frequency higher than 48 khz, so there is little value for use on the computer.
Streaming media audio and video formats
Streaming media Video Format H.264
H. 264 is a new generation digital video compression format after MPEG4 proposed by the International Organization for Standardization (ISO) and the International Telecommunication Union (ITU. the most valuable part of 264 is undoubtedly a higher data compression ratio. Under the same image quality conditions, H.264 data compression ratio can be 2-3 times higher than the MPEG-2 used in the current DVD system, 264-2 times higher than the MPEG-4. Because of this, video data compressed by H.264 requires less bandwidth and is more economical during network transmission. When the MPEG-2 requires a 6 Mbps transfer rate match, H.264 only requires a 1 mbps-2mbps transfer rate.
Streaming Media Audio Format
AAC (Advanced Audio Coding) is a File compression format designed specifically for Audio data. Unlike Mp3, it adopts a new algorithm for encoding, which is more efficient and has a higher "cost effectiveness ". The AAC format can make people feel that the sound quality is not significantly reduced.