1. Why compression?
The video compression standard and the data size of the original digital video signals are amazing. For example, when NTSC images are delivered at a resolution of about X, 24 BIST/pixel, and 30 frames per second, the video data is 640x480X24x30 = 221 Mb/s or 28 Mb/s seconds. Obviously, such a large data stream is unacceptable for most transmission lines, and it cannot be stored. That is to say, you can't watch the video on the Internet, or you can't put the video on the local device. If you want to add the video to the local device, the two hours of video will not know how big it is: 28 m * 60*60*2 = 196g more, who can hold it. Therefore, compression is required. Similarly, no one can afford to transmit 28 MB of data per second over the network.
The same is true for audio.
2. What compression algorithms are there?
Video Compression Algorithm:
MPEG 1, MPEG 2 and MPEG 4 three standards, due to its standardization, large compression ratio and high image quality, become the preferred algorithm for Video Compression Systems. Mpegi is a technology with a high compression ratio but poor image quality. mpegz focuses on image quality and has a low compression ratio, so it requires a large storage space; MPEG4 is a popular technology that can save space, improve image quality, and save network transmission bandwidth.
Of course, there are H.261 and other video compression algorithms.
Audio Compression Algorithm: mpeg I II III compression algorithm. MpEG III is an MP3 compression algorithm.
3. Audio and Video Compression (encoding) can be divided into hardware compression and software compression, and decompression (Decoding) is also
The signal processing and compression of all videos and audios are implemented by the high-speed DSP of the video card. The CPU does not need to be involved, it only transmits the data stream compressed by the fixed encoding format passed by the DSP to the network or writes the data to the hard disk. Using DSP hardware to complete the speech compression algorithm is a powerful means of recording and processing voice signals.
Software compression is different. It compresses DSP tasks by using CPU software commands, that is, compress the compressed data streams to the network or write them to the hard disk. This is obviously slow.
Decoder is divided into software decoder. Hardware decoder and wireless decoder. Software Decoder is the software decoding algorithm used by our player. The most typical example of hardware decoder is the DVD we used previously.
4. Actual Work Process
The working principle of the camera is roughly as follows: the scene is projected to the image sensor surface through an optical image generated by the lens (lens), and then converted to an electrical signal, after A/D (modulus conversion) after conversion, it is converted into a digital image signal and then sent to the digital signal processing chip (DSP) for processing. The front is completed by hardware. Transmit the image to the computer through the USB interface, and then you can see the image through the display. This is done through software. After CMOS is exposed to light through the lens, it will generate a charge and then process these signals through the DSP control chip.
Video: Camera (DSP) image capture (analog-to-digital conversion), image processing (compression and encoding), and then we use the upper-layer application program to call the compressed encoding after the data is stored in the hard disk or sent to the network. How does an application interact with a camera? It uses an interface driver. For example, a USB camera uses a USB driver to read the application, or the mini2440 camera uses a gpio port to connect to the camera, it is necessary to use the corresponding driver for reading.
5. Differences between the compression algorithm and the compression format.
Audio and video data are stored in a certain format using a certain compression algorithm. The file suffix is XX. For example, the compression algorithm of MPEG4 is used in asfformat, And the suffix is. ASF.
The encoder can compress and store the original data (using the compression algorithm). These are common encoding formats, and some professional encoding formats are not commonly used. Decoding software is usually called a plug-in to replay these videos and audios on a home device or computer.
During video playback, software is required to identify the encapsulation of various video files (commonly referred to as 'format'). After the data is split, it is sent to the decoder chip for decoding and then the decoded data is played. this 'unblock' and playback task must be completed by the playing software (player. generally, a player can recognize multiple video encapsulation formats (file formats). For example, coreplayer can play AVI, WMV, MP4, and other video formats. The run player can play videos in RM and rmvb formats. the file name suffix, such as MP4, 3GP, WMV, Avi, RM, and rmvb. in fact, these are encapsulation types. The real video format is not the file name but the video encoding scheme in the file and the Audio Encoding case. which files can be played depends on which player is used and whether the hardware decoding chip can identify the encoding scheme in the file.
The video format pursues compression ratio and image quality.