Project address, for starHttps://github.com/979451341/Audio-and-video-learning-materials/tree/master/FFmpeg (mp4%e8%bd%acyuv%ef%bc%89This time is to decode the MP4 the YUV file out, first introduced a wave of YUV fileYUV refers to the luminance parameters and chromaticity parameters are separated by the pixel format, and the advantage of this separation is not only to avoid mutual interference, but also to
The color display principle of Computer Color Display is the same as that of color TV sets. The principle of adding and Mixing colors of R (red), g (green), and B (blue) is used: three electron beam with different intensity are emitted to make the red, green, and blue phosphor materials on the inside of the screen emit light and produce color. This color representation method is called RGB Color Space Representation (it is also the most commonly used Color Space Representation Method in multimed
Recently in the camera, looked up some data about YUV, which is mainly for the YUV422 format signal (mobile phone camera).
YUV signal has many kinds, general YUV420 and YUV422 used more, and for CMOS sensor, most of them are YUV422.
YUV422 format, also divided into many small classes, according to the arrangement of U, V can have yuyv,yvyu,uyvy,vyuy four kinds, of which, Yuyvy is generally referred to a
=arm \
-- Enable-cross-compile \
--sysroot= $SYSROOT \
--extra-cflags= "-os-fpic $ADDI _cflags" \
-- extra-ldflags= "$ADDI _ldflags" \
$ADDITIONAL _configure_flag
#make
}
Cpu=arm
prefix=$ (pwd)/android/$CPU
addi_cflags= "-marm"
Build_one
The key is the path of the NDK, according to the actual situation configuration, NATIVE-API version and cross-compiler tool chain selection, pay attention to 32-64 x86 host difference, the target platform for ARMAccording to the above compi
This paper describes in detail how to use MATLAB to read and write YUV files, and gives the detailed code and its analysis.The sample program is as follows:Close All;clear;fid1=fopen (' D:\HM-14.0-ROI\bin\vc10\Win32\Release\Result\ background modeling result\hall_cif_352x288_300\qp=22\ Bgf_rec.yuv ', ' RB '); Fid2=fopen (' D:\HM-14.0-ROI\bin\vc10\Win32\Release\Result\ background modeling RESULT\HALL_CIF_352X288_300\QP =22\cal_rec.yuv ', ' RB '); Outfi
The previous article is mainly to participate in the Awesomeplayer directly with the Softwarerenderer class to display YUV, in order to use this class, at all costs of relying on libstagefright, libstagefright_color_ Conversion and other dynamic static library, which causes the program to have very high coupling degree, also does not facilitate us to understand the YUV data direct display of the deep-seated
About YUV FormatThe YUV format typically has two main classes: the packaged (packed) format and the planar format. The former stores the YUV components in the same array,Usually several neighboring pixels make up a macro pixel (Macro-pixel), and the latter uses three arrays to store the YUV three components separately,
/*The main sampling formats are YCbCr 4:2:0, YCbCr 4:2:2, YCbCr 4:1:1, and YCbCr 4:4:4.Where YCbCr 4:1:1 is more commonly used, meaning: each point holds a 8bit luminance value (that is, the Y value),Each 2x2 point holds a Cr and CB value, and the image does not change much in the human eye.So, the original use of RGB (R,g,b are 8bit unsigned) model, a point requires 8x3=24 bits (such as the first figure),(after full sampling, YUV still accounts for 8
Differences between yv12 and i420
Generally, the video data directly collected is in rgb24 format. The size of a rgb24 frame is size = width × heigth × 3 bit, and the size of rgb32 is width × heigth × 4, for i420 (that is, the YUV standard format), the data volume is size = width × heigth × 1. 5 bit.After rgb24 data is collected, the data in this format needs to be compressed for the first time. The color space of the image is determined by r
Dark part: it is the value close to 0 in RGB 0.
Intermediate Value: the value in-025 of RGB.
Highlight: the value close to 255 in RGB 0.
RGB and YUV are both color spaces and can be converted to each other. YUV (also called ycrcb) is a color encoding method used by European TV systems, YUV is mainly used to optimize the transmission of color-electric video
In Linux these two days, YUV videos are collected using v4l2 and then encoded as H.264 files.
I saved the YUV video to a file and used the pyuv player to play it. The tragedy was that the video screen was played.
As follows:
The parameters for collecting videos are:
Size: 640*480
YUV format: yuyv is yuv422
I always thought it was my code that was wrong, because
The simplest Video Encoder: Based on libx265 (H.265 encoded YUV), libx265h. 265
This document records a simple H.265 (HEVC) Video Encoder Based on libx265. The previously recorded encoder uses FFmpeg to call libx265 for encoding. For example:
The simplest FFmpeg-based video encoder-new version (YUV encoding is HEVC (H.265)Compared with the encoder above, the encoder recorded in this article is a "lightweigh
YUV is a color coding method adopted by European television system. In modern color TV systems, a three-tube color camera or a color CCD camera is usually used to take the image and then takeThe color image signal is color-coded, respectively enlarged and corrected to get RGB, after the matrix transformation circuit, to obtain the luminance signal Y and two chroma signal r-y (U), B-y (V), and finallyThe sending side encodes three signals of brightness
The main sampling formats are YCbCr, and YCbCr. YCbCr is commonly used. It means that each vertex stores an 8-bit brightness value (that is, the Y value), and each 2x2 points stores a Cr and CB value, the image does not feel much changed to the naked eye. Therefore, the original RGB (R, G, and B are all 8bit unsigned) models, 4 points require 8x3 = 24 bites (such as the first figure ). now, we only need 8 + (8/4) + (8/4) = 12 bites, and each point occupies 12 bites on average (such as the second
Then the previous article
http://blog.csdn.net/openswc/article/details/51597755
Second, the ffmpeg will be YUV encoded as. H264
1. Download and install FFmpeg
./configure--enable-libx264--ENABLE-GPL--enable-sharedMakeMake install
2. Use the installed ffmpeg with the command to encode YUV as. H264
Ffmpeg-s 480x272-i ds_480x272.yuv-r 25-vcodec libx264 ds2.h264
Better articles to collect links:https://www.douban.com/note/76361504/Http://blog.sina.com.cn/s/blog_a85e142101010h8n.htmlHere is what I pasted from the link article, convenient for me to query."1"RGBRGB (red-green-blue) is defined by the color of the human eye recognition of the space, can represent most of the color. However, in scientific research, the RGB color space is generally not used because its details are difficult to adjust digitally. It will be tonal, brightness, saturation of three
Reprinted: http://www.cnblogs.com/soniclq/archive/2012/02/02/2335974.html
About YUV format
The YUV format generally has two categories: packed format and planar format. The former stores the YUV component in the same array,Usually several adjacent pixels form a macro pixel (macro-pixel), while the latter uses three arrays to separate and store three
This article introduces a simple Video Encoder Based on FFMPEG. The YUV420P pixel data is encoded as H.264 compressed data. The code of the encoder is very simple, but every line of code is very important. It is suitable for a good study. After the code is clarified, the encoding process of FFMPEG is basically clarified. At present, although I have already completed the program, I still haven't fully understood it in some places. I need to continue to explore and then add content.
This program u
The following coefficients are used in conversion process:
Copy code
C = Y-16
D = u-128
E = V-128
Using the previous coefficients and noting that clip () denotes clippingValue to the range of 0 to 255, the following formulas provideConversion from YUV to RGB:Copy codeR = clip (298 * C + 409 * E + 128)> 8)G = clip (298 * C-100 * D-208 * E + 128)> 8)B = clip (298 * C + 516 * D + 128)> 8)
These formulas use some coefficients that require more than
follows:
Av_register_all (): Note that all codecs of FFMPEG are supported.
Avformat_alloc_output_context2 (): Initialize avformatcontext of the output code stream.
Avio_open (): Open the output file.
Av_new_stream (): creates an avstream of the output code stream.
Avcodec_find_encoder (): Find the encoder.
Avcodec_open2 (): Enable the encoder.
Avformat_write_header (): Write a file header (this function is not required for some encapsulation formats without a file header. For example, mpeg2ts )
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.