Network Abstraction Layer Unit type (NALU):The Nalu header consists of a byte with the following syntax:+---------------+|0|1|2|3|4|5|6|7|+-+-+-+-+-+-+-+-+| F| nri| Type |+---------------+F:1 a bit.Forbidden_zero_bit. This one must be 0, as stipulated in the H.Nri:2 a bit.NAL_REF_IDC. Taking 00~11, it seems to indicate the importance of this nalu, such as 00 of the Nalu decoder can discard it without affecting the playback of the image.Type:5 a bit.Nal_unit_type. The type of this NALU unit is su
Original http://blog.csdn.net/gubenpeiyuan/article/details/19548019 ThemeFFmpegThis article outlines:This paper introduces the famous open source audio and video codec library ffmpeg how to decode H264 stream, and elaborates its h264 stream input process, decoding principle and decoding process in detail. At the same time, most of the application environment, the original stream video size display is not th
Transferred from: http://cache2.weidaohang.org/h/index.php?q=aHR0cDovL2Jsb2cuY3Nkbi5uZXQvemh1cWluZ183MzkvYXJ0aWNsZS9kZXRhaWxzLzY2MzY4NTc=Has been more confused a problem, all said FFmpeg function is very powerful, but he has not been to study, today finally saw a bit of its strong place!First of all, of course, the Linux compiler and installation success FFmpeg, about the specific installation process, you can refer to my previous blog post!Here is a direct introduction of how to put.
Intel Sandy-bridge HW h264 EncoderGOP(Group of pictures) settings
You can set h264 encoder parameters by referring to the sample code provided by the Intel Media SDK:
Intel (r) Media SDK encoding sampleUsage: sample_encode.exe h264 | MPEG2 [Options]-I inputyuvfile-O outputencodedfiLe-W width-H heightOptions:[-Nv12]-input is in nv12 color format, if not specified
Http://bbs.chinavideo.org/viewthread.php? Tid = 7575
I believe many of you want to play h264 video streaming media like me. However, a newbie often does not know where to start. Using Baidu, Google, and other search materials is a treasure. After N weeks of thinking, I made some achievements. It took a lot of useless effort. I spent a week watching the English protocol, and later I learned that there was a Chinese version, in addition, the g
It seems that the problem can only be solved in this way. Now we need to test more to prevent new problems. Currently, it does not affect the existing code, and the frame of the screen is directly blocked.
Ideas:I asked about h264 decoding of the set-top box. They used hardware decoding. They simply set an interface provided by hardware decoding: Set the error processing mode.I think this error handling mode will definitely block the wrong frames, so
Awesome video conferencing website: http://wmnmtm.blog.163.com/blog/#m =0
++++++++++++++++++++++++++++++++++++++++++++++++++++
http://wmnmtm.blog.163.com/blog/static/38245714201192491746701/
When using RTP to transmit H264, we need to use the SDP protocol description, two of them: Sequence Parameter Sets (SPS) and picture Parameter Set (PPS) need to be used, so where are these two items obtained? The answer is to get it from the
One, H264 video encoded into MP4 file
See: H264 video encoded into MP4 file
See: Compilation of Mp4v2 under VS2010 and its use in projectsRecently do the project needs to be H264 file encapsulated as MP4 file, from the Internet to find the MP4V2 library, downloaded down do not know where to start, the official website https://code.google.com/p/mp4v2/in the comp
IVideo Encoding
1.1 target of video compression and encoding
1) Ensure compression ratio
2) Ensure recovery quality
3) easy to implement, low cost, and reliability
1.2 starting point of compression (feasibility)
1) Time Correlation
In a video sequence, adjacent two adjacent frames have very few differences. This is the time correlation.
2) Spatial correlation
In the same frame, there is a large correlation between adjacent pixels. The closer the two pixels are, the stronger the side correlation
If you want to transmit video streams in real time when using Android phones for h264 hard encoding, you need to know the sequence parameter sets (SPS) and picture parameter set (PPS) of the video stream ).
Today, I understand how to obtain SPS and PPS. I will record them here. I hope you can get some help here.
First, let's take a look at the prerequisites. The video recording parameters I set are:
Mmediarecorder. setoutputformat (mediarecorder. outp
I wrote an article earlierArticleAnalysis of the format of using RTP for h264 packets: RTP encapsulation of h264. However, it seems that the split and some situations that need attention are not clearly stated, so here we will make a supplement and also serve as our own memo (I don't seem to have a good memory ).
note that the sampling rate of h264 is
To play the H264 bare stream, you can split it into the following three jobs:1. Decoding H264 bare stream to get YUV data2. Convert YUV data to RGB data fill picture3. Display the captured pictureTo complete the work 1, we can directly use the HiSilicon decoding library, because the HiSilicon decoding library is C + + dynamic library, to complete in C # call can refer to HiSilicon
iOS audio AAC video H264 coded push flow best practicesProjects are personal research and experimentation, there may be many bad or wrong places please forgive.1 Overview of features* Realization of audio and video data collection* Realize the encoding of audio and video data, video encoding into H264, audio encoding into AAC* To achieve the release of audio and video data, the encoded audio and video trans
, skip here.But there is a problem to note that non-IE browser session will be lost, the search for a lot of data, the final summary of the reasons are:
Because Uploadify uses a flash client, it produces useragent different from the user-agent of the browser.
Final Solution:
Copy Code code as follows:
Add the session parameter to the upmodify upload parameter as follows:
Scriptdata: {"session_id": "},
Add the following code to the server-side receive page:
if (@$_request['
H264 es raw data is generally in the format of the NAL (Network Abstract Layer). Can be used directly for file storage and network transport. Each nalu (Network Abstract Layer Unit) data is composed of data header +rbsp data.
The first step is to split the data stream into a single, NALU data.
The value of Nal_type,i_nal_type that gets Nalu equals 0x7 indicates that the NALU is an SPS packet. Find and parse this SPS packet, which contains very importa
Step by step learning, a little progress every day
FFmpeg + x264 + QT Decoding code H264
Decoding: H264 encoded format of the MP4 file decoded after saving RGB to PPM format
Encoding: Encode the decoded RGB format to H264
Code:
Decoding section:
. Pro
TEMPLATE = AppCONFIG + = ConsoleCONFIG-= qt
SOURCES + = Main.cppIncludepath + =-i/usr/local/include/LIBS + =-l/
I. H264 Basic Concepts
1.1, NAL, slice and frame introduction and interrelated
NAL refers to the network extraction layer, which put some network-related information.
Slice is the meaning of the film, 264 in the image into a frame (frame) or two fields (field), and the frame can be divided into one or several slices (slilce), the slices are composed of macro blocks (MB). A macro block is a basic unit of encoding processing.
A frame can be divided into
Live555 in armlinux, and download the Arm board to play the H264 file. my system is ubuntu11.101. download the live555 source code and H264 test file 2. modify config according to your own cross compiler. * (config. armeb-uclibc), my compiler is buildroot -... live555 in arm linux cross-compilation, and download the Arm board to play H264 files. my system is ubun
1 Vocabulary conventions of this articleMacro BLOCK: H264 encoded base unit, 16x16 pixel (or sample) compositionBlock: A unit of 8x8 pixels (or samples)Sub-block: a unit of 4x4 pixels (or sampling) 2 in-Frame brightness prediction modeH264 specification, the macro block has 4 kinds of intra-frame brightness prediction mode, the pattern number is 0,1,2,3, the block and the child block respectively has 9 kinds of intra-frame brightness prediction mode,
here.However, there is a problem to note that non-IE browser session will be lost, looked up a lot of information, the final reason is summarized:
Because the flash client, such as Uploadify, produces a useragent that differs from the user-agent of the browser.
Final Solution:Copy the Code code as follows:Add the session parameter to the upmodify upload parameter, as follows:Scriptdata: {"session_id": ""},Add the following code to the service-side receive page:if (@$_request[' session_id ') ($
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.