Source:http://www.aichengxu.com/view/37145In the iOS platform using ffmpeg decoding H264 video stream, there is a need for friends to refer to below.For mainstream video transmission protocols such as video files and RTSP, FFmpeg provides a avformat_open_input interface that can be opened directly by passing in a file path or URL. Reading video data, decoder initial parameter settings, etc., can be done by invoking the API.However, for
To put it simply, the MKV and AVI formats only encapsulate containers, which encapsulate video streams and audio streams. The container does not affect the image quality. What affects the quality is what is encapsulated in the container. Therefore, changing the container does not affect definition subtitle audio.
I would like to add some basic knowledge about video. Maybe there is something wrong with it.
First, the video format does not determine the definition. The definition of a video depend
reference for intra-frame prediction.
P Macro Block uses the previously encoded image as the reference image for intra-frame prediction.
The B Macro Block uses two-way reference images (the previous frame and the next frame) for intra-frame prediction.
The purpose of an slice is to limit the spread and transmission of codes so that the encoded slice is independent of each other.
The prediction of a piece cannot take the Macro Block in other pieces as the reference image, so that the prediction
The previous article introduced the "H264 video through the rtmp streaming", the following will explain how to H264 real-time video through the RTSP live.
The realization idea is to send the video stream to the live555, and the live555 to realize the H264 data stream live.
The video capture module sends the H264 data f
according to the RTMP protocol, the RTMP video stream can be published. To facilitate video packaging and publishing, an RTMPStream is encapsulated. Currently, only videos of H264 can be sent. You can directly send H264 data frames or H264 files. The interface provided by RTMPStream is as follows.
Class CRTMPStream {public: CRTMPStream (void );~ CRTMPStream (vo
H264 decoder source code, porting the H264 decoding part of the FFmpeg to Android, depth pruning optimization, in the simulator (320x480) validated through.
The application of the JNI architecture. Interface section, file reading, video display is done in Java, the bottom of the video decoding with C to do to meet the speed requirements.
In this release, the NAL from the
About the use of rtmpdump, many of the applications in the Windows platform, Thor has to do a series of analysis, but Raytheon is mainly based on the Windows platform. The main task of this article is to migrate the project from the simplest librtmp example: Release H.264 (H.264 through Rtmp) to the Linux system, while fixing some problems and adding a description.
Before using the project, there should be a rtmp server and a better understanding of the H264
1. Learning Route
Step1, preliminary understanding of H264, understanding H264 Data frame Classification and recognition
Step2, H264 syntax related algorithm parsing, this is important to understand the H264 video frame. The H264 data definition is made up of a set of bit b
Recently, the IIS server downloads h264 files. After IIS is deployed, the input in the address bar of the browser indicates that the files do not exist and cannot be downloaded. You can try other types of files.
Finally, my solution: set your own IIS server as follows:
1. Set mime to enable IIS to support more file types. skip this step if. h264 already exists in the MIME type. You can also set other ty
After debugging for a while, the RTSP server can finally see the live video. The test environment is
① Rt5350 runs in AP Mode
② The client is PC + VLC, And wifi uses a 54 Mbps USB Nic of TP-link.
Busybox v1.12.1 (22:09:57 Cst) built-in Shell (Ash)Enter 'help' for a list of built-in commands.# FreeTotal used free shared BuffersMem: 28116 18524 9592 0 836Swap: 0 0 0Total: 28116 18524 9592# TopMem: 18520 K used, 9596 K free, 0 K shrd, 836 K buff, 2840 K cachedCPU: 9% USR 9% sys 0% nice 81% idle
An h264 video decoding trial program is displayed on the Internet, which can be used to test its logical process.
The Code is as follows:
Test_display_h264 (){In_fd = open (h1__input_file, o_rdonly); // video file open
Fstat (in_fd, S); // get input file sizeFile_size = S. st_size;
In_addr = (char *) MMAP (0, file_size, prot_read, map_shared, in_fd, 0); // mapping input file to memoryPp_fd = open (pp_dev_name, o_rdwr); // post processor open, why d
Today, when I access the Internet, I occasionally find the answer to my questions.
H.264 video types
The following media subtypes are defined for H.264 video.
Subtype
Fourcc
Description
Mediasubtype_avc1
'Avc1'
H. 264 bitstream without start codes.
Mediasubtype_h264
'H264'
H. 264 bitstream with start codes.
Mediasubtype_h264
'H264'
Equivalent to
Today, when I access the Internet, I occasionally find the answer to my questions.
H.264 video types
The following media subtypes are defined for H.264 video.
Subtype
Fourcc
Description
Mediasubtype_avc1
'Avc1'
H. 264 bitstream without start codes.
Mediasubtype_h264
'H264'
H. 264 bitstream with start codes.
Mediasubtype_h264
'H264'
Equivalent to
The following content is transferred from the Peter Lee h264 code stream structure.
H.264's bitstream structure is very different from H.263's. It no longer uses a strict hierarchical structure.
The following image is from the network and represents the h264 code stream structure from another perspective:
Twelve h264 RTP packet Timestamp
Let's take h264 as an example.
Void hsf-videortpsink: dospecialframehandling (unsigned/* fragmentationoffset */, The function first checks whether it is the last packet of a frame. If yes, it marks the 'M' and then sets the timestamp. Where does this timestamp come from? It depends on who calls the function dospecialframehandling (). After searching, it is called by multifr
Http://www.cnblogs.com/haibindev/archive/2012/04/16/2450989.html implements an RTMP server that outputs H264 live streamsRTMP (Real time Messaging Protocol) is a common streaming media protocol, used to transmit audio and video data, combined with flash, widely used in live, on-demand, chat and other applications, as well as PC, mobile, embedded and other platforms, is to do streaming media development often come into contact with the protocol. I have
types of fu-a and fu-b. The type values are 28 and 29, respectively.Remember the definition of the previous nal_unit_type, 0~23 is for H264 use, 24~31 unused, in RTP packaging, if a nalu placed in a RTP packet, you can use Nalu Nal_unit_type, However, when you need to package multiple Nalu into a single RTP package, or if you need to package a NALU into multiple RTP packets, define a new type to identify it.Type Packet type name----------------------
rtmp Server for output H264 live streamingRTMP (Real time Messaging Protocol) is a common streaming media protocol, used to transmit audio and video data, combined with flash, widely used in live, on-demand, chat and other applications, as well as PC, mobile, embedded and other platforms, is to do streaming media development often come into contact with the protocol. I have previously written an article, "rtmp protocol to send H. E encoding and AAC en
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.