Build a Streaming Media Server using FFMPEG + ffserver

Source: Internet
Author: User
Tags http 200 flv file

FFMPEG and ffserver can be used together to implement real-time Streaming Media Services.


I. Understanding

There are four main things in it, and the relationship between them is almost clear.

1. FFMPEG


2. ffserver


3. ffserver. conf


4. feed1.ffm


 

1. FFmpeg is responsible for the transcode of media files. It converts the source media files on your server into the streaming media files to be sent.


2. ffserver is responsible for responding to streaming media requests from the client and sending streaming media data to the client.


3. ffserver. conf and ffserver configuration file during startup. In this file, set the network protocol, cache file feed1.ffm (see below), and format parameters of the streaming media file to be sent.


4. feed1.ffm can be regarded as a cache file for streaming media data. FFMPEG sends transcoded data to ffserver. If no client connection request is sent, ffserver caches the data to the file.



2. Http creation process

1. Configure the ffserver. conf file (for initial contact, refer to DOC/ffserver. conf in FFMPEG source code, which contains detailed notes)

Write an example as follows

Port 10535

Rtspport 5454

Bindaddress 0.0.0.0,

# Maxhttpconnections 2000

Maxclients 1000

Maxbandwidth 1000

Customlog-

Nodaemon


# Real-time stream data configuration (refer to ffserver. conf under FFMPEG/test)

<Feed feed1.ffm>

File/tmp/feed1.ffm

Filemaxsize 1 m

ACL allow 127.0.0.1

</Feed>


<Stream test. Avi>

Feed feed1.ffm

Format AVI

#

Bitexact

Dctfastint

Idctsimple

Videoframerate 10

Videosize 352x288

Videobitrate 100

Videogopsize 30

Noaudio


Preroll 10

Startsendonkey

# Maxtime 100


</Stream>


# Existing files instead of real-time streams

 

<Stream test. FLV>

File "/project/apps/ffserver/test. FLV"

Format FLV

</Stream>



2. How to Implement playback

(1) Real-time streams are transmitted over HTTP

If the files on the hard disk are transmitted:

Ffserver-F myfile/ffmpeg0.8.9/ffserver. conf & FFMPEG-I inputfile (input file) http: // localhost: 10535/feed1.ffm

How to transmit the real-time stream captured by the camera:

Ffserver-F myfile/ffmpeg0.8.9/ffserver. conf & FFMPEG-F video4linux2-framerate 30-I/dev/video0 http: // 127.0.0.1: 8090/feed1.ffm


Start ffserver and FFMPEG. Ffserver is started before FFMPEG. It needs to add the parameter-F to specify its configuration file when it is started. After ffserver is started, feed1.ffm will be created. If you open feed1.ffm, you will find that the content starting with feed1.ffm has been written into the content, you can find the keyword FFM and the configuration information that sends the stream to the client. When feed1.ffm is used for buffering, the information will not be overwritten, think of them as the header of the feed1.ffm file.


After the ffserver is started, FFmpeg is started. A key parameter is "http: // ip: 10535/feed1.ffm". The IP address is the IP address of the host running ffserver, if both FFMPEG and ffserver are running in the same system, use localhost. After FFmpeg is started, it establishes a connection (short connection) with ffserver. Through this first connection, FFMPEG obtains the output stream configuration from ffserver to the client, use these configurations as the output of Self-encoding, and FFMPEG disconnects the connection and establishes a connection with ffserver (persistent connection) again ), using this connection, FFMPEG will send the encoded data to ffserver.


If you observe the output of ffserver, you will find that there will be two HTTP 200 requests during this period, which is the process of two connections.


After obtaining data from the camera, FFmpeg is encoded according to the encoding method of the output stream and then sent to ffserver. After FFMPEG data is received by FFMPEG, if there is no playback request on the network, write the data into the cache of feed1.ffm, add some header information to the data during writing, and then split the data into blocks. Each block is 4096b (each block also has a structure). When feed1.ffm is smaller than ffserver. after the size specified in Conf, the data will be written from the file (skip the header) to overwrite the old data. Until there is a playback request on the network, ffserver reads data from feed1.ffm and sends it to the client.


(2) transfer local files over HTTP

Ffserver-F/etc/ffserver. conf

Start ffserver with the command, and then use ffplay http: // ip: Port/test. FLV, or enter the above URL in VLC for playback.


(3) Use RTSP to transmit local files

Ffserver-F/etc/ffserver. conf

Start ffserver with the command, and then use ffplay rtsp: // ip: Port/RTSP. mpg, or enter the above URL in VLC to play back the video.

Note: During the test, the FLV file cannot be transmitted using RTSP.


This article is from "Chen shenggang's blog", please be sure to keep this source http://chenshengang.blog.51cto.com/4399161/1653493

Build a Streaming Media Server using FFMPEG + ffserver

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.