Use FireBreath to develop real-time playback interfaces (Yate + SIP + FFMPEG + SDL) and firebreathyate
At that time, such a blog post was really needed to guide this function module. Unfortunately, FireBreath has very little information on the Internet and is not very familiar with C ++, so we tried and explored it all the way. Fortunately, we have implemented this module, and now we have recorded it.
First of all, our Yate SIP Server and terminal SIP Client, as well as the above SIP Client of Android, as well as the SIP Client of Windows PC terminal and the SIP Client of Linux PC terminal are all implemented.
Therefore, we recommend that you use this existing condition to implement the web service real-time playback function.
In addition, I just got to know FireBreath and tried to call SIP, RTP, FFMPEG, and SDL on it.
The problem encountered during this period has some records in other articles intermittently. I don't know why, RTP seems to be useless. During package receiving, the plug-in will be suspended directly. Later, UDP was used directly, removing the first 12 bytes.
Three interfaces are registered through FireBreath for JS calls on the interface, SipRegister (); SipInvite (); SipBye ();
The other functions are completed by the threads started under these three functions. In SipRegister, A SipMonitor thread that receives the SIP Message is called; When 200OK is received in SipMonitor, that is, the threads of AudioRecv, VideoRecv, AudioPlay, and VideoPlay are created in exosip answer;
SipInvite is used to send an Invite message and request to obtain real-time video data;
SipBye sends a Bye message, requesting to end the video request;
Collect AudioData and VideoData in the AudioRecv and VideoRecv threads, and write them into the ring buffer AudioCirBuf and VideoCirBuf;
In the AudioPlay and VideoPlay threads, extract the data of the ring buffer AudioCirBuf and VideoCirBuf through FFMPEG decoding and play it through SDL;
So far, the framework has probably appeared.