Transferred from: http://blog.chinaunix.net/space.php?uid=10995602&do=blog&id=2918725
The process of audio and video is finished, and the next step is to see Audio and Video Synchronization (synchronization) problems. Opencore's approach is to set up a main clock, and audio and video are used as a means of output. In the Stagefright, audio output is through the callback function to move, video is rooted in the audio timestamp to do synchronization. The following are detailed instructions:
(1) When callback function drivers Audioplayer read the information after the code, Audioplayer will get two time stamps-Mpositiontimemediaus and Mpositiontimerealus
size_t audioplayer::fillbuffer (data, size) { ...
Msource->read (&minputbuffer, ...);
Minputbuffer->meta_data ()->findint64 (Kkeytime, &mpositiontimemediaus); Mpositiontimerealus = ((mnumframesplayed + size_done/mframesize) * 1000000)/msamplerate;
... }
|
Mpositiontimemediaus is the time stamp (timestamp) that is contained in the information, and Mpositiontimerealus is the actual time that the information is played (according to frame number and sample rate).
(2) video in Stagefright is based on the difference between the two time stamps derived from Audioplayer
void Awesomeplayer::onvideoevent () { ...
Mvideosource->read (&mvideobuffer, ...); Mvideobuffer->meta_data ()->findint64 (Kkeytime, &timeus);
Maudioplayer->getmediatimemapping (&realtimeus, &mediatimeus); Mtimesourcedeltaus = Realtimeus-mediatimeus;
Nowus = Ts->getrealtimeus ()-Mtimesourcedeltaus; Latenessus = Nowus-timeus;
... }
|
Awesomeplayer from Audioplayer Realtimeus (i.e. mpositiontimerealus) and mediatimeus (i.e. Mpositiontimemediaus), and calculate the difference of the value of Mtimesourcedeltaus.
(3) Finally, we'll schedule the video.
void Awesomeplayer::onvideoevent () { ... if (Latenessus > 40000) { Mvideobuffer->release (); Mvideobuffer = NULL;
postvideoevent_l (); Return } if (Latenessus <-10000) { postvideoevent_l (10000); Return }
Mvideorenderer->render (Mvideobuffer);
... }
|