Reprint Please specify source: Http://blog.csdn.net/typename powered by [email protected]
Objective:with the completion of the HTML5 standard, the recent various HTML5 speculation sound constantly, in this not to publish the future views on HTML5, this article focuses on HTML5 's new feature, audio tags. New tags have been added to the HTML5, and developers can embed audio directly using the voice tag, or directly embed the video directly using the visual tag, which has been supported in WHATWG early on, but has only recently been incorporated into the standard. Developers using Android platform WebView to load HTML5 Web games may encounter on Android 4.4 The system version is often the game background music fails or the game sound effects fail, often only one of them, this is the system WebView kernel bug, On Android 4.4 The system WebView uses the WebKit kernel, and the WebKit kernel has problems with the audio block. In Andorid4.4 and above game music can work normally, Android 4.4 system WebView adopt chromium kernel, let's look at the audio implementation of chromium kernel. Overview:the implementation of audio tags in chromium is mainly divided into the following parts:(1) Chromium need to implement audio playback control, resource acquisition. (2) responsible for audio parsing and encoding. (3) to implement the audio related parts of HTML and JavaScript binding, and render the audio control Panel, also need to provide the corresponding interface to the Web to control the playback state. Chromium AUDIO Implementation StructureBlink the HTML5 audio tag is found during parsing the Web page, it creates htmlaudioelement objects for it, Htmlaudioelement inherits from Htmlmediaelement, Among the WebCore, Htmlmediaelement is
The audio node is responsible for connecting the Web page JS operation with our browser platform docking implementation of the base unit. It requests the creation of media Player to perform the related actions of audio, which creates a Webmediaplayerimpl object that manages audio playback controls. Webmediaplayerimpl object has pipeline instance, pipeline to perform all audio operations, chromium using FFMPEG to encode the audio data stream, the parsed data is passed by Audiorender IPC to Audiorendererhost for processing, Audiorendererhost is the main process of the browser run Audiomanager to manage the data flow, its most
The end is through Android.media.AudioManager, Android.media.AudioTrack, Android.media.AudioRecord to complete the playback of the Web page Aduio.
Audio Related code directorysrc/media/base/audio data stream encapsulation includes audio decoder, renderer, buffer management, etc.src/media/audio/the entire playback process of audio runs on the line management and combines the platform for audio streaming playback, on the Android system last use of Android Audiomanagersrc/content/browser/renderer_host/media/This part of the audio correlation code is initialized in the UI thread of the browser main process, but some data operations are done on the IO thread, The code in this directory is responsible for maintaining audio's entire lifecycle and platform hardware interaction. Why chromium use FFmpegFFmpeg Open Source project is widely used in many video and audio players, thanks to its multi-encoder encoded stream format, and its decoding using frame-by-box decoding, with good efficiency. Although FFmpeg supports sharding decoding, but also requires that the source data also need to be fragmented encoding to support, chromium using multi-core technology, in chromium when the ffmpeg as a shunt layer to achieve audio, video decoding, making full use of multicore hardware features, Increases the decoding rate.
Chromium supported audio formatsWebm,wav,x-wav,ogg,mp3,mp4,x-mp3,x-m4a,ogv,ogm,oga
References:Http://www.chromium.org/audio-video
Chromium Audio Implementation Analysis