HTML5 Audio API Research and learning 1

Source: Internet
Author: User

HTML5 Audio

1.audio Sprite

The main idea of the audio sprite is actually similar to the CSS Sprite, which synthesizes an audio into one audio and then differentiates it by CurrentTime to get the current playback time, but before executing the code here, the audio

Load into the page first. At that time how to judge the page has been loaded sound, many of the properties of the mobile audio support is not very good, in a lot of trying to find a lot of information on the Internet, and found that can use Audiocontext object to manage and play the sound

<input type= "button" value= "Load Sound" id= "button" onclick= "Loadsound ()"/>    <audio src= "Smbb.mp3" id= "Audio1" ></audio>    <br/><br/><br/>    

var audiosprite = document.getElementById ("Audio1");                        Audiodata = {shake: {start:0, length:10 }, Win: {start:15, le NGTH:15}} document.getElementById ("Play"). AddEventListener ("click", Function () {audiosprite.play () Audiosprite.currenttime = Audioda                        Ta.shake.start;                    Audiosprite.play ();                    }, False); var handler = function () {if (this.currenttime >= AudioData.shake.start + AudioData.shake.leng                            Th) {this.pause ();                SetTimeout (function () {Audiosprite.removeeventlistener ("timeupdate", Handler, false);                Audiosprite.currenttime = AudioData.win.start;                                Audiosprite.play ();                            Audiosprite.addeventlistener ("Timeupdate", Handler2, false);                        }, 1000); }} var handler2 = function () {if (This.currenttime >= a                        UdioData.win.start + audioData.win.length) {this.pause (); }} audiosprite.addeventlistener ("Timeupdate", Handler, false);

2. Using Audiocontext, in order to generate a sound, one or more sound sources need to be created, and these sound sources are connected together through Audiocontext objects, which are not directly connected, Instead, indirect connections are made through multiple Audionodes objects that are used to process various modules of the audio signal.

A Audiocontext object can support multiple audio inputs and can support the generation of audio graphs, so we only need to create a Audiocontext object in an audio processing application

<span style= "White-space:pre" ></span>var context;        if (webkitaudiocontext) {            context = new Webkitaudiocontext ();        } else{            context = new Audiocontext ();//Standard        }
2.1 After creating the Audiocontext object, we can load the audio data using the Audiobuffer object. There is a button that loads the sound, and we use the XMLHttpRequest object to get the audio data from the server-side MP3 file by clicking the button. The following examples:

function Loadsound () {            var request = new XMLHttpRequest ();            Request.open ("GET", "Smbb.mp3", true);            Request.responsetype = "Arraybuffer";            Request.onload = function () {                context.decodeaudiodata (request.response, function (buffer) {                    Console.log ( Buffer)                }, OnError);            }            Request.send ();        }        function OnError (e) {            console.log (e);        }

In the above code, the user clicks the "Load Sound" button and executes the Loadsound function in the JavaScript script code, where the XMLHttpRequest object is used to get the audio data from the server-side MP3 file, because the audio data is binary data, So the Responsetype property value of the XMLHttpRequest object is set to "Arraybuffer". When you accept audio data that is not decoded in the server's MP3 file, you can use the Decodeaudiodata method of the Audiocontext object to decode it.

the Decodeaudiodata method of the Audiocontext object uses three parameters, the first of which is a Arraybuffer object that loads the uncompressed audio data, and the 2nd, 3 parameters are a function. Represents the callback function that executes when the decoding processing of audio data executes successfully when execution fails when the decoding of the audio data is performed.

2.2 When the audio loading is complete, we can create a Audiobuffersourcenode object (the sound source representing the audio playback device) from the Createbuffersource object of the Audiobuffer object. and specify the buffer property value of the Audiobuffersourcenode object as the Audiobuffer object, the code is as follows

var Source = Context.createbuffersource ();    Source.buffer = buffer;
This buffer is the buffer returned by the above example Decodeaudiodata.

Next, we need to connect the sound source to the destination property value of the Audiocontext object (representing the audio playback device) using the Connect method of the Audiobuffersourcenode object.

Source.connect (context.destination);
Connect method for Audiobuffersourcenode objectsWith a parameter, the value of the parameter is the destination property value of the Audiocontext object, which is the audio playback device on the client computer

Finally play the sound using the Start method of the Audiobuffersourcenode object

Source.start (0)//Note: Before online with a lot of books on the location is said to use the Noteon method, at that time, I tested the method is undefined. Use the start () method to play the test on the iOS system if the Start method does not pass parameters, the method is undefined, and later found to have to give a default argument.

The entire code at this stage

<! DOCTYPE html>
Summary, HTML5 's Audioapi function is really very powerful, research is also very interesting, but the technical department at home, in the time spent in the pondering is also a lot of, of course, there are loopholes, said the wrong place also please forgive me, the next stage has more interesting control play rhythm, Multiple sound cross-blends and smooth transitions between multiple sound files ...

HTML5 Audio API Research and learning 1

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.