A blink of an eye has been three months did not write a blog, graduation season is indeed many things, and now finally complete graduation, blog or can not fall. Occasionally, you have to write.
Play HTML5 Audio API is because the blog in the park before the blog about this, think it is very fun, so I learned a bit. This article is for your own learning record only. Please indicate if there is any mistake.
The final effect is like the right figure, the landlord is just a simple demo, if there is more complex effect, the park friends can go to play
Demo Link: please poke me!!! play after selecting an audio file
At the same time, the API is currently not high-level browser support, to use in the production environment, please discretion.
First of all, to achieve this effect, to take a few steps:
1. Get the audio file and instantiate an audio container object.
2, by FileReader the audio file to Arraybuffer after the decoding.
3. The decoded buffer data is passed through the Audiobuffersourcenode interface to the audio container object and played.
4. Instantiate an analyzer using the Analysernode interface.
5. Use the analyzer to get the frequency of audio playback.
6, according to the frequency on the canvas to draw out.
Probably speaking on the above steps, the specific code analysis is as follows:
The object to be used first is defined first: it includes the Audiocontext audio container object, as well as the canvas's 2d drawing environment object, Requestanimationframe's compatibility notation.
var music = document.getElementById ("Music"), canvas = document.getElementById ("cas"), Ctx=canvas.getcontext ("2d"); Window. Audiocontext= window. audiocontext| | window.webkitaudiocontext| | = (function() { returnfunction (callback) {Window.settimeout (Callback, 1000/60);};}) (); var New Audiocontext ();
Then get the audio file, can be directly through the input file to get, or use XHR also line. Landlord for convenience, do the demo is directly using input file to obtain audio files. The code is as follows: Get to the audio file by onchange event Music.file[0]. Then decode the audio file, which is what the Changebuffer method does.
function () { if(music.files.length!==0) { changebuffer (music.files[0]); } }
Get to the audio file, first use Filrereader to convert the file to Arraybuffer object, after loading, you can get the file content through E.target.result.
Then decoding the contents of the file, using the Decodeaudiodata method in the Audiocontext object. According to the official API documentation, the method has three parameters: the first is the audio Arraybuffer object, the second is the successful decoding after the completion of the callback, the third is the decoding failed callback
function Changebuffer (file) { varnew FileReader (); function (e) { var fileresult = e.target.result; function (buffer) { playmusic (buffer) function(e) { console.log (e) alert ( "File decoding Failed")} ) } Fr.readasarraybuffer (files); }
Decoding succeeds in regretting the invocation of the Playmusic method and passing in the decoded buffer data, at which point a Audiobuffersource object is instantiated, and the properties of the Audiobuffersource object are five. respectively: buffer, playbackrate, loop, Loopstart, and Loopend,buffer are naturally the audio buffer data, Playbackrate is the speed at which the audio stream is rendered, the default value is 1. loop is the playback loop property, which defaults to false, and if set to true, the audio is looped back. Loopstart and LoopEnd are the time periods for the start and end of the loop, in seconds, with the default value of 0, which only works if the loop has a value of true.
After instantiating the Audiobuffersource object, give the audio an output destination, which is to connect the audio to the speaker, via Connect (audiocontext.destination).
Then create a parser that creates a parser directly from the Createanalyser method. Similarly, connect the audio to the parser by using the Connect method.
When you are ready, call the Start method to start playing audio. Then jump to the canvas's drawing method animate and draw the sound spectrum.
var Analyser; function playmusic (buffer) { var absn = ac.createbuffersource (); = Ac.createanalyser (); Absn.connect (analyser); Absn.connect (ac.destination); = buffer; true ; Absn.start (0); Animate () }
The previous two code landlord is not very understand, read the official API learned analyser Frequencybincount value: Half the FFT size, and according to the landlord on-line data learned that FFT is a fast algorithm of discrete Fourier transform, can transform a signal to the frequency domain. Some signals are difficult to see in the time domain, but it is easy to see the features when they are transformed into the frequency domain. The FFT can also extract the spectrum of a signal. That is to say, these two pieces of code should be the use of the FFT algorithm to extract the audio frequency of the information to convert the array, each of the values in the arrays represents the amount of audio in the current frequency of the signal. According to these signals, we can clearly see the difference of each frequency.
After acquiring the semaphore of each frequency, it is possible to draw different bar graphs with these parameters to complete the simplest audio animation.
function Animate () { varnew Uint8array (analyser.frequencybincount); Analyser.getbytefrequencydata (array);
ctx.clearrect (0,0, canvas.width,canvas.height); for (var i=0;i<array.length;i+=10) { ctx.fillrect (i,canvas.height-array[i], Array[i]); = "#FFF" ctx.strokerect (i,canvas.height-array[i], ten , Array[i]); } RAF (animate) }
Put all the code below:
<!doctype html>#cas {position:absolute; Left:0;top:0;bottom:0;right:0; Margin:auto; BORDER:1PX solid; } </style>varMusic = document.getElementById ("Music"), canvas = document.getElementById ("cas"), Ctx=canvas.getcontext ("2d"); Window. Audiocontext= window. audiocontext| | window.webkitaudiocontext| |Window.mozaudiocontext; Window. RAF= (function(){ returnWindow.requestanimationframe | | Window.webkitrequestanimationframe | | Window.mozrequestanimationframe | | Window.orequestanimationframe | | Window.msrequestanimationframe | |function(callback) {Window.settimeout (callback, 1000/60); }; })(); varAC =NewAudiocontext (); Music.onchange=function(){ if(music.files.length!==0) {Changebuffer (music.files[0]); } } functionChangebuffer (file) {varFR =NewFileReader (); Fr.onload=function(e) {varFileresult =E.target.result; Ac.decodeaudiodata (Fileresult,function(buffer) {playmusic (buffer)},function(e) {Console.log (E) alert ("File decoding failed")})} fr.readasarraybuffer (file); } varAnalyser; functionplaymusic (buffer) {varABSN =Ac.createbuffersource (); Analyser=Ac.createanalyser (); Absn.connect (analyser); Absn.connect (ac.destination); Absn.buffer=buffer; Absn.loop=true; Absn.start (0); Animate ()}functionAnimate () {varArray =NewUint8array (Analyser.frequencybincount); Analyser.getbytefrequencydata (array); Ctx.clearrect (0,0, Canvas.width,canvas.height); for(vari=0;i<array.length;i+=10) {ctx.fillrect (i,canvas.height-array[i], 10, Array[i]); Ctx.strokestyle= "#FFF"Ctx.strokerect (I,canvas.height-array[i], 10, Array[i]); } RAF (animate)}</script></body>View Code