[HTML5] Web Audio API creates superb music visualization effects, html5api

Source: Internet
Author: User

[HTML5] Web Audio API creates superb music visualization effects, html5api

HTML5 is really a lot of cool stuff. Among them, the Web Audio API is regarded as one, and a demo of music visualization is pondered. First, go:

Project Demonstration: Don't talk. Click me! The source code has been linked to github. If you are interested, you can also go to star or fork. The source code comments are very clear ~~ I have read Liu's articles and source code before, and I feel that there are too many other aspects of the content. It may be difficult for beginners to grasp the focus of Web Audio APIs at once, so I will tell you about the Web Audio API from the perspective of a beginner.

 

The Web Audio API and the Audio tag provided by HTML5 are not the same. You can search for the differences between them. To put it simply, Audio is a tag with its own GUI, and its Audio operations are still relatively weak, while the Web Audio API encapsulates a lot of Audio operations and has powerful functions. The Web Audio API is still a draft. For the latest version, click here: Latest

Note: Because the Web Audio API is still very new, browser support is not very good now. The specific compatibility is as follows:

I. Overall process:

Generally, the process of a typical Audio application is as follows:

1. Create an audio environment (audio Context)

2. You can create audio sources using audio tags, file streams, and other methods.

3. Create an effect node, which can be a reverb, compression, latency, gain, or analysis node.

4. Select an output for the audio. Connect the source, effect, and output.

It should be easy to understand that the audio environment is your entire instrument hardware system, the sound source is your instrument, and the effect node is a variety of effects, of course, speaker output and various connections are also required. To process music, the most important thing is the effect node. This project does not add any other effects. It only uses one analysis node to quantify the audio signal and associate it with the Graphic Element for visualization.

 

Ii. Lite version code
1 <! DOCTYPE html> 2 

This is the most streamlined version. I believe it can help you quickly understand the entire process, but note that this versionIncompatible with mobile terminals.

 

Iii. key code analysis 1. Creating an audio environment (audio Context)

Creating an audio environment is very simple, but considering browser compatibility, you need to use one or more statements to be compatible with different browsers. In addition, for browsers that do not support AudioContext, you also need to add a try catch to avoid errors.

try {  var audioCtx = new (window.AudioContext ||window.webkitAudioContext)();} catch (err) {  alert('!Your browser does not support Web Audio API!');};
2. Create a sound source

You can use the following methods to create a sound source:

1) directly obtain from the video/audio element of HTML. The method used isAudioContext.createMediaElementSource().

Var source = audioCtx. createMediaElementSource (myMediaElement); // myMediaElement can be an element in the page or an audio/video object created with new

Note: As the audio tag has many pitfalls on mobile terminals, you can search for them to avoid them! If you do not consider mobile terminals, this is the easiest way.

2) create a sound source from the original PCM data using the following methods:AudioContext.createBuffer(),AudioContext.createBufferSource(), AndAudioContext.decodeAudioData(). The key code is as follows.

Var source = audioCtx. createBufferSource (); // create an empty audio source. This method is generally used. In the future, the decoded buffer data is put into the source and the source operation is performed directly. // Var buffer = audioCtx. createBuffer (2, 22050,441 00); // create a buffer data with a dual channel, 22050 frames, and 44.1k sampling rate. AudioCtx. decodeAudioData (audioData, function (buffer) {source. buffer = buffer; // put the decoded data into the source. // other operations}, function (err) {alert ('! Fail to decode the file! '); // Decoding error handling });

To obtain the file through the network, open an Ajax asynchronous request and set the request return type to 'arraybuffer'. The Code is as follows:

// Use XHR to load an audio track, and // decodeAudioData to decode it and stick it in a buffer. // Then we put the buffer into the sourcefunction getData () {var request = new XMLHttpRequest (); // open a request. open ('get', url, true); // request data from the url. responseType = 'arraybuffer'; // sets the returned data type request. onload = function () {var audioData = request. response; // decode audioCtx after data buffering is completed. decodeAudioData (audi OData, function (buffer) {source. buffer = buffer; // put the decoded data into the source for data processing}, function (err) {alert ('! Fail to decode the file! '); // Decoding error handling});}; request. send ();}

If you obtain the file locally, You need to select the file by using the input tag of the file type and listen to the onchnage event of input. When you select a file, you can start to read the file in the code, here, file reader is used to read files. The data type of the same reading result is also set to 'arraybuffer '. My project uses the file reader local reading method, taking into account the mobile terminal.

Var audioInput = document. getElementById ("uploader"); // HTML statement: <input type = "file" id = "uploader"/> audioInput. onchange = function () {// if the file length is not 0, the selected file is selected because the onchange event is triggered when you click Cancel. If (audioInput. files. length! = 0) {files = audioInput. files [0]; // get the file selected by the user // After the file is selected, use FileReader to read fr = new FileReader (); fr. onload = function (e) {var fileResult = e.tar get. result; // The file is read and decoded. decodeAudioData (fileResult, function (buffer) {source. buffer = buffer; // put the decoded data into the source. // go to the playback and analysis link.}, function (err) {alert ('! Fail to decode the file '); // decoding error}) ;}; fr. onerror = function (err) {alert ('! Fail to read the file '); // file reading error}; fr. readAsArrayBuffer (rfile); // Similarly, ArrayBuffer read }};
3. Create an effect node, select an output node, and connect the source, effect, and output nodes.

As mentioned above, there are many effects. This article describes how to create an analysis node. The connection here is the audio source> analysis node> output. Why not directly connect the audio source and output? In fact, it can also be directly connected, but because the analysis node needs to do some processing, If you directly connect the audio source and output, there will be a certain delay. In addition, a status parameter is defined to indicate the status value. There are two writing methods for the final startup. If you are interested, check them on the MDN.

Var status = 0, // status, playing in progress: 1, stop: 0 arraySize = 128, // you can get 128 sets of frequency values: analyser = audioCtx. createAnalyser (); // create an analysis node source. connect (analyser); // connect the audio source and analysis node together with analyser. connect (audioCtx. destination); // connects the analysis node and output together. source. start (0); // start the audio source status = 1; // change the audio status
4. Visual plotting

To achieve the visualization effect, you also need to perform a fourier transform on the analysis node to transfer the signal from the time domain to the frequency domain. The principle here is omitted by 10 thousand words... If you are interested, you can go and check the books related to signal processing. However, I cannot bear this rare opportunity! If you are not familiar with it, you can skip it or transfer it to me ...).

The getByteFrequencyData (array) function in the project is used to obtain the energy value of the desired frequency. The length of the array is the number of frequencies. Some students who have read the materials found that analyser is often used here. frequencyBinCount and analyser. fftsize, where analyser. fftsize is the size of the Fast Fourier Transform (FFT) used for frequency domain analysis. The default value is 2048. Analyser. frequencyBinCount is half of fftsize. Here I don't need so many groups of data, so I have customized an 8-bit unsigned integer array with a length of 128 characters (TAN haoqiang C language sequelae, don't blame ). Note that the frequency value ranges from 0 to 255. If you want a value with higher precision, you can use AnalyserNode. getFloatFrequencyData () to obtain a 32-bit floating point number. After obtaining these values, we can use them to associate them with some visual elements, such as the height of the common columnar spectrum, the radius of the circle, and the density of the lines. In my project, the energy ball is used.

Var canvas = document. getElementById ('drawcanvas '), ctx = canvas. getContext ('2d '), cwidth = canvas. width, cheight = canvas. height, visualizer = [], // visualize the shape of animationId = null; var random = function (m, n) {return Math. round (Math. random () * (n-m) + m); // returns m ~ Random Number between n}; for (var I = 0; I <num; I ++) {var x = random (0, cwidth), y = random (0, cheight), color = "rgba (" + random (0,255) + "," + random (0,255) + "," + random (0,255) + ", 0 )"; // randomize the color visualizer. push ({x: x, y: y, dy: Math. random () + 0.1, // ensure dy> 0.1 color: color, radius: 30 // energy ball initialization radius});} var draw = function () {var array = new Uint8Array (128); // create a frequency array analyser. getByteFrequencyData (array); // minute Analyze the frequency information. The result is returned to the array. The frequency value ranges from 0 ~ 255 if (status = 0) {// array returns to zero. Sometimes the audio is played completely, but the frequency value remains. In this case, the value must be forcibly cleared for (var I = array. length-1; I> = 0; I --) {array [I] = 0 ;}; var allBallstoZero = true; // The energy ball returns to zero for (var I = that. visualizer. length-1; I> = 0; I --) {allBallstoZero = allBallstoZero & (visualizer [I]. radius <1) ;}; if (allBallstoZero) {cancelAnimationFrame (animationId); // end animation return ;}; ctx. clearRect (0, 0, cwidth, cheight); f Or (var n = 0; n <array. length; n ++) {var s = visualizer [n]; s. radius = Math. round (array [n]/256 * (cwidth> cheight? Cwidth/25: cheight/18); // ratio of the ball size to the canvas size var gradient = ctx. createRadialGradient (s. x, s. y, 0, s. x, s. y, s. radius); // create the gradient of the energy ball. addColorStop (0, "# fff"); gradient. addColorStop (0.5, "# D2BEC0"); gradient. addColorStop (0.75, s. color. substring (0, s. color. lastIndexOf (",") + ", 0.4)"); gradient. addColorStop (1, s. color); ctx. fillStyle = gradient; ctx. beginPath (); ctx. arc (s. x, s. y, s. radius, 0, Math. PI * 2, true); // draw an energy ball ctx. fill (); s. y = s. y-2 * s. dy; // move the energy ball up if (s. y <= 0) & (status! = 0) {// reset s to the top of the canvas. y, randomization s. x s. y = cheight; s. x = random (0, cwidth) ;}} animationId = requestAnimationFrame (draw); // animation };
5. Others

Audio Broadcasting:

Source. onended = function () {status = 0; // change the playback status };

In terms of adaptability, the canvas size is adjusted in onload and onresize of the window, but the focus is not here. The focus is on how to adjust the size of the energy ball following the size of the canvas. Because the animation is in progress, the best solution is to dynamically change it in the animation, therefore, the values of cwidth and cheight in the code above should be placed in the drawing animation.

var canvas = document.getElementById('drawCanvas');canvas.width = window.clientWidth           || document.documentElement.clientWidth               || document.body.clientWidth;canvas.height = window.clientHeight               || document.documentElement.clientHeight               || document.body.clientHeight;

Add tips: capture the energy ball with the mouse, capture the energy ball when the mouse enters the energy ball, and keep moving to continue to grasp the energy ball

// Capture the energy ball canvas. onmousemove = function (e) {if (status! = 0) {for (var n = 0; n <visualizer. length; n ++) {var s = visualizer [n]; if (Math. sqrt (Math. pow (s. x-e.pageX, 2) + Math. pow (s. y-e.pageY, 2) <s. radius) {s. x = e. pageX; s. y = e. pageY ;}}}};

Finally, let's talk about very important code specifications!

The above code is only rewritten to simplify the link highlighting process and does not comply with the code specifications. For better encoding, we should create a global object and write all the above attributes and methods into it. Global objects can be conveniently managed, and can be viewed and edited directly in the console during debugging in chrome. By convention, the attributes of an object are directly written in the constructor. The object method is written into the prototype to facilitate future extension of inheritance. The private methods used inside the object also have private attributes starting with a dash, which are also considered from the encapsulation perspective.

 

This article mainly refers to the blog and code (next 2) of Liu wayong. Later I checked the MDN and found a more popular and simple version of the next 1, which is strongly recommended! Writing an article is really not easy. First, we need to reorganize our ideas, then we need to collect materials, increase or decrease code, and the workload is huge. It's not easy to write. Just like it ~

Reference:

1. Visualizations with Web Audio API (Official original, strongly recommended) https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API/Visualizations_with_Web_Audio_API

2, open your Audio, feel the HTML5 Audio API to bring the Audio-visual feast http://www.cnblogs.com/Wayou/p/html5_audio_api_visualizer.html

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.