OfflineAudioContext
The OfflineAudioContext
interface is an AudioContext
interface representing an audio-processing graph built from linked together AudioNode
s. In contrast with a standard AudioContext
, an OfflineAudioContext
doesn't render the audio to the device hardware; instead, it generates it, as fast as it can, and outputs the result to an AudioBuffer
.
Constructor
OfflineAudioContext.OfflineAudioContext()
-
Creates a new
OfflineAudioContext
instance.
Properties
Also inherits properties from its parent interface, BaseAudioContext
.
-
OfflineAudioContext.length
Read only -
An integer representing the size of the buffer in sample-frames.
Event handlers
OfflineAudioContext.oncomplete
-
Is an event handler called when processing is terminated, that is when the
complete
event (of typeOfflineAudioCompletionEvent
) is raised, after the event-based version ofOfflineAudioContext.startRendering()
is used.
Methods
Also inherits methods from its parent interface, BaseAudioContext
.
OfflineAudioContext.suspend()
-
Schedules a suspension of the time progression in the audio context at the specified time and returns a promise.
OfflineAudioContext.startRendering()
-
Starts rendering the audio, taking into account the current connections and the current scheduled changes. This page covers both the event-based version and the promise-based version.
Deprecated methods
OfflineAudioContext.resume()
-
Resumes the progression of time in an audio context that has previously been suspended.
Note: The resume()
method is still available — it is now defined on the BaseAudioContext
interface (see AudioContext.resume
) and thus can be accessed by both the AudioContext
and OfflineAudioContext
interfaces.
Events
Listen to these events using addEventListener()
or by assigning an event listener to the oneventname
property of this interface:
complete
-
Fired when the rendering of an offline audio context is complete. Also available using the
oncomplete
event handler property.
Examples
In this simple example, we declare both an AudioContext
and an OfflineAudioContext
object. We use the AudioContext
to load an audio track via XHR (BaseAudioContext.decodeAudioData
), then the OfflineAudioContext
to render the audio into an AudioBufferSourceNode
and play the track through. After the offline audio graph is set up, you need to render it to an AudioBuffer
using OfflineAudioContext.startRendering
.
When the startRendering()
promise resolves, rendering has completed and the output AudioBuffer
is returned out of the promise.
At this point we create another audio context, create an AudioBufferSourceNode
inside it, and set its buffer to be equal to the promise AudioBuffer
. This is then played as part of a simple standard audio graph.
Note: For a working example, see our offline-audio-context-promise Github repo (see the source code too.)
// define online and offline audio context var audioCtx = new AudioContext(); var offlineCtx = new OfflineAudioContext(2,44100*40,44100); source = offlineCtx.createBufferSource(); // use XHR to load an audio track, and // decodeAudioData to decode it and OfflineAudioContext to render it function getData() { request = new XMLHttpRequest(); request.open('GET', 'viper.ogg', true); request.responseType = 'arraybuffer'; request.onload = function() { var audioData = request.response; audioCtx.decodeAudioData(audioData, function(buffer) { myBuffer = buffer; source.buffer = myBuffer; source.connect(offlineCtx.destination); source.start(); //source.loop = true; offlineCtx.startRendering().then(function(renderedBuffer) { console.log('Rendering completed successfully'); var song = audioCtx.createBufferSource(); song.buffer = renderedBuffer; song.connect(audioCtx.destination); play.onclick = function() { song.start(); } }).catch(function(err) { console.log('Rendering failed: ' + err); // Note: The promise should reject when startRendering is called a second time on an OfflineAudioContext }); }); } request.send(); } // Run getData to start the process off getData();
Specifications
Specification |
---|
Web Audio API # OfflineAudioContext |
Browser compatibility
Desktop | Mobile | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Chrome | Edge | Firefox | Internet Explorer | Opera | Safari | WebView Android | Chrome Android | Firefox for Android | Opera Android | Safari on IOS | Samsung Internet | |
OfflineAudioContext |
35
25-57
|
12 |
25 |
No |
22
15-44
|
14.1
6-14.1
|
4.4.3
≤37-57
|
35
25-57
|
25 |
22
14-43
|
14.5
6-14.5
|
3.0
1.5-7.0
|
OfflineAudioContext |
35
25-57
|
12 |
53 |
No |
22
15-44
|
14.1
6-14.1
|
4.4.3
≤37-57
|
35
25-57
|
53 |
22
14-43
|
14.5
6-14.5
|
3.0
1.5-7.0
|
complete_event |
25 |
12 |
25 |
No |
15 |
6 |
≤37 |
25 |
25 |
14 |
6 |
1.5 |
length |
51 |
14 |
49 |
No |
38 |
14.1 |
51 |
51 |
49 |
41 |
14.5 |
5.0 |
oncomplete |
25 |
12 |
25 |
No |
15 |
6 |
≤37 |
25 |
25 |
14 |
6 |
1.5 |
resume |
49 |
14 |
No |
No |
36 |
14.1 |
49 |
49 |
No |
36 |
14.5 |
5.0 |
startRendering |
25 |
12 |
25 |
No |
15 |
14.1
6-14.1
|
≤37 |
25 |
25 |
14 |
14.5
6-14.5
|
1.5 |
suspend |
49 |
14 |
No |
No |
36 |
14.1 |
49 |
49 |
No |
36 |
14.5 |
5.0 |
See also
© 2005–2021 MDN contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/OfflineAudioContext