Robert O'Callahan dd57723337 Bug 804837. Part 9: Update WebAudio implementation to integrate with AudioNodeStream. r=ehsan
This is a mega-patch that was too hard to disentangle. Here's what it does:
-- Create infrastructure around AudioNode::UpdateOutputEnded to detect
when a node can no longer produce any output. When that becomes true,
disconnect it from the AudioNode graph.
-- Have AudioNode implement JSBindingFinalized to use as input in
UpdateOutputEnded.
-- Give every AudioNode a MediaStream, and give every connection
a MediaInputPort.
-- Actually play the audio that reaches the AudioContext's destination node.
-- Force AudioContext to use the audio sample rate defined by MediaStreamGraph.
-- Fix AudioBufferSourceNode's start and stop methods to possibly throw and
take default 'when' parameters.
-- Create an AudioNodeStream for AudioBufferSourceNode and give it a
AudioBufferSourceNodeEngine that does what's needed. Set parameters for
this engine in the start() and stop() methods.
-- Create AudioBuffer::GetThreadSharedChannelsForRate, which is responsible
for stealing the contents of any JS array buffers, and bundling them up
into a thread-shared read-only buffer object which can be used as
part of an AudioChunk. This method will also be responsible for
resampling and caching as necessary.
2013-02-05 12:07:25 +13:00
..