Is there a way to provide the sound engine an Audio Context that has already been created elsewhere on our webpage prior to Babylon loading to avoid creating a second context?
Similarly, is there a way to specify a custom MediaStreamAudioDestinationNode as the destination for the audio engine? We are trying to use this node to grab a media stream that may be sent back out via webrtc.
Yeah, I would be happy to. It should not be too tricky because it just impacts the initialization of the audio graph, and the final destination everything gets routed to. Everything in between on the audio graph should be able to function exactly as is.
One other thing that might also be a neat feature, in the past on some of our web audio managers we have added the ability to inject custom web audio processing functionsā¦ Something like:
var music = new BABYLON.Sound("Music", "music.wav", scene, null, {
loop: true,
autoplay: true,
customProcessing: myCustomProcessor
});
where:
function myCustomProcessor(sourceNode, destinationNode){
//custom web audio API processing injected into graph
}
Adds the ability to take full advantage of the web audio API to create sound effects while still utilizing the core sound engine.
Processor was probably a very poor choice of words on my part thereā¦ I should have said something more generic for injecting non deprecated API onto the graph (convolution, filters, etcā¦).
Another option there would be to just have a sound constructor that takes an audio node as the source.
Our working examples are in apps that require user accounts to log inā¦ but the basic workflow looks like this:
private playSoundEffect(src : string) {
let audioEl = document.createElement("AUDIO");
audioEl.src = src;
//...
myAudioManager.addAudioElement(audioEl, 1, myCustomAudioEffect);
audioEl.play();
}
private myCustomAudioEffect(sourceNode, destinationNode){
let myCustomEffectNode = //whatever works... maybe make it sounds like its on an intercom
let myCustomEffectNode2 = //or sound like an alien
sourceNode.connect.connect(myCustomEffectNode);
myCustomEffectNode.connect(myCustomEffectNode2);
myCustomEffectNode2.connect(destinationNode);
}
//In the audio manager
public addAudioElement(audioEl, gain, customAudioEffect){
let audioSourceNode = this.audioContext.createMediaElementSource(audioEl);
let gainNode = this.audioContext.createGain();
gainNode.connect(this.audioDestinationNode);
if(customAudioEffect == undefined){
audioSourceNode.connect(gainNode);
} else {
customAudioEffect(audioSourceNode, gainNode)
}
}
OK. Just thinking that you might not even need specific code for a Mediastreamaudiodestinationode, if the sourcenode arg for that connection feature Can be null.