I have a server that I can communicate with WebSockets and it delivers streaming audio to me. I am able to communicate and receive this streaming audio data, and I would like to stream it into BabylonJS. I have looked through all the current forum posts, but I haven’t managed to piece together a working solution yet.
Could anyone help me with some code that sets up the necessary audio sources, tracks, streams, etc and essentially is left with a buffer I can add data to (as it comes in from the server) and it’ll playback?
I tried adding an audio element in my HTML:
<audio id="audio"></audio>
Then over in BabylonJS:
const audio_element = document.getElementById("audio") as HTMLMediaElement;
if (audio_element && engine.getRenderingCanvas()) {
audio_element.src = URL.createObjectURL(media_source);
const media_stream = engine.getRenderingCanvas()!.captureStream();
const sound = new Sound("Orson.wav", media_stream, scene, null, { autoplay: true, streaming: true, spatialSound: false});
media_source.onsourceopen = function () {
const source_buffer = media_source.addSourceBuffer("audio/mpeg");
my_audio_service.onAudioData = (data) => {
source_buffer.appendBuffer(data);
};
}
The sound fails to create as it says no tracks are attached, also addSourceBuffer I initially tried to pass in wav as the MIME type, but the wav variations I tried were all rejected as being unsupported.
I feel like I must be going about this the wrong way, making it more complicated than it needs to be.