Sound does not seem to work with a MediaStream object created with a remote track (coming from a SFU).
The same MediaStream object works fine with audio and video html elements.
Sound does work with a local media stream coming from userMedia.
Update: When autoPlay property of html audio element is set to false my ubuntu system makes a pop. I can hear the same pop when I create a Sound. Therefore, the problem might be that “autoplay” property in the Sound class has no effect somehow.
This is probably because now, browsers will not let you play the sound without a user interaction. Try to run your code behind a click event (on a button for instance)
With audio.srcObject commented out there is no sound
With addSound commented out there is sound through the html audio element (autoPlay = true)
With nothing commented out, there is sound but it’s echoing because now the sound object actually works. sound.setVolume() also works and when the volume is set to 0 the echoing goes away. Also, this happens when autoPlay (audio element) is set to false - same echoing.
So, there is definitely something weird going on with the Sound class. It only works when an audio element is rendered and srcObject is set to the stream as well. The sound is echoing, feels like the stream is “cloned” multiple times.
There is no way I can get a track created through webrtc in the playground. A local stream from user media works as expected, probably because its audio track is not produced by webrtc (not sure about this) . But then again, if there was an issue with that webrtc track how would html audio/video elements work just fine ?
I think the issue could be the integration of the webrtc track/stream with AudioContext/AudioNode interfaces.
I can create a github repo for this. If someone has a UNIX system they can test it.
Without the hack, there must be an audio element and srcObject set to the stream for the sound to work. If an audio element is present and autoPlay is set to false, everything seems to work as expected - no echo, sound stop/play/setVolume all work as expected.
Finally got some time to play around with this. A couple of issues, my ubuntu system fails to build babylon. Constant firefox crashes, multiple firefox windows created, timeouts, builds fail at different tests (some passed until the 126th and then failed). Weird random stuff. My little surface laptop completes the build though.
Is there anyway to include dependencies in the build, so somehow I could import and use them in index.js in ./localDev ? I need socket.io to communicate with a server to create a stream. I could use the dist directory, but it would be a chore to rebuild every time.
Just an update. This issue is a known bug in Chrome, Firefox works perfectly.
In the last comment in https://bugs.chromium.org/p/chromium/issues/detail?id=121673#c96 they say that every Audio Node should be connected to AudioContext.destination. I’ve tried that locally but got no success as a bunch of other people. The only fix is to create an audio element in the constructor of Sound and mute it.
I am going to use the fix in my fork and see how it goes. Don’t know if you want to use it in the source code, but if you do let me know and I’ll create a PR.