I am just creating an audio manager, and I am able to play a sound effect instance, and pass it an Audio Listener and an Audio Emitter to play a 3D sound.
It sort of works, but I have a test set up, one emitter on the left, one on the right, but when played, they both sound the same. I know I must be doing something daft, can anyone shed some light on what I am doing wrong?
Here are how I have the listener and emitters set up:-
I can then call my function to eithe call the sound effect without and emitter and a listener, or with, like this:
The code to call the sound effect looks like this:
So, hitting F1, I get the sound I expect, hitting F2 or F3, I get the sound, but slightly quieter, and at the same volume in each ear peice. I am expecting F2 to sound like it’s on the left, and F3 as if it is on the right, but they both sound central, and a little quieter…
I have ensured the mp3 is also a sterio sound file with Audacity, and playing L and R through that gives me the effect I expect from my code sample.
I have noticed, it is altering the sound based on distance rather than position, which is odd, so if I put one further from the listner it’s quieter, but, that’s really not how that should work, it should also be acting on the ballance, not just the volume…
Stereo sound effects cannot work in 3D, think about it.
They already contain spatial data and the code should not make assumptions about the data like only using the left channel.
I use 3D sounds all the time and the only issue I have with it is matching the rate of fall off of the sound to what seems correct in the scene can be really tricky.
Also ae you setting the rest of the parameters on AudioListener?
AudioListener.Forward is important for obvious reasons