如何使用 Resonance Audio 的 setPosition 以像素 (x,y) 为单位定义音频源在 javascript 画布元素上的位置?



我有一个基于 2D 画布的网络音频游戏,其中包括使用 Web 音频 API 位于画布上特定像素坐标的空间化音频源。

虽然我已经成功地使用网络音频pannerNode在画布元素上精确定位每个音频源,如下所示:

var canvas = document.getElementById("map");
var context = canvas.getContext("2d");
function audioFileLoader(fileDirectory) {
var soundObj = {};
var playSound = undefined;
var panner = undefined;
var gainNode = undefined;
var getSound = new XMLHttpRequest();
soundObj.fileDirectory = fileDirectory;
getSound.open("GET", soundObj.fileDirectory, true);
getSound.responseType = "arraybuffer";
getSound.onload = function() {
audioContext.decodeAudioData(getSound.response, function(buffer) { soundObj.soundToPlay = buffer;
}); };
getSound.send();
panner = audioContext.createPanner();
panner.panningModel = 'HRTF'; 
soundObj.position = function(x,y,z) {
panner.setPosition(x,y,z);
};

我现在正在尝试使用共振音频 Web SDK 升级音频空间化,以便我可以使用其可以说是更高级的音频空间化特性。

如何使用共振音频的 setLocation 以像素 (x,y( 为单位定义画布元素上音频源的位置?

我似乎无法弄清楚如何将本机共振音频比例(米(转换为画布元素上的像素坐标。我假设如果我能解决这个问题,我会在 2d 游戏中定义不同音频室的大小和位置,这将非常酷。

谢谢。

因此,事实证明,如果您在画布上以像素为单位获取坐标,然后使用相同的单位(像素(来定位和更新听众的位置,那么一切都很好。只要您对声源和听众使用相同的单位,那么它们就会保持相对,并且共振音频空间化有效:

// Set some global variables            
var canvas = document.getElementById("map");
var context = canvas.getContext("2d");
var mouseX;
var mouseY;
// Map event functions 
// Get mouse coordinates on the map element
function updateCoords() {
mouseX = event.offsetX;
mouseY = event.offsetY;
}
// Create mouse event functions
function moveAroundMap(event) {
updateCoords();
mapX.innerText = mouseX;
mapY.innerText = mouseY;
// Update the listener position on the canvas in pixels (x,y)
resonanceAudioScene.setListenerPosition(mouseX,mouseY,-20); // elevate the listener rather than lowering the sources
}
map.addEventListener("mousemove", moveAroundMap, false);

// Create an AudioContext
let audioContext = new AudioContext();
// Create a (third-order Ambisonic) Resonance Audio scene and pass it
// the AudioContext.
let resonanceAudioScene = new ResonanceAudio(audioContext);
resonanceAudioScene.setAmbisonicOrder(3);
// Connect the scene’s binaural output to stereo out.
resonanceAudioScene.output.connect(audioContext.destination);
// Create an AudioElement.
let audioElement = document.createElement('audio');
// Load an audio file into the AudioElement.
audioElement.src = './samples/mono-seagulls.mp3';
audioElement.loop = true;
// Generate a MediaElementSource from the AudioElement.
let audioElementSource = audioContext.createMediaElementSource(audioElement);
// Add the MediaElementSource to the scene as an audio input source.
let source = resonanceAudioScene.createSource();
audioElementSource.connect(source.input);
// Set the source position relative to the listener
source.setPosition(140, 150, 0);

最新更新