我在Unity中使用Agora.io来执行屏幕共享,当涉及两台台式电脑时,它运行良好。现在我正试图用Oculus Quest和一台PC实现同样的效果。PC将有一个显示Oculus屏幕视图的原始图像纹理。不幸的是,根本没有输入,只有一个黑屏。但提醒你,当两台电脑甚至一部安卓手机连接时,它会显示屏幕视图,效果很好。只有当Oculus Quest连接时,它才不起作用。我甚至已经授予Oculus实现这一目标所需的所有权限,但它不起作用。
编辑:我知道我必须将Screen.width和Screen.height更改为自定义渲染纹理,并将其附加到相机上。我也这样做了,但这一次即使在桌面模式下输出也是空的。
using System;
using System.Collections;
using System.Collections.Generic;
using System.Globalization;
using System.Runtime.InteropServices;
using agora_gaming_rtc;
using UnityEngine;
using UnityEngine.UI;
public class ScreenShare : MonoBehaviour {
Texture2D mTexture;
Rect mRect;
[SerializeField]
private string appId = "Your_AppID";
[SerializeField]
private string channelName = "agora";
public IRtcEngine mRtcEngine;
int i = 100;
void Start () {
Debug.Log ("ScreenShare Activated");
mRtcEngine = IRtcEngine.getEngine (appId);
// enable log
mRtcEngine.SetLogFilter (LOG_FILTER.DEBUG | LOG_FILTER.INFO | LOG_FILTER.WARNING | LOG_FILTER.ERROR | LOG_FILTER.CRITICAL);
// set callbacks (optional)
mRtcEngine.SetParameters ("{"rtc.log_filter": 65535}");
//Configure the external video source
mRtcEngine.SetExternalVideoSource (true, false);
// Start video mode
mRtcEngine.EnableVideo ();
// allow camera output callback
mRtcEngine.EnableVideoObserver ();
// join channel
mRtcEngine.JoinChannel (channelName, null, 0);
//Create a rectangle width and height of the screen
mRect = new Rect (0, 0, Screen.width, Screen.height);
//Create a texture the size of the rectangle you just created
mTexture = new Texture2D ((int) mRect.width, (int) mRect.height, TextureFormat.BGRA32, false);
}
void Update () {
//Start the screenshare Coroutine
StartCoroutine (shareScreen ());
}
//Screen Share
IEnumerator shareScreen () {
yield return new WaitForEndOfFrame ();
//Read the Pixels inside the Rectangle
mTexture.ReadPixels (mRect, 0, 0);
//Apply the Pixels read from the rectangle to the texture
mTexture.Apply ();
// Get the Raw Texture data from the the from the texture and apply it to an array of bytes
byte[] bytes = mTexture.GetRawTextureData ();
// Make enough space for the bytes array
int size = Marshal.SizeOf (bytes[0]) * bytes.Length;
// Check to see if there is an engine instance already created
IRtcEngine rtc = IRtcEngine.QueryEngine ();
//if the engine is present
if (rtc != null) {
//Create a new external video frame
ExternalVideoFrame externalVideoFrame = new ExternalVideoFrame ();
//Set the buffer type of the video frame
externalVideoFrame.type = ExternalVideoFrame.VIDEO_BUFFER_TYPE.VIDEO_BUFFER_RAW_DATA;
// Set the video pixel format
externalVideoFrame.format = ExternalVideoFrame.VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_BGRA;
//apply raw data you are pulling from the rectangle you created earlier to the video frame
externalVideoFrame.buffer = bytes;
//Set the width of the video frame (in pixels)
externalVideoFrame.stride = (int) mRect.width;
//Set the height of the video frame
externalVideoFrame.height = (int) mRect.height;
//Remove pixels from the sides of the frame
externalVideoFrame.cropLeft = 10;
externalVideoFrame.cropTop = 10;
externalVideoFrame.cropRight = 10;
externalVideoFrame.cropBottom = 10;
//Rotate the video frame (0, 90, 180, or 270)
externalVideoFrame.rotation = 180;
// increment i with the video timestamp
externalVideoFrame.timestamp = i++;
//Push the external video frame with the frame we just created
int a = rtc.PushVideoFrame (externalVideoFrame);
Debug.Log (" pushVideoFrame = " + a);
}
}
}
如何管理渲染纹理?它链接到相机吗?您应该将渲染纹理指定给相机并从中获取数据。以下是另一个项目的示例,您可以了解渲染纹理数据的使用方式。
还要注意的是,您正在遵循一个过时的教程,其中API在SDK更新后略有变化。这方面的例子也已过时。Pixel格式应使用RGBA而不是BGRA,以实现跨平台兼容性。
externalVideoFrame.format = ExternalVideoFrame.VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_RGBA;
从Vulkan切换到OpenGLES3。
我猜ReadPixels((函数在带有Vulkan API 的Quest 2中不起作用