这是我的问题:我想显示一个我计算到MTKView的像素缓冲区。我搜索了MTLTexture、MTLBuffer和其他Metal对象,但我找不到任何只显示像素缓冲区的方法。我看到的每一个教程都是关于使用顶点和碎片着色器来呈现3D对象的。
我认为缓冲区必须在drawInMTKView
函数中显示(可能与MTLRenderCommonEncoder一起(,但同样,我找不到任何有关这方面的信息。
我希望我不是在问一个显而易见的问题。
感谢
欢迎!
我建议您使用核心图像将像素缓冲区的内容渲染到视图中。这需要最少的手动金属设置。
按如下方式设置MTKView
和一些所需对象(假设您有一个视图控制器和情节提要设置(:
import UIKit
import CoreImage
class PreviewViewController: UIViewController {
@IBOutlet weak var metalView: MTKView!
var device: MTLDevice!
var commandQueue: MTLCommandQueue!
var ciContext: CIContext!
var pixelBuffer: CVPixelBuffer?
override func viewDidLoad() {
super.viewDidLoad()
self.device = MTLCreateSystemDefaultDevice()
self.commandQueue = self.device.makeCommandQueue()
self.metalView.delegate = self
self.metalView.device = self.device
// this allows us to render into the view's drawable
self.metalView.framebufferOnly = false
self.ciContext = CIContext(mtlDevice: self.device)
}
}
在委托方法中,您可以使用核心图像来转换像素缓冲区以适应视图的内容(这是一个额外的好处,请根据您的用例进行调整(,并使用CIContext
:进行渲染
extension PreviewViewController: MTKViewDelegate {
func draw(in view: MTKView) {
guard let pixelBuffer = self.pixelBuffer,
let commandBuffer = self.commandQueue.makeCommandBuffer() else { return }
// turn the pixel buffer into a CIImage so we can use Core Image for rendering into the view
let image = CIImage(cvPixelBuffer: pixelBuffer)
// bonus: transform the image to aspect-fit the view's bounds
let drawableSize = view.drawableSize
let scaleX = drawableSize.width / image.extent.width
let scaleY = drawableSize.height / image.extent.height
let scale = min(scaleX, scaleY)
let scaledImage = image.transformed(by: CGAffineTransform(scaleX: scale, y: scale))
// center in the view
let originX = max(drawableSize.width - scaledImage.extent.size.width, 0) / 2
let originY = max(drawableSize.height - scaledImage.extent.size.height, 0) / 2
let centeredImage = scaledImage.transformed(by: CGAffineTransform(translationX: originX, y: originY))
// Create a render destination that allows to lazily fetch the target texture
// which allows the encoder to process all CI commands _before_ the texture is actually available.
// This gives a nice speed boost because the CPU doesn't need to wait for the GPU to finish
// before starting to encode the next frame.
// Also note that we don't pass a command buffer here, because according to Apple:
// "Rendering to a CIRenderDestination initialized with a commandBuffer requires encoding all
// the commands to render an image into the specified buffer. This may impact system responsiveness
// and may result in higher memory usage if the image requires many passes to render."
let destination = CIRenderDestination(width: Int(drawableSize.width),
height: Int(drawableSize.height),
pixelFormat: view.colorPixelFormat,
commandBuffer: nil,
mtlTextureProvider: { () -> MTLTexture in
return currentDrawable.texture
})
// render into the view's drawable
let _ = try! self.ciContext.startTask(toRender: centeredImage, to: destination)
// present the drawable
commandBuffer.present(currentDrawable)
commandBuffer.commit()
}
}
有一种稍微简单一点的方法可以渲染到可绘制纹理中,而不是使用CIRenderDestination
,但如果希望实现高帧率,则建议使用这种方法(请参见注释(。
我想我找到了一个解决方案:https://developer.apple.com/documentation/metal/creating_and_sampling_textures?language=objc.在本示例中,他们展示了如何将图像渲染到"金属"视图,只使用几个顶点和片段着色器将纹理渲染到2D正方形。我从那里开始。不确定是否没有更好(更简单?(的方法来做到这一点。但我想这就是Metal希望我们这样做的方式。