在屏幕外渲染SwiftUI视图,并将视图保存为UIImage以便共享



我正在尝试与SwiftUI创建一个共享按钮,当按下可以共享生成的图像。我发现了一些教程,可以屏幕截图当前显示的视图,并将其转换为UIImage。但是我想在屏幕外以编程方式创建一个视图,然后将其保存到一个UIImage中,然后用户可以与共享表共享。

import SwiftUI
import SwiftyJSON
import MapKit

struct ShareRentalView : View {
@State private var region = MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 32.786038, longitude: -117.237324) , span: MKCoordinateSpan(latitudeDelta: 0.025, longitudeDelta: 0.025))
@State var coordinates: [JSON] = []
@State var origin: CGPoint? = nil
@State var size: CGSize? = nil

var body: some View {
GeometryReader{ geometry in
VStack(spacing: 0) {
ZStack{
HistoryMapView(region: region, pointsArray: $coordinates)
.frame(height: 300)
}.frame(height: 300)
}.onAppear {
self.origin = geometry.frame(in: .global).origin
self.size =  geometry.size
}
}
}
func returnScreenShot() -> UIImage{
return takeScreenshot(origin: self.origin.unsafelyUnwrapped, size: self.size.unsafelyUnwrapped)
}
}

extension UIView {
var renderedImage: UIImage {
// rect of capure
let rect = self.bounds
// create the context of bitmap
UIGraphicsBeginImageContextWithOptions(rect.size, false, 0.0)
let context: CGContext = UIGraphicsGetCurrentContext()!
self.layer.render(in: context)
// get a image from current context bitmap
let capturedImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return capturedImage
}
}
extension View {
func takeScreenshot(origin: CGPoint, size: CGSize) -> UIImage {
let window = UIWindow(frame: CGRect(origin: origin, size: size))
let hosting = UIHostingController(rootView: self)
hosting.view.frame = window.frame
window.addSubview(hosting.view)
window.makeKeyAndVisible()
return hosting.view.renderedImage
}
}

这是我目前的代码思路。我已经构建了一个视图,onAppear设置了屏幕截图的CGpoint和CGsize。然后是一个附加的方法,它可以获取视图的屏幕截图。现在的问题是这个视图永远不会呈现因为我从来没有把这个添加到父视图因为我不想让这个视图出现在用户面前。在父视图中我有

struct HistoryCell: View {
...
private var shareRental : ShareRentalView? = nil
private var uiimage: UIImage? = nil
...
init(){
...
self.shareRental = ShareRentalView()
}
var body: some View {
...
Button{action: {self.uiimage = self.shareRental?.returnScreenShot()}}
...
}
}

这不起作用,因为我想要截图的视图从未渲染?有没有一种方法来渲染它在内存或屏幕外,然后从它创建一个图像?还是我需要想另一种方法?

这最终使未在屏幕上显示的视图的屏幕截图保存为UIImage

extension UIView {
func asImage() -> UIImage {
let format = UIGraphicsImageRendererFormat()
format.scale = 1
return UIGraphicsImageRenderer(size: self.layer.frame.size, format: format).image { context in
self.drawHierarchy(in: self.layer.bounds, afterScreenUpdates: true)
}
}
}
extension View {
func asImage() -> UIImage {
let controller = UIHostingController(rootView: self)
let size = controller.sizeThatFits(in: UIScreen.main.bounds.size)
controller.view.bounds = CGRect(origin: .zero, size: size)
let image = controller.view.asImage()
return image
}
}

然后在父视图

var shareRental: ShareRentalView?
init(){
....
self.shareRental = ShareRentalView()
}
var body: some View {
Button(action: {
let shareImage = self.shareRental.asImage()
}

这让我几乎到了。mkmapsnapshot在加载时有延迟,图像创建发生得太快,当UIImage创建时没有映射。

为了解决地图加载延迟的问题,我在一个类中创建了一个数组,该数组构建了所有的uiimage并将它们存储在一个数组中。

class MyUser: ObservableObject {
...
public func buildHistoryRental(){
self.historyRentals.removeAll()
MapSnapshot().generateSnapshot(completion: self.snapShotRsp)
}
}
}
}
private func snapShotRsp(image: UIImage){
self.historyRentals.append(image))
}

然后我创建了一个类来创建这样的快照图像

func generateSnapshot(completion: @escaping (JSON, UIImage)->() ){
let mapSnapshotOptions = MKMapSnapshotOptions()
// Set the region of the map that is rendered. (by polyline)
let polyLine = MKPolyline(coordinates: &yourCoordinates, count: yourCoordinates.count)
let region = MKCoordinateRegionForMapRect(polyLine.boundingMapRect)
mapSnapshotOptions.region = region
// Set the scale of the image. We'll just use the scale of the current device, which is 2x scale on Retina screens.
mapSnapshotOptions.scale = UIScreen.main.scale
// Set the size of the image output.
mapSnapshotOptions.size = CGSize(width: IMAGE_VIEW_WIDTH, height: IMAGE_VIEW_HEIGHT)
// Show buildings and Points of Interest on the snapshot
mapSnapshotOptions.showsBuildings = true
mapSnapshotOptions.showsPointsOfInterest = true
let snapShotter = MKMapSnapshotter(options: mapSnapshotOptions)
var image: UIImage = UIImage()
snapshotter.start(completionHandler: { (snapshot: MKMapSnapshotter.Snapshot?, Error) -> Void in
if(Error != nil){
print("(String(describing: Error))");
}else{
image = self.drawLineOnImage(snapshot: snapshot.unsafelyUnwrapped, pointsToUse: pointsToUse)
}
completion(image)
})
}
}
func drawLineOnImage(snapshot: MKMapSnapshot) -> UIImage {
let image = snapshot.image
// for Retina screen
UIGraphicsBeginImageContextWithOptions(self.imageView.frame.size, true, 0)
// draw original image into the context
image.draw(at: CGPoint.zero)
// get the context for CoreGraphics
let context = UIGraphicsGetCurrentContext()
// set stroking width and color of the context
context!.setLineWidth(2.0)
context!.setStrokeColor(UIColor.orange.cgColor)
// Here is the trick :
// We use addLine() and move() to draw the line, this should be easy to understand.
// The diificult part is that they both take CGPoint as parameters, and it would be way too complex for us to calculate by ourselves
// Thus we use snapshot.point() to save the pain.
context!.move(to: snapshot.point(for: yourCoordinates[0]))
for i in 0...yourCoordinates.count-1 {
context!.addLine(to: snapshot.point(for: yourCoordinates[i]))
context!.move(to: snapshot.point(for: yourCoordinates[i]))
}
// apply the stroke to the context
context!.strokePath()
// get the image from the graphics context
let resultImage = UIGraphicsGetImageFromCurrentImageContext()
// end the graphics context 
UIGraphicsEndImageContext()
return resultImage!
}

与回调同步返回图像是很重要的。试图从函数调用中直接返回图像会产生一个空白映射。

最新更新