我使用了谷歌用于 Camera2 API 的示例代码 - Camera2 API 示例代码
我还在相机预览中添加了一个矩形叠加层。当我尝试导航到下一个活动以显示捕获的图像时,我收到 SIGSEGV 错误。
我认为问题是由于画布而出现的。我不确定,但这里的一些答案表明这可能是问题所在,因为我在开始下一个活动的意图之前关闭了相机。
画布使用Preview
的宽度和高度来绘制矩形。这可能是关闭相机对象时的问题吗?
矩形.java
package com.example.googlecamera2;
import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.Rect;
import android.util.Log;
import android.view.View;
public class Rectangle extends View {
Paint paint = new Paint();
Context context;
int width;
int height;
public Rectangle(Context context, int width, int height) {
super(context);
this.context = context;
this.width = width;
this.height = height;
}
@Override
public void onDraw(Canvas canvas) {
paint.setColor(Color.GREEN);
paint.setStyle(Paint.Style.STROKE);
paint.setStrokeWidth(6);
Log.e("RECT", width +" "+ height);
Rect rect = new Rect(width/8, height/8, 7*width/8, 7*height/8);
canvas.drawRect(rect, paint);
}
}
这是保存图像的方法调用
@Override
public void onImageAvailable(ImageReader reader) {
mBackgroundHandler.post(new ImageSaver(reader.acquireNextImage(), mFile));
closeCameraDevice();
startActivity(new Intent(getActivity(), DisplayImage.class).putExtra("IMAGE", mFile.getPath()));
}
};
这是ImageSaver
方法
private class ImageSaver implements Runnable {
/**
* The JPEG image
*/
private final Image mImage;
/**
* The file we save the image into.
*/
private final File mFile;
ImageSaver(Image image, File file) {
mImage = image;
mFile = file;
}
@Override
public void run() {
// System.gc();
ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
capturedImage = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
// Rectangle is drawn by dividing the preview in 8 dimensions and the rectangle starts
// on 2nd dimension and ends on 7th dimension so take 6 for dividing the width and height
int left = capturedImage.getWidth() / 8;
int top = capturedImage.getHeight() / 8;
int cropWidth = 6*left;
int cropHeight = 6*top;
Matrix rotationMatrix = new Matrix();
rotationMatrix.postRotate(90);
Log.e("Image dim", capturedImage.getWidth()+" "+capturedImage.getHeight());
Log.e("Crop dim", left+" "+top+" "+cropWidth+" "+cropHeight);
croppedImage = Bitmap.createBitmap(capturedImage, left, top, cropWidth, cropHeight, rotationMatrix, false);
//Destroy the original bitmap image
capturedImage.recycle();
FileOutputStream output = null;
try {
output = new FileOutputStream(mFile);
croppedImage.compress(Bitmap.CompressFormat.JPEG, 100, output);
output.flush();
}
catch (IOException e) {
e.printStackTrace();
} finally
{
mImage.close();
if (null != output)
{
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
我在这里绘制矩形,当openCamera()
调用此方法:
private void setUpCameraOutputs(int width, int height) {
Activity activity = getActivity();
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
for (String cameraId : manager.getCameraIdList()) {
CameraCharacteristics characteristics
= manager.getCameraCharacteristics(cameraId);
// We don't use a front facing camera in this sample.
Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING);
if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) {
continue;
}
StreamConfigurationMap map = characteristics.get(
CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
if (map == null) {
continue;
}
// For still image captures, we use the largest available size.
Size largest = Collections.max(
Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)),
new CompareSizesByArea());
mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(),
ImageFormat.JPEG, /*maxImages*/1);
mImageReader.setOnImageAvailableListener(
mOnImageAvailableListener, mBackgroundHandler);
// Find out if we need to swap dimension to get the preview size relative to sensor
// coordinate.
int displayRotation = activity.getWindowManager().getDefaultDisplay().getRotation();
//noinspection ConstantConditions
mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
boolean swappedDimensions = false;
switch (displayRotation) {
case Surface.ROTATION_0:
case Surface.ROTATION_180:
if (mSensorOrientation == 90 || mSensorOrientation == 270) {
swappedDimensions = true;
}
break;
case Surface.ROTATION_90:
case Surface.ROTATION_270:
if (mSensorOrientation == 0 || mSensorOrientation == 180) {
swappedDimensions = true;
}
break;
default:
Log.e(TAG, "Display rotation is invalid: " + displayRotation);
}
Point displaySize = new Point();
activity.getWindowManager().getDefaultDisplay().getSize(displaySize);
int rotatedPreviewWidth = width;
int rotatedPreviewHeight = height;
int maxPreviewWidth = displaySize.x;
int maxPreviewHeight = displaySize.y;
if (swappedDimensions) {
rotatedPreviewWidth = height;
rotatedPreviewHeight = width;
maxPreviewWidth = displaySize.y;
maxPreviewHeight = displaySize.x;
}
if (maxPreviewWidth > MAX_PREVIEW_WIDTH) {
maxPreviewWidth = MAX_PREVIEW_WIDTH;
}
if (maxPreviewHeight > MAX_PREVIEW_HEIGHT) {
maxPreviewHeight = MAX_PREVIEW_HEIGHT;
}
// Danger, W.R.! Attempting to use too large a preview size could exceed the camera
// bus' bandwidth limitation, resulting in gorgeous previews but the storage of
// garbage capture data.
mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
rotatedPreviewWidth, rotatedPreviewHeight, maxPreviewWidth,
maxPreviewHeight, largest);
// We fit the aspect ratio of TextureView to the size of preview we picked.
int orientation = getResources().getConfiguration().orientation;
if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
mTextureView.setAspectRatio(
mPreviewSize.getWidth(), mPreviewSize.getHeight());
} else {
mTextureView.setAspectRatio(
mPreviewSize.getHeight(), mPreviewSize.getWidth());
}
// Add rectangle to the view
Rectangle rectangle = new Rectangle(getActivity(), mPreviewSize.getHeight(), mPreviewSize.getWidth());
linearLayout.addView(rectangle);
// Check if the flash is supported.
Boolean available = characteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE);
mFlashSupported = available == null ? false : available;
mCameraId = cameraId;
return;
}
} catch (CameraAccessException e) {
e.printStackTrace();
} catch (NullPointerException e) {
// Currently an NPE is thrown when the Camera2API is used but not supported on the
// device this code runs.
ErrorDialog.newInstance(getString(R.string.camera_error))
.show(getChildFragmentManager(), FRAGMENT_DIALOG);
}
}
堆栈跟踪
W/art: Failed execv(/system/bin/dex2oat --runtime-arg -classpath --runtime-arg --debuggable --instruction-set=arm --instruction-set-features=smp,div,atomic_ldrd_strd --runtime-arg -Xrelocate --boot-image=/system/framework/boot.art --runtime-arg -Xms64m --runtime-arg -Xmx512m --compiler-filter=speed --instruction-set-variant=cortex-a15 --instruction-set-features=default --dex-file=/data/app/com.example.googlecamera2-1/split_lib_slice_0_apk.apk --oat-file=/data/dalvik-cache/arm/data@app@com.example.googlecamera2-1@split_lib_slice_0_apk.apk@classes.dex) because non-0 exit status
W/art: Failed execv(/system/bin/dex2oat --runtime-arg -classpath --runtime-arg --debuggable --instruction-set=arm --instruction-set-features=smp,div,atomic_ldrd_strd --runtime-arg -Xrelocate --boot-image=/system/framework/boot.art --runtime-arg -Xms64m --runtime-arg -Xmx512m --compiler-filter=speed --instruction-set-variant=cortex-a15 --instruction-set-features=default --dex-file=/data/app/com.example.googlecamera2-1/split_lib_slice_1_apk.apk --oat-file=/data/dalvik-cache/arm/data@app@com.example.googlecamera2-1@split_lib_slice_1_apk.apk@classes.dex) because non-0 exit status
W/art: Failed execv(/system/bin/dex2oat --runtime-arg -classpath --runtime-arg --debuggable --instruction-set=arm --instruction-set-features=smp,div,atomic_ldrd_strd --runtime-arg -Xrelocate --boot-image=/system/framework/boot.art --runtime-arg -Xms64m --runtime-arg -Xmx512m --compiler-filter=speed --instruction-set-variant=cortex-a15 --instruction-set-features=default --dex-file=/data/app/com.example.googlecamera2-1/split_lib_slice_3_apk.apk --oat-file=/data/dalvik-cache/arm/data@app@com.example.googlecamera2-1@split_lib_slice_3_apk.apk@classes.dex) because non-0 exit status
W/art: Failed execv(/system/bin/dex2oat --runtime-arg -classpath --runtime-arg --debuggable --instruction-set=arm --instruction-set-features=smp,div,atomic_ldrd_strd --runtime-arg -Xrelocate --boot-image=/system/framework/boot.art --runtime-arg -Xms64m --runtime-arg -Xmx512m --compiler-filter=speed --instruction-set-variant=cortex-a15 --instruction-set-features=default --dex-file=/data/app/com.example.googlecamera2-1/split_lib_slice_4_apk.apk --oat-file=/data/dalvik-cache/arm/data@app@com.example.googlecamera2-1@split_lib_slice_4_apk.apk@classes.dex) because non-0 exit status
W/art: Failed execv(/system/bin/dex2oat --runtime-arg -classpath --runtime-arg --debuggable --instruction-set=arm --instruction-set-features=smp,div,atomic_ldrd_strd --runtime-arg -Xrelocate --boot-image=/system/framework/boot.art --runtime-arg -Xms64m --runtime-arg -Xmx512m --compiler-filter=speed --instruction-set-variant=cortex-a15 --instruction-set-features=default --dex-file=/data/app/com.example.googlecamera2-1/split_lib_slice_5_apk.apk --oat-file=/data/dalvik-cache/arm/data@app@com.example.googlecamera2-1@split_lib_slice_5_apk.apk@classes.dex) because non-0 exit status
W/art: Failed execv(/system/bin/dex2oat --runtime-arg -classpath --runtime-arg --debuggable --instruction-set=arm --instruction-set-features=smp,div,atomic_ldrd_strd --runtime-arg -Xrelocate --boot-image=/system/framework/boot.art --runtime-arg -Xms64m --runtime-arg -Xmx512m --compiler-filter=speed --instruction-set-variant=cortex-a15 --instruction-set-features=default --dex-file=/data/app/com.example.googlecamera2-1/split_lib_slice_6_apk.apk --oat-file=/data/dalvik-cache/arm/data@app@com.example.googlecamera2-1@split_lib_slice_6_apk.apk@classes.dex) because non-0 exit status
W/art: Failed execv(/system/bin/dex2oat --runtime-arg -classpath --runtime-arg --debuggable --instruction-set=arm --instruction-set-features=smp,div,atomic_ldrd_strd --runtime-arg -Xrelocate --boot-image=/system/framework/boot.art --runtime-arg -Xms64m --runtime-arg -Xmx512m --compiler-filter=speed --instruction-set-variant=cortex-a15 --instruction-set-features=default --dex-file=/data/app/com.example.googlecamera2-1/split_lib_slice_7_apk.apk --oat-file=/data/dalvik-cache/arm/data@app@com.example.googlecamera2-1@split_lib_slice_7_apk.apk@classes.dex) because non-0 exit status
W/art: Failed execv(/system/bin/dex2oat --runtime-arg -classpath --runtime-arg --debuggable --instruction-set=arm --instruction-set-features=smp,div,atomic_ldrd_strd --runtime-arg -Xrelocate --boot-image=/system/framework/boot.art --runtime-arg -Xms64m --runtime-arg -Xmx512m --compiler-filter=speed --instruction-set-variant=cortex-a15 --instruction-set-features=default --dex-file=/data/app/com.example.googlecamera2-1/split_lib_slice_8_apk.apk --oat-file=/data/dalvik-cache/arm/data@app@com.example.googlecamera2-1@split_lib_slice_8_apk.apk@classes.dex) because non-0 exit status
W/art: Failed execv(/system/bin/dex2oat --runtime-arg -classpath --runtime-arg --debuggable --instruction-set=arm --instruction-set-features=smp,div,atomic_ldrd_strd --runtime-arg -Xrelocate --boot-image=/system/framework/boot.art --runtime-arg -Xms64m --runtime-arg -Xmx512m --compiler-filter=speed --instruction-set-variant=cortex-a15 --instruction-set-features=default --dex-file=/data/app/com.example.googlecamera2-1/split_lib_slice_9_apk.apk --oat-file=/data/dalvik-cache/arm/data@app@com.example.googlecamera2-1@split_lib_slice_9_apk.apk@classes.dex) because non-0 exit status
W/System: ClassLoader referenced unknown path: /data/app/com.example.googlecamera2-1/lib/arm
I/InstantRun: starting instant run server: is main process
W/art: Before Android 4.1, method android.graphics.PorterDuffColorFilter android.support.graphics.drawable.VectorDrawableCompat.updateTintFilter(android.graphics.PorterDuffColorFilter, android.content.res.ColorStateList, android.graphics.PorterDuff$Mode) would have incorrectly overridden the package-private method in android.graphics.drawable.Drawable
D/TextView: setTypeface with style : 0
D/ViewRootImpl: #1 mView = com.android.internal.policy.PhoneWindow$DecorView{bb95551 I.E...... R.....ID 0,0-0,0}
D/OpenGLRenderer: Use EGL_SWAP_BEHAVIOR_PRESERVED: true
D/libEGL: loaded /vendor/lib/egl/libGLES_mali.so
D/libEGL: eglInitialize EGLDisplay = 0xe0f717c4
I/OpenGLRenderer: Initialized EGL, version 1.4
D/mali_winsys: new_window_surface returns 0x3000, [720x1280]-format:1
I/CameraManagerGlobal: Connecting to camera service
I/CameraManager: Using legacy camera HAL.
D/libGLESv1: DTS_GLAPI : DTS is not allowed for Package : com.example.googlecamera2
I/CameraDeviceState: Legacy camera service transitioning to state CONFIGURING
I/RequestThread-0: Configure outputs: 2 surfaces configured.
D/Camera: app passed NULL surface
I/RequestThread-0: configureOutputs - set take picture size to 4128x3096
D/libEGL: eglInitialize EGLDisplay = 0xddaff404
D/mali_winsys: new_window_surface returns 0x3000, [960x720]-format:1
I/CameraDeviceState: Legacy camera service transitioning to state IDLE
I/RequestQueue: Repeating capture request set.
W/LegacyRequestMapper: convertRequestMetadata - control.awbRegions setting is not supported, ignoring value
Only received metering rectangles with weight 0.
D/ViewRootImpl: MSG_RESIZED_REPORT: ci=Rect(0, 0 - 0, 0) vi=Rect(0, 0 - 0, 0) or=1
W/LegacyRequestMapper: convertRequestToMetadata - Ignoring android.lens.focusDistance false, only 0.0f is supported
E/RECT: 720 960
I/Timeline: Timeline: Activity_idle id: android.os.BinderProxy@370aa78 time:79076526
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
I/CameraDeviceState: Legacy camera service transitioning to state CAPTURING
D/libEGL: eglInitialize EGLDisplay = 0xe0f71614
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xe0f71614
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/libEGL: eglInitialize EGLDisplay = 0xddaff2f4
D/ViewRootImpl: ViewPostImeInputStage processPointer 0
D/ViewRootImpl: ViewPostImeInputStage processPointer 1
W/LegacyRequestMapper: convertRequestMetadata - control.awbRegions setting is not supported, ignoring value
Only received metering rectangles with weight 0.
convertRequestToMetadata - Ignoring android.lens.focusDistance false, only 0.0f is supported
W/LegacyRequestMapper: convertRequestMetadata - control.awbRegions setting is not supported, ignoring value
Only received metering rectangles with weight 0.
convertRequestToMetadata - Ignoring android.lens.focusDistance false, only 0.0f is supported
I/RequestQueue: Repeating capture request cancelled.
I/RequestThread-0: Flushing all pending requests.
E/RequestQueue: cancel failed: no repeating request exists.
I/CameraDeviceState: Legacy camera service transitioning to state IDLE
W/LegacyRequestMapper: convertRequestMetadata - control.awbRegions setting is not supported, ignoring value
Only received metering rectangles with weight 0.
W/LegacyRequestMapper: convertRequestToMetadata - Ignoring android.lens.focusDistance false, only 0.0f is supported
I/CameraDeviceState: Legacy camera service transitioning to state CAPTURING
I/RequestThread-0: Received jpeg.
Producing jpeg buffer...
D/ImageReader_JNI: ImageReader_lockedImageSetup: Receiving JPEG in HAL_PIXEL_FORMAT_RGBA_8888 buffer.
I/CameraDeviceState: Legacy camera service transitioning to state IDLE
D/libEGL: eglTerminate EGLDisplay = 0xddaff384
D/libEGL: eglTerminate EGLDisplay = 0xddaff49c
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
eglTerminate EGLDisplay = 0xddaff424
E/BufferQueueProducer: [SurfaceTexture-1-14265-1] cancelBuffer: BufferQueue has been abandoned
E/BufferQueueProducer: [SurfaceTexture-1-14265-1] cancelBuffer: BufferQueue has been abandoned
E/BufferQueueProducer: [SurfaceTexture-1-14265-1] cancelBuffer: BufferQueue has been abandoned
E/BufferQueueProducer: [SurfaceTexture-1-14265-1] cancelBuffer: BufferQueue has been abandoned
[SurfaceTexture-1-14265-1] cancelBuffer: BufferQueue has been abandoned
[SurfaceTexture-1-14265-1] cancelBuffer: BufferQueue has been abandoned
[SurfaceTexture-1-14265-1] cancelBuffer: BufferQueue has been abandoned
E/BufferQueueProducer: [SurfaceTexture-1-14265-1] cancelBuffer: BufferQueue has been abandoned
E/BufferQueueProducer: [SurfaceTexture-1-14265-1] cancelBuffer: BufferQueue has been abandoned
E/BufferQueueProducer: [SurfaceTexture-1-14265-1] setBufferCount: BufferQueue has been abandoned
I/Timeline: Timeline: Activity_launch_request id:com.example.googlecamera2 time:79081932
A/libc: Fatal signal 11 (SIGSEGV), code 1, fault addr 0xdba03ff8 in tid 14345 (CameraBackgroun)
Application terminated.
我不明白SIGSEGV错误到底是怎么回事。它大部分时间都会出现,但并非总是如此。
所以,我终于自己想通了。问题是我在相机预览上绘制的overlay
。
当我关闭相机并尝试离开活动时,preview size
变量可能被破坏了,因此画布无法访问绘制叠加层所必需的宽度和高度。 所以我只需要退出Rectangle
课的onDraw()
,如果相机被关闭并且解决了问题。
我通过一个简单的布尔变量来实现这一点,我将其传递给 Rectangle 类。如果设置为 false,则不会绘制矩形。
在更改活动之前,我只是将camerOn
设置为false
,然后调用onDraw()
,它只是返回而不绘制矩形,因为变量为 false 并且对 SIGSEGV 异常进行了排序。