我正在使用ffmpeg进行30秒的视频捕获。
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (yuvIplimage != null && recording && rec)
{
new SaveFrame().execute(data);
}
}
}
保存帧类别低于
private class SaveFrame extends AsyncTask<byte[], Void, File> {
long t;
protected File doInBackground(byte[]... arg) {
t = 1000 * (System.currentTimeMillis() - firstTime - pausedTime);
toSaveFrames++;
File pathCache = new File(Environment.getExternalStorageDirectory()+"/DCIM", (System.currentTimeMillis() / 1000L)+ "_" + toSaveFrames + ".tmp");
BufferedOutputStream bos;
try {
bos = new BufferedOutputStream(new FileOutputStream(pathCache));
bos.write(arg[0]);
bos.flush();
bos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
pathCache = null;
toSaveFrames--;
} catch (IOException e) {
e.printStackTrace();
pathCache = null;
toSaveFrames--;
}
return pathCache;
}
@Override
protected void onPostExecute(File filename)
{
if(filename!=null)
{
savedFrames++;
tempList.add(new FileFrame(t,filename));
}
}
}
最后我添加了所有帧与裁剪和旋转
private class AddFrame extends AsyncTask<Void, Integer, Void> {
private int serial = 0;
@Override
protected Void doInBackground(Void... params) {
for(int i=0; i<tempList.size(); i++)
{
byte[] bytes = new byte[(int) tempList.get(i).file.length()];
try {
BufferedInputStream buf = new BufferedInputStream(new FileInputStream(tempList.get(i).file));
buf.read(bytes, 0, bytes.length);
buf.close();
IplImage image = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
// final int startY = 640*(480-480)/2;
// final int lenY = 640*480;
// yuvIplimage.getByteBuffer().put(bytes, startY, lenY);
// final int startVU = 640*480+ 640*(480-480)/4;
// final int lenVU = 640* 480/2;
// yuvIplimage.getByteBuffer().put(bytes, startVU, lenVU);
if (tempList.get(i).time > recorder.getTimestamp()) {
recorder.setTimestamp(tempList.get(i).time);
}
image = cropImage(image);
image = rotate(image, 270);
// image = rotateImage(image);
recorder.record(image);
Log.i(LOG_TAG, "record " + i);
image = null;
serial++;
publishProgress(serial);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (com.googlecode.javacv.FrameRecorder.Exception e) {
e.printStackTrace();
}
}
return null;
}
@Override
protected void onProgressUpdate(Integer... serial) {
int value = serial[0];
creatingProgress.setProgress(value);
}
@Override
protected void onPostExecute(Void v)
{
creatingProgress.dismiss();
if (recorder != null && recording) {
recording = false;
Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
finish();
startActivity(new Intent(RecordActivity.this,AnswerViewActivity.class));
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
}
我的作物和轮作方法低于
private IplImage cropImage(IplImage src)
{
cvSetImageROI(src, r);
IplImage cropped = IplImage.create(imageHeight, imageHeight, IPL_DEPTH_8U, 2);
cvCopy(src, cropped);
return cropped;
}
public static IplImage rotate(IplImage image, double angle) {
IplImage copy = opencv_core.cvCloneImage(image);
IplImage rotatedImage = opencv_core.cvCreateImage(opencv_core.cvGetSize(copy), copy.depth(), copy.nChannels());
CvMat mapMatrix = opencv_core.cvCreateMat( 2, 3, opencv_core.CV_32FC1 );
//Define Mid Point
CvPoint2D32f centerPoint = new CvPoint2D32f();
centerPoint.x(copy.width()/2);
centerPoint.y(copy.height()/2);
//Get Rotational Matrix
opencv_imgproc.cv2DRotationMatrix(centerPoint, angle, 1.0, mapMatrix);
//Rotate the Image
opencv_imgproc.cvWarpAffine(copy, rotatedImage, mapMatrix, opencv_imgproc.CV_INTER_CUBIC + opencv_imgproc.CV_WARP_FILL_OUTLIERS, opencv_core.cvScalarAll(170));
opencv_core.cvReleaseImage(copy);
opencv_core.cvReleaseMat(mapMatrix);
return rotatedImage;
}
我的最后一个视频裁剪和旋转,但绿色帧和彩色帧混合在一起。
如何解决此问题。我不知道iplimage。在一些博客中,他们提到了YUV格式。首先u需要转换Y,然后转换UV。
如何解决这个问题?
我已经修改了这个开源Android Touch To Record库的onPreviewFrame方法,以便对捕获的帧进行转置和调整大小。
我在setCameraParams()方法中定义了"yuvIplImage"如下。
IplImage yuvIplImage = IplImage.create(mPreviewSize.height, mPreviewSize.width, opencv_core.IPL_DEPTH_8U, 2);
同样初始化您的videoRecorder对象如下,宽度为高度,反之亦然。
//call initVideoRecorder() method like this to initialize videoRecorder object of FFmpegFrameRecorder class.
initVideoRecorder(strVideoPath, mPreview.getPreviewSize().height, mPreview.getPreviewSize().width, recorderParameters);
//method implementation
public void initVideoRecorder(String videoPath, int width, int height, RecorderParameters recorderParameters)
{
Log.e(TAG, "initVideoRecorder");
videoRecorder = new FFmpegFrameRecorder(videoPath, width, height, 1);
videoRecorder.setFormat(recorderParameters.getVideoOutputFormat());
videoRecorder.setSampleRate(recorderParameters.getAudioSamplingRate());
videoRecorder.setFrameRate(recorderParameters.getVideoFrameRate());
videoRecorder.setVideoCodec(recorderParameters.getVideoCodec());
videoRecorder.setVideoQuality(recorderParameters.getVideoQuality());
videoRecorder.setAudioQuality(recorderParameters.getVideoQuality());
videoRecorder.setAudioCodec(recorderParameters.getAudioCodec());
videoRecorder.setVideoBitrate(1000000);
videoRecorder.setAudioBitrate(64000);
}
这是我的onPreviewFrame()方法:
@Override
public void onPreviewFrame(byte[] data, Camera camera)
{
long frameTimeStamp = 0L;
if(FragmentCamera.mAudioTimestamp == 0L && FragmentCamera.firstTime > 0L)
{
frameTimeStamp = 1000L * (System.currentTimeMillis() - FragmentCamera.firstTime);
}
else if(FragmentCamera.mLastAudioTimestamp == FragmentCamera.mAudioTimestamp)
{
frameTimeStamp = FragmentCamera.mAudioTimestamp + FragmentCamera.frameTime;
}
else
{
long l2 = (System.nanoTime() - FragmentCamera.mAudioTimeRecorded) / 1000L;
frameTimeStamp = l2 + FragmentCamera.mAudioTimestamp;
FragmentCamera.mLastAudioTimestamp = FragmentCamera.mAudioTimestamp;
}
synchronized(FragmentCamera.mVideoRecordLock)
{
if(FragmentCamera.recording && FragmentCamera.rec && lastSavedframe != null && lastSavedframe.getFrameBytesData() != null && yuvIplImage != null)
{
FragmentCamera.mVideoTimestamp += FragmentCamera.frameTime;
if(lastSavedframe.getTimeStamp() > FragmentCamera.mVideoTimestamp)
{
FragmentCamera.mVideoTimestamp = lastSavedframe.getTimeStamp();
}
try
{
yuvIplImage.getByteBuffer().put(lastSavedframe.getFrameBytesData());
IplImage bgrImage = IplImage.create(mPreviewSize.width, mPreviewSize.height, opencv_core.IPL_DEPTH_8U, 4);// In my case, mPreviewSize.width = 1280 and mPreviewSize.height = 720
IplImage transposed = IplImage.create(mPreviewSize.height, mPreviewSize.width, yuvIplImage.depth(), 4);
IplImage squared = IplImage.create(mPreviewSize.height, mPreviewSize.height, yuvIplImage.depth(), 4);
int[] _temp = new int[mPreviewSize.width * mPreviewSize.height];
Util.YUV_NV21_TO_BGR(_temp, data, mPreviewSize.width, mPreviewSize.height);
bgrImage.getIntBuffer().put(_temp);
opencv_core.cvTranspose(bgrImage, transposed);
opencv_core.cvFlip(transposed, transposed, 1);
opencv_core.cvSetImageROI(transposed, opencv_core.cvRect(0, 0, mPreviewSize.height, mPreviewSize.height));
opencv_core.cvCopy(transposed, squared, null);
opencv_core.cvResetImageROI(transposed);
videoRecorder.setTimestamp(lastSavedframe.getTimeStamp());
videoRecorder.record(squared);
}
catch(com.googlecode.javacv.FrameRecorder.Exception e)
{
e.printStackTrace();
}
}
lastSavedframe = new SavedFrames(data, frameTimeStamp);
}
}
这段代码使用了一个方法"YUV_NV21_TO_BGR",我从这个链接中找到了这个方法
基本上,这个方法是用来解决的,我称之为"安卓系统上的绿魔鬼问题",就像你的方法一样。我也有同样的问题,浪费了将近3-4天的时间。在添加"YUV_NV21_TO_BGR"方法之前,当我刚刚对YuvIplImage进行转置时,更重要的是转置、翻转(有或没有调整大小)的组合,在生成的视频中有绿色输出。这种"YUV_NV21_TO_BGR"方法挽救了局面。感谢@David Han从上面的谷歌群组线程。
此外,您应该知道,onPreviewFrame中的所有这些处理(转置、翻转和调整大小)都需要花费大量时间,这会严重影响每秒帧数(FPS)。当我在onPreviewFrame方法中使用此代码时,录制视频的FPS从30fps降至3帧/秒。
我建议不要使用这种方法。相反,您可以在AsyncTask中使用JavaCV对视频文件进行录制后处理(转置、翻转和调整大小)。希望这能有所帮助。
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
//IplImage newImage = cvCreateImage(cvGetSize(yuvIplimage), IPL_DEPTH_8U, 1);
if (recording) {
videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);
yuvimage = IplImage.create(imageWidth, imageHeight * 3 / 2, IPL_DEPTH_8U,1);
yuvimage.getByteBuffer().put(data);
rgbimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 3);
opencv_imgproc.cvCvtColor(yuvimage, rgbimage, opencv_imgproc.CV_YUV2BGR_NV21);
IplImage rotateimage=null;
try {
recorder.setTimestamp(videoTimestamp);
int rot=0;
switch (degrees) {
case 0:
rot =1;
rotateimage=rotate(rgbimage,rot);
break;
case 180:
rot = -1;
rotateimage=rotate(rgbimage,rot);
break;
default:
rotateimage=rgbimage;
}
recorder.record(rotateimage);
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
}
IplImage rotate(IplImage IplSrc,int angle) {
IplImage img= IplImage.create(IplSrc.height(), IplSrc.width(), IplSrc.depth(), IplSrc.nChannels());
cvTranspose(IplSrc, img);
cvFlip(img, img, angle);
return img;
}
}
经过多次搜索,这对我来说很有效。