如何将OpenCV图像推送到GStreamer管道中,以便通过TCPServer接收器进行流式传输



我正试图将OpenCV创建的图像推送到GStreamer管道中,以便通过GStreamer TCPServerSink流式传输视频。

我的GStreamer管道如下所示:AppSrc->FFMpegColorSpace->VP8Enc->WebMMux->TCPServerSinkAppSrc是通过创建OpenCV映像并通过gst_app_src_push_buffer将它们推送到AppSrc来提供的。

我写了一个名为"Sandbox"的小测试程序,它显示了程序运行时GStreamer中发生的错误。

我的测试系统如下:

  • Gentoo
  • 内核:3.10.0-rc3
  • OpenCV:2.4.5
  • G流:0.10.36

我想了解为什么会出现此错误,我做错了什么,以及一个有效的解决方案可能是什么样子

我希望你能帮我解决这个问题。

我已经建立了一个迷你CMake项目,其中包含产生错误的代码。它由两个文件组成,即CMakeLists.txt和Sandbox.cpp。CMakeLists.txt是关于如何构建Sandbox.cp中包含的Sandbox程序的配置。要构建它,只需创建一个目录,例如"构建",更改为它并启动一个源代码外构建:

cmake ../

现在,如果安装了依赖项,CMake会创建所有需要的文件,因此您只需键入:

make

以构建程序。

####################################################################################################
# Projectinformations
####################################################################################################
project( Sandbox CXX )
cmake_minimum_required( VERSION 2.8 )
####################################################################################################
# G++ Options
####################################################################################################
set( CMAKE_C_COMPILER "gcc" )
set( CMAKE_CXX_COMPILER "g++" )
set( CMAKE_CXX_FLAGS "-g -O0 -std=c++11 -ggdb -Wall -W -Wunused-variable -Wunused-parameter -Wunused-function -Wunused -Woverloaded-virtual -Wwrite-strings --coverage" )
set( CMAKE_C_FLAGS ${CMAKE_CXX_FLAGS} )
####################################################################################################
# Resolve Dependencies
####################################################################################################
set( Boost_DEBUG 1 )
set( Boost_USE_MULTITHREADED 1 )
find_package( Boost 1.53 REQUIRED system thread timer )
find_package( PkgConfig )
pkg_check_modules( GSTREAMER_0_10 gstreamer-0.10 )
pkg_check_modules( GSTREAMER_0_10_APP gstreamer-app-0.10 )
include_directories( ${GSTREAMER_0_10_INCLUDE_DIRS} )
include_directories( ${GSTREAMER_0_10_APP_INCLUDE_DIRS} )
find_package( OpenCV REQUIRED )
####################################################################################################
# Project
####################################################################################################
add_executable( Sandbox Sandbox.cpp )

程序本身如下所示:

// Standard C++ Libraries
#include <iostream>
#include <sstream>
#include <string>
// Boost Libraries
#include <boost/asio.hpp>
#include <boost/make_shared.hpp>
#include <boost/date_time/posix_time/posix_time.hpp>
#include <boost/shared_ptr.hpp>
#include <boost/thread.hpp>
// GStreamer
#include <gstreamer-0.10/gst/gst.h>
#include <gstreamer-0.10/gst/gstelement.h>
#include <gstreamer-0.10/gst/gstpipeline.h>
#include <gstreamer-0.10/gst/gstutils.h>
#include <gstreamer-0.10/gst/app/gstappsrc.h>
// OpenCV
// #include "cv.h"
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/imgproc/imgproc.hpp>

GMainLoop *glib_MainLoop;
unsigned int heartbeat_Intervall; ///< In Milliseconds
boost::shared_ptr<boost::asio::deadline_timer> heartbeat;
GstElement *source_OpenCV;
guint64 imagecounter;
void
GLib_MainLoop() {
    if( !g_main_loop_is_running( glib_MainLoop ) ) {
        std::cout << "Starting glib_MainLoop..." << std::endl;
        g_main_loop_run( glib_MainLoop );
        std::cout << "Starting glib_MainLoop stopped." << std::endl;
    }
};
/// Creates an Image with a red filled Circle and the current Time displayed in it.
cv::Mat
Create_Image() {
    cv::Size size = cv::Size( 640, 480 );
    cv::Mat image = cv::Mat::zeros( size, CV_8UC3 );
    cv::Point center = cv::Point( size.width / 2, size.height / 2 );
    int thickness = -1;
    int lineType = 8;
    cv::circle( image, center, size.width / 4.0, cv::Scalar( 0, 0, 255 ), thickness, lineType );
    std::stringstream current_Time;
    boost::posix_time::time_facet *facet = new boost::posix_time::time_facet( "%Y.%m.%d %H:%M:%S.%f" );
    current_Time.imbue( std::locale( current_Time.getloc(), facet ) );
    current_Time << boost::posix_time::microsec_clock::universal_time();
    int font = cv::FONT_HERSHEY_SCRIPT_SIMPLEX;
    double fontsize = 1;
    int font_Thickness = 1;
    int baseline = 0;
    cv::Size textSize_01 = cv::getTextSize( current_Time.str(), font, fontsize, font_Thickness , &baseline );
    baseline += font_Thickness;
    cv::Point textOrg_01( ( image.cols - textSize_01.width ) / 2, ( image.rows + textSize_01.height * 2 ) / 2 );
    cv::Scalar textcolour = cv::Scalar( 0, 255, 0 );
    cv::putText( image, current_Time.str(), textOrg_01, font, fontsize, textcolour , font_Thickness , 1 );
    return image;
}
/// Creates a Graph of the created Pipeline including the contained Elements. The environment variable "GST_DEBUG_DUMP_DOT_DIR" must be set, e.g to /tmp/ to actually create the Graph.
/// Furthermore GST_DEBUG needs to be activated, e.g. with "GST_DEBUG=3".
/// So "GST_DEBUG=3 GST_DEBUG_DUMP_DOT_DIR=/tmp/" ./Sandbox would work.
/// The .dot file can be converted to a e.g. svg-Graphic with the following command (Package GraphViz): dot -Tsvg -oPipelineGraph.svg PipelineGraph.dot
void
Create_PipelineGraph( GstElement *pipeline ) {
    bool debug_active = gst_debug_is_active();
    gst_debug_set_active( 1 );
    GST_DEBUG_BIN_TO_DOT_FILE( GST_BIN( pipeline ), GST_DEBUG_GRAPH_SHOW_ALL, "PipelineGraph" );
    gst_debug_set_active( debug_active );
}
void
Push_new_Image( const boost::system::error_code &error ) {
    if( error != 0 ) {
        std::cout << "Error in Timer: " << error.message() << std::endl;
        return;
    }
    cv::Mat image = Create_Image();
    /// OpenCV handles image in BGR, so to get RGB, Channels R and B needs to be swapped.
    cv::cvtColor( image, image, CV_CVTIMG_SWAP_RB );
    {
        /// How do i get the actual bpp and depth out of the cv::Mat?
        GstCaps *caps = gst_caps_new_simple( "video/x-raw-rgb", "width", G_TYPE_INT, image.cols, "height", G_TYPE_INT, image.rows, "framerate", GST_TYPE_FRACTION, 0, 1, NULL );
        g_object_set( G_OBJECT( source_OpenCV ), "caps", caps, NULL );
        gst_caps_unref( caps );
        IplImage* img = new IplImage( image );
        uchar *IMG_data = ( uchar* ) img->imageData;
        GstBuffer *buffer;
        {
            int bufferlength = image.cols * image.rows * image.channels();
            buffer = gst_buffer_new_and_alloc( bufferlength );
            /// Copy Data from OpenCV to GStreamer
            memcpy( GST_BUFFER_DATA( buffer ), IMG_data, GST_BUFFER_SIZE( buffer ) );
            GST_BUFFER_DURATION( buffer ) = gst_util_uint64_scale( bufferlength, GST_SECOND, 1 );
        }
        /// Setting the Metadata for the image to be pushed.
        {
            GstCaps *caps_Source = NULL;
            std::stringstream video_caps_text;
            video_caps_text << "video/x-raw-rgb,width=(int)" << image.cols << ",height=(int)" << image.rows << ",framerate=(fraction)0/1";
            caps_Source = gst_caps_from_string( video_caps_text.str().c_str() );
            if( !GST_IS_CAPS( caps_Source ) ) {
                std::cout << "Error creating Caps for OpenCV-Source, exiting...";
                exit( 1 );
            }
            gst_app_src_set_caps( GST_APP_SRC( source_OpenCV ), caps_Source );
            gst_buffer_set_caps( buffer, caps_Source );
            gst_caps_unref( caps_Source );
        }
        /// Setting a continues timestamp
        GST_BUFFER_TIMESTAMP( buffer ) = gst_util_uint64_scale( imagecounter * 20, GST_MSECOND, 1 );
        imagecounter += 1;
        /// Push Buffer into GStreamer-Pipeline
        GstFlowReturn rw;
        rw = gst_app_src_push_buffer( GST_APP_SRC( source_OpenCV ), buffer );
        if( rw != GST_FLOW_OK ) {
            std::cout << "Error push buffer to GStreamer-Pipeline, exiting...";
            exit( 1 );
        } else {
            std::cout << "GST_FLOW_OK " << "imagecounter: " << imagecounter << std::endl;
        }
    }
    /// Renew the Heartbeat-Timer
    heartbeat->expires_from_now( boost::posix_time::milliseconds( heartbeat_Intervall ) );
    heartbeat->async_wait( Push_new_Image );
}
int
main( int argc, char **argv ) {
    std::cout << "Sandbox started." << std::endl;
    /// ####################
    /// Initialise Sandbox
    /// ####################
    boost::shared_ptr<boost::asio::io_service> io_service = boost::make_shared<boost::asio::io_service>();
    boost::shared_ptr<boost::asio::io_service::work> work = boost::make_shared<boost::asio::io_service::work>( *io_service );
    boost::shared_ptr<boost::thread_group> threadgroup = boost::make_shared<boost::thread_group>();
    /// io_service Callback for continuously feeding into the pipeline of GStreamer.
    /// I've using to push the Buffer into GStreamer as i come available instead of getting informed about an empty pipeline by GStreamer-Signals.
    heartbeat_Intervall = 1000; ///< In Milliseconds
    heartbeat = boost::make_shared<boost::asio::deadline_timer>( ( *( io_service.get() ) ) );
    std::cout << "Initialise GStreamer..." << std::endl;
    gst_init( &argc, &argv );
    glib_MainLoop = g_main_loop_new( NULL, 0 );
    std::cout << "Start GLib_MainLoop..." << std::endl;
    io_service->post( GLib_MainLoop );
    /// Create some Workerthreads
    for( std::size_t i = 0; i < 3; ++i )  {
        threadgroup->create_thread( boost::bind( &boost::asio::io_service::run, &( *io_service ) ) );
    }
    /// ####################
    /// Do the actual Work
    /// ####################
    GstElement *pipeline;
    GstElement *converter_FFMpegColorSpace;
    GstElement *converter_VP8_Encoder;
    GstElement *muxer_WebM;
    GstElement *sink_TCPServer;

    /// Create GStreamer Elements
    pipeline = gst_pipeline_new( "OpenCV_to_TCPServer" );
    if( !pipeline ) {
        std::cout << "Error creating Pipeline, exiting...";
        return 1;
    }
    {
        source_OpenCV = gst_element_factory_make( "appsrc", "Source_OpenCV" );
        if( !source_OpenCV ) {
            std::cout << "Error creating OpenCV-Source, exiting...";
            return 1;
        }
        gst_bin_add( GST_BIN( pipeline ), source_OpenCV );
    }
    {
        converter_FFMpegColorSpace = gst_element_factory_make( "ffmpegcolorspace", "Converter_FFMpegColorSpace" );
        if( !converter_FFMpegColorSpace ) {
            std::cout << "Error creating Converter_FFMpegColorSpace, exiting...";
            return 1;
        }
        gst_bin_add( GST_BIN( pipeline ), converter_FFMpegColorSpace );
    }
    {
        converter_VP8_Encoder = gst_element_factory_make( "vp8enc", "Converter_VP8_Encoder" );
        if( !converter_VP8_Encoder ) {
            std::cout << "Error creating Converter_VP8_Encoder, exiting...";
            return 1;
        }
        gst_bin_add( GST_BIN( pipeline ), converter_VP8_Encoder );
    }
    {
        muxer_WebM = gst_element_factory_make( "webmmux", "Muxer_WebM" );
        if( !muxer_WebM ) {
            std::cout << "Error creating Muxer_WebM, exiting...";
            return 1;
        }
        gst_bin_add( GST_BIN( pipeline ), muxer_WebM );
    }
    {
        sink_TCPServer = gst_element_factory_make( "tcpserversink", "Sink_TCPServer" );
        if( !sink_TCPServer ) {
            std::cout << "Error creating Sink_TCPServer, exiting...";
            return 1;
        }
        gst_bin_add( GST_BIN( pipeline ), sink_TCPServer );
    }

    /// Link GStreamer Elements
    if( !gst_element_link( source_OpenCV, converter_FFMpegColorSpace ) ) {
        std::cout << "Error linking creating source_OpenCV to converter_FFMpegColorSpace, exiting...";
        return 2;
    }
    if( !gst_element_link( converter_FFMpegColorSpace, converter_VP8_Encoder ) ) {
        std::cout << "Error linking creating converter_FFMpegColorSpace to converter_VP8_Encoder, exiting...";
        return 2;
    }
    if( !gst_element_link( converter_VP8_Encoder, muxer_WebM ) ) {
        std::cout << "Error linking creating converter_VP8_Encoder to muxer_WebM, exiting...";
        return 2;
    }
    if( !gst_element_link( muxer_WebM, sink_TCPServer ) ) {
        std::cout << "Error linking creating muxer_WebM to sink_TCPServer, exiting...";
        return 2;
    }

    /// Set State of the GStreamer Pipeline to Playing
    GstStateChangeReturn ret = gst_element_set_state( pipeline, GST_STATE_PLAYING );
    if( ret == GST_STATE_CHANGE_FAILURE ) {
        std::cout << "Error setting GStreamer-Pipeline to playing.";
        return 2;
    }
    Create_PipelineGraph( pipeline );

    /// Start the Heartbeat, that continously creates new Images
    heartbeat->expires_from_now( boost::posix_time::milliseconds( heartbeat_Intervall ) );
    heartbeat->async_wait( Push_new_Image );
    /// ####################
    /// Shutdown the Sandbox
    /// ####################
    std::cout << "Wait some Seconds before joining all Threads and shutdown the Sandbox..." << std::endl;
    boost::this_thread::sleep( boost::posix_time::seconds( 4 ) );
    std::cout << "Shutdown Sandbox..." << std::endl;
    g_main_loop_quit( glib_MainLoop );
    io_service->stop();
    while( !io_service->stopped() ) {
        boost::this_thread::sleep( boost::posix_time::seconds( 1 ) );
    }
    work.reset();
    threadgroup->join_all();
    g_main_loop_unref( glib_MainLoop );
    threadgroup.reset();
    work.reset();
    io_service.reset();
    std::cout << "Sandbox stopped" << std::endl;
}
target_link_libraries( Sandbox ${Boost_LIBRARIES} ${GSTREAMER_0_10_LIBRARIES} ${GSTREAMER_0_10_APP_LIBRARIES} ${OpenCV_LIBS} )
set_target_properties( Sandbox PROPERTIES LINKER_LANGUAGE CXX )

我开始沙盒程序如下:

LC_ALL="C" GST_DEBUG=3 GST_DEBUG_DUMP_DOT_DIR=/tmp/ ./Sandbox

然后,应该在/tmp/中创建当前管道的Graph。这个.dot文件可以转换为例如svg图形,带有:

dot -Tsvg -oPipelineGraph.svg PipelineGraph.dot

错误发生后的直接时间。这是通过GST_DEBUG=3:选择的短消息

...
0:00:00.141888460 28057      0x2245d90 INFO                 basesrc gstbasesrc.c:2562:gst_base_src_loop:<Source_OpenCV> pausing after gst_pad_push() = not-negotiated
0:00:00.141924274 28057      0x2245d90 WARN                 basesrc gstbasesrc.c:2625:gst_base_src_loop:<Source_OpenCV> error: Internal data flow error.
0:00:00.141937917 28057      0x2245d90 WARN                 basesrc gstbasesrc.c:2625:gst_base_src_loop:<Source_OpenCV> error: streaming task paused, reason not-negotiated (-4)
0:00:00.141965714 28057      0x2245d90 INFO        GST_ERROR_SYSTEM gstelement.c:1964:gst_element_message_full:<Source_OpenCV> posting message: Internal data flow error.
0:00:00.141998959 28057      0x2245d90 INFO        GST_ERROR_SYSTEM gstelement.c:1987:gst_element_message_full:<Source_OpenCV> posted error message: Internal data flow error.
0:00:00.142018539 28057      0x2245d90 ERROR                 vp8enc gstvp8enc.c:1028:gst_vp8_enc_finish:<Converter_VP8_Encoder> encode returned 1 error
0:00:00.142053733 28057      0x2245d90 INFO             matroskamux matroska-mux.c:2226:gst_matroska_mux_start:<ebmlwrite0> DocType: webm, Version: 2
0:00:00.142082043 28057      0x2245d90 INFO               ebmlwrite ebml-write.c:218:gst_ebml_writer_send_new_segment_event: seeking to 0
0:00:00.142093688 28057      0x2245d90 INFO               GST_EVENT gstevent.c:606:gst_event_new_new_segment_full: creating newsegment update 0, rate 1.000000, format bytes, start 0, stop -1, position 0
...

对于GST_DEBUG=4,它看起来像:

...
0:00:02.464122744 28483      0x24b8590 DEBUG               GST_CAPS gstpad.c:2925:gst_pad_get_allowed_caps:<Converter_VP8_Encoder:src> allowed caps video/x-vp8, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:02.464152446 28483      0x24b8590 DEBUG               GST_CAPS gstpad.c:2263:gst_pad_get_caps_unlocked:<Converter_VP8_Encoder:sink> pad getcaps returned video/x-raw-yuv, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 2147483647/1 ], format=(fourcc)I420
0:00:02.464174847 28483      0x24b8590 DEBUG               GST_PADS gstpad.c:2577:gst_pad_acceptcaps_default:<Converter_VP8_Encoder:sink> allowed caps video/x-raw-yuv, width=(int)[ 16, 4096 ], height=(int)[ 16, 4096 ], framerate=(fraction)[ 0/1, 2147483647/1 ], format=(fourcc)I420
0:00:02.464198411 28483      0x24b8590 DEBUG               GST_PADS gstpad.c:2629:gst_pad_accept_caps:<Converter_VP8_Encoder:sink> acceptfunc returned 1
0:00:02.464210382 28483      0x24b8590 DEBUG          basetransform gstbasetransform.c:1082:gst_base_transform_find_transform:<Converter_FFMpegColorSpace> Input caps were video/x-raw-rgb, width=(int)640, height=(int)480, framerate=(fraction)0/1, and got final caps video/x-raw-yuv, width=(int)640, height=(int)480, framerate=(fraction)0/1, format=(fourcc)I420
0:00:02.464236195 28483      0x24b8590 DEBUG          basetransform gstbasetransform.c:810:gst_base_transform_configure_caps:<Converter_FFMpegColorSpace> in caps:  video/x-raw-rgb, width=(int)640, height=(int)480, framerate=(fraction)0/1
0:00:02.464252757 28483      0x24b8590 DEBUG          basetransform gstbasetransform.c:811:gst_base_transform_configure_caps:<Converter_FFMpegColorSpace> out caps: video/x-raw-yuv, width=(int)640, height=(int)480, framerate=(fraction)0/1, format=(fourcc)I420
0:00:02.464271709 28483      0x24b8590 DEBUG          basetransform gstbasetransform.c:819:gst_base_transform_configure_caps:<Converter_FFMpegColorSpace> have_same_caps: 0
0:00:02.464282177 28483      0x24b8590 DEBUG          basetransform gstbasetransform.c:2921:gst_base_transform_set_in_place:<Converter_FFMpegColorSpace> setting in_place FALSE
0:00:02.464292338 28483      0x24b8590 DEBUG          basetransform gstbasetransform.c:2860:gst_base_transform_set_passthrough:<Converter_FFMpegColorSpace> set passthrough 0
0:00:02.464302369 28483      0x24b8590 DEBUG          basetransform gstbasetransform.c:834:gst_base_transform_configure_caps:<Converter_FFMpegColorSpace> Calling set_caps method to setup caps
0:00:02.464318339 28483      0x24b8590 DEBUG       ffmpegcolorspace gstffmpegcolorspace.c:320:gst_ffmpegcsp_set_caps:<Converter_FFMpegColorSpace> could not configure context for input format
0:00:02.464329667 28483      0x24b8590 WARN           basetransform gstbasetransform.c:1311:gst_base_transform_setcaps:<Converter_FFMpegColorSpace> FAILED to configure caps <Converter_FFMpegColorSpace:src> to accept video/x-raw-yuv, width=(int)640, height=(int)480, framerate=(fraction)0/1, format=(fourcc)I420
0:00:02.464352143 28483      0x24b8590 DEBUG               GST_CAPS gstpad.c:2773:gst_pad_set_caps:<Converter_FFMpegColorSpace:sink> caps video/x-raw-rgb, width=(int)640, height=(int)480, framerate=(fraction)0/1 could not be set
0:00:02.464504648 28483      0x24b8590 INFO                 basesrc gstbasesrc.c:2562:gst_base_src_loop:<Source_OpenCV> pausing after gst_pad_push() = not-negotiated
0:00:02.464517617 28483      0x24b8590 DEBUG                basesrc gstbasesrc.c:2588:gst_base_src_loop:<Source_OpenCV> pausing task, reason not-negotiated
0:00:02.464531137 28483      0x24b8590 DEBUG               GST_PADS gstpad.c:5646:gst_pad_pause_task:<Source_OpenCV:src> pause task
0:00:02.464543705 28483      0x24b8590 DEBUG                   task gsttask.c:698:gst_task_set_state:<Source_OpenCV:src> Changing task 0x24c9000 to state 2
0:00:02.464585246 28483      0x24b8590 DEBUG              GST_EVENT gstevent.c:269:gst_event_new: creating new event 0x24b8940 eos 86
0:00:02.464618078 28483      0x24b8590 WARN                 basesrc gstbasesrc.c:2625:gst_base_src_loop:<Source_OpenCV> error: Internal data flow error.
0:00:02.464631538 28483      0x24b8590 WARN                 basesrc gstbasesrc.c:2625:gst_base_src_loop:<Source_OpenCV> error: streaming task paused, reason not-negotiated (-4)
0:00:02.464645609 28483      0x24b8590 DEBUG            GST_MESSAGE gstelement.c:1933:gst_element_message_full:<Source_OpenCV> start
0:00:02.464672918 28483      0x24b8590 INFO        GST_ERROR_SYSTEM gstelement.c:1964:gst_element_message_full:<Source_OpenCV> posting message: Internal data flow error.
0:00:02.464697059 28483      0x24b8590 DEBUG                GST_BUS gstbus.c:308:gst_bus_post:<bus0> [msg 0x2325680] posting on bus, type error, GstMessageError, gerror=(GError)NULL, debug=(string)"gstbasesrc.c(2625): gst_base_src_loop (): /GstPipeline:OpenCV_to_TCPServer/GstAppSrc:Source_OpenCV:12streaming task paused, reason not-negotiated (-4)"; from source <Source_OpenCV>
0:00:02.464727109 28483      0x24b8590 DEBUG                    bin gstbin.c:3164:gst_bin_handle_message_func:<OpenCV_to_TCPServer> [msg 0x2325680] handling child Source_OpenCV message of type error
0:00:02.464739521 28483      0x24b8590 DEBUG                    bin gstbin.c:3171:gst_bin_handle_message_func:<OpenCV_to_TCPServer> got ERROR message, unlocking state change
0:00:02.464750463 28483      0x24b8590 DEBUG                    bin gstbin.c:3441:gst_bin_handle_message_func:<OpenCV_to_TCPServer> posting message upward
0:00:02.464770307 28483      0x24b8590 DEBUG                GST_BUS gstbus.c:308:gst_bus_post:<bus1> [msg 0x2325680] posting on bus, type error, GstMessageError, gerror=(GError)NULL, debug=(string)"gstbasesrc.c(2625): gst_base_src_loop (): /GstPipeline:OpenCV_to_TCPServer/GstAppSrc:Source_OpenCV:12streaming task paused, reason not-negotiated (-4)"; from source <Source_OpenCV>
0:00:02.464792364 28483      0x24b8590 DEBUG                GST_BUS gstbus.c:338:gst_bus_post:<bus1> [msg 0x2325680] pushing on async queue
0:00:02.464804471 28483      0x24b8590 DEBUG                GST_BUS gstbus.c:343:gst_bus_post:<bus1> [msg 0x2325680] pushed on async queue
0:00:02.464824022 28483      0x24b8590 DEBUG                GST_BUS gstbus.c:334:gst_bus_post:<bus0> [msg 0x2325680] dropped
0:00:02.464835553 28483      0x24b8590 INFO        GST_ERROR_SYSTEM gstelement.c:1987:gst_element_message_full:<Source_OpenCV> posted error message: Internal data flow error.
...

GST_DEBUG=5将超过此帖子的限制。:)

我以这种方式阅读日志,发现将图像从Source_OpenCV传输到Converter_FFMpegColorSpace元素时存在问题?但是为什么converter_FFMpegColorSpace不能处理数据呢?我以为我向Converter_FFMpegColorSpace提供了x-raw-rgb,所以它可以将其转换为Converter_VP8_Encoder的x-raw-yuv。我是否以错误的方式指定了推送的图像?

如果我缺少一些信息,我很高兴得到提示,这样我就可以添加它们。

提前感谢!

这里的问题是您将video/x-raw-rgb,width=640,height=480,帧速率=0/1设置为上限。这还不完整,如果这是ARGB,还需要提供红色、绿色和蓝色掩码、endianness和alpha掩码。例如,以ffmpegcolorspace接收器模板帽为例,并从中选择了正确的格式。

还可以考虑使用GStreamer 1.x,而不是旧的、未维护的0.10版本。在那里,指定上限也更容易,因为在这种情况下,它只需要是video/x-raw,format=ARGB,width=640,height=480,帧速率=0/1(假设它是ARGB,而不是BGRA或BGRx或其他什么)。

IIRC opencv使用RGB或RGBA。

opencv输出格式为rgb,但vp8enc信元大小写如下:

 SINK template: 'sink'
    Availability: Always
    Capabilities:
      video/x-raw-yuv
                 format: I420
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]

你需要转换rgb->yuv,我在opencv和gstreamer之间转换了yuv2rgb,如果你需要,我可以和你分享我的代码。

相关内容

最新更新