My stream plays only first frame

Hey, I am trying to create an app that will transfer filesrc from one pc to another using udpsrc, because I cant save files there I had to save it to RAM, the video I get will be played after its fully received and should be able to go forward and jump to frames, pause/play.

This is my pipeline for sending filesrc:

gst-launch-1.0 filesrc location="C:/2024_03_08_09_36_12_627_[327]/video.avi" ! decodebin ! videoconvert ! x264enc ! rtph264pay config-interval=1 pt=96 ! udpsink host=228.1.1.1 port=9000

My Receiver:

#include <gst/gst.h>
#include <gst/app/gstappsink.h>
#include <iostream>
#include <vector>
#include <QLabel>
#include "receiver.h"
QLabel* receiver::m_label = nullptr;
std::vector<GstSample*> receiver::m_bufferVector;

// TODO data is saved raw, takes up a lot of memory

GstFlowReturn receiver::on_new_sample(GstAppSink *appsink, gpointer user_data) {
    // Cast user_data back to reciever instance
    receiver* self = static_cast<receiver*>(user_data);

    // Retrieve the sample
    GstSample *sample = gst_app_sink_pull_sample(appsink);
    if (!sample) {
        return GST_FLOW_ERROR;
    }

    // Save the sample into the buffer vector of the receiver instance
    self->m_bufferVector.push_back(sample);

    m_label->setText(QString::fromStdString("Frames: " + std::to_string(receiver::m_bufferVector.size())));
    return GST_FLOW_OK;
}


void receiver::receive(QLabel *label) {
    gst_init(nullptr, nullptr);

    m_label = label;
    // Include an H.264 encoder before the appsink to compress frames
    std::string pipeline_str = "udpsrc address=228.1.1.1 port=9000 caps=\"application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264\" ! rtph264depay ! queue ! avdec_h264 ! videoconvert ! x264enc tune=zerolatency ! appsink name=sink";
    GstElement *pipeline = gst_parse_launch(pipeline_str.c_str(), nullptr);

    if (!pipeline) {
        std::cerr << "Failed to create pipeline" << std::endl;
        return;
    }

    // Get the appsink element
    GstElement *sink = gst_bin_get_by_name(GST_BIN(pipeline), "sink");

    // Setup appsink
    gst_app_sink_set_emit_signals((GstAppSink*)sink, true);
    gst_app_sink_set_drop((GstAppSink*)sink, true);
    gst_app_sink_set_max_buffers((GstAppSink*)sink, 1);

    // Connect the "new-sample" signal to the callback function
    g_signal_connect(sink, "new-sample", G_CALLBACK(receiver::on_new_sample), this);

    gst_element_set_state(pipeline, GST_STATE_PLAYING);

    GMainLoop *main_loop = g_main_loop_new(nullptr, FALSE);
    g_main_loop_run(main_loop);

    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(GST_OBJECT(pipeline));
    g_main_loop_unref(main_loop);
}

My Player:

#include "player.h"
#include <gst/gst.h>

void player::play(const std::vector<GstSample*>& m_bufferVector) {
    gst_init(nullptr, nullptr);

    // Define a pipeline that uses appsrc as the source element for H.264 compressed frames
    GstElement* pipeline = gst_parse_launch("appsrc name=source ! h264parse ! avdec_h264 ! videoconvert ! autovideosink", nullptr);
    GstElement* appsrc = gst_bin_get_by_name(GST_BIN(pipeline), "source");

    // Configure appsrc's caps according to the video data you will push into it
    g_object_set(G_OBJECT(appsrc), "caps",
                 gst_caps_new_simple("video/x-h264",
                                     "stream-format", G_TYPE_STRING, "byte-stream",
                                     "alignment", G_TYPE_STRING, "au",
                                     nullptr),
                 nullptr);

    gst_element_set_state(pipeline, GST_STATE_PLAYING);

    // Frame duration in nanoseconds for 30 FPS
    const GstClockTime frame_duration = GST_SECOND / 30;
    GstClockTime timestamp = 0; // Start timestamp

    // Push each buffer to the appsrc element
    for (GstSample* sample : m_bufferVector) {
        GstBuffer* buffer = gst_sample_get_buffer(sample);
        GST_BUFFER_FLAG_SET(buffer, GST_BUFFER_FLAG_DELTA_UNIT);

        if (buffer != nullptr) {
            // Set buffer timestamp
            GST_BUFFER_PTS(buffer) = timestamp;
            GST_BUFFER_DURATION(buffer) = frame_duration;

            timestamp += frame_duration; // Increment timestamp for the next frame

            GstFlowReturn ret;
            g_signal_emit_by_name(appsrc, "push-buffer", buffer, &ret);
            if (ret != GST_FLOW_OK) {
                // Log the error or handle it accordingly
                break; // Exit the loop or handle the error as needed
            }
        }
    }

//     Wait until pipeline is finished
    GstStateChangeReturn state_ret = gst_element_get_state(pipeline, nullptr, nullptr, GST_CLOCK_TIME_NONE);
    // Ensure the state change was successful
    if (state_ret == GST_STATE_CHANGE_FAILURE) {
        // Handle state change failure, such as by logging or cleanup
    }

    // Clean up
    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(GST_OBJECT(pipeline));
}

The problem I have come to is my player plays only first frame.

Why do you decode and then immediately re-encode the H.264 video again on the receiver?

If you want the receiver to be able to seek and pause over a network connection, there might be better protocols, e.g.:

  • RTSP: check out rtsp-server’s test-uri.c example. You can make a file available as a VOD resource and play it with gst-play-1.0 rtsp://... or vlc rtsp://... etc. and it should support seeking and pausing.
  • HTTP: just point a http server that supports seeking (byte-range requests) to the directory with the file in it, then you can play it back via HTTP, and if the server supports seeking, seeking will be possible. You can also download the file, e.g. if you use playbin or playbin3 or uridecodebin or uridecodebin3 with the download flag set, or you can also set ring-buffer-max-size and have it buffer content in memory.

Thank you for your answer, I have tried doing seek, but I seem to get an error when seeking:

Code:

#include "player.h"
#include <gst/gst.h>

void player::play() {
    gst_init(nullptr, nullptr);

    // Assuming a frame rate of 30 fps for calculation purposes
    const int frameRate = 30;
    const int targetFrame = 500;
    const GstClockTime targetTime = (GstClockTime) (GST_SECOND * targetFrame / frameRate);

    // Define a pipeline
    GstElement* pipeline = gst_parse_launch("rtspsrc location=rtsp://127.0.0.1:8554/test latency=50 ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink", nullptr);

    // Set the pipeline to the PLAYING state to start playback and allow seeking
    gst_element_set_state(pipeline, GST_STATE_PLAYING);

    // Seek to the specific frame (time)
    if (!gst_element_seek_simple(pipeline, GST_FORMAT_TIME,
                                 static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE), targetTime)) {
        g_printerr("Seek failed!\n");
    } else {
        // Pause after seeking
        gst_element_set_state(pipeline, GST_STATE_PAUSED);
        g_print("Playback paused at frame: %d\n", targetFrame);
    }

    // Wait for the state change to complete or a failure occurs
    GstStateChangeReturn ret = gst_element_get_state(pipeline, nullptr, nullptr, GST_CLOCK_TIME_NONE);
    if (ret == GST_STATE_CHANGE_FAILURE) {
        g_printerr("Failed to pause the playback!\n");
    }

    // Main event loop to keep the application running while paused
    GMainLoop *loop = g_main_loop_new(nullptr, FALSE);
    g_main_loop_run(loop);

    // Clean up
    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(GST_OBJECT(pipeline));
    g_main_loop_unref(loop);
}

Code Server:

#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
#include <iostream>
#include <vector>
#include <QLabel>
#include "receiver.h"

//GstFlowReturn receiver::on_new_sample(GstAppSink *appsink, gpointer user_data) {
//    receiver* self = static_cast<receiver*>(user_data);
//    GstSample *sample = gst_app_sink_pull_sample(appsink);
//    if (!sample) {
//        return GST_FLOW_ERROR;
//    }
//    self->m_bufferVector.push_back(sample);
//    m_label->setText(QString::fromStdString("Frames: " + std::to_string(self->m_bufferVector.size())));
//    return GST_FLOW_OK;
//}

void receiver::receive() {
    gst_init(nullptr, nullptr);

    // Setup the RTSP server
    GstRTSPServer *server = gst_rtsp_server_new();
    GstRTSPMountPoints *mounts = gst_rtsp_server_get_mount_points(server);
    GstRTSPMediaFactory *factory = gst_rtsp_media_factory_new();

    std::string pipeline_str = "( filesrc location=\"C:/2024_03_08_09_36_12_627_[327]/video.avi\" ! decodebin ! videoconvert ! x264enc tune=zerolatency ! rtph264pay name=pay0 pt=96 )";
    gst_rtsp_media_factory_set_launch(factory, pipeline_str.c_str());

    gst_rtsp_mount_points_add_factory(mounts, "/test", factory);
    g_object_unref(mounts);

    GstRTSPAddressPool *pool = gst_rtsp_address_pool_new();
    gst_rtsp_address_pool_add_range(pool, "224.3.0.0", "224.3.0.255", 5000, 5010, 1);
    gst_rtsp_media_factory_set_address_pool(factory, pool);
    g_object_unref(pool);

    gst_rtsp_server_attach(server, nullptr);

    // Display RTSP server address
    std::cout << "RTSP server started at rtsp://127.0.0.1:8554/test" << std::endl;

    // Enter GMainLoop to keep the server running
    GMainLoop *main_loop = g_main_loop_new(nullptr, FALSE);
    g_main_loop_run(main_loop);

    // Clean up
    gst_element_set_state(GST_ELEMENT(server), GST_STATE_NULL);
    gst_object_unref(GST_OBJECT(server));
    g_main_loop_unref(main_loop);
}

Error:

C:\Git\framePlayer\cmake-build-release-visual-studio\framePlayer.exe
RTSP server started at rtsp://127.0.0.1:8554/test
Player START!
0:00:01.450759000  8924 00000238867B4040 WARN             d3d11device gstd3d11device.cpp:1271:gst_d3d11_device_get_video
_device_handle: D3D11 call failed: 0x80004002, No such interface supported
0:00:01.484040000  8924 0000023886646310 WARN                 basesrc gstbasesrc.c:3693:gst_base_src_start_complete:<fil
esrc0> pad not activated yet
0:00:01.674314000  8924 00000238868D54D0 WARN               cudanvrtc gstcudanvrtc.c:148:gst_cuda_nvrtc_load_library_onc
e: Could not open nvrtc library 'nvrtc64_90_0.dll': The specified module could not be found.
0:00:01.753704000  8924 00000238868CE9D0 WARN           basetransform gstbasetransform.c:1373:gst_base_transform_setcaps
:<videoconvert1> transform could not transform video/x-raw(memory:CUDAMemory), format=(string)NV12, width=(int)2048, hei
ght=(int)750, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSe
t)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspec
t-ratio=(fraction)1/1, framerate=(fraction)30/1 in anything we support
0:00:01.753987000  8924 00000238868CE9D0 WARN           basetransform gstbasetransform.c:1373:gst_base_transform_setcaps
:<videoconvert1> transform could not transform video/x-raw(memory:CUDAMemory), format=(string)NV12, width=(int)2048, hei
ght=(int)750, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSe
t)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspec
t-ratio=(fraction)1/1, framerate=(fraction)30/1 in anything we support
0:00:01.754279000  8924 00000238868CE9D0 WARN                GST_PADS gstpad.c:4361:gst_pad_peer_query:<decodebin0:src_0
> could not send sticky events
0:00:01.767081000  8924 00000238868CE9D0 WARN              rtpsession gstrtpsession.c:2436:gst_rtp_session_chain_send_rt
p_common:<rtpsession0> Can't determine running time for this packet without knowing configured latency
0:00:01.767335000  8924 00000238868CE9D0 WARN              rtpsession gstrtpsession.c:2515:gst_rtp_session_chain_send_rt
p_common:<rtpsession0> Don't have a clock yet and can't determine NTP time for this packet
0:00:01.768874000  8924 0000023886646EA0 WARN               rtspmedia rtsp-media.c:4623:gst_rtsp_media_suspend: media 00
00023886609470 was not prepared
0:00:01.770629000  8924 0000023886646EA0 WARN              rtspstream rtsp-stream.c:5607:gst_rtsp_stream_query_position:
<GstRTSPStream@00000238863975A0> Couldn't obtain position: position query failed
0:00:06.217033000  8924 000002388528E0C0 ERROR            d3d11window gstd3d11window_win32.cpp:1244:gst_d3d11_window_win
32_present:<d3d11windowwin32-0> Output window was closed
0:00:06.217471000  8924 000002388528E0C0 WARN          d3d11videosink gstd3d11videosink.cpp:1438:gst_d3d11_video_sink_sh
ow_frame:<autovideosink0-actual-sink-d3d11video> error: Output window was closed
0:00:06.248933000  8924 000002388528DCB0 WARN                 basesrc gstbasesrc.c:3132:gst_base_src_loop:<udpsrc0> erro
r: Internal data stream error.
0:00:06.249566000  8924 000002388528DCB0 WARN                 basesrc gstbasesrc.c:3132:gst_base_src_loop:<udpsrc0> erro
r: streaming stopped, reason error (-5)

Process finished with exit code 0

You need to wait for the pipeline to start before you can seek (either state PLAYING in this case, or maybe waiting for an ASYNC_DONE message might work here too, unsure).

Test if it works in gst-play-1.0 rtsp://... with the right/left arrow keys.