[Android] [Appsink] [Appsrc] Strange behavior when displaying the receiving raw data via appsrc

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

[Android] [Appsink] [Appsrc] Strange behavior when displaying the receiving raw data via appsrc

sofien rahmouni
i have an issue of strange behavior when display the received raw data h264
is half of the screen is grey.

The issue only reproduce on internet network (in local network is work

On my code i have two pipeline:

1 - Pipeline:  Appsink  which take the ahcsrc as a source video  and send it
as a raw by a UdpSocket on java

/ Sample Send data APPSINK
 data->pipeline = gst_parse_launch(
         "ahcsrc device=1 ! video/x-raw,format=(string)NV21 ! tee name=t t.
! queue ! autovideosink sync=false t. ! queue ! video/x-raw,  width=320,
height=240 ! videoconvert ! x264enc bitrate=500 speed-preset=superfast
tune=zerolatency ! rtph264pay mtu=1024 ! appsink
name=callback_read_buffer_sink emit-signals=true",
 1.1 - Callback de l'appsink :

 GstElement *testsink = NULL;
 /* we use appsink in push mode, it sends us a signal when data is available
* and we pull out the data in the signal callback. */
 testsink = gst_bin_get_by_name(GST_BIN(data->pipeline),

 if (testsink == NULL) {
     g_print("appsink is NULL\n");
 g_signal_connect(testsink, "new-sample", G_CALLBACK(cb_new_sample), NULL);

 if (error) {
     gchar *message =
             g_strdup_printf("Unable to build pipeline: %s",
     set_ui_message(message, data);
     return NULL;

2- Pipeline: Appsrc which receive a data raw video h264 and display it via

// Sample Receive data APPSRC
data->pipeline = gst_parse_launch(
        "appsrc name=video_app_source is-live=true !
! rtpjitterbuffer latency=10 ! rtph264depay ! h264parse ! avdec_h264 !
videoconvert ! autovideosink",
   2.1 - Callback de l'appsrc :
static void gst_native_receive_video_data(JNIEnv *env, jobject thiz,
jbyteArray array) {
    jbyte *temp = (*env)->GetByteArrayElements(env, array, NULL);
    jsize size = (*env)->GetArrayLength(env, array);

    GstBuffer *buffer = gst_buffer_new_allocate(NULL, size, NULL);
    gst_buffer_fill(buffer, 0, temp, size);

    GstElement *element =
gst_bin_get_by_name(GST_BIN(pCustomData->pipeline), "video_app_source");

    if (basetimestamp == 0) {

        if (appsrc) {
            element = gst_object_ref(appsrc);
        } else {
            fprintf(stdout, "%s:%d %s\n", __func__, __LINE__, "manually
setting time: exit");
            element = NULL;

        GST_OBJECT_LOCK (element);
        if (GST_ELEMENT_CLOCK (element)) {
            GstClockTime now;
            GstClockTime base_time;
            now = gst_clock_get_time (GST_ELEMENT_CLOCK (element));
            base_time = GST_ELEMENT_CAST (element)->base_time;

            basetimestamp = now - base_time;
            GST_BUFFER_TIMESTAMP (buffer) = basetimestamp;

        GST_OBJECT_UNLOCK (element);
        gst_object_unref (element);

        gst_app_src_push_buffer (GST_APP_SRC(element), buffer);
    } else {
        gst_app_src_push_buffer(appsrc, buffer);

    (*env)->ReleaseByteArrayElements(env, array, temp, JNI_ABORT);

Here is attached screenshot of the strange behavior of the streaming

Sent from: http://gstreamer-devel.966125.n4.nabble.com/
gstreamer-devel mailing list
[hidden email]