Skip to content
Advertisement

Receive RTP stream with gstreamer

I’m trying to stream from a linux-based microcontroller, using gstreamer, to a python script. This is to workaround some firmware issues on the microcontroller where it cannot open the camera directly with opencv/python.

The launch output command looks like this for gstreamer:

gst-launch-1.0 -e -v v4l2src device=/dev/video0  ! video/x-raw,format=UYVY,width=1280,height=720,framerate=30/1 ! videoconvert ! video/x-raw,width=1280,height=720,framerate=30/1 ! avenc_mpeg4 bitrate=4000000 ! rtpmp4vpay config-interval=1 ! udpsink host="$1" port=5004

My question is, what would the “receive” command look like? Currently it’s using this:

gst-launch-1.0 udpsrc port=5004 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP4" ! rtpmp4vdepay ! glimagesink 

But that throws the error:

WARNING: erroneous pipeline: could not link udpsrc0 to rtpmp4vdepay0

It seems like I’m missing something….there’s not much documentation on how to properly write the receive scripts. Ultimately, I’d be putting into python with something like:

cap_receive = cv2.VideoCapture('gstreamer receive command script goes here' , cv2.CAP_GSTREAMER)

Any insights greatly appreciated, TIA!

Advertisement

Answer

If you check the output of gst-inspect-1.0 rtpmp4vdepay you will notice the following caps for the sink pad:

  SINK template: 'sink'
    Availability: Always
    Capabilities:
      application/x-rtp
                  media: video
             clock-rate: [ 1, 2147483647 ]
          encoding-name: MP4V-ES

You will notice that the encoding-name is actually MP4V-ES.

On top of that – you cannot connect the RTP depacketizer directly to an image sink. You will have to parse, decode and color convert before that. Perhaps a decodebin can help you here if you want want to build the pipeline by hand..

User contributions licensed under: CC BY-SA
9 People found this is helpful
Advertisement