Skip to main content

Using a Rasberry Pi 4 as a Webcam

I wanted a way to share my video with my friends in a Discord call. Lacking a proper webcam, I needed some other way to capture myself on video. Fortunately, I happened to have a Raspberry Pi 4 and a camera module. Initially, I thought maybe I could just connect a USB cable directly between my Raspberry Pi and my computer and have my computer recognize the Pi as a camera with some software. A quick google search helped me realize this is not such a good idea as connecting devices together in this way can damage the connected devices. A special cable typically labeled as a USB bridging cable would be required for this to work and since I am trying to avoid spending money, I needed another solution.

Enter GStreamer

GStreamer is a multimedia framework that can be used to construct pipelines for media data to flow through. In order to get the following pipelines to run, we must have GStreamer installed on both the Pi and the host system. We’ll want to get pretty much all the GStreamer plugins (at least the good and the bad) and we’ll also need a kernel module to facilitate the creation of a video loopback device. On Arch Linux, this kernel module can be installed from the v4l2loopback-dkms package. I won’t go into the details about how to setup a v4l2 device here; There are other articles online that can explain the process.

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=1280,height=720,framerate=30/1 ! x264enc tune=zerolatency speed-preset=superfast ! rtph264pay ! queue ! udpsink

The above shell script fragment constructs a GStreamer pipeline that reads data from the device file /dev/video0 and sends it to the specified host. While this is running on the Pi, we can run another pipeline on the host computer to listen for the incoming RTP packets and write them out to a device file.

gst-launch-1.0 udpsrc ! application/x-rtp ! queue ! rtph264depay ! decodebin ! videoconvert ! video/x-raw,format=YUY2 ! v4l2sink device=/dev/video0

Pay particular attention to the format specified in the caps filter just before the v4l2sink! I spent a days trying to figure out why Discord wouldn’t accept my video device. Turns out, the format was erroneously negotiated as I420 and I had to specifically specify the format as YUY2 for Discord to accept the stream. The caps filter after the v4l2src is also important as Discord will not accept a stream with greater than 720p resolution and 30fps.

With both pipelines running, we can choose our loopback video device as our input video in Discord and we’ll be able to see our output in Discord!