GNUnet Messenger API: August 2024

Hi again,

as mentioned last time when talking about the user interface for discourses, I was looking into the implementation of video streaming this time. For this task I wasn’t really able to workaround using RTP or something comparable. So I tried looking into it again with success. The problem I had with it previously was actually related to me overlooking the structure GstRTPBuffer in the documentation of GStreamer. With that I was able to calculate the actual durations of each buffer. I could simplify the code handling video and audio streams into one with only some variables like datarate which differ. But overall it worked for RTP via both rtpL16pay and rtph264pay nodes.

So I could transfer H.264 encoded video via GStreamer now without major issues. However one thing I needed to solve was getting encoding done on many different platforms and form factors. Because not every SoC or device offers hardware encoding and decoding which is preferable though reducing power cost and latency. So the best compromise I could come up with was using the x264enc node from GStreamer for encoding and the avdec_h264 node for decoding. These were available on all platforms I tested with, including mobile with the Librem 5 and in a all-in-one pipeline it’s possible to have them running in real time without big latency issues if configured properly.

That means encoding is done in software for now on the CPU and decoding is hardware accelerated if available which should cover most cases. There might also be arguments to select from within multiple encodings because AV1 for example has big advantages over H.264 and even H.265 might be a better option. But I wanted to implement a solution for now that can be utilized by most hardware combinations.

My goal was to have a usable 720p resolution (1280x720) at 30Hz (or frames per second) video stream. So I decided to downscale any input from webcam towards this level and adjust properties of my pipelines until I was happy with the result. Currently the bitrate is rather low for this target to reduce latency. So this might change a bit to increase overall quality. I might also still add UI to manually configure own streaming. But nothing drastic is planned.

picture of discourse dialog with active screencast

For screen sharing I needed to implement another feature which wasn’t part from GStreamer nodes already. Because a screencast via libportal from desktop or a specific window only generates a new frame if anything visible changes, I couldn’t rely on continuous data in the video stream to tell whether another contact is streaming video actively. So I added some virtual heartbeat that sends every 100ms an empty RTP buffer that will be dropped on receive for decoding. But the client will understand there’s still an active stream if even video data falls off for some undefined time window.

So it’s now possible to either share your screen with others or your webcam. There’s only some limitation for cameras, I noticed. Not all cameras in tablets and phones are properly supported or recogniced by libportal or pipewire. That means because the current implementation uses a pipewiresrc node for video input, some devices might show the button to activate camera streaming as disabled even though V4L2 might be able to capture images from those sensors.

I should also note that I’ve fixed multiple issues from multi-threading with a newer solution using eventfd and semaphores to hold different threads until a critical area of instructions has been completed. This works much better than my previous approach and it seems much more reliable as well. So messages don’t get dropped randomly anymore by the GUI application and the CPU usage should be lower overall because I could get rid of some idle tasks.

Besides that there are only minor UI adjustments and I’m really looking forward towards the next release. It just might take a while because some changes and fixes in GNUnet since the release of version 0.21.2 (which are upstream already but not released yet as stable) are necessary for the discourses to work properly. So I expect the next stable release of the Messenger application to come with GNUnet 0.22.0 and this should contain the discourse dialog with live chat functionality.

Small addition: End of last month I randomly discovered that the snap package for the messenger-gtk application was able to build again because the gnome extension for snaps finally supported the core24 base of it. That means it’s finally up-to-date again with version 0.10.0 as the flatpak has been for quite some time.

If I work on other major changes or features, I’ll let you know here on this blog.

Kind regards,
Jacki

Read original article

Popular posts from this blog

GNUnet Messenger API: March

GNUnet Messenger API: September

GNUnet Messenger API: February