Featured image of post GNOME Introductory Post

GNOME Introductory Post

I have been selected as a GSoC Intern at GNOME Foundation

I will be working as a Google Summer of Code Intern at GNOME Foundation, and my project will be to add Chromecast support to the already very cool GNOME Network Displays app that has Miracast support as of now. It can be installed through flatpak as well.


Linux desktop users will be able to cast their screens to Miracast and Chromecast devices. For the Android TVs that feature both of them, it would be wise to opt for the Chromecast path when both the devices share the same router (i.e. are on the same network) since Miracast would require setting up a WiFi-Direct connection to function (may be faster with less latency, may not be worth the trouble).

I believe this will see the light of day with very helpful and experienced mentors, Claudio Wunder and Benjamin Berg.


We started by looking into the Chromecast documentation (an adventure in itself). So there are two ends to deal with - the sender one and the receiver one.

The receiver end

There are two options here: Styled Media Receiver and Custom Web Receiver.

The Styled Media Receiver does everything for us. It hosts the receiver application (an HTML5 app) and provides all the default styles and functionality. If we want to change the styles and/or the logo, we are free to host them on an HTTPS server, and we’ll be done.

With everything it offers, it has limited still quite adequate media support, including playing video files, audio files, streaming video, and audio, displaying images and opening specific apps on the TV (like YouTube).

Our main job is to “stream” our desktop to the Chromecast device. For this purpose, the Styled Media Receiver supports three streaming protocols:

  • Dynamic Adaptive Streaming over HTTP (DASH)
  • Apple’s HTTP Live Streaming (HLS)
  • Microsoft’s Smooth streaming

We drop considerations for the Smooth Streaming protocol here because of the lack of feature differences compared to the other widely supported protocols, except that HLS and Smooth Streaming support pre-loading by default.

So we end up here with two choices: DASH and HLS. Now, HLS is widely supported; on the other hand, DASH is codec agnostic for both video and audio. In terms of latency, both are said to have similar latencies that depend on the segment duration we decide upon.

As per the HLS’s RFC (linked below), the EXT-X-TARGETDURATION tag in the playlist file (or manifest file) accepts a decimal integer value. It dictates how much the maximum duration of each segment can be (rounded off to the nearest integer). It is suspected to be similar to DASH, although its RFC does not clearly mention so. We better find out by actually experimenting and streaming.

Reducing the segment size too much can cause the bitrate to bump up and the stream quality to degrade thanks to all the added keyframes for each segment, so we need to test what suits this live stream the best.

Next up, we take the discussion to the Custom Web Receiver and the conclusions from some of our tests with the Chromecast streamings using the Command and Control (CaC) Tool, which is a sender app provided by Google for testing and debugging Web Receiver apps.

We want to test out and know more about other protocols not supported out of the box in Chromecast but can work fine with a Custom Web Receiver: SRT, RTSP and WebRTC.

Thanks to Benjamin and Claudio for all the help!

Relevant Links:


comments powered by Disqus