uttarayan21 a7ffa69326
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
fix(iced-video): Fix the very high ram usage
feat(playback): add GstPlayFlags for playbin and playbin3
2025-12-26 10:29:31 +05:30
2025-10-31 20:54:28 +05:30
2025-12-25 02:14:56 +05:30
2025-11-19 17:01:14 +05:30
2025-12-25 02:14:56 +05:30
2025-10-31 20:54:28 +05:30
2025-11-19 01:39:20 +05:30
2025-12-25 02:14:56 +05:30

Jello

A WIP video client for jellyfin.

(Planned) Features

  1. Integrate with jellyfin
  2. HDR video playback
  3. Audio Track selection
  4. Chapter selection

Libraries and frameworks used for this

  1. iced -> primary gui toolkit
  2. gstreamer -> primary video + audio decoding library
  3. wgpu -> rendering the video from gstreamer in iced

HDR

I'll try to document all my findings about HDR here.
I'm making this project to mainly learn about videos, color-spaces and gpu programming. And so very obviously I'm bound to make mistakes in either the code or the fundamental understanding of a concept. Please don't take anything in this text as absolute.

let window = ... // use winnit to get a window handle, check the example in this repo
let instance = wgpu::Instance::default();
let surface = instance.create_surface(window).unwrap();
let adapter = instance
    .request_adapter(&wgpu::RequestAdapterOptions {
        power_preference: wgpu::PowerPreference::default(),
        compatible_surface: Some(&surface),
        force_fallback_adapter: false,
    })
    .await
    .context("Failed to request wgpu adapter")?;
let caps = surface.get_capabilities();
println!("{:#?}", caps.formats);

This should print out all the texture formats that can be used by your current hardware
Among these the formats that support hdr (afaik) are

wgpu::TextureFormat::Rgba16Float
wgpu::TextureFormat::Rgba32Float
wgpu::TextureFormat::Rgb10a2Unorm
wgpu::TextureFormat::Rgb10a2Uint // (unsure)

My display supports Rgb10a2Unorm so I'll be going forward with that texture format.

Rgb10a2Unorm is still the same size as a Rgba8Unorm but data is in a different representation in each of them

Rgb10a2Unorm: R, G, B => 10 bits each (2^10 = 1024 [0..=1023])
A => 2 bits (2^2 = 4 [0..=3])

Whereas in a normal pixel Rgba8Unorm R, G, B, A => 8 bits each (2^8 = 256 [0..=255])

For displaying videos the alpha components is not really used (I don't know of any) so we can use re-allocate 6 bits from the alpha channel and put them in the r,g and b components.
In the shader the components get uniformly normalized from [0..=1023] integer to [0..=1] in float so we can compute them properly

Videos however are generally not stored in this format or any rgb format in general because it is not as efficient for (lossy) compression as YUV formats.

Right now I don't want to deal with yuv formats so I'll use gstreamer caps to convert the video into Rgba10a2 format

Description
No description provided
Readme 6.8 MiB
Languages
Rust 97.9%
Nix 1.6%
Just 0.3%
WGSL 0.2%