Compare commits

...

27 Commits

Author SHA1 Message Date
f016c56ba6 feat: Move settings to a tab based ui 2026-01-28 23:40:59 +05:30
4b5dea576f feat: Added AGENTS.md 2026-01-28 20:38:15 +05:30
e66c457b57 feat: Added BACKGROUND_COLOR to settings popup 2026-01-28 02:06:57 +05:30
76fc14c73b feat: Use a floating box for settings 2026-01-28 02:00:45 +05:30
5b4fbd5df6 feat(store): add SecretStore, ApiKey, remove Store trait 2026-01-26 21:00:56 +05:30
e7fd01c0af chore: Update cargo.lock 2026-01-20 21:52:46 +05:30
a040478069 chore: Update flake.lock 2026-01-20 21:52:46 +05:30
e5ef173473 fix(iced-video): Update the color matrices and subtract .5 from uv samples to
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2026-01-15 17:25:09 +05:30
429371002b fix(iced-video): Write the conversion matrix buffer so the video actually shows up
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2026-01-15 17:01:41 +05:30
335e8fdbef feat: move cuda to linux
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2026-01-14 15:55:49 +05:30
9dac0b6c78 feat(iced-video): added video format to the video frame
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2026-01-14 09:51:56 +05:30
uttarayan21
97a7a632d4 feat(iced-video): implement planar YUV texture support with HDR conversion matrices and update dependencies
Some checks failed
build / checks-matrix (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
build / checks-build (push) Has been cancelled
2026-01-04 23:02:47 +05:30
uttarayan21
29390140cd feat(settings): simplify form updates and temporarily disable server toggler 2025-12-27 00:13:54 +05:30
uttarayan21
97c2b3f14c feat(settings): implement user and server form handling with update functions and UI views 2025-12-27 00:04:42 +05:30
uttarayan21
2b2e8060e7 feat(ui-iced): implement settings screen with navigation and basic UI elements
Some checks failed
build / checks-matrix (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
build / checks-build (push) Has been cancelled
2025-12-26 21:21:58 +05:30
uttarayan21
584495453f feat: Many more improvements to video player now with a subscription 2025-12-26 19:06:40 +05:30
uttarayan21
99853167df feat(config): enable unfree packages, add CUDA toolkit
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-26 10:43:15 +05:30
uttarayan21
fc9555873b refactor: move PlayFlags defaults into Playbin3 and clean up unused prelude imports
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-26 10:39:00 +05:30
uttarayan21
a7ffa69326 fix(iced-video): Fix the very high ram usage
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
feat(playback): add GstPlayFlags for playbin and playbin3
2025-12-26 10:29:31 +05:30
uttarayan21
4ed15c97f0 feat: Add keybinds to minimal example 2025-12-25 21:43:55 +05:30
uttarayan21
a2491695b3 fix(video): try to optimize memory leaks
Some checks failed
build / checks-matrix (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
build / checks-build (push) Has been cancelled
2025-12-25 06:28:52 +05:30
uttarayan21
5a0bdae84b fix: Try to minimize frame latency 2025-12-25 05:48:51 +05:30
uttarayan21
5d0b795ba5 feat: Added readme and forgotten id.rs
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-25 02:15:43 +05:30
uttarayan21
ebe2312272 feat: Get iced-video working 2025-12-25 02:14:56 +05:30
uttarayan21
3382aebb1f feat: Added PipelineExt trait for all Children of Pipelines
Some checks failed
build / checks-matrix (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
build / checks-build (push) Has been cancelled
2025-12-23 01:33:54 +05:30
uttarayan21
8d46bd2b85 feat: Restructure the gst parent<->child relations 2025-12-23 01:09:01 +05:30
uttarayan21
043d1e99f0 feat: Modify gst crate to add lot of more granularity 2025-12-22 13:27:30 +05:30
43 changed files with 4185 additions and 1819 deletions

199
AGENTS.md Normal file
View File

@@ -0,0 +1,199 @@
# Agent Guidelines for Jello
This document provides guidelines for AI coding agents working on the Jello codebase.
## Project Overview
Jello is a WIP video client for Jellyfin written in Rust, focusing on HDR video playback using:
- **iced** - Primary GUI toolkit
- **gstreamer** - Video + audio decoding library
- **wgpu** - Rendering video from GStreamer in iced
## Build, Test, and Lint Commands
### Building
```bash
# Build in release mode
cargo build --release
cargo build -r
# Build specific workspace member
cargo build -p api
cargo build -p gst
cargo build -p ui-iced
# Run the application
cargo run --release -- -vv
just jello # Uses justfile
```
### Testing
```bash
# Run all tests in workspace
cargo test --workspace
# Run tests for a specific package
cargo test -p gst
cargo test -p api
cargo test -p iced-video
# Run a single test by name
cargo test test_appsink
cargo test -p gst test_appsink
# Run a specific test in a specific file
cargo test -p gst --test <test_file_name> <test_function_name>
# Run tests with output
cargo test -- --nocapture
cargo test -- --show-output
```
### Linting and Formatting
```bash
# Check code without building
cargo check
cargo check --workspace
# Run clippy (linter)
cargo clippy
cargo clippy --workspace
cargo clippy --workspace -- -D warnings
# Format code
cargo fmt
cargo fmt --all
# Check formatting without modifying files
cargo fmt --all -- --check
```
### Other Tools
```bash
# Check for security vulnerabilities and license compliance
cargo deny check
# Generate Jellyfin type definitions
just typegen
```
## Code Style Guidelines
### Rust Edition
- Use **Rust 2024 edition** (as specified in Cargo.toml files)
### Imports
- Use `use` statements at the top of files
- Group imports: std library, external crates, then local modules
- Use `crate::` for absolute paths within the crate
- Common pattern: create a `priv_prelude` module for internal imports
- Use `pub use` to re-export commonly used items
- Use wildcard imports (`use crate::priv_prelude::*;`) within internal modules when a prelude exists
Example:
```rust
use std::sync::Arc;
use reqwest::{Method, header::InvalidHeaderValue};
use serde::{Deserialize, Serialize};
use crate::errors::*;
```
### Naming Conventions
- **Types/Structs/Enums**: PascalCase (e.g., `JellyfinClient`, `Error`, `AppSink`)
- **Functions/Methods**: snake_case (e.g., `request_builder`, `stream_url`)
- **Variables**: snake_case (e.g., `access_token`, `device_id`)
- **Constants**: SCREAMING_SNAKE_CASE (e.g., `NEXT_ID`, `GST`)
- **Modules**: snake_case (e.g., `priv_prelude`, `error_stack`)
### Error Handling
- Use **`error-stack`** for error handling with context propagation
- Use **`thiserror`** for defining error types
- Standard error type pattern:
```rust
pub use error_stack::{Report, ResultExt};
#[derive(Debug, thiserror::Error)]
#[error("An error occurred")]
pub struct Error;
pub type Result<T, E = error_stack::Report<Error>> = core::result::Result<T, E>;
```
- Attach context to errors using `.change_context(Error)` and `.attach("description")`
- Use `#[track_caller]` on functions that may panic or error for better error messages
- Error handling example:
```rust
self.inner
.set_state(gstreamer::State::Playing)
.change_context(Error)
.attach("Failed to set pipeline to Playing state")?;
```
### Types
- Prefer explicit types over type inference when it improves clarity
- Use `impl Trait` for function parameters when appropriate (e.g., `impl AsRef<str>`)
- Use `Option<T>` and `Result<T, E>` idiomatically
- Use `Arc<T>` for shared ownership
- Use newtype patterns for semantic clarity (e.g., `ApiKey` wrapping `secrecy::SecretBox<String>`)
### Formatting
- Use 4 spaces for indentation
- Line length: aim for 100 characters, but not strictly enforced
- Use trailing commas in multi-line collections
- Follow standard Rust formatting conventions (enforced by `cargo fmt`)
### Documentation
- Add doc comments (`///`) for public APIs
- Use inline comments (`//`) sparingly, prefer self-documenting code
- Include examples in doc comments when helpful
### Async/Await
- Use `tokio` as the async runtime
- Mark async functions with `async` keyword
- Use `.await` for async operations
- Common pattern: `tokio::fs` for file operations
### Module Structure
- Use `mod.rs` or inline modules as appropriate
- Keep related functionality together
- Use `pub(crate)` for internal APIs
- Re-export commonly used items at crate root
### Macros
- Custom macros used: `wrap_gst!`, `parent_child!`
- Use macros for reducing boilerplate, only in the `gst` crate
### Testing
- Place tests in the same file with `#[test]` or `#[cfg(test)]`
- Use descriptive test function names (e.g., `test_appsink`, `unique_generates_different_ids`)
- Initialize tracing in tests when needed for debugging
### Dependencies
- Prefer well-maintained crates from crates.io
- Use `workspace.dependencies` for shared dependencies across workspace members
- Pin versions when stability is important
### Workspace Structure
The project uses a Cargo workspace with multiple members:
- `.` - Main jello binary
- `api` - Jellyfin API client
- `gst` - GStreamer wrapper
- `ui-iced` - Iced UI implementation
- `ui-gpui` - GPUI UI implementation (optional)
- `store` - Secret/data/storage management
- `jello-types` - Shared type definitions
- `typegen` - Jellyfin type generator
- `crates/iced-video` - Custom iced video widget
- `examples/hdr-gstreamer-wgpu` - HDR example
### Project-Specific Patterns
- Use `LazyLock` for global initialization (e.g., GStreamer init)
- Use the builder pattern with method chaining (e.g., `request_builder()`)
- Use `tap` crate's `.pipe()` for functional transformations
- Prefer `BTreeMap`/`BTreeSet` over `HashMap`/`HashSet` when order matters
- Prefer a functional programming style instead of an imperative one.
- When building UIs keep the handler and view code in the same module (eg. settings view and settings handle in the same file)
## License
All code in this project is MIT licensed.

1759
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -9,20 +9,20 @@ members = [
"jello-types",
"gst",
"examples/hdr-gstreamer-wgpu",
"crates/iced-video",
]
[workspace.dependencies]
iced = { version = "0.14.0", features = [
"advanced",
"canvas",
"image",
"sipper",
"tokio",
"debug",
] }
iced_video_player = "0.6"
iced = { version = "0.14.0" }
gst = { version = "0.1.0", path = "gst" }
# iced_video_player = { git = "https://github.com/jazzfool/iced_video_player" }
# iced_video_player = { path = "crates/iced_video_player" }
iced_wgpu = { version = "0.14.0" }
iced-video = { version = "0.1.0", path = "crates/iced-video" }
[patch.crates-io]
iced_wgpu = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
iced_core = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
iced_renderer = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
iced_futures = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
iced = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
[package]
name = "jello"
@@ -32,6 +32,7 @@ license = "MIT"
[dependencies]
api = { version = "0.1.0", path = "api" }
bytemuck = { version = "1.24.0", features = ["derive"] }
clap = { version = "4.5", features = ["derive"] }
clap-verbosity-flag = { version = "3.0.4", features = ["tracing"] }
clap_complete = "4.5"

112
README.md Normal file
View File

@@ -0,0 +1,112 @@
# Jello
A WIP video client for jellyfin.
(Planned) Features
1. Integrate with jellyfin
2. HDR video playback
3. Audio Track selection
4. Chapter selection
Libraries and frameworks used for this
1. iced -> primary gui toolkit
2. gstreamer -> primary video + audio decoding library
3. wgpu -> rendering the video from gstreamer in iced
### HDR
I'll try to document all my findings about HDR here.
I'm making this project to mainly learn about videos, color-spaces and gpu programming. And so very obviously I'm bound to make mistakes in either the code or the fundamental understanding of a concept. Please don't take anything in this text as absolute.
```rust
let window = ... // use winnit to get a window handle, check the example in this repo
let instance = wgpu::Instance::default();
let surface = instance.create_surface(window).unwrap();
let adapter = instance
.request_adapter(&wgpu::RequestAdapterOptions {
power_preference: wgpu::PowerPreference::default(),
compatible_surface: Some(&surface),
force_fallback_adapter: false,
})
.await
.context("Failed to request wgpu adapter")?;
let caps = surface.get_capabilities();
println!("{:#?}", caps.formats);
```
This should print out all the texture formats that can be used by your current hardware
Among these the formats that support hdr (afaik) are
```
wgpu::TextureFormat::Rgba16Float
wgpu::TextureFormat::Rgba32Float
wgpu::TextureFormat::Rgb10a2Unorm
wgpu::TextureFormat::Rgb10a2Uint // (unsure)
```
My display supports Rgb10a2Unorm so I'll be going forward with that texture format.
`Rgb10a2Unorm` is still the same size as a `Rgba8Unorm` but data is in a different representation in each of them
`Rgb10a2Unorm`:
R, G, B => 10 bits each (2^10 = 1024 [0..=1023])
A => 2 bits (2^2 = 4 [0..=3])
Whereas in a normal pixel
`Rgba8Unorm`
R, G, B, A => 8 bits each (2^8 = 256 [0..=255])
For displaying videos the alpha components is not really used (I don't know of any) so we can use re-allocate 6 bits from the alpha channel and put them in the r,g and b components.
In the shader the components get uniformly normalized from [0..=1023] integer to [0..=1] in float so we can compute them properly
Videos however are generally not stored in this format or any rgb format in general because it is not as efficient for (lossy) compression as YUV formats.
Right now I don't want to deal with yuv formats so I'll use gstreamer caps to convert the video into `Rgba10a2` format
## Pixel formats and Planes
Dated: Sun Jan 4 09:09:16 AM IST 2026
| value | count | quantile | percentage | frequency |
| --- | --- | --- | --- | --- |
| yuv420p | 1815 | 0.5067001675041876 | 50.67% | ************************************************** |
| yuv420p10le | 1572 | 0.4388609715242881 | 43.89% | ******************************************* |
| yuvj420p | 171 | 0.04773869346733668 | 4.77% | **** |
| rgba | 14 | 0.003908431044109436 | 0.39% | |
| yuvj444p | 10 | 0.0027917364600781687 | 0.28% | |
For all of my media collection these are the pixel formats for all the videos
### RGBA
Pretty self evident
8 channels for each of R, G, B and A
Hopefully shouldn't be too hard to make a function or possibly a lut that takes data from rgba and maps it to Rgb10a2Unorm
```mermaid
packet
title RGBA
+8: "R"
+8: "G"
+8: "B"
+8: "A"
```
### YUV
[All YUV formats](https://learn.microsoft.com/en-us/windows/win32/medfound/recommended-8-bit-yuv-formats-for-video-rendering#surface-definitions)
[10 and 16 bit yuv formats](https://learn.microsoft.com/en-us/windows/win32/medfound/10-bit-and-16-bit-yuv-video-formats)
Y -> Luminance
U,V -> Chrominance
p -> Planar
sp -> semi planar
j -> full range
planar formats have each of the channels in a contiguous array one after another
in semi-planar formats the y channel is seperate and uv channels are interleaved
## Chroma Subsampling

View File

@@ -0,0 +1,29 @@
[package]
name = "iced-video"
version = "0.1.0"
edition = "2024"
[dependencies]
bytemuck = "1.24.0"
error-stack = "0.6.0"
futures-lite = "2.6.1"
gst.workspace = true
iced_core = "0.14.0"
iced_futures = "0.14.0"
iced_renderer = { version = "0.14.0", features = ["iced_wgpu"] }
iced_wgpu = { version = "0.14.0" }
thiserror = "2.0.17"
tracing = "0.1.43"
wgpu = { version = "27.0.1", features = ["vulkan"] }
[dev-dependencies]
iced.workspace = true
tracing-subscriber = { version = "0.3.22", features = ["env-filter"] }
[profile.dev]
debug = true
[profile.release]
debug = true
# [patch.crates-io]
# iced_wgpu = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }

View File

@@ -0,0 +1,178 @@
use iced_video::{Video, VideoHandle};
pub fn main() -> iced::Result {
use tracing_subscriber::prelude::*;
tracing_subscriber::registry()
.with(
tracing_subscriber::fmt::layer()
.with_thread_ids(true)
.with_file(true),
)
.with(tracing_subscriber::EnvFilter::from_default_env())
.init();
iced::application(State::new, update, view)
.subscription(|state| {
// Foo
match &state.video {
Some(video) => video.subscription_with(state, keyboard_event),
None => keyboard_event(state),
}
})
.run()
}
fn keyboard_event(_state: &State) -> iced::Subscription<Message> {
use iced::keyboard::{Key, key::Named};
iced::keyboard::listen().map(move |event| match event {
iced::keyboard::Event::KeyPressed { key, .. } => {
let key = key.as_ref();
match key {
Key::Named(Named::Escape) | Key::Character("q") => Message::Quit,
Key::Character("f") => Message::Fullscreen,
Key::Named(Named::Space) => Message::Toggle,
_ => Message::Noop,
}
}
_ => Message::Noop,
})
}
#[derive(Debug, Clone)]
pub struct State {
video: Option<VideoHandle<Message>>,
fullscreen: bool,
}
impl State {
pub fn new() -> (Self, iced::Task<Message>) {
(
Self {
video: None,
fullscreen: false,
},
iced::Task::done(Message::Load),
)
}
}
#[derive(Debug, Clone)]
pub enum Message {
Play,
Pause,
Toggle,
Noop,
Load,
Fullscreen,
OnLoad(VideoHandle<Message>),
OnError(String),
NewFrame,
Eos,
Quit,
}
pub fn update(state: &mut State, message: Message) -> iced::Task<Message> {
match message {
Message::NewFrame => {
iced::Task::none()
}
Message::Eos => {
iced::Task::done(Message::Pause)
}
Message::Load => {
iced::Task::perform(
VideoHandle::load(
"https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c",
),
|result| match result {
Ok(video) => Message::OnLoad(video),
Err(err) => Message::OnError(format!("Error loading video: {:?}", err)),
},
).chain(iced::Task::done(Message::Play))
}
Message::OnError(err) => {
eprintln!("Error: {}", err);
iced::Task::none()
}
Message::OnLoad(video) => {
state.video = Some(video.on_new_frame(Message::NewFrame).on_end_of_stream(Message::Eos));
iced::Task::none()
}
Message::Fullscreen => {
state.fullscreen = !state.fullscreen;
let fullscreen = state.fullscreen;
let mode = if fullscreen {
iced::window::Mode::Fullscreen
} else {
iced::window::Mode::Windowed
};
iced::window::oldest().and_then(move |id| iced::window::set_mode::<Message>(id, mode))
}
Message::Play => {
state
.video
.as_ref()
.unwrap()
.source()
.play()
.expect("Failed to play video");
iced::Task::none()
}
Message::Pause => {
state
.video
.as_ref()
.unwrap()
.source()
.pause()
.expect("Failed to pause video");
iced::Task::none()
}
Message::Toggle => {
state
.video
.as_ref()
.unwrap()
.source()
.toggle()
.expect("Failed to stop video");
iced::Task::none()
}
Message::Quit => {
state
.video
.as_ref()
.unwrap()
.source()
.stop()
.expect("Failed to stop video");
std::process::exit(0);
}
Message::Noop => iced::Task::none(),
}
}
pub fn view<'a>(state: &'a State) -> iced::Element<'a, Message> {
if let None = &state.video {
return iced::widget::Column::new()
.push(iced::widget::Text::new("Press any key to load video"))
.align_x(iced::Alignment::Center)
.into();
}
let video_widget = Video::new(&state.video.as_ref().unwrap())
.width(iced::Length::Fill)
.height(iced::Length::Fill)
.content_fit(iced::ContentFit::Contain);
iced::widget::Column::new()
.push(video_widget)
.push(
iced::widget::Row::new()
.push(iced::widget::Button::new("Play").on_press(Message::Play))
.push(iced::widget::Button::new("Pause").on_press(Message::Pause))
.spacing(5)
.padding(10)
.align_y(iced::Alignment::Center),
)
.align_x(iced::Alignment::Center)
.into()
}

View File

@@ -0,0 +1,8 @@
info:
RUST_LOG=info,wgpu_core=warn,wgpu_hal=warn cargo run --release --example minimal
# GST_DEBUG=5 RUST_LOG="" cargo run --release --example minimal
flame:
cargo flamegraph run --release --example minimal
heaptrack:
cargo build --release --example minimal
RUST_LOG="info,wgpu_hal=info" heaptrack $CARGO_TARGET_DIR/release/examples/minimal

View File

@@ -0,0 +1,55 @@
use std::borrow;
use std::sync::atomic::{self, AtomicUsize};
static NEXT_ID: AtomicUsize = AtomicUsize::new(0);
/// The identifier of a generic widget.
#[derive(Debug, Clone, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub struct Id(Internal);
impl Id {
/// Creates a new [`Id`] from a static `str`.
pub const fn new(id: &'static str) -> Self {
Self(Internal::Custom(borrow::Cow::Borrowed(id)))
}
/// Creates a unique [`Id`].
///
/// This function produces a different [`Id`] every time it is called.
pub fn unique() -> Self {
let id = NEXT_ID.fetch_add(1, atomic::Ordering::Relaxed);
Self(Internal::Unique(id))
}
}
impl From<&'static str> for Id {
fn from(value: &'static str) -> Self {
Self::new(value)
}
}
impl From<String> for Id {
fn from(value: String) -> Self {
Self(Internal::Custom(borrow::Cow::Owned(value)))
}
}
#[derive(Debug, Clone, PartialEq, Eq, Hash, PartialOrd, Ord)]
enum Internal {
Unique(usize),
Custom(borrow::Cow<'static, str>),
}
#[cfg(test)]
mod tests {
use super::Id;
#[test]
fn unique_generates_different_ids() {
let a = Id::unique();
let b = Id::unique();
assert_ne!(a, b);
}
}

View File

@@ -0,0 +1,164 @@
pub mod id;
pub mod primitive;
pub mod source;
pub mod widget;
pub use widget::Video;
use error_stack::{Report, ResultExt};
use gst::plugins::app::AppSink;
use gst::plugins::playback::Playbin3;
use gst::plugins::videoconvertscale::VideoConvert;
#[derive(Debug, thiserror::Error)]
#[error("Iced Video Error")]
pub struct Error;
pub type Result<T, E = Report<Error>> = core::result::Result<T, E>;
use std::sync::{Arc, Mutex, atomic::AtomicBool};
mod seal {
pub trait Sealed {}
impl Sealed for super::Unknown {}
impl Sealed for super::Ready {}
}
pub trait State: seal::Sealed {
fn is_ready() -> bool {
false
}
}
#[derive(Debug, Clone)]
pub struct Unknown;
#[derive(Debug, Clone)]
pub struct Ready;
impl State for Unknown {}
impl State for Ready {
fn is_ready() -> bool {
true
}
}
/// This is the video handle that is used to control the video playback.
/// This should be keps in the application state.
#[derive(Debug, Clone)]
pub struct VideoHandle<Message, S: State = Unknown> {
id: id::Id,
pub source: source::VideoSource,
frame_ready: Arc<AtomicBool>,
on_new_frame: Option<Box<Message>>,
on_end_of_stream: Option<Box<Message>>,
on_about_to_finish: Option<Box<Message>>,
__marker: core::marker::PhantomData<S>,
}
impl<Message: Send + Sync + Clone> VideoHandle<Message, Unknown> {
pub fn new(url: impl AsRef<str>) -> Result<Self> {
let source = source::VideoSource::new(url)?;
let frame_ready = Arc::clone(&source.ready);
Ok(Self {
id: id::Id::unique(),
source: source,
on_new_frame: None,
on_end_of_stream: None,
on_about_to_finish: None,
frame_ready,
__marker: core::marker::PhantomData,
})
}
/// Creates a new video handle and waits for the metadata to be loaded.
pub async fn load(url: impl AsRef<str>) -> Result<VideoHandle<Message, Ready>> {
let handle = VideoHandle::new(url)?;
handle.wait().await
}
}
impl<Message: Send + Sync + Clone, S: State> VideoHandle<Message, S> {
pub fn id(&self) -> &id::Id {
&self.id
}
pub fn source(&self) -> &source::VideoSource {
&self.source
}
pub async fn wait(self) -> Result<VideoHandle<Message, Ready>> {
self.source.wait().await?;
Ok(self.state::<Ready>())
}
fn state<S2: State>(self) -> VideoHandle<Message, S2> {
VideoHandle {
id: self.id,
source: self.source,
on_new_frame: self.on_new_frame,
on_end_of_stream: self.on_end_of_stream,
on_about_to_finish: self.on_about_to_finish,
frame_ready: self.frame_ready,
__marker: core::marker::PhantomData,
}
}
// pub fn subscription(&self) -> iced_futures::subscription::Subscription<Message> {
// let sub = widget::VideoSubscription {
// id: self.id.clone(),
// on_end_of_stream: self.on_end_of_stream.clone(),
// on_new_frame: self.on_new_frame.clone(),
// on_about_to_finish: self.on_about_to_finish.clone(),
// bus: self.source.bus.clone(),
// };
// iced_futures::subscription::from_recipe(sub)
// }
//
// pub fn subscription_with<State>(
// &self,
// state: &State,
// f: impl FnOnce(&State) -> iced_futures::subscription::Subscription<Message> + 'static,
// ) -> iced_futures::subscription::Subscription<Message>
// where
// State: Send + Sync + 'static,
// {
// let sub = self.subscription();
// iced_futures::subscription::Subscription::batch([sub, f(state)])
// }
pub fn on_new_frame(self, message: Message) -> Self {
Self {
on_new_frame: Some(Box::new(message)),
..self
}
}
pub fn on_end_of_stream(self, message: Message) -> Self {
Self {
on_end_of_stream: Some(Box::new(message)),
..self
}
}
pub fn on_about_to_finish(self, message: Message) -> Self {
Self {
on_about_to_finish: Some(Box::new(message)),
..self
}
}
pub fn play(&self) {
self.source.play();
}
pub fn pause(&self) {
self.source.pause();
}
pub fn stop(&self) {
self.source.stop();
}
}
impl<Message: Send + Sync + Clone> VideoHandle<Message, Ready> {
pub fn format(&self) -> Result<gst::VideoFormat> {
self.source
.format()
.change_context(Error)
.attach("Failed to get video format")
}
}

View File

@@ -0,0 +1,574 @@
use crate::id;
use gst::videoconvertscale::VideoFormat;
use iced_wgpu::primitive::Pipeline;
use iced_wgpu::wgpu;
use std::collections::BTreeMap;
use std::sync::{Arc, Mutex, atomic::AtomicBool};
#[derive(Clone, Copy, Debug, bytemuck::Zeroable, bytemuck::Pod)]
#[repr(transparent)]
pub struct ConversionMatrix {
matrix: [Vec3f; 3],
}
#[derive(Clone, Copy, Debug, bytemuck::Zeroable, bytemuck::Pod)]
#[repr(C, align(16))]
pub struct Vec3f {
data: [f32; 3],
__padding: u32,
}
impl From<[f32; 3]> for Vec3f {
fn from(value: [f32; 3]) -> Self {
Vec3f {
data: [value[0], value[1], value[2]],
__padding: 0,
}
}
}
impl Vec3f {
pub fn new(x: f32, y: f32, z: f32) -> Self {
Vec3f {
data: [x, y, z],
__padding: 0,
}
}
pub const fn from(data: [f32; 3]) -> Self {
Vec3f {
data: [data[0], data[1], data[2]],
__padding: 0,
}
}
}
// impl ConversionMatrix {
// pub fn desc() -> wgpu::VertexBufferLayout<'static> {
// wgpu::VertexBufferLayout {
// array_stride: core::mem::size_of::<ConversionMatrix>() as wgpu::BufferAddress,
// step_mode: wgpu::VertexStepMode::Vertex,
// attributes: &[
// wgpu::VertexAttribute {
// offset: 0,
// shader_location: 0,
// format: wgpu::VertexFormat::Float32x4,
// },
// wgpu::VertexAttribute {
// offset: 16,
// shader_location: 1,
// format: wgpu::VertexFormat::Float32x4,
// },
// wgpu::VertexAttribute {
// offset: 32,
// shader_location: 2,
// format: wgpu::VertexFormat::Float32x4,
// },
// wgpu::VertexAttribute {
// offset: 48,
// shader_location: 3,
// format: wgpu::VertexFormat::Float32x4,
// },
// ],
// }
// }
// }
pub const BT2020_TO_RGB: ConversionMatrix = ConversionMatrix {
matrix: [
Vec3f::from([1.0, 0.0, 1.4746]),
Vec3f::from([1.0, -0.16455, -0.5714]),
Vec3f::from([1.0, 1.8814, 0.0]),
],
};
pub const BT709_TO_RGB: ConversionMatrix = ConversionMatrix {
matrix: [
Vec3f::from([1.0, 0.0, 1.5748]),
Vec3f::from([1.0, -0.1873, -0.4681]),
Vec3f::from([1.0, 1.8556, 0.0]),
],
};
#[derive(Debug)]
pub struct VideoFrame {
pub id: id::Id,
pub size: wgpu::Extent3d,
pub ready: Arc<AtomicBool>,
pub frame: Arc<Mutex<gst::Sample>>,
pub format: VideoFormat,
}
impl iced_wgpu::Primitive for VideoFrame {
type Pipeline = VideoPipeline;
fn prepare(
&self,
pipeline: &mut Self::Pipeline,
device: &wgpu::Device,
queue: &wgpu::Queue,
bounds: &iced_wgpu::core::Rectangle,
viewport: &iced_wgpu::graphics::Viewport,
) {
let video = pipeline.videos.entry(self.id.clone()).or_insert_with(|| {
let texture = VideoTexture::new(
"iced-video-texture",
self.size,
device,
pipeline.format,
self.format,
);
let bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
label: Some("iced-video-texture-bind-group"),
layout: &pipeline.bind_group_layout,
entries: &[
wgpu::BindGroupEntry {
binding: 0,
resource: wgpu::BindingResource::TextureView(&texture.y_texture()),
},
wgpu::BindGroupEntry {
binding: 1,
resource: wgpu::BindingResource::TextureView(&texture.uv_texture()),
},
wgpu::BindGroupEntry {
binding: 2,
resource: wgpu::BindingResource::Sampler(&pipeline.sampler),
},
wgpu::BindGroupEntry {
binding: 3,
resource: wgpu::BindingResource::Buffer(
texture
.conversion_matrix_buffer()
.as_entire_buffer_binding(),
),
},
],
});
let matrix = if matches!(self.format, VideoFormat::P01010le | VideoFormat::P016Le) {
BT2020_TO_RGB
} else {
BT709_TO_RGB
};
texture.write_conversion_matrix(&matrix, queue);
VideoFrameData {
id: self.id.clone(),
texture,
bind_group,
conversion_matrix: matrix,
ready: Arc::clone(&self.ready),
}
});
if self.size != video.texture.size() {
let new_texture = video
.texture
.resize("iced-video-texture-resized", self.size, device);
new_texture.write_conversion_matrix(&video.conversion_matrix, queue);
let new_bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
label: Some("iced-video-texture-bind-group"),
layout: &pipeline.bind_group_layout,
entries: &[
wgpu::BindGroupEntry {
binding: 0,
resource: wgpu::BindingResource::TextureView(&new_texture.y_texture()),
},
wgpu::BindGroupEntry {
binding: 1,
resource: wgpu::BindingResource::TextureView(&new_texture.uv_texture()),
},
wgpu::BindGroupEntry {
binding: 2,
resource: wgpu::BindingResource::Sampler(&pipeline.sampler),
},
wgpu::BindGroupEntry {
binding: 3,
resource: wgpu::BindingResource::Buffer(
video
.texture
.conversion_matrix_buffer()
.as_entire_buffer_binding(),
),
},
],
});
video.texture = new_texture;
video.bind_group = new_bind_group;
}
if video.ready.load(std::sync::atomic::Ordering::SeqCst) {
let frame = self.frame.lock().expect("BUG: Mutex poisoned");
let buffer = frame
.buffer()
.expect("BUG: Failed to get frame data from gst::Sample");
let data = buffer
.map_readable()
.expect("BUG: Failed to map gst::Buffer readable");
video.texture.write_texture(&data, queue);
drop(data);
video
.ready
.store(false, std::sync::atomic::Ordering::SeqCst);
}
}
fn render(
&self,
pipeline: &Self::Pipeline,
encoder: &mut wgpu::CommandEncoder,
target: &wgpu::TextureView,
bounds: &iced_wgpu::core::Rectangle<u32>,
) {
let Some(video) = pipeline.videos.get(&self.id) else {
return;
};
let mut render_pass = encoder.begin_render_pass(&wgpu::RenderPassDescriptor {
label: Some("iced-video-render-pass"),
color_attachments: &[Some(wgpu::RenderPassColorAttachment {
view: target,
resolve_target: None,
ops: wgpu::Operations {
load: wgpu::LoadOp::Load,
store: wgpu::StoreOp::Store,
},
depth_slice: None,
})],
depth_stencil_attachment: None,
timestamp_writes: None,
occlusion_query_set: None,
});
render_pass.set_pipeline(&pipeline.pipeline);
render_pass.set_bind_group(0, &video.bind_group, &[]);
render_pass.set_scissor_rect(
bounds.x as _,
bounds.y as _,
bounds.width as _,
bounds.height as _,
);
render_pass.draw(0..3, 0..1);
// self.ready
// .store(false, std::sync::atomic::Ordering::Relaxed);
}
}
/// NV12 or P010 are only supported in DX12 and Vulkan backends.
/// While we can use vulkan with moltenvk on macos, I'd much rather use metal directly
/// Right now only supports interleaved UV formats.
/// For planar formats we would need 3 textures.
/// Also NV12 and P010 textures are not COPY_DST capable
/// This assumes 4:2:0 chroma subsampling (for now).
/// So for 4 Y samples there is 1 U and 1 V sample.
/// This means that the UV texture is half the width and half the height of the Y texture.
#[derive(Debug)]
pub struct VideoTexture {
y: wgpu::Texture,
uv: wgpu::Texture,
size: wgpu::Extent3d,
video_format: VideoFormat,
surface_format: wgpu::TextureFormat,
conversion_matrix_buffer: wgpu::Buffer,
}
impl VideoTexture {
pub fn size(&self) -> wgpu::Extent3d {
self.size
}
pub fn new(
label: &str,
size: wgpu::Extent3d,
device: &wgpu::Device,
surface_format: wgpu::TextureFormat,
video_format: VideoFormat,
) -> Self {
let surface_hdr = surface_format.is_wide();
let video_hdr = matches!(video_format, VideoFormat::P01010le | VideoFormat::P016Le);
if surface_hdr && !video_hdr {
tracing::warn!("Surface texture is HDR but video format is SDR");
} else if !surface_hdr && video_hdr {
tracing::warn!("Video format is HDR but surface does not support HDR");
}
let y_texture = device.create_texture(&wgpu::TextureDescriptor {
label: Some(&format!("{}-y", label)),
size: wgpu::Extent3d {
width: size.width,
height: size.height,
depth_or_array_layers: 1,
},
mip_level_count: 1,
sample_count: 1,
dimension: wgpu::TextureDimension::D2,
format: wgpu::TextureFormat::R16Unorm,
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
view_formats: &[],
});
let uv_texture = device.create_texture(&wgpu::TextureDescriptor {
label: Some(&format!("{}-uv", label)),
size: wgpu::Extent3d {
width: size.width / 2,
height: size.height / 2,
depth_or_array_layers: 1,
},
mip_level_count: 1,
sample_count: 1,
dimension: wgpu::TextureDimension::D2,
format: wgpu::TextureFormat::Rg16Unorm,
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
view_formats: &[],
});
let buffer = device.create_buffer(&wgpu::BufferDescriptor {
label: Some("iced-video-conversion-matrix-buffer"),
usage: wgpu::BufferUsages::UNIFORM | wgpu::BufferUsages::COPY_DST,
size: core::mem::size_of::<ConversionMatrix>() as wgpu::BufferAddress,
mapped_at_creation: false,
});
VideoTexture {
y: y_texture,
uv: uv_texture,
size,
surface_format,
video_format,
conversion_matrix_buffer: buffer,
}
}
// This return the surface texture format, not the video pixel format
pub fn format(&self) -> wgpu::TextureFormat {
self.surface_format
}
pub fn y_texture(&self) -> wgpu::TextureView {
self.y.create_view(&wgpu::TextureViewDescriptor::default())
}
pub fn uv_texture(&self) -> wgpu::TextureView {
self.uv.create_view(&wgpu::TextureViewDescriptor::default())
}
pub fn resize(&self, name: &str, new_size: wgpu::Extent3d, device: &wgpu::Device) -> Self {
VideoTexture::new(name, new_size, device, self.format(), self.pixel_format())
}
pub fn pixel_format(&self) -> VideoFormat {
self.video_format
}
/// This assumes that the data is laid out correctly for the texture format.
pub fn write_texture(&self, data: &[u8], queue: &wgpu::Queue) {
let Self { y, uv, .. } = self;
let y_size = y.size();
let uv_size = uv.size();
let y_data_size = (y_size.width * y_size.height * 2) as usize;
let uv_data_size = (y_data_size / 2) as usize; // UV is interleaved
let y_data = &data[0..y_data_size];
let uv_data = &data[y_data_size..y_data_size + uv_data_size];
queue.write_texture(
wgpu::TexelCopyTextureInfo {
texture: y,
mip_level: 0,
origin: wgpu::Origin3d::ZERO,
aspect: wgpu::TextureAspect::All,
},
y_data,
wgpu::TexelCopyBufferLayout {
offset: 0,
bytes_per_row: Some(y_size.width * 2),
rows_per_image: None,
},
y_size,
);
queue.write_texture(
wgpu::TexelCopyTextureInfo {
texture: uv,
mip_level: 0,
origin: wgpu::Origin3d::ZERO,
aspect: wgpu::TextureAspect::All,
},
uv_data,
wgpu::TexelCopyBufferLayout {
offset: 0,
bytes_per_row: Some(uv_size.width * 4),
rows_per_image: None,
},
uv_size,
);
}
pub fn write_conversion_matrix(&self, matrix: &ConversionMatrix, queue: &wgpu::Queue) {
queue.write_buffer(
&self.conversion_matrix_buffer,
0,
bytemuck::bytes_of(matrix),
);
}
pub fn conversion_matrix_buffer(&self) -> &wgpu::Buffer {
&self.conversion_matrix_buffer
}
}
#[derive(Debug)]
pub struct VideoFrameData {
id: id::Id,
texture: VideoTexture,
bind_group: wgpu::BindGroup,
conversion_matrix: ConversionMatrix,
ready: Arc<AtomicBool>,
}
impl VideoFrameData {
pub fn is_hdr(&self) -> bool {
self.texture.format().is_wide()
}
}
#[derive(Debug)]
pub struct VideoPipeline {
pipeline: wgpu::RenderPipeline,
bind_group_layout: wgpu::BindGroupLayout,
sampler: wgpu::Sampler,
format: wgpu::TextureFormat,
videos: BTreeMap<id::Id, VideoFrameData>,
}
pub trait WideTextureFormatExt {
fn is_wide(&self) -> bool;
}
impl WideTextureFormatExt for wgpu::TextureFormat {
fn is_wide(&self) -> bool {
matches!(
self,
wgpu::TextureFormat::Rgba16Float
| wgpu::TextureFormat::Rgba32Float
| wgpu::TextureFormat::Rgb10a2Unorm
| wgpu::TextureFormat::Rgb10a2Uint
| wgpu::TextureFormat::P010
)
}
}
impl Pipeline for VideoPipeline {
fn new(device: &wgpu::Device, queue: &wgpu::Queue, format: wgpu::TextureFormat) -> Self
where
Self: Sized,
{
if format.is_wide() {
tracing::info!("HDR texture format detected: {:?}", format);
}
let bind_group_layout = device.create_bind_group_layout(&wgpu::BindGroupLayoutDescriptor {
label: Some("iced-video-texture-bind-group-layout"),
entries: &[
// y
wgpu::BindGroupLayoutEntry {
binding: 0,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Texture {
multisampled: false,
view_dimension: wgpu::TextureViewDimension::D2,
sample_type: wgpu::TextureSampleType::Float { filterable: true },
},
count: None,
},
// uv
wgpu::BindGroupLayoutEntry {
binding: 1,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Texture {
multisampled: false,
view_dimension: wgpu::TextureViewDimension::D2,
sample_type: wgpu::TextureSampleType::Float { filterable: true },
},
count: None,
},
// sampler
wgpu::BindGroupLayoutEntry {
binding: 2,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Sampler(wgpu::SamplerBindingType::Filtering),
count: None,
},
// conversion matrix
wgpu::BindGroupLayoutEntry {
binding: 3,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Buffer {
ty: wgpu::BufferBindingType::Uniform,
has_dynamic_offset: false,
min_binding_size: None,
},
count: None,
},
],
});
let shader_passthrough =
device.create_shader_module(wgpu::include_wgsl!("shaders/passthrough.wgsl"));
let render_pipeline_layout =
device.create_pipeline_layout(&wgpu::PipelineLayoutDescriptor {
label: Some("iced-video-render-pipeline-layout"),
bind_group_layouts: &[&bind_group_layout],
push_constant_ranges: &[],
});
let pipeline = device.create_render_pipeline(&wgpu::RenderPipelineDescriptor {
label: Some("iced-video-render-pipeline"),
layout: Some(&render_pipeline_layout),
vertex: wgpu::VertexState {
module: &shader_passthrough,
entry_point: Some("vs_main"),
buffers: &[],
compilation_options: wgpu::PipelineCompilationOptions::default(),
},
fragment: Some(wgpu::FragmentState {
module: &shader_passthrough,
entry_point: Some("fs_main"),
targets: &[Some(wgpu::ColorTargetState {
format,
blend: Some(wgpu::BlendState::REPLACE),
write_mask: wgpu::ColorWrites::ALL,
})],
compilation_options: wgpu::PipelineCompilationOptions::default(),
}),
primitive: wgpu::PrimitiveState::default(),
depth_stencil: None,
multisample: wgpu::MultisampleState::default(),
multiview: None,
cache: None,
});
let sampler = device.create_sampler(&wgpu::SamplerDescriptor {
label: Some("iced-video-sampler"),
address_mode_u: wgpu::AddressMode::ClampToEdge,
address_mode_v: wgpu::AddressMode::ClampToEdge,
address_mode_w: wgpu::AddressMode::ClampToEdge,
mag_filter: wgpu::FilterMode::Linear,
min_filter: wgpu::FilterMode::Linear,
mipmap_filter: wgpu::FilterMode::Nearest,
..Default::default()
});
Self {
pipeline,
bind_group_layout,
sampler,
format,
videos: BTreeMap::new(),
}
}
}

View File

@@ -0,0 +1,30 @@
struct VertexOutput {
@builtin(position) clip_position: vec4<f32>,
@location(0) tex_coords: vec2<f32>,
}
@vertex
fn vs_main(
@builtin(vertex_index) in_vertex_index: u32,
) -> VertexOutput {
var out: VertexOutput;
let uv = vec2<f32>(f32((in_vertex_index << 1u) & 2u), f32(in_vertex_index & 2u));
out.clip_position = vec4<f32>(uv * 2.0 - 1.0, 0.0, 1.0);
out.clip_position.y = -out.clip_position.y;
out.tex_coords = uv;
return out;
}
@group(0) @binding(0) var y_texture: texture_2d<f32>;
@group(0) @binding(1) var uv_texture: texture_2d<f32>;
@group(0) @binding(2) var texture_sampler: sampler;
@group(0) @binding(3) var<uniform> rgb_primaries: mat3x3<f32>;
@fragment
fn fs_main(input: VertexOutput) -> @location(0) vec4<f32> {
let y = textureSample(y_texture, texture_sampler, input.tex_coords).r;
let uv = textureSample(uv_texture, texture_sampler, input.tex_coords).rg;
let yuv = vec3f(y, uv.x - 0.5, uv.y - 0.5);
return vec4f(yuv * rgb_primaries, 1.0);
}

View File

@@ -0,0 +1,173 @@
use crate::{Error, Result, ResultExt};
use gst::{
Bus, Gst, MessageType, MessageView, Sink, Source,
app::AppSink,
caps::{Caps, CapsType},
element::ElementExt,
pipeline::PipelineExt,
playback::{PlayFlags, Playbin3},
videoconvertscale::VideoConvert,
};
use std::sync::{Arc, Mutex, atomic::AtomicBool};
#[derive(Debug, Clone)]
pub struct VideoSource {
pub(crate) playbin: Playbin3,
pub(crate) appsink: AppSink,
pub(crate) bus: Bus,
pub(crate) ready: Arc<AtomicBool>,
pub(crate) frame: Arc<Mutex<gst::Sample>>,
pub(crate) size: std::sync::OnceLock<(i32, i32)>,
}
impl VideoSource {
/// Creates a new video source from the given URL.
/// Since this doesn't have to parse the pipeline manually, we aren't sanitizing the URL for
/// now.
pub fn new(url: impl AsRef<str>) -> Result<Self> {
Gst::new();
let mut appsink = AppSink::new("iced-video-sink").change_context(Error)?;
appsink
.drop(true)
.sync(true)
// .async_(true)
.emit_signals(true);
let playbin = Playbin3::new("iced-video")
.change_context(Error)?
.with_uri(url.as_ref())
.with_buffer_duration(core::time::Duration::from_secs(2))
.with_buffer_size(4096 * 4096 * 4 * 3)
.with_ring_buffer_max_size(4096 * 4096 * 4 * 3)
.with_flags(Playbin3::default_flags() | PlayFlags::DOWNLOAD)
.with_video_sink(&appsink);
let bus = playbin.bus().change_context(Error)?;
playbin.pause().change_context(Error)?;
let ready = Arc::new(AtomicBool::new(false));
let frame = Arc::new(Mutex::new(gst::Sample::new()));
appsink.on_new_sample({
let ready = Arc::clone(&ready);
let frame = Arc::clone(&frame);
move |appsink| {
let Ok(sample) = appsink.pull_sample() else {
tracing::error!("Failed to pull video sample from appsink despite being notified of new frame");
return Ok(());
};
{
let mut guard = frame.lock().expect("BUG: Mutex poisoned");
core::mem::replace(&mut *guard, sample);
ready.store(true, std::sync::atomic::Ordering::Relaxed);
}
Ok(())
}
});
Ok(Self {
playbin,
appsink,
bus,
ready,
frame,
size: std::sync::OnceLock::new(),
})
}
pub async fn wait(&self) -> Result<()> {
use futures_lite::StreamExt;
// self.bus_stream()
// .for_each(|msg: gst::Message| {
// use gst::gstreamer::prelude::*;
// match msg.view() {
// MessageView::Eos(_) => {
// tracing::info!("Video reached end of stream");
// }
// MessageView::Error(err) => {
// tracing::error!(
// "Video Error from {:?}: {} ({:?})",
// err.src().map(|s| s.path_string()),
// err.error(),
// err.debug()
// );
// }
// view => tracing::info!("Video Message: {:#?}", view),
// }
// })
// .await;
self.playbin
.wait_for_states(&[gst::State::Paused, gst::State::Playing])
.await
.change_context(Error)
.attach("Failed to wait for video initialisation")?;
Ok(())
}
pub fn format(&self) -> Result<gst::VideoFormat> {
let caps = self
.appsink
.sink("sink")
.current_caps()
.change_context(Error)?;
let format = caps
.format()
.ok_or(Error)
.attach("Failed to get video caps structure")?;
Ok(format)
}
pub fn bus_stream(&self) -> impl futures_lite::Stream<Item = gst::Message> {
self.bus.stream()
}
pub fn is_playing(&self) -> Result<bool> {
let state = self.playbin.state(None).change_context(Error)?;
Ok(state == gst::State::Playing)
}
pub fn toggle(&self) -> Result<()> {
if self.is_playing()? {
self.pause()?;
} else {
self.play()?;
}
Ok(())
}
pub fn play(&self) -> Result<()> {
self.playbin
.play()
.change_context(Error)
.attach("Failed to play video")
}
pub fn pause(&self) -> Result<()> {
self.playbin
.pause()
.change_context(Error)
.attach("Failed to pause video")
}
pub fn stop(&self) -> Result<()> {
self.playbin
.stop()
.change_context(Error)
.attach("Failed to stop video")
}
pub fn size(&self) -> Result<(i32, i32)> {
if let Some(size) = self.size.get() {
return Ok(*size);
}
let caps = self
.appsink
.sink("sink")
.current_caps()
.change_context(Error)?;
let out = caps
.width()
.and_then(|width| caps.height().map(|height| (width, height)))
.ok_or(Error)
.attach("Failed to get width, height")?;
self.size.set(out);
Ok(out)
}
}

View File

@@ -0,0 +1,258 @@
use super::*;
use iced::Length;
use iced_core as iced;
use iced_wgpu::primitive::Renderer as PrimitiveRenderer;
use std::marker::PhantomData;
/// This is the Video widget that displays a video.
/// This should be used in the view function.
pub struct Video<'a, Message, Theme = iced::Theme, Renderer = iced_wgpu::Renderer>
where
Renderer: PrimitiveRenderer,
{
id: id::Id,
handle: &'a VideoHandle<Message, Ready>,
video_format: gst::VideoFormat,
content_fit: iced::ContentFit,
width: iced::Length,
height: iced::Length,
looping: bool,
__marker: PhantomData<(Renderer, Theme)>,
}
impl<'a, Message, Theme, Renderer> Video<'a, Message, Theme, Renderer>
where
Renderer: PrimitiveRenderer,
Message: Clone + Send + Sync,
{
pub fn new(handle: &'a VideoHandle<Message, Ready>) -> Self {
Self {
id: handle.id.clone(),
handle: &handle,
video_format: handle
.format()
.expect("Failed to get video format during widget creation"),
content_fit: iced::ContentFit::Contain,
width: Length::Shrink,
height: Length::Shrink,
looping: false,
__marker: PhantomData,
}
}
}
impl<'a, Message, Theme, Renderer> Video<'a, Message, Theme, Renderer>
where
Renderer: PrimitiveRenderer,
{
pub fn width(mut self, width: Length) -> Self {
self.width = width;
self
}
pub fn height(mut self, height: Length) -> Self {
self.height = height;
self
}
pub fn content_fit(mut self, fit: iced::ContentFit) -> Self {
self.content_fit = fit;
self
}
// pub fn on_end_of_stream(mut self, message: Message) -> Self {
// self.on_end_of_stream = Some(message);
// self
// }
//
// pub fn on_new_frame(mut self, message: Message) -> Self {
// self.on_new_frame = Some(message);
// self
// }
pub fn looping(mut self, looping: bool) -> Self {
self.looping = looping;
self
}
}
impl<Message, Theme, Renderer> iced::Widget<Message, Theme, Renderer>
for Video<'_, Message, Theme, Renderer>
where
Message: Clone + Send + Sync,
Renderer: PrimitiveRenderer,
{
fn size(&self) -> iced::Size<Length> {
iced::Size {
width: self.width,
height: self.height,
}
}
// The video player should take max space by default
fn layout(
&mut self,
_tree: &mut iced::widget::Tree,
_renderer: &Renderer,
limits: &iced::layout::Limits,
) -> iced::layout::Node {
iced::layout::Node::new(limits.max())
}
fn draw(
&self,
tree: &iced::widget::Tree,
renderer: &mut Renderer,
theme: &Theme,
style: &iced::renderer::Style,
layout: iced::Layout<'_>,
cursor: iced::mouse::Cursor,
viewport: &iced::Rectangle,
) {
if let Ok((width, height)) = self.handle.source.size() {
let video_size = iced::Size {
width: width as f32,
height: height as f32,
};
let bounds = layout.bounds();
let adjusted_fit = self.content_fit.fit(video_size, bounds.size());
let scale = iced::Vector::new(
adjusted_fit.width / video_size.width,
adjusted_fit.height / video_size.height,
);
let final_size = video_size * scale;
let position = match self.content_fit {
iced::ContentFit::None => iced::Point::new(
bounds.x + (video_size.width - adjusted_fit.width) / 2.0,
bounds.y + (video_size.height - adjusted_fit.height) / 2.0,
),
_ => iced::Point::new(
bounds.center_x() - final_size.width / 2.0,
bounds.center_y() - final_size.height / 2.0,
),
};
let drawing_bounds = iced::Rectangle::new(position, final_size);
let render = |renderer: &mut Renderer| {
renderer.draw_primitive(
drawing_bounds,
primitive::VideoFrame {
id: self.id.clone(),
size: iced_wgpu::wgpu::Extent3d {
width: width as u32,
height: height as u32,
depth_or_array_layers: 1,
},
ready: Arc::clone(&self.handle.frame_ready),
frame: Arc::clone(&self.handle.source.frame),
format: self.video_format,
},
);
};
if adjusted_fit.width > bounds.width || adjusted_fit.height > bounds.height {
renderer.with_layer(bounds, render);
} else {
render(renderer);
}
}
}
fn update(
&mut self,
_tree: &mut iced_core::widget::Tree,
event: &iced::Event,
_layout: iced_core::Layout<'_>,
_cursor: iced_core::mouse::Cursor,
_renderer: &Renderer,
_clipboard: &mut dyn iced_core::Clipboard,
shell: &mut iced_core::Shell<'_, Message>,
_viewport: &iced::Rectangle,
) {
if let iced::Event::Window(iced::window::Event::RedrawRequested(when)) = event {
if self
.handle
.frame_ready
.load(std::sync::atomic::Ordering::SeqCst)
{
shell.request_redraw();
} else {
shell.request_redraw_at(iced::window::RedrawRequest::At(
iced_core::time::Instant::now() + core::time::Duration::from_millis(16)
- when.elapsed(),
));
}
}
}
}
impl<'a, Message, Theme, Renderer> From<Video<'a, Message, Theme, Renderer>>
for iced::Element<'a, Message, Theme, Renderer>
where
Message: Send + Sync + 'a + Clone,
Theme: 'a,
Renderer: 'a + iced_wgpu::primitive::Renderer,
{
fn from(video: Video<'a, Message, Theme, Renderer>) -> Self {
Self::new(video)
}
}
#[derive(Debug, Clone)]
pub struct VideoSubscription<Message> {
pub(crate) id: id::Id,
pub(crate) on_end_of_stream: Option<Box<Message>>,
pub(crate) on_new_frame: Option<Box<Message>>,
pub(crate) on_about_to_finish: Option<Box<Message>>,
// on_subtitle_text: Option<Box<dyn Fn(Option<String>) -> Message>>,
// on_error: Option<Box<dyn Fn(&glib::Error) -> Message>>,
pub(crate) bus: gst::Bus,
}
impl<Message> VideoSubscription<Message> where Message: Clone {}
impl<Message> iced_futures::subscription::Recipe for VideoSubscription<Message>
where
Message: Clone + Send + Sync + 'static,
{
type Output = Message;
fn hash(&self, state: &mut iced_futures::subscription::Hasher) {
use std::hash::Hash;
self.id.hash(state);
}
fn stream(
self: Box<Self>,
_input: core::pin::Pin<
Box<dyn iced_futures::futures::Stream<Item = iced_futures::subscription::Event> + Send>,
>,
) -> core::pin::Pin<Box<dyn iced_futures::futures::Stream<Item = Self::Output> + Send>> {
// use iced_futures::futures::StreamExt;
use futures_lite::stream::StreamExt;
Box::pin(
self.bus
.filtered_stream(&[gst::MessageType::Eos, gst::MessageType::Element])
.filter_map({
let eos = self.on_end_of_stream.clone();
let frame = self.on_new_frame.clone();
move |message: gst::Message| match message.view() {
gst::MessageView::Eos(_) => eos.clone().map(|m| *m),
gst::MessageView::Element(element_msg) => {
let structure = element_msg.structure();
if let Some(structure) = structure {
if structure.name() == "GstVideoFrameReady" {
frame.clone().map(|m| *m)
} else {
None
}
} else {
None
}
}
_ => None,
}
}),
)
}
}

View File

@@ -5,11 +5,11 @@ edition = "2024"
[dependencies]
# gst = { workspace = true }
wgpu = "*"
gstreamer = "*"
gstreamer-video = "*"
gstreamer-app = "*"
gstreamer-base = "*"
wgpu = "27"
gstreamer = { version = "0.24.4", features = ["v1_26"] }
gstreamer-app = { version = "0.24.4", features = ["v1_26"] }
gstreamer-base = { version = "0.24.4", features = ["v1_26"] }
gstreamer-video = { version = "0.24.4", features = ["v1_26"] }
winit = { version = "*", features = ["wayland"] }
anyhow = "*"
pollster = "0.4.0"

View File

@@ -77,7 +77,7 @@ impl State {
.await
.context("Failed to request wgpu device")?;
let surface_caps = surface.get_capabilities(&adapter);
dbg!(&surface_caps);
tracing::info!("Caps: {:#?}", &surface_caps);
let surface_format = surface_caps
.formats
.iter()
@@ -85,6 +85,7 @@ impl State {
.find(|f| f.is_hdr_format())
.expect("HDR format not supported")
.clone();
tracing::info!("Using surface format: {:?}", surface_format);
let size = window.inner_size();
let config = wgpu::SurfaceConfiguration {
usage: wgpu::TextureUsages::RENDER_ATTACHMENT,
@@ -411,9 +412,8 @@ impl State {
},
texture.size(),
);
drop(map);
// drop(buffer);
drop(frame);
// drop(map);
// drop(frame);
Ok(())
}
@@ -426,11 +426,11 @@ impl ApplicationHandler<State> for App {
let window = Arc::new(event_loop.create_window(window_attributes).unwrap());
let monitor = event_loop
.primary_monitor()
.or_else(|| window.current_monitor());
// let monitor = event_loop
// .primary_monitor()
// .or_else(|| window.current_monitor());
// window.set_fullscreen(None);
window.set_fullscreen(Some(winit::window::Fullscreen::Borderless(monitor)));
// window.set_fullscreen(Some(winit::window::Fullscreen::Borderless(monitor)));
self.state = Some(pollster::block_on(State::new(window)).expect("Failed to block"));
}
@@ -528,7 +528,7 @@ impl Video {
gst::init()?;
use gst::prelude::*;
let pipeline = gst::parse::launch(
r##"playbin3 uri=https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c video-sink="videoconvert ! video/x-raw,format=RGB10A2_LE ! appsink name=appsink""##,
r##"playbin3 uri=https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c video-sink="videoconvert ! video/x-raw,format=RGB10A2_LE ! appsink sync=true drop=true name=appsink""##
).context("Failed to parse gst pipeline")?;
let pipeline = pipeline
.downcast::<gst::Pipeline>()
@@ -544,11 +544,11 @@ impl Video {
})?;
// appsink.set_property("max-buffers", 2u32);
// appsink.set_property("emit-signals", true);
appsink.set_callbacks(
gst_app::AppSinkCallbacks::builder()
.new_sample(|_appsink| Ok(gst::FlowSuccess::Ok))
.build(),
);
// appsink.set_callbacks(
// gst_app::AppSinkCallbacks::builder()
// .new_sample(|_appsink| Ok(gst::FlowSuccess::Ok))
// .build(),
// );
let bus = pipeline.bus().context("Failed to get gst pipeline bus")?;
pipeline.set_state(gst::State::Playing)?;

35
flake.lock generated
View File

@@ -3,11 +3,11 @@
"advisory-db": {
"flake": false,
"locked": {
"lastModified": 1765811277,
"narHash": "sha256-QF/aUvQwJG/ndoRZCjb+d7xASs0ELCmpqpK8u6Se2f4=",
"lastModified": 1768679419,
"narHash": "sha256-l9rM4lXBeS2mIAJsJjVfl0UABx3S3zg5tul7bv+bn50=",
"owner": "rustsec",
"repo": "advisory-db",
"rev": "2d254c1fad2260522209e9bce2fdc93012b0627f",
"rev": "c700e1cd023ca87343cbd9217d50d47023e9adc7",
"type": "github"
},
"original": {
@@ -18,11 +18,11 @@
},
"crane": {
"locked": {
"lastModified": 1765739568,
"narHash": "sha256-gQYx35Of4UDKUjAYvmxjUEh/DdszYeTtT6MDin4loGE=",
"lastModified": 1768873933,
"narHash": "sha256-CfyzdaeLNGkyAHp3kT5vjvXhA1pVVK7nyDziYxCPsNk=",
"owner": "ipetkov",
"repo": "crane",
"rev": "67d2baff0f9f677af35db61b32b5df6863bcc075",
"rev": "0bda7e7d005ccb5522a76d11ccfbf562b71953ca",
"type": "github"
},
"original": {
@@ -34,10 +34,10 @@
"crates-io-index": {
"flake": false,
"locked": {
"lastModified": 1763363725,
"narHash": "sha256-cxr5xIKZFP45yV1ZHFTB1sHo5YGiR3FA8D9vAfDizMo=",
"lastModified": 1769614137,
"narHash": "sha256-3Td8fiv6iFVxeS0hYq3xdd10ZvUkC9INMAiQx/mECas=",
"ref": "refs/heads/master",
"rev": "0382002e816a4cbd17d8d5b172f08b848aa22ff6",
"rev": "c7e7d6394bc95555d6acd5c6783855f47d64c90d",
"shallow": true,
"type": "git",
"url": "https://github.com/rust-lang/crates.io-index"
@@ -50,7 +50,9 @@
},
"crates-nix": {
"inputs": {
"crates-io-index": "crates-io-index"
"crates-io-index": [
"crates-io-index"
]
},
"locked": {
"lastModified": 1763364255,
@@ -106,11 +108,11 @@
},
"nixpkgs": {
"locked": {
"lastModified": 1765779637,
"narHash": "sha256-KJ2wa/BLSrTqDjbfyNx70ov/HdgNBCBBSQP3BIzKnv4=",
"lastModified": 1768564909,
"narHash": "sha256-Kell/SpJYVkHWMvnhqJz/8DqQg2b6PguxVWOuadbHCc=",
"owner": "nixos",
"repo": "nixpkgs",
"rev": "1306659b587dc277866c7b69eb97e5f07864d8c4",
"rev": "e4bae1bd10c9c57b2cf517953ab70060a828ee6f",
"type": "github"
},
"original": {
@@ -124,6 +126,7 @@
"inputs": {
"advisory-db": "advisory-db",
"crane": "crane",
"crates-io-index": "crates-io-index",
"crates-nix": "crates-nix",
"flake-utils": "flake-utils",
"nix-github-actions": "nix-github-actions",
@@ -138,11 +141,11 @@
]
},
"locked": {
"lastModified": 1765852971,
"narHash": "sha256-rQdOMqfQNhcfqvh1dFIVWh09mrIWwerUJqqBdhIsf8g=",
"lastModified": 1768877311,
"narHash": "sha256-abSDl0cNr0B+YCsIDpO1SjXD9JMxE4s8EFnhLEFVovI=",
"owner": "oxalica",
"repo": "rust-overlay",
"rev": "5f98ccecc9f1bc1c19c0a350a659af1a04b3b319",
"rev": "59e4ab96304585fde3890025fd59bd2717985cc1",
"type": "github"
},
"original": {

View File

@@ -9,7 +9,14 @@
url = "github:nix-community/nix-github-actions";
inputs.nixpkgs.follows = "nixpkgs";
};
crates-nix.url = "github:uttarayan21/crates.nix";
crates-io-index = {
url = "git+https://github.com/rust-lang/crates.io-index?shallow=1";
flake = false;
};
crates-nix = {
url = "github:uttarayan21/crates.nix";
inputs.crates-io-index.follows = "crates-io-index";
};
rust-overlay = {
url = "github:oxalica/rust-overlay";
inputs.nixpkgs.follows = "nixpkgs";
@@ -35,6 +42,7 @@
system: let
pkgs = import nixpkgs {
inherit system;
config.allowUnfree = true;
overlays = [
rust-overlay.overlays.default
];
@@ -87,6 +95,8 @@
glib
glib-networking
wrapGAppsHook4
# bzip2_1_1
# libsysprof-capture
# pcre2
@@ -100,6 +110,7 @@
++ (lib.optionals pkgs.stdenv.isLinux [
gst_all_1.gstreamermm
gst_all_1.gst-vaapi
cudatoolkit
# util-linux
# libselinux
@@ -175,34 +186,49 @@
devShells = rec {
rust-shell =
pkgs.mkShell.override {
stdenv =
if pkgs.stdenv.isLinux
then (pkgs.stdenvAdapters.useMoldLinker pkgs.clangStdenv)
else pkgs.clangStdenv;
} (commonArgs
stdenv = pkgs.clangStdenv;
# if pkgs.stdenv.isLinux
# then (pkgs.stdenvAdapters.useMoldLinker pkgs.clangStdenv)
# else pkgs.clangStdenv;
}
(commonArgs
// {
GST_PLUGIN_PATH = "/run/current-system/sw/lib/gstreamer-1.0/";
# GST_PLUGIN_PATH = "/run/current-system/sw/lib/gstreamer-1.0/";
GIO_EXTRA_MODULES = "${pkgs.glib-networking}/lib/gio/modules";
packages = with pkgs;
[
toolchainWithRustAnalyzer
cargo-nextest
bacon
cargo-audit
cargo-deny
cargo-expand
bacon
cargo-make
cargo-hack
cargo-make
cargo-nextest
cargo-outdated
lld
lldb
cargo-flamegraph
(crates.buildCrate "cargo-with" {doCheck = false;})
(crates.buildCrate "dioxus-cli" {
nativeBuildInputs = with pkgs; [pkg-config];
buildInputs = [openssl];
doCheck = false;
})
(crates.buildCrate "cargo-hot" {
nativeBuildInputs = with pkgs; [pkg-config];
buildInputs = [openssl];
})
]
++ (lib.optionals pkgs.stdenv.isDarwin [
apple-sdk_26
])
++ (lib.optionals pkgs.stdenv.isLinux [
ffmpeg
heaptrack
samply
cargo-flamegraph
perf
mold
# mold
]);
});
default = rust-shell;

View File

@@ -1,19 +1,24 @@
[package]
name = "gst"
version = "0.1.0"
edition = "2021"
edition = "2024"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
error-stack = "0.6"
futures = "0.3.31"
futures-lite = "2.6.1"
glib = "0.21.5"
gstreamer = { version = "0.24.4", features = ["v1_18"] }
gstreamer-app = { version = "0.24.4", features = ["v1_18"] }
gstreamer-video = { version = "0.24.4", features = ["v1_18"] }
glib-sys = "0.21.5"
gstreamer = { version = "0.24.4", features = ["v1_26"] }
gstreamer-app = { version = "0.24.4", features = ["v1_26"] }
gstreamer-video = { version = "0.24.4", features = ["v1_26"] }
gstreamer-base = { version = "0.24.4", features = ["v1_26"] }
thiserror = "2.0"
tracing = { version = "0.1", features = ["log"] }
wgpu = { version = "27.0.1", default-features = false }
bitflags = "2.10.0"
[dev-dependencies]
smol = "2.0.2"
tracing-subscriber = "0.3.22"

View File

@@ -1,41 +1,28 @@
use crate::*;
use crate::priv_prelude::*;
pub struct Bin {
inner: gstreamer::Bin,
}
wrap_gst!(Bin);
parent_child!(Element, Bin);
impl IsElement for Bin {
fn as_element(&self) -> &Element {
let element = self.inner.upcast_ref::<gstreamer::Element>();
unsafe { core::mem::transmute(element) }
}
fn into_element(self) -> Element {
Element {
inner: self.inner.into(),
}
}
}
impl Bin {
pub fn new(name: impl AsRef<str>) -> Self {
let bin = gstreamer::Bin::with_name(name.as_ref());
Bin { inner: bin }
}
pub fn add(&mut self, element: impl IsElement) -> Result<&mut Self> {
pub fn add(&mut self, element: &impl ChildOf<Element>) -> Result<&mut Self> {
self.inner
.add(&element.as_element().inner)
.add(&element.upcast_ref().inner)
.change_context(Error)
.attach("Failed to add element to bin")?;
Ok(self)
}
pub fn add_many<'a, E: IsElement + 'a>(
pub fn add_many<'a, E: ChildOf<Element> + 'a>(
&mut self,
elements: impl IntoIterator<Item = &'a E>,
) -> Result<&mut Self> {
self.inner
.add_many(elements.into_iter().map(|e| &e.as_element().inner))
.add_many(elements.into_iter().map(|e| &e.upcast_ref().inner))
.change_context(Error)
.attach("Failed to add elements to bin")?;
Ok(self)

27
gst/src/bus.rs Normal file
View File

@@ -0,0 +1,27 @@
use crate::priv_prelude::*;
wrap_gst!(Bus);
impl Bus {
pub fn iter_timed(
&self,
timeout: impl Into<Option<core::time::Duration>>,
) -> gstreamer::bus::Iter<'_> {
let clocktime = match timeout.into() {
Some(dur) => gstreamer::ClockTime::try_from(dur).ok(),
None => gstreamer::ClockTime::NONE,
};
self.inner.iter_timed(clocktime)
}
pub fn stream(&self) -> gstreamer::bus::BusStream {
self.inner.stream()
}
pub fn filtered_stream<'a>(
&self,
msg_types: &'a [gstreamer::MessageType],
) -> impl futures::stream::FusedStream<Item = gstreamer::Message> + Unpin + Send + 'a {
self.inner.stream_filtered(msg_types)
}
}

View File

@@ -1,4 +1,6 @@
use crate::*;
use gstreamer::Fraction;
#[derive(Debug, Clone)]
#[repr(transparent)]
pub struct Caps {
pub(crate) inner: gstreamer::caps::Caps,
@@ -16,7 +18,6 @@ pub struct CapsBuilder {
impl CapsBuilder {
pub fn field<V: Into<glib::Value> + Send>(mut self, name: impl AsRef<str>, value: V) -> Self {
use gstreamer::prelude::*;
self.inner = self.inner.field(name.as_ref(), value);
self
}
@@ -51,3 +52,27 @@ impl CapsBuilder {
}
}
}
impl Caps {
pub fn format(&self) -> Option<gstreamer_video::VideoFormat> {
self.inner
.structure(0)
.and_then(|s| s.get::<&str>("format").ok())
.map(|s| gstreamer_video::VideoFormat::from_string(s))
}
pub fn width(&self) -> Option<i32> {
self.inner
.structure(0)
.and_then(|s| s.get::<i32>("width").ok())
}
pub fn height(&self) -> Option<i32> {
self.inner
.structure(0)
.and_then(|s| s.get::<i32>("height").ok())
}
pub fn framerate(&self) -> Option<gstreamer::Fraction> {
self.inner
.structure(0)
.and_then(|s| s.get::<Fraction>("framerate").ok())
}
}

View File

@@ -1,52 +1,133 @@
use crate::{Error, Pad, Result, ResultExt};
#[repr(transparent)]
pub struct Element {
pub(crate) inner: gstreamer::Element,
}
use crate::priv_prelude::*;
use crate::wrap_gst;
pub trait IsElement {
fn as_element(&self) -> &Element;
fn into_element(self) -> Element;
fn pad(&self, name: &str) -> Option<Pad> {
wrap_gst!(Element, gstreamer::Element);
// pub trait IsElement {
// fn upcast_ref(&self) -> &Element;
// fn into_element(self) -> Element;
// fn pad(&self, name: &str) -> Option<Pad> {
// use gstreamer::prelude::*;
// self.upcast_ref().inner.static_pad(name).map(Pad::from)
// }
// }
// impl IsElement for Element {
// fn upcast_ref(&self) -> &Element {
// self
// }
//
// fn into_element(self) -> Element {
// self
// }
// }
impl Element {
pub fn pad(&self, name: impl AsRef<str>) -> Option<Pad> {
use gstreamer::prelude::*;
self.as_element().inner.static_pad(name).map(Pad::from)
}
}
impl IsElement for Element {
fn as_element(&self) -> &Element {
self
self.inner.static_pad(name.as_ref()).map(Pad::from)
}
fn into_element(self) -> Element {
self
}
}
pub trait Sink: IsElement {
fn sink_pad(&self) -> Pad {
pub fn bus(&self) -> Result<Bus> {
use gstreamer::prelude::*;
self.as_element()
.pad("sink")
.map(From::from)
self.inner
.bus()
.map(Bus::from)
.ok_or(Error)
.attach_with(|| format!("Failed to get bus from Element: {}", self.inner.name()))
}
}
pub trait Sink: ChildOf<Element> {
fn sink(&self, name: impl AsRef<str>) -> Pad {
self.upcast_ref()
.pad(name)
.expect("Sink element has no sink pad")
}
}
pub trait Source: IsElement {
fn source_pad(&self) -> Pad {
use gstreamer::prelude::*;
self.as_element()
.pad("src")
.map(From::from)
pub trait Source: ChildOf<Element> {
fn source(&self, name: impl AsRef<str>) -> Pad {
self.upcast_ref()
.pad(name)
.expect("Source element has no src pad")
}
fn link<S: Sink>(&self, sink: &S) -> Result<()> {
fn link<S: Sink>(&self, sink: &S) -> Result<Bin>
where
Self: Sized,
{
use gstreamer::prelude::*;
self.as_element()
.inner
.link(&sink.as_element().inner)
if let Ok(bin) = self.upcast_ref().inner.clone().downcast::<gstreamer::Bin>() {
bin.add(&sink.upcast_ref().inner)
.change_context(Error)
.attach("Failed to link source to sink")
.attach("Failed to add sink to bin")?;
self.upcast_ref()
.inner
.link(&sink.upcast_ref().inner)
.change_context(Error)
.attach("Failed to link elements")?;
Ok(Bin::from(bin))
} else {
let bin = gstreamer::Bin::builder()
.name(format!(
"{}-link-{}",
self.upcast_ref().inner.name(),
sink.upcast_ref().inner.name()
))
.build();
bin.add(&self.upcast_ref().inner)
.change_context(Error)
.attach("Failed to add source to bin")?;
bin.add(&sink.upcast_ref().inner)
.change_context(Error)
.attach("Failed to add sink to bin")?;
self.upcast_ref()
.inner
.link(&sink.upcast_ref().inner)
.change_context(Error)
.attach("Failed to link elements")?;
if let Some(sink_pad) = self.upcast_ref().pad("sink") {
let ghost_pad = Pad::ghost(&sink_pad)?;
bin.add_pad(&ghost_pad.inner)
.change_context(Error)
.attach("Failed to add src pad to bin")?;
ghost_pad.activate(true)?;
}
Ok(From::from(bin))
}
}
// fn link_pad<S: Sink>(&self, sink: &S, src_pad_name: &str, sink_pad_name: &str) -> Result<()> {
// use gstreamer::prelude::*;
// let src_pad = self
// .upcast_ref()
// .pad(src_pad_name)
// .ok_or(Error)
// .attach("Source pad not found")?;
// let sink_pad = sink
// .upcast_ref()
// .pad(sink_pad_name)
// .ok_or(Error)
// .attach("Sink pad not found")?;
// src_pad
// .inner
// .link(&sink_pad.inner)
// .change_context(Error)
// .attach("Failed to link source pad to sink pad")?;
// Ok(())
// }
}
pub trait ElementExt: ChildOf<Element> + Sync {
#[track_caller]
fn bus(&self) -> Result<Bus> {
self.upcast_ref().bus()
}
#[track_caller]
fn pad(&self, name: impl AsRef<str>) -> Option<Pad> {
self.upcast_ref().pad(name)
}
}
impl<T: ChildOf<Element> + Sync> ElementExt for T {}

View File

@@ -1,24 +1,51 @@
pub mod bin;
pub mod bus;
pub mod caps;
pub mod element;
pub mod errors;
pub mod pad;
pub mod pipeline;
pub mod plugins;
// pub mod playbin3;
// pub mod videoconvert;
#[macro_use]
pub mod wrapper;
pub mod sample;
pub use bin::*;
pub use bus::*;
pub use caps::*;
pub use element::*;
pub use gstreamer;
#[doc(inline)]
pub use gstreamer::{Message, MessageType, MessageView, State};
pub use gstreamer_video::VideoFormat;
pub use pad::*;
pub use pipeline::*;
pub use plugins::*;
// pub use playbin3::*;
// pub use videoconvert::*;
pub use sample::*;
pub(crate) mod priv_prelude {
pub use crate::errors::*;
pub use crate::wrapper::*;
pub use crate::*;
pub use gstreamer::prelude::ElementExt as _;
pub use gstreamer::prelude::*;
#[track_caller]
pub fn duration_to_clocktime(
timeout: impl Into<Option<core::time::Duration>>,
) -> Result<Option<gstreamer::ClockTime>> {
match timeout.into() {
Some(dur) => {
let clocktime = gstreamer::ClockTime::try_from(dur)
.change_context(Error)
.attach("Failed to convert duration to ClockTime")?;
Ok(Some(clocktime))
}
None => Ok(gstreamer::ClockTime::NONE),
}
}
}
use errors::*;
use gstreamer::prelude::*;
use std::sync::Arc;
static GST: std::sync::LazyLock<std::sync::Arc<Gst>> = std::sync::LazyLock::new(|| {
gstreamer::init().expect("Failed to initialize GStreamer");
std::sync::Arc::new(Gst {
@@ -26,7 +53,6 @@ static GST: std::sync::LazyLock<std::sync::Arc<Gst>> = std::sync::LazyLock::new(
})
});
/// This should be a global singleton
pub struct Gst {
__private: core::marker::PhantomData<()>,
}
@@ -35,283 +61,4 @@ impl Gst {
pub fn new() -> Arc<Self> {
Arc::clone(&GST)
}
pub fn pipeline_from_str(&self, s: &str) -> Result<Pipeline> {
let pipeline = gstreamer::parse::launch(s).change_context(Error)?;
let pipeline = pipeline.downcast::<gstreamer::Pipeline>();
let pipeline = match pipeline {
Err(_e) => return Err(Error).attach("Failed to downcast to Pipeline"),
Ok(p) => p,
};
Ok(Pipeline { inner: pipeline })
}
}
pub struct Pipeline {
inner: gstreamer::Pipeline,
}
impl core::fmt::Debug for Pipeline {
fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
f.debug_struct("Pipeline")
.field("pipeline", &self.inner)
// .field("state", &self.pipeline.state(gstreamer::ClockTime::NONE))
.finish()
}
}
impl Drop for Pipeline {
fn drop(&mut self) {
let _ = self.inner.set_state(gstreamer::State::Null);
}
}
impl Pipeline {
pub fn bus(&self) -> Result<Bus> {
let bus = self
.inner
.bus()
.ok_or(Error)
.attach("Failed to get bus from pipeline")?;
Ok(Bus { bus })
}
pub fn play(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Playing)
.change_context(Error)
.attach("Failed to set pipeline to Playing state")?;
Ok(())
}
pub fn pause(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Paused)
.change_context(Error)
.attach("Failed to set pipeline to Paused state")?;
Ok(())
}
pub fn ready(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Ready)
.change_context(Error)
.attach("Failed to set pipeline to Paused state")?;
Ok(())
}
pub unsafe fn set_state(
&self,
state: gstreamer::State,
) -> Result<gstreamer::StateChangeSuccess> {
let result = self
.inner
.set_state(state)
.change_context(Error)
.attach("Failed to set pipeline state")?;
Ok(result)
}
}
pub struct Bus {
bus: gstreamer::Bus,
}
impl Bus {
pub fn iter_timed(
&self,
timeout: impl Into<Option<core::time::Duration>>,
) -> gstreamer::bus::Iter<'_> {
let clocktime = match timeout.into() {
Some(dur) => gstreamer::ClockTime::try_from(dur).ok(),
None => gstreamer::ClockTime::NONE,
};
self.bus.iter_timed(clocktime)
}
pub fn stream(&self) -> gstreamer::bus::BusStream {
self.bus.stream()
}
}
pub struct Playbin3Builder {
uri: Option<String>,
video_sink: Option<Element>,
audio_sink: Option<Element>,
text_sink: Option<Element>,
}
#[test]
fn gst_parse_pipeline() {
let gst = Gst::new();
let pipeline = gst
.pipeline_from_str("videotestsrc ! autovideosink")
.expect("Failed to create pipeline");
println!("{:?}", pipeline);
}
#[test]
fn gst_parse_invalid_pipeline() {
let gst = Gst::new();
let result = gst.pipeline_from_str("invalidpipeline");
assert!(result.is_err());
}
#[test]
fn gst_play_pipeline() {
let gst = Gst::new();
let pipeline = gst
.pipeline_from_str("videotestsrc ! autovideosink")
.expect("Failed to create pipeline");
let bus = pipeline.bus().expect("Failed to get bus from pipeline");
pipeline
.play()
.expect("Unable to set the pipeline to the `Playing` state");
for msg in bus.iter_timed(None) {
use gstreamer::MessageView;
match msg.view() {
MessageView::Eos(..) => break,
MessageView::Error(err) => {
eprintln!(
"Error from {:?}: {} ({:?})",
err.src().map(|s| s.path_string()),
err.error(),
err.debug()
);
break;
}
_ => (),
}
}
}
#[test]
#[ignore]
fn gstreamer_unwrapped() {
gstreamer::init();
let uri = "https://gstreamer.freedesktop.org/data/media/sintel_trailer-480p.webm";
let pipeline = gstreamer::parse::launch(&format!("playbin uri={}", uri)).unwrap();
use gstreamer::prelude::*;
pipeline.set_state(gstreamer::State::Playing).unwrap();
let bus = pipeline.bus().unwrap();
for msg in bus.iter_timed(gstreamer::ClockTime::NONE) {
use gstreamer::MessageView;
match msg.view() {
MessageView::Eos(..) => break,
MessageView::Error(err) => {
eprintln!(
"Error from {:?}: {} ({:?})",
err.src().map(|s| s.path_string()),
err.error(),
err.debug()
);
break;
}
_ => (),
}
}
pipeline.set_state(gstreamer::State::Null).unwrap();
}
#[test]
fn test_appsink() {
let gst = Gst::new();
let pipeline = gst
.pipeline_from_str(
"videotestsrc ! videoconvert | capsfilter name=video-filter ! appsink name=video-sink",
)
.expect("Failed to create pipeline");
// let video_sink = pipeline.
let bus = pipeline.bus().expect("Failed to get bus from pipeline");
let sink = pipeline
.inner
.by_name("video-sink")
.expect("Sink not found")
.downcast::<gstreamer_app::AppSink>()
.expect("Failed to downcast to AppSink");
let capsfilter = pipeline
.inner
.by_name("video-filter")
.expect("Capsfilter not found");
let caps = gstreamer::Caps::builder("video/x-raw")
.field("format", "RGBA")
.build();
capsfilter.set_property("caps", &caps);
sink.set_callbacks(
gstreamer_app::AppSinkCallbacks::builder()
.new_sample(|sink| {
// foo
Ok(gstreamer::FlowSuccess::Ok)
})
.build(),
);
pipeline
.play()
.expect("Unable to set the pipeline to the `Playing` state");
for msg in bus.iter_timed(None) {
use gstreamer::MessageView;
match msg.view() {
MessageView::Eos(..) => break,
MessageView::Error(err) => {
eprintln!(
"Error from {:?}: {} ({:?})",
err.src().map(|s| s.path_string()),
err.error(),
err.debug()
);
break;
}
_ => (),
}
}
}
#[test]
fn gst_test_manual_pipeline() {
use gstreamer as gst;
use gstreamer::prelude::*;
// Initialize GStreamer
gst::init().unwrap();
// Create a new pipeline
let pipeline = gst::Pipeline::new();
// Create elements for the pipeline
let src = gst::ElementFactory::make("videotestsrc").build().unwrap();
let sink = gst::ElementFactory::make("autovideosink").build().unwrap();
// Add elements to the pipeline
pipeline.add_many(&[&src, &sink]).unwrap();
// Link elements together
src.link(&sink).unwrap();
// Set the pipeline to the playing state
pipeline.set_state(gst::State::Playing).unwrap();
// Start the main event loop
// let main_loop = glib::MainLoop::new(None, false);
// main_loop.run();
// Shut down the pipeline and GStreamer
let bus = pipeline.bus().unwrap();
let messages = bus.iter_timed(gst::ClockTime::NONE);
for msg in messages {
dbg!(msg);
}
pipeline.set_state(gst::State::Null).unwrap();
}

View File

@@ -1,17 +1,9 @@
use crate::*;
/// Pads are link points between elements
#[repr(transparent)]
pub struct Pad {
pub(crate) inner: gstreamer::Pad,
}
use crate::priv_prelude::*;
impl From<gstreamer::Pad> for Pad {
fn from(inner: gstreamer::Pad) -> Self {
Pad { inner }
}
}
wrap_gst!(Pad, gstreamer::Pad);
impl Pad {
#[track_caller]
pub fn ghost(target: &Pad) -> Result<Pad> {
let ghost_pad = gstreamer::GhostPad::with_target(&target.inner)
.change_context(Error)
@@ -20,6 +12,28 @@ impl Pad {
inner: ghost_pad.upcast(),
})
}
#[track_caller]
pub fn link(&self, peer: &Pad) -> Result<()> {
use gstreamer::prelude::*;
self.inner
.link(&peer.inner)
.change_context(Error)
.attach("Failed to link pads")?;
Ok(())
}
#[track_caller]
pub fn current_caps(&self) -> Result<Caps> {
let caps = self
.inner
.current_caps()
.ok_or(Error)
.attach("Failed to get pad caps")?;
Ok(Caps { inner: caps })
}
#[track_caller]
pub fn activate(&self, activate: bool) -> Result<()> {
use gstreamer::prelude::*;
self.inner

211
gst/src/pipeline.rs Normal file
View File

@@ -0,0 +1,211 @@
use crate::priv_prelude::*;
wrap_gst!(Pipeline);
parent_child!(Element, Pipeline);
parent_child!(Bin, Pipeline);
impl Drop for Pipeline {
fn drop(&mut self) {
let _ = self.inner.set_state(gstreamer::State::Null);
}
}
impl Pipeline {
#[track_caller]
pub fn bus(&self) -> Result<Bus> {
let bus = self
.inner
.bus()
.ok_or(Error)
.attach("Failed to get bus from pipeline")?;
Ok(Bus::from_gst(bus))
}
/// Get the state
pub fn state(
&self,
timeout: impl Into<Option<core::time::Duration>>,
) -> Result<gstreamer::State> {
let (result, current, _pending) = self.inner.state(duration_to_clocktime(timeout)?);
result.change_context(Error).attach("Failed to get state")?;
Ok(current)
}
pub fn play(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Playing)
.change_context(Error)
.attach("Failed to set pipeline to Playing state")?;
Ok(())
}
pub fn pause(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Paused)
.change_context(Error)
.attach("Failed to set pipeline to Paused state")?;
Ok(())
}
pub fn ready(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Ready)
.change_context(Error)
.attach("Failed to set pipeline to Ready state")?;
Ok(())
}
pub fn stop(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Null)
.change_context(Error)
.attach("Failed to set pipeline to Null state")?;
Ok(())
}
pub fn set_state(&self, state: gstreamer::State) -> Result<gstreamer::StateChangeSuccess> {
let result = self
.inner
.set_state(state)
.change_context(Error)
.attach("Failed to set pipeline state")?;
Ok(result)
}
pub async fn wait_for(&self, state: gstreamer::State) -> Result<()> {
let current_state = self.state(core::time::Duration::ZERO)?;
if current_state == state {
Ok(())
} else {
// use futures::stream::StreamExt;
use futures_lite::stream::StreamExt as _;
self.bus()?
.filtered_stream(&[MessageType::StateChanged])
.find(|message: &gstreamer::Message| {
let view = message.view();
if let gstreamer::MessageView::StateChanged(changed) = view {
changed.current() == state
&& changed.src().is_some_and(|s| s == &self.inner)
} else {
false
}
})
.await;
Ok(())
}
}
pub async fn wait_for_states(&self, states: impl AsRef<[gstreamer::State]>) -> Result<()> {
let current_state = self.state(core::time::Duration::ZERO)?;
let states = states.as_ref();
if states.contains(&current_state) {
Ok(())
} else {
use futures_lite::stream::StreamExt as _;
self.bus()?
.filtered_stream(&[MessageType::StateChanged])
.find(|message: &gstreamer::Message| {
let view = message.view();
if let gstreamer::MessageView::StateChanged(changed) = view {
states.contains(&changed.current())
&& changed.src().is_some_and(|s| s == &self.inner)
} else {
false
}
})
.await;
Ok(())
}
}
pub async fn wait_for_message<'a, F2>(
&self,
filter: Option<&'a [gstreamer::MessageType]>,
filter_fn: F2,
) -> Result<gstreamer::Message>
where
F2: Fn(&gstreamer::Message) -> bool + Send + 'a,
{
use futures_lite::stream::StreamExt as _;
match filter {
Some(filter) => {
let message = self.bus()?.filtered_stream(filter).find(filter_fn).await;
match message {
Some(msg) => Ok(msg),
None => {
Err(Error).attach("Failed to find message matching the provided filter")
}
}
}
None => {
let message = self.bus()?.stream().find(filter_fn).await;
match message {
Some(msg) => Ok(msg),
None => {
Err(Error).attach("Failed to find message matching the provided filter")
}
}
}
}
}
}
pub trait PipelineExt: ChildOf<Pipeline> + Sync {
// #[track_caller]
// fn bus(&self) -> Result<Bus> {
// self.upcast_ref().bus()
// }
#[track_caller]
fn play(&self) -> Result<()> {
self.upcast_ref().play()
}
#[track_caller]
fn pause(&self) -> Result<()> {
self.upcast_ref().pause()
}
#[track_caller]
fn ready(&self) -> Result<()> {
self.upcast_ref().ready()
}
#[track_caller]
fn stop(&self) -> Result<()> {
self.upcast_ref().stop()
}
#[track_caller]
fn set_state(&self, state: gstreamer::State) -> Result<gstreamer::StateChangeSuccess> {
self.upcast_ref().set_state(state)
}
#[track_caller]
fn state(&self, timeout: impl Into<Option<core::time::Duration>>) -> Result<gstreamer::State> {
self.upcast_ref().state(timeout)
}
fn wait_for(
&self,
state: gstreamer::State,
) -> impl std::future::Future<Output = Result<()>> + Send {
self.upcast_ref().wait_for(state)
}
fn wait_for_states(
&self,
states: impl AsRef<[gstreamer::State]> + Send,
) -> impl std::future::Future<Output = Result<()>> + Send {
self.upcast_ref().wait_for_states(states)
}
fn wait_for_message<'a, F2>(
&self,
filter: Option<&'a [gstreamer::MessageType]>,
filter_fn: F2,
) -> impl std::future::Future<Output = Result<gstreamer::Message>> + Send
where
F2: Fn(&gstreamer::Message) -> bool + Send + 'a,
{
self.upcast_ref().wait_for_message(filter, filter_fn)
}
}
impl<T: ChildOf<Pipeline> + Sync> PipelineExt for T {}

View File

@@ -1,29 +1,83 @@
use crate::*;
use crate::priv_prelude::*;
pub struct AppSink {
inner: gstreamer::Element,
#[doc(inline)]
pub use gstreamer_app::AppSinkCallbacks;
wrap_gst!(AppSink, gstreamer::Element);
parent_child!(Element, AppSink);
pub struct AppSinkBuilder {
inner: AppSink,
callbacks: Option<gstreamer_app::app_sink::AppSinkCallbacksBuilder>,
}
impl IsElement for AppSink {
fn as_element(&self) -> &Element {
unsafe { core::mem::transmute(&self.inner) }
impl AppSinkBuilder {
pub fn on_new_sample<F>(mut self, mut f: F) -> Self
where
F: FnMut(&AppSink) -> Result<(), gstreamer::FlowError> + Send + 'static,
{
let mut callbacks_builder = self
.callbacks
.take()
.unwrap_or_else(gstreamer_app::app_sink::AppSinkCallbacks::builder);
callbacks_builder = callbacks_builder.new_sample(move |appsink| {
use glib::object::Cast;
let element = appsink.upcast_ref::<gstreamer::Element>();
let appsink = AppSink::from_gst_ref(element);
std::panic::catch_unwind(std::panic::AssertUnwindSafe(|| f(appsink)))
.unwrap_or(Err(gstreamer::FlowError::Error))
.map(|_| gstreamer::FlowSuccess::Ok)
});
self.callbacks = Some(callbacks_builder);
self
}
fn into_element(self) -> Element {
Element { inner: self.inner }
pub fn on_new_preroll<F>(mut self, mut f: F) -> Self
where
F: FnMut(&AppSink) -> Result<(), gstreamer::FlowError> + Send + 'static,
{
let mut callbacks_builder = self
.callbacks
.take()
.unwrap_or_else(gstreamer_app::app_sink::AppSinkCallbacks::builder);
callbacks_builder = callbacks_builder.new_preroll(move |appsink| {
use glib::object::Cast;
let element = appsink.upcast_ref::<gstreamer::Element>();
let appsink = AppSink::from_gst_ref(element);
std::panic::catch_unwind(std::panic::AssertUnwindSafe(|| f(appsink)))
.unwrap_or(Err(gstreamer::FlowError::Error))
.map(|_| gstreamer::FlowSuccess::Ok)
});
self.callbacks = Some(callbacks_builder);
self
}
pub fn build(self) -> AppSink {
let AppSinkBuilder { inner, callbacks } = self;
if let Some(callbacks) = callbacks {
inner.appsink().set_callbacks(callbacks.build());
}
inner
}
}
impl Sink for AppSink {}
impl AppSink {
pub fn builder(name: impl AsRef<str>) -> AppSinkBuilder {
let inner = AppSink::new(name).expect("Failed to create AppSink");
AppSinkBuilder {
inner,
callbacks: None,
}
}
fn appsink(&self) -> &gstreamer_app::AppSink {
self.inner
.downcast_ref::<gstreamer_app::AppSink>()
.expect("Failed to downcast to AppSink")
}
pub fn new(name: impl AsRef<str>) -> Result<Self> {
use gstreamer::prelude::*;
let inner = gstreamer::ElementFactory::make("appsink")
.name(name.as_ref())
.build()
@@ -32,19 +86,55 @@ impl AppSink {
Ok(AppSink { inner })
}
pub fn with_caps(mut self, caps: &gstreamer::Caps) -> Self {
use gstreamer::prelude::*;
// self.inner.set_caps(Some(caps));
pub fn emit_signals(&mut self, emit: bool) -> &mut Self {
self.inner.set_property("emit-signals", emit);
self
}
pub fn set_callbacks(&self, callbacks: gstreamer_app::AppSinkCallbacks) -> Result<()> {
self.appsink().set_callbacks(callbacks);
Ok(())
pub fn async_(&mut self, async_: bool) -> &mut Self {
self.inner.set_property("async", async_);
self
}
pub fn pull_sample(&self, timeout: impl Into<Option<core::time::Duration>>) -> Result<Sample> {
use gstreamer::prelude::*;
pub fn sync(&mut self, sync: bool) -> &mut Self {
self.inner.set_property("sync", sync);
self
}
pub fn drop(&mut self, drop: bool) -> &mut Self {
self.inner.set_property("drop", drop);
self
}
pub fn caps(&mut self, caps: Caps) -> &mut Self {
self.inner.set_property("caps", caps.inner);
self
}
pub fn callbacks(&mut self, callbacks: gstreamer_app::AppSinkCallbacks) -> &mut Self {
self.appsink().set_callbacks(callbacks);
self
}
pub fn on_new_sample<F>(&mut self, mut f: F) -> &mut Self
where
F: FnMut(&AppSink) -> Result<(), gstreamer::FlowError> + Send + 'static,
{
self.emit_signals(true).callbacks(
AppSinkCallbacks::builder()
.new_sample(move |appsink| {
use glib::object::Cast;
let element = appsink.upcast_ref::<gstreamer::Element>();
let appsink = AppSink::from_gst_ref(element);
std::panic::catch_unwind(std::panic::AssertUnwindSafe(|| f(appsink)))
.unwrap_or(Err(gstreamer::FlowError::Error))
.map(|_| gstreamer::FlowSuccess::Ok)
})
.build(),
)
}
pub fn pull_sample(&self) -> Result<Sample> {
self.appsink()
.pull_sample()
.change_context(Error)
@@ -55,15 +145,13 @@ impl AppSink {
&self,
timeout: impl Into<Option<core::time::Duration>>,
) -> Result<Option<Sample>> {
use gstreamer::prelude::*;
Ok(self
.appsink()
.try_pull_sample(duration_to_clocktime(timeout)?)
.map(From::from))
}
pub fn pull_preroll(&self, timeout: impl Into<Option<core::time::Duration>>) -> Result<Sample> {
use gstreamer::prelude::*;
pub fn pull_preroll(&self) -> Result<Sample> {
self.appsink()
.pull_preroll()
.change_context(Error)
@@ -75,7 +163,6 @@ impl AppSink {
&self,
timeout: impl Into<Option<core::time::Duration>>,
) -> Result<Option<Sample>> {
use gstreamer::prelude::*;
Ok(self
.appsink()
.try_pull_preroll(duration_to_clocktime(timeout)?)
@@ -83,26 +170,109 @@ impl AppSink {
}
}
fn duration_to_clocktime(
timeout: impl Into<Option<core::time::Duration>>,
) -> Result<Option<gstreamer::ClockTime>> {
match timeout.into() {
Some(dur) => {
let clocktime = gstreamer::ClockTime::try_from(dur)
.change_context(Error)
.attach("Failed to convert duration to ClockTime")?;
Ok(Some(clocktime))
#[test]
fn test_appsink() {
use gstreamer::prelude::*;
use tracing_subscriber::prelude::*;
tracing_subscriber::registry()
.with(
tracing_subscriber::fmt::layer()
.with_thread_ids(true)
.with_file(true),
)
.init();
tracing::info!("Linking videoconvert to appsink");
Gst::new();
let playbin3 = playback::Playbin3::new("pppppppppppppppppppppppppppppp").unwrap().with_uri("https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c");
let video_convert = plugins::videoconvertscale::VideoConvert::new("vcvcvcvcvcvcvcvcvcvcvcvcvc")
.expect("Create videoconvert");
let mut appsink = app::AppSink::new("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa").expect("Create appsink");
appsink.caps(
Caps::builder(CapsType::Video)
.field("format", "RGB")
.build(),
);
let video_sink = video_convert
.link(&appsink)
.expect("Link videoconvert to appsink");
let playbin3 = playbin3.with_video_sink(&video_sink);
playbin3.play().expect("Play video");
let bus = playbin3.bus().unwrap();
for msg in bus.iter_timed(None) {
match msg.view() {
gstreamer::MessageView::Eos(..) => {
tracing::info!("End of stream reached");
break;
}
None => Ok(None),
gstreamer::MessageView::Error(err) => {
tracing::error!(
"Error from {:?}: {} ({:?})",
err.src().map(|s| s.path_string()),
err.error(),
err.debug()
);
break;
}
gstreamer::MessageView::StateChanged(state) => {
eprintln!(
"State changed from {:?} to \x1b[33m{:?}\x1b[0m for {:?}",
state.old(),
state.current(),
state.src().map(|s| s.path_string())
);
}
_ => {}
}
// tracing::info!("{:#?}", &msg.view());
}
// std::thread::sleep(std::time::Duration::from_secs(5));
}
pub struct Sample {
inner: gstreamer::Sample,
}
#[test]
fn test_appsink_metadata() {
use tracing_subscriber::prelude::*;
tracing_subscriber::registry()
.with(
tracing_subscriber::fmt::layer()
.with_thread_ids(true)
.with_file(true),
)
.init();
impl From<gstreamer::Sample> for Sample {
fn from(inner: gstreamer::Sample) -> Self {
Sample { inner }
}
crate::Gst::new();
let url = "https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c";
let videoconvert = crate::plugins::videoconvertscale::VideoConvert::new("iced-video-convert")
// .unwrap();
// .with_output_format(gst::plugins::videoconvertscale::VideoFormat::Rgba)
.unwrap();
let appsink = crate::plugins::app::AppSink::new("iced-video-sink")
.unwrap()
.with_async(true)
.with_sync(true);
let video_sink = videoconvert.link(&appsink).unwrap();
let playbin = crate::plugins::playback::Playbin3::new("iced-video")
.unwrap()
.with_uri(url)
.with_video_sink(&video_sink);
playbin.pause().unwrap();
smol::block_on(async {
playbin.wait_for(gstreamer::State::Paused).await.unwrap();
});
// std::thread::sleep(core::time::Duration::from_secs(1));
let pad = appsink.pad("sink").unwrap();
let caps = pad.current_caps().unwrap();
let format = caps.format();
let height = caps.height();
let width = caps.width();
let framerate = caps.framerate();
dbg!(&format, height, width, framerate);
dbg!(&caps);
}

View File

@@ -1,25 +1,13 @@
use crate::*;
use crate::priv_prelude::*;
#[repr(transparent)]
pub struct AutoVideoSink {
inner: gstreamer::Element,
}
impl IsElement for AutoVideoSink {
fn as_element(&self) -> &Element {
unsafe { core::mem::transmute(&self.inner) }
}
fn into_element(self) -> Element {
Element { inner: self.inner }
}
}
wrap_gst!(AutoVideoSink, gstreamer::Element);
parent_child!(Element, AutoVideoSink);
parent_child!(Bin, AutoVideoSink, downcast);
impl Sink for AutoVideoSink {}
impl AutoVideoSink {
pub fn new(name: impl AsRef<str>) -> Result<Self> {
use gstreamer::prelude::*;
let element = gstreamer::ElementFactory::make("autovideosink")
.name(name.as_ref())
.build()

View File

@@ -1,2 +1,71 @@
pub mod playbin3;
pub use playbin3::*;
pub mod playbin;
pub use playbin::*;
bitflags::bitflags! {
/// Extra flags to configure the behaviour of the sinks.
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub struct PlayFlags: u32 {
/// Render the video stream
const VIDEO = (1 << 0);
/// Render the audio stream
const AUDIO = (1 << 1);
/// Render subtitles
const TEXT = (1 << 2);
/// Render visualisation when no video is present
const VIS = (1 << 3);
/// Use software volume
const SOFT_VOLUME = (1 << 4);
/// Only use native audio formats
const NATIVE_AUDIO = (1 << 5);
/// Only use native video formats
const NATIVE_VIDEO = (1 << 6);
/// Attempt progressive download buffering
const DOWNLOAD = (1 << 7);
/// Buffer demuxed/parsed data
const BUFFERING = (1 << 8);
/// Deinterlace video if necessary
const DEINTERLACE = (1 << 9);
/// Use software color balance
const SOFT_COLORBALANCE = (1 << 10);
/// Force audio/video filter(s) to be applied
const FORCE_FILTERS = (1 << 11);
/// Force only software-based decoders (no effect for playbin3)
const FORCE_SW_DECODERS = (1 << 12);
}
}
const _: () = {
use glib::types::StaticType;
impl glib::types::StaticType for PlayFlags {
#[inline]
#[doc(alias = "gst_play_flags_get_type")]
fn static_type() -> glib::Type {
glib::Type::from_name("GstPlayFlags").expect("GstPlayFlags type not found")
}
}
impl glib::value::ToValue for PlayFlags {
#[inline]
fn to_value(&self) -> glib::Value {
let value = self.bits().to_value();
value
.transform_with_type(Self::static_type())
.expect("Failed to transform PlayFlags(u32) to GstPlayFlags")
}
#[inline]
fn value_type(&self) -> glib::Type {
Self::static_type()
}
}
impl From<PlayFlags> for glib::Value {
#[inline]
fn from(v: PlayFlags) -> Self {
// skip_assert_initialized!();
glib::value::ToValue::to_value(&v)
}
}
};

View File

@@ -0,0 +1,82 @@
use crate::priv_prelude::*;
wrap_gst!(Playbin, gstreamer::Element);
parent_child!(Element, Playbin);
parent_child!(Pipeline, Playbin, downcast);
parent_child!(Bin, Playbin, downcast);
impl Drop for Playbin {
fn drop(&mut self) {
self.set_state(gstreamer::State::Null).ok();
}
}
impl Playbin {
pub fn new(name: impl AsRef<str>) -> Result<Self> {
gstreamer::ElementFactory::make("playbin3")
.name(name.as_ref())
.build()
.map(|element| Playbin { inner: element })
.change_context(Error)
}
pub fn with_uri(self, uri: impl AsRef<str>) -> Self {
self.inner.set_property("uri", uri.as_ref());
self
}
pub fn with_buffer_duration(self, duration: impl Into<Option<core::time::Duration>>) -> Self {
let duration = match duration.into() {
Some(dur) => dur.as_secs() as i64,
None => -1,
};
self.inner.set_property("buffer-duration", duration);
self
}
pub fn with_buffer_size(self, size: impl Into<Option<u32>>) -> Self {
let size = match size.into() {
Some(size) => size as i32,
None => -1,
};
self.inner.set_property("buffer-size", size);
self
}
/// Sets the maximum size of the ring buffer in bytes.
pub fn with_ring_buffer_max_size(self, size: u64) -> Self {
self.inner.set_property("ring-buffer-max-size", size);
self
}
pub fn with_video_sink(self, video_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("video-sink", &video_sink.upcast_ref().inner);
self
}
pub fn with_text_sink(self, text_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("text-sink", &text_sink.upcast_ref().inner);
self
}
pub fn with_audio_sink(self, audio_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("audio-sink", &audio_sink.upcast_ref().inner);
self
}
pub fn set_volume(&self, volume: f64) {
self.inner.set_property("volume", volume.clamp(1.0, 100.0))
}
pub fn get_volume(&self) -> f64 {
self.inner.property::<f64>("volume")
}
pub fn with_flags(self, flags: playback::PlayFlags) -> Self {
self.inner.set_property("flags", flags);
self
}
}

View File

@@ -1,22 +1,19 @@
use crate::*;
pub struct Playbin3 {
inner: gstreamer::Element,
}
use crate::priv_prelude::*;
use playback::PlayFlags;
wrap_gst!(Playbin3, gstreamer::Element);
parent_child!(Element, Playbin3);
parent_child!(Pipeline, Playbin3, downcast);
parent_child!(Bin, Playbin3, downcast);
impl Drop for Playbin3 {
fn drop(&mut self) {
let _ = self
.inner
.set_state(gstreamer::State::Null)
.inspect_err(|e| {
tracing::error!("Failed to set playbin3 to Null state on drop: {:?}", e)
});
self.set_state(gstreamer::State::Null).ok();
}
}
impl Playbin3 {
pub fn new(name: impl AsRef<str>) -> Result<Self> {
use gstreamer::prelude::*;
gstreamer::ElementFactory::make("playbin3")
.name(name.as_ref())
.build()
@@ -25,101 +22,74 @@ impl Playbin3 {
}
pub fn with_uri(self, uri: impl AsRef<str>) -> Self {
use gstreamer::prelude::*;
self.inner.set_property("uri", uri.as_ref());
self
}
pub fn with_video_sink(self, video_sink: &impl IsElement) -> Self {
use gstreamer::prelude::*;
self.inner
.set_property("video-sink", &video_sink.as_element().inner);
pub fn with_buffer_duration(self, duration: impl Into<Option<core::time::Duration>>) -> Self {
let duration = match duration.into() {
Some(dur) => dur.as_secs() as i64,
None => -1,
};
self.inner.set_property("buffer-duration", duration);
self
}
pub fn with_text_sink(self, text_sink: &impl IsElement) -> Self {
use gstreamer::prelude::*;
self.inner
.set_property("text-sink", &text_sink.as_element().inner);
pub fn with_buffer_size(self, size: impl Into<Option<u32>>) -> Self {
let size = match size.into() {
Some(size) => size as i32,
None => -1,
};
self.inner.set_property("buffer-size", size);
self
}
pub fn with_audio_sink(self, audio_sink: &impl IsElement) -> Self {
use gstreamer::prelude::*;
/// Sets the maximum size of the ring buffer in bytes.
pub fn with_ring_buffer_max_size(self, size: u64) -> Self {
self.inner.set_property("ring-buffer-max-size", size);
self
}
pub fn with_video_sink(self, video_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("audio-sink", &audio_sink.as_element().inner);
.set_property("video-sink", &video_sink.upcast_ref().inner);
self
}
pub fn with_text_sink(self, text_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("text-sink", &text_sink.upcast_ref().inner);
self
}
pub fn with_audio_sink(self, audio_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("audio-sink", &audio_sink.upcast_ref().inner);
self
}
pub fn set_volume(&self, volume: f64) {
use gstreamer::prelude::*;
self.inner.set_property("volume", volume.clamp(1.0, 100.0))
}
pub fn get_volume(&self) -> f64 {
use gstreamer::prelude::*;
self.inner.property::<f64>("volume")
}
pub fn play(&self) -> Result<()> {
use gstreamer::prelude::*;
self.inner
.set_state(gstreamer::State::Playing)
.change_context(Error)
.attach("Failed to set playbin3 to Playing state")?;
Ok(())
}
pub fn bus(&self) -> Result<Bus> {
let bus = self
.inner
.bus()
.ok_or(Error)
.attach("Failed to get bus from playbin3")?;
Ok(Bus { bus })
pub fn with_flags(self, flags: playback::PlayFlags) -> Self {
self.inner.set_property("flags", flags);
self
}
}
#[test]
fn test_playbin3() {
use gstreamer::prelude::*;
use tracing_subscriber::prelude::*;
tracing_subscriber::registry()
.with(
tracing_subscriber::fmt::layer()
.with_thread_ids(true)
.with_file(true),
)
.init();
tracing::info!("Linking videoconvert to appsink");
gstreamer::init().unwrap();
let playbin3 = Playbin3::new("test_playbin3").unwrap().with_uri("https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c");
// let mut video_sink = Bin::new("wgpu_video_sink");
//
// let video_convert = plugins::videoconvertscale::VideoConvert::new("wgpu_video_convert")
// .expect("Create videoconvert");
// let appsink = AppSink::new("test_appsink").expect("Create appsink");
let appsink = plugins::autodetect::AutoVideoSink::new("test_autodetect_video_sink")
.expect("Create autodetect video sink");
// video_convert
// .link(&appsink)
// .expect("Link videoconvert to appsink");
//
// let sink_pad = video_convert.sink_pad();
// let sink_pad = Pad::ghost(&sink_pad).expect("Create ghost pad from videoconvert src");
// video_sink
// .add(appsink)
// .expect("Add appsink to video sink")
// .add(video_convert)
// .expect("Add videoconvert to video sink")
// .add_pad(&sink_pad)
// .expect("Add ghost pad to video sink");
// sink_pad.activate(true).expect("Activate ghost pad");
let playbin3 = playbin3.with_video_sink(&appsink);
playbin3.play().unwrap();
let bus = playbin3.bus().unwrap();
for msg in bus.iter_timed(None) {
tracing::info!("{:#?}", &msg.view());
impl Playbin3 {
pub fn default_flags() -> PlayFlags {
PlayFlags::SOFT_COLORBALANCE
| PlayFlags::DEINTERLACE
| PlayFlags::BUFFERING
| PlayFlags::SOFT_VOLUME
| PlayFlags::TEXT
| PlayFlags::AUDIO
| PlayFlags::VIDEO
}
// std::thread::sleep(std::time::Duration::from_secs(5));
}

View File

@@ -1,28 +1,15 @@
use crate::*;
use crate::priv_prelude::*;
#[doc(inline)]
pub use gstreamer_video::VideoFormat;
#[repr(transparent)]
pub struct VideoConvert {
inner: gstreamer::Element,
}
impl IsElement for VideoConvert {
fn as_element(&self) -> &Element {
unsafe { core::mem::transmute(&self.inner) }
}
fn into_element(self) -> Element {
Element { inner: self.inner }
}
}
wrap_gst!(VideoConvert, gstreamer::Element);
parent_child!(Element, VideoConvert);
impl Sink for VideoConvert {}
impl Source for VideoConvert {}
impl VideoConvert {
pub fn new(name: impl AsRef<str>) -> Result<Self> {
use gstreamer::prelude::*;
let element = gstreamer::ElementFactory::make("videoconvert")
.name(name.as_ref())
.build()

37
gst/src/sample.rs Normal file
View File

@@ -0,0 +1,37 @@
impl From<gstreamer::Sample> for Sample {
fn from(inner: gstreamer::Sample) -> Self {
Sample { inner }
}
}
#[repr(transparent)]
#[derive(Debug, Clone)]
pub struct Sample {
pub inner: gstreamer::Sample,
}
use gstreamer::BufferRef;
impl Sample {
#[doc(alias = "empty")]
pub fn new() -> Self {
Self {
inner: gstreamer::Sample::builder().build(),
}
}
pub fn buffer(&self) -> Option<&BufferRef> {
self.inner.buffer()
}
pub fn caps(&self) -> Option<&gstreamer::CapsRef> {
self.inner.caps()
}
pub fn info(&self) -> Option<&gstreamer::StructureRef> {
self.inner.info()
}
// pub fn set_buffer(&mut self) {
// self.inner.set_buffer(None);
// }
}

View File

@@ -1,3 +1,2 @@
// pub fn copy_sample_to_texture() {
//
// }

145
gst/src/wrapper.rs Normal file
View File

@@ -0,0 +1,145 @@
pub trait GstWrapper {
type GstType: glib::prelude::ObjectType;
fn from_gst(gst: Self::GstType) -> Self;
// fn into_gst(self) -> Self::GstType;
fn as_gst_ref(&self) -> &Self::GstType;
fn from_gst_ref(gst: &Self::GstType) -> &Self;
}
#[macro_export]
macro_rules! wrap_gst {
($name:ident) => {
$crate::wrap_gst!($name, gstreamer::$name);
};
($name:ident, $inner:ty) => {
$crate::wrap_gst!(core $name, $inner);
$crate::wrap_gst!($name, $inner, into_inner);
};
($name:ident, $inner:ty, skip_inner) => {
$crate::wrap_gst!(core $name, $inner);
};
(core $name:ident, $inner:ty) => {
#[derive(Debug, Clone)]
#[repr(transparent)]
pub struct $name {
pub(crate) inner: $inner,
}
// impl From<$name> for $inner {
// fn from(wrapper: $name) -> Self {
// wrapper.into_inner()
// }
// }
impl $name {
pub fn into_inner(self) -> $inner {
self.inner.clone()
}
}
impl $crate::wrapper::GstWrapper for $name {
type GstType = $inner;
fn from_gst(gst: Self::GstType) -> Self {
Self { inner: gst }
}
// fn into_gst(self) -> Self::GstType {
// self.inner.clone()
// }
fn as_gst_ref(&self) -> &Self::GstType {
&self.inner
}
fn from_gst_ref(gst: &Self::GstType) -> &Self {
unsafe { &*(gst as *const Self::GstType as *const Self) }
}
}
impl ChildOf<$name> for $name {
fn upcast_ref(&self) -> &$name {
self
}
}
};
($name:ident, $inner:ty, into_inner) => {
impl From<$inner> for $name {
fn from(inner: $inner) -> Self {
Self { inner }
}
}
};
}
/// A trait for types that can be upcasted to type T.
pub trait ChildOf<T> {
fn upcast_ref(&self) -> &T;
}
#[macro_export]
macro_rules! parent_child {
($parent:ty, $child:ty) => {
impl ChildOf<$parent> for $child
where
$child: GstWrapper,
$parent: GstWrapper,
{
fn upcast_ref(&self) -> &$parent {
let upcasted = self.inner.upcast_ref::<<$parent as GstWrapper>::GstType>();
unsafe { &*(upcasted as *const <$parent as GstWrapper>::GstType as *const $parent) }
}
}
};
($parent:ty, $child:ty, downcast) => {
impl ChildOf<$parent> for $child
where
$child: GstWrapper,
$parent: GstWrapper,
{
fn upcast_ref(&self) -> &$parent {
let downcasted = self
.inner
.downcast_ref::<<$parent as GstWrapper>::GstType>()
.expect(
format!(
"BUG: Failed to downcast GStreamer type from child {} to parent {}",
stringify!($child),
stringify!($parent)
)
.as_str(),
);
unsafe {
&*(downcasted as *const <$parent as GstWrapper>::GstType as *const $parent)
}
}
}
}; // ($parent:ty, $child:ty, deref) => {
// $crate::parent_child!($parent, $child);
// $crate::parent_child!($parent, $child, __deref);
// };
//
// ($parent:ty, $child:ty, downcast, deref) => {
// $crate::parent_child!($parent, $child, downcast);
// $crate::parent_child!($parent, $child, __deref);
// };
// ($parent:ty, $child:ty, deref, downcast) => {
// $crate::parent_child!($parent, $child, downcast);
// $crate::parent_child!($parent, $child, __deref);
// };
//
// ($parent:ty, $child:ty, __deref) => {
// impl core::ops::Deref for $child
// where
// $child: GstWrapper,
// $parent: GstWrapper,
// {
// type Target = $parent;
//
// fn deref(&self) -> &Self::Target {
// self.upcast_ref()
// }
// }
// };
}

View File

@@ -1,3 +1,7 @@
jello:
cargo r -r -- -vv
# iced-video:
# cd crates/iced-video && cargo run --release --example minimal
typegen:
@echo "Generating jellyfin type definitions..."
cd typegen && cargo run
@@ -8,5 +12,7 @@ hdrtest:
GST_DEBUG=3 gst-launch-1.0 playbin3 uri=https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c video-sink="videoconvert ! video/x-raw,format=(string)RGB10A2_LE ! fakesink"
codec:
GST_DEBUG=3 gst-discoverer-1.0 -v https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c
GST_DEBUG=3 gst-discoverer-1.0 https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c
ffprobe:
ffprobe -v error -show_format -show_streams "https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c" | grep pix_fmt

View File

@@ -14,31 +14,3 @@ fn main() -> Result<()> {
ui_iced::ui().change_context(Error)?;
Ok(())
}
// #[tokio::main]
// pub async fn main() -> Result<()> {
// dotenvy::dotenv()
// .change_context(Error)
// .inspect_err(|err| {
// eprintln!("Failed to load .env file: {}", err);
// })
// .ok();
// let config = JellyfinConfig::new(
// std::env::var("JELLYFIN_USERNAME").change_context(Error)?,
// std::env::var("JELLYFIN_PASSWORD").change_context(Error)?,
// std::env::var("JELLYFIN_SERVER_URL").change_context(Error)?,
// "jello".to_string(),
// );
// let mut jellyfin = api::JellyfinClient::new(config);
// jellyfin
// .authenticate_with_cached_token(".session")
// .await
// .change_context(Error)?;
//
// #[cfg(feature = "iced")]
// ui_iced::ui(jellyfin);
// #[cfg(feature = "gpui")]
// ui_gpui::ui(jellyfin);
//
// Ok(())
// }

View File

@@ -4,10 +4,9 @@ version = "0.1.0"
edition = "2024"
[dependencies]
bson = { version = "3.1.0", features = ["serde"] }
futures = "0.3.31"
parking_lot = "0.12.5"
redb = { version = "3.1.0", features = ["uuid"] }
secrecy = "0.10.3"
serde = "1.0.228"
tokio = { version = "1.48.0", features = ["rt"] }
uuid = "1.18.1"
uuid = { version = "1.18.1", features = ["v4"] }

View File

@@ -1,10 +1,10 @@
pub mod redb;
pub mod sqlite;
pub mod toml;
use std::collections::BTreeMap;
pub trait Store {
fn image(&self, id: &str) -> Option<Vec<u8>>;
fn save_image(&mut self, id: &str, data: &[u8]);
use uuid::Uuid;
pub struct ApiKey {
inner: secrecy::SecretBox<String>,
}
pub struct SecretStore {
api_keys: BTreeMap<Uuid, ApiKey>,
}
pub struct Settings {}

View File

@@ -1,225 +0,0 @@
use std::{
borrow::Borrow,
collections::VecDeque,
marker::PhantomData,
path::Path,
sync::{Arc, RwLock, atomic::AtomicBool},
};
use futures::task::AtomicWaker;
use redb::{Error, Key, ReadableDatabase, TableDefinition, Value};
use serde::{Serialize, de::DeserializeOwned};
const USERS: TableDefinition<uuid::Uuid, Vec<u8>> = TableDefinition::new("users");
const SERVERS: TableDefinition<uuid::Uuid, Vec<u8>> = TableDefinition::new("servers");
const SETTINGS: TableDefinition<uuid::Uuid, Vec<u8>> = TableDefinition::new("settings");
#[derive(Debug)]
pub struct TableInner<T> {
db: Arc<T>,
}
impl<T> Clone for TableInner<T> {
fn clone(&self) -> Self {
Self {
db: Arc::clone(&self.db),
}
}
}
impl<T> TableInner<T> {
fn new(db: Arc<T>) -> Self {
Self { db }
}
}
impl TableInner<DatabaseHandle> {
async fn get<'a, K: Key, V: Serialize + DeserializeOwned>(
&self,
table: TableDefinition<'static, K, Vec<u8>>,
key: impl Borrow<K::SelfType<'a>>,
) -> Result<Option<V>> {
let db: &redb::Database = &self.db.as_ref().database;
let db_reader = db.begin_read()?;
let table = db_reader.open_table(table)?;
table
.get(key)?
.map(|value| bson::deserialize_from_slice(&value.value()))
.transpose()
.map_err(|e| redb::Error::Io(std::io::Error::other(e)))
}
async fn insert<
'a,
'b,
K: Key + Send + Sync,
V: Serialize + DeserializeOwned + Send + Sync + 'a,
>(
&'b self,
table: TableDefinition<'static, K, Vec<u8>>,
key: impl Borrow<K::SelfType<'a>> + Send + 'b,
value: V,
) -> Result<Option<V>> {
let db: &redb::Database = &self.db.as_ref().database;
// self.db
// .writing
// .store(true, std::sync::atomic::Ordering::SeqCst);
// let out = tokio::task::spawn_blocking(move || -> Result<Option<V>>
let out = tokio::task::spawn_blocking(|| -> Result<Option<V>> {
let db_writer = db.begin_write()?;
let out = {
let mut table = db_writer.open_table(table)?;
let serialized_value = bson::serialize_to_vec(&value)
.map_err(|e| redb::Error::Io(std::io::Error::other(e)))?;
let previous = table.insert(key, &serialized_value)?;
let out = previous
.map(|value| bson::deserialize_from_slice(&value.value()))
.transpose()
.map_err(|e| redb::Error::Io(std::io::Error::other(e)));
out
};
db_writer.commit()?;
out
})
.await
.expect("Task panicked");
out
}
}
// impl<K: Key, V: Serialize + DeserializeOwned> Table<K, V> for TableInner {
// async fn get(&self, key: K) -> Result<Option<Value>> {}
// async fn insert(&self, key: K, value: V) -> Result<Option<Value>> {}
// async fn modify(&self, key: K, v: FnOnce(V) -> V) -> Result<bool> {}
// async fn remove(&self, key: K) -> Result<Option<Value>> {}
// }
#[derive(Debug)]
pub struct Users<T>(TableInner<T>);
impl<T> Clone for Users<T> {
fn clone(&self) -> Self {
Self(self.0.clone())
}
}
impl<T> Users<T> {
const TABLE: TableDefinition<'static, uuid::Uuid, Vec<u8>> = USERS;
}
#[derive(Debug)]
pub struct Servers<T>(TableInner<T>);
impl<T> Clone for Servers<T> {
fn clone(&self) -> Self {
Self(self.0.clone())
}
}
impl<T> Servers<T> {
const TABLE: TableDefinition<'static, uuid::Uuid, Vec<u8>> = SERVERS;
}
#[derive(Debug)]
pub struct Settings<T>(TableInner<T>);
impl<T> Clone for Settings<T> {
fn clone(&self) -> Self {
Self(self.0.clone())
}
}
impl<T> Settings<T> {
const TABLE: TableDefinition<'static, uuid::Uuid, Vec<u8>> = SETTINGS;
}
#[derive(Debug, Clone)]
pub struct Database {
users: Users<DatabaseHandle>,
servers: Servers<DatabaseHandle>,
settings: Settings<DatabaseHandle>,
handle: Arc<DatabaseHandle>,
}
#[derive(Debug)]
pub struct DatabaseHandle {
database: redb::Database,
writing: AtomicBool,
wakers: RwLock<VecDeque<AtomicWaker>>,
}
#[derive(Debug)]
pub struct DatabaseWriterGuard<'a> {
handle: &'a DatabaseHandle,
dropper: Arc<AtomicBool>,
}
// impl Drop for DatabaseWriterGuard<'_> {
// fn drop(&mut self) {
// self.handle
// .writing
// .store(false, std::sync::atomic::Ordering::SeqCst);
// let is_panicking = std::thread::panicking();
// let Ok(writer) = self.handle.wakers.write() else {
// if is_panicking {
// return;
// } else {
// panic!("Wakers lock poisoned");
// }
// }
// if let Some(waker) = (self.handle.wakers.write()).pop() {
// waker.wake();
// };
// // let mut wakers = self.handle.wakers.write().expect();
// // if let Some(waker) = self.handle.wakers.write().expect("Wakers lock poisoned").pop_front() {
// // waker.wake();
// // }
// // while let Some(waker) = wakers.pop_front() {
// // waker.wake();
// // }
// }
// }
type Result<O, E = redb::Error> = core::result::Result<O, E>;
pub trait Table<K: Key> {
fn insert<V: Serialize + DeserializeOwned>(
&self,
key: K,
value: V,
) -> impl Future<Output = Result<Option<V>>> + Send;
fn modify<V: Serialize + DeserializeOwned, O: Serialize + DeserializeOwned>(
&self,
key: K,
v: impl FnOnce(V) -> O,
) -> impl Future<Output = Result<bool>> + Send;
fn remove<V: Serialize + DeserializeOwned>(
&self,
key: K,
) -> impl Future<Output = Result<Option<V>>> + Send;
fn get<V: Serialize + DeserializeOwned>(
&self,
key: K,
) -> impl Future<Output = Result<Option<V>>> + Send;
}
impl Database {
pub fn create(path: impl AsRef<Path>) -> Result<Self, Error> {
let writing = AtomicBool::new(false);
let wakers = RwLock::new(VecDeque::new());
let db = redb::Database::create(path)?;
let db = Arc::new(DatabaseHandle {
database: db,
writing,
wakers,
});
let table_inner = TableInner::new(Arc::clone(&db));
let users = Users(table_inner.clone());
let servers = Servers(table_inner.clone());
let settings = Settings(table_inner.clone());
Ok(Self {
servers,
users,
settings,
handle: db,
})
}
}

View File

@@ -9,9 +9,22 @@ api = { version = "0.1.0", path = "../api" }
blurhash = "0.2.3"
bytes = "1.11.0"
gpui_util = "0.2.2"
iced = { workspace = true }
iced_video_player = { workspace = true }
reqwest = "0.12.24"
iced = { workspace = true, features = [
"advanced",
"canvas",
"image",
"sipper",
"tokio",
"debug",
"hot",
], default-features = true }
iced-video = { workspace = true }
iced_aw = "0.13.0"
iced_wgpu = "0.14.0"
iced_winit = "0.14.0"
reqwest = "0.13"
tap = "1.0.1"
toml = "0.9.8"
tracing = "0.1.41"

View File

@@ -2,8 +2,9 @@ mod settings;
mod video;
mod shared_string;
use iced_video_player::{Video, VideoPlayer};
use iced_video::{Ready, Video, VideoHandle};
use shared_string::SharedString;
use tap::Pipe as _;
use std::sync::Arc;
@@ -25,6 +26,8 @@ pub struct ItemCache {
pub tree: BTreeMap<Option<uuid::Uuid>, BTreeSet<uuid::Uuid>>,
}
const BACKGROUND_COLOR: iced::Color = iced::Color::from_rgba8(30, 30, 30, 0.7);
impl ItemCache {
pub fn insert(&mut self, parent: impl Into<Option<uuid::Uuid>>, item: Item) {
let parent = parent.into();
@@ -140,7 +143,7 @@ struct State {
screen: Screen,
settings: settings::SettingsState,
is_authenticated: bool,
video: Option<Arc<Video>>,
video: Option<Arc<VideoHandle<Message, Ready>>>,
}
impl State {
@@ -155,8 +158,6 @@ impl State {
query: None,
screen: Screen::Home,
settings: settings::SettingsState::default(),
// username_input: String::new(),
// password_input: String::new(),
is_authenticated: false,
video: None,
}
@@ -172,24 +173,14 @@ pub enum Message {
OpenItem(Option<uuid::Uuid>),
LoadedItem(Option<uuid::Uuid>, Vec<Item>),
Error(String),
SetToken(String),
Back,
Home,
// Login {
// username: String,
// password: String,
// config: api::JellyfinConfig,
// },
// LoginSuccess(String),
// LoadedClient(api::JellyfinClient, bool),
// Logout,
Video(video::VideoMessage),
}
fn update(state: &mut State, message: Message) -> Task<Message> {
// if let Some(client) = state.jellyfin_client.clone() {
match message {
Message::Settings(msg) => settings::update(&mut state.settings, msg),
Message::Settings(msg) => settings::update(state, msg),
Message::OpenItem(id) => {
if let Some(client) = state.jellyfin_client.clone() {
use api::jellyfin::BaseItemKind::*;
@@ -250,15 +241,6 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
state.messages.push(err);
Task::none()
}
Message::SetToken(token) => {
tracing::info!("Authenticated with token: {}", token);
state
.jellyfin_client
.as_mut()
.map(|mut client| client.set_token(token));
state.is_authenticated = true;
Task::none()
}
Message::Back => {
state.current = state.history.pop().unwrap_or(None);
Task::none()
@@ -269,7 +251,6 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
}
Message::SearchQueryChanged(query) => {
state.query = Some(query);
// Handle search query change
Task::none()
}
Message::Search => {
@@ -294,9 +275,29 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
}
fn view(state: &State) -> Element<'_, Message> {
let content = home(state);
match state.screen {
Screen::Settings => settings::settings(state),
Screen::Home | _ => home(state),
Screen::Settings => {
let settings = settings::settings(state);
let settings = container(settings)
.width(Length::FillPortion(4))
.height(Length::FillPortion(4))
.style(container::rounded_box)
.pipe(mouse_area)
.on_press(Message::Refresh)
.pipe(|c| iced::widget::column![space::vertical(), c, space::vertical()])
.pipe(container)
.width(Length::Fill)
.width(Length::Fill)
.align_y(Alignment::Center)
.align_x(Alignment::Center)
.style(|_| container::background(BACKGROUND_COLOR))
.padding(50)
.pipe(mouse_area)
.on_press(Message::Settings(settings::SettingsMessage::Close));
stack![content, settings].into()
}
Screen::Home | _ => content,
}
}
@@ -311,18 +312,16 @@ fn body(state: &State) -> Element<'_, Message> {
if let Some(ref video) = state.video {
video::player(video)
} else {
scrollable(
container(
Grid::with_children(state.cache.items_of(state.current).into_iter().map(card))
.fluid(400)
.spacing(50),
)
.spacing(50)
.pipe(container)
.padding(50)
.align_x(Alignment::Center)
// .align_y(Alignment::Center)
.height(Length::Fill)
.width(Length::Fill),
)
.width(Length::Fill)
.pipe(scrollable)
.height(Length::Fill)
.into()
}
@@ -330,19 +329,17 @@ fn body(state: &State) -> Element<'_, Message> {
fn header(state: &State) -> Element<'_, Message> {
row([
container(
Button::new(
Text::new(
text(
state
.jellyfin_client
.as_ref()
.map(|c| c.config.server_url.as_str())
.unwrap_or("No Server"),
)
.align_x(Alignment::Start),
)
.on_press(Message::Home),
)
.align_x(Alignment::Start)
.pipe(button)
.on_press(Message::Home)
.pipe(container)
.padding(10)
.width(Length::Fill)
.height(Length::Fill)
@@ -351,7 +348,6 @@ fn header(state: &State) -> Element<'_, Message> {
.style(container::rounded_box)
.into(),
search(state),
container(
row([
button("Refresh").on_press(Message::Refresh).into(),
button("Settings")
@@ -361,8 +357,8 @@ fn header(state: &State) -> Element<'_, Message> {
.on_press(Message::Video(video::VideoMessage::Test))
.into(),
])
.spacing(10),
)
.spacing(10)
.pipe(container)
.padding(10)
.width(Length::Fill)
.height(Length::Fill)
@@ -378,14 +374,13 @@ fn header(state: &State) -> Element<'_, Message> {
}
fn search(state: &State) -> Element<'_, Message> {
container(
TextInput::new("Search...", state.query.as_deref().unwrap_or_default())
.padding(10)
.size(16)
.width(Length::Fill)
.on_input(Message::SearchQueryChanged)
.on_submit(Message::Search),
)
.on_submit(Message::Search)
.pipe(container)
.padding(10)
.width(Length::Fill)
.height(Length::Shrink)

View File

@@ -2,16 +2,26 @@ use crate::*;
use iced::Element;
pub fn settings(state: &State) -> Element<'_, Message> {
empty()
screens::settings(state)
}
pub fn update(_state: &mut SettingsState, message: SettingsMessage) -> Task<Message> {
pub fn update(state: &mut State, message: SettingsMessage) -> Task<Message> {
match message {
SettingsMessage::Open => {}
SettingsMessage::Close => {}
SettingsMessage::Open => {
tracing::trace!("Opening settings");
state.screen = Screen::Settings;
}
SettingsMessage::Close => {
tracing::trace!("Closing settings");
state.screen = Screen::Home;
}
SettingsMessage::Select(screen) => {
tracing::trace!("Switching settings screen to {:?}", screen);
state.settings.screen = screen;
}
SettingsMessage::User(user) => state.settings.login_form.update(user),
SettingsMessage::Server(server) => state.settings.server_form.update(server),
}
Task::none()
}
@@ -32,9 +42,31 @@ pub enum SettingsMessage {
Open,
Close,
Select(SettingsScreen),
User(UserMessage),
Server(ServerMessage),
}
#[derive(Debug, Clone, Default)]
#[derive(Debug, Clone)]
pub enum UserMessage {
Add,
UsernameChanged(String),
PasswordChanged(String),
// Edit(uuid::Uuid),
// Delete(uuid::Uuid),
Clear,
}
#[derive(Debug, Clone)]
pub enum ServerMessage {
Add,
NameChanged(String),
UrlChanged(String),
// Edit(uuid::Uuid),
// Delete(uuid::Uuid),
Clear,
}
#[derive(Debug, Clone, Default, PartialEq, Eq)]
pub enum SettingsScreen {
#[default]
Main,
@@ -58,25 +90,207 @@ pub struct UserItem {
#[derive(Debug, Clone, Default)]
pub struct LoginForm {
username: Option<String>,
password: Option<String>,
username: String,
password: String,
}
impl LoginForm {
pub fn update(&mut self, message: UserMessage) {
match message {
UserMessage::UsernameChanged(data) => {
self.username = data;
}
UserMessage::PasswordChanged(data) => {
self.password = data;
}
UserMessage::Add => {
// Handle adding user
}
UserMessage::Clear => {
self.username.clear();
self.password.clear();
}
}
}
pub fn view(&self) -> Element<'_, Message> {
iced::widget::column![
text("Login Form"),
text_input("Enter Username", &self.username).on_input(|data| {
Message::Settings(SettingsMessage::User(UserMessage::UsernameChanged(data)))
}),
text_input("Enter Password", &self.password)
.secure(true)
.on_input(|data| {
Message::Settings(SettingsMessage::User(UserMessage::PasswordChanged(data)))
}),
row![
button(text("Add User")).on_press_maybe(self.validate()),
button(text("Cancel"))
.on_press(Message::Settings(SettingsMessage::User(UserMessage::Clear))),
]
.spacing(10),
]
.spacing(10)
.padding([10, 0])
.into()
}
pub fn validate(&self) -> Option<Message> {
(!self.username.is_empty() && !self.password.is_empty())
.then(|| Message::Settings(SettingsMessage::User(UserMessage::Add)))
}
}
#[derive(Debug, Clone, Default)]
pub struct ServerForm {
name: Option<String>,
url: Option<String>,
name: String,
url: String,
}
impl ServerForm {
pub fn update(&mut self, message: ServerMessage) {
match message {
ServerMessage::NameChanged(data) => {
self.name = data;
}
ServerMessage::UrlChanged(data) => {
self.url = data;
}
ServerMessage::Add => {
// Handle adding server
}
ServerMessage::Clear => {
self.name.clear();
self.url.clear();
}
_ => {}
}
}
pub fn view(&self) -> Element<'_, Message> {
iced::widget::column![
text("Add New Server"),
text_input("Enter server name", &self.name).on_input(|data| {
Message::Settings(SettingsMessage::Server(ServerMessage::NameChanged(data)))
}),
text_input("Enter server URL", &self.url).on_input(|data| {
Message::Settings(SettingsMessage::Server(ServerMessage::UrlChanged(data)))
}),
row![
button(text("Add Server")).on_press_maybe(self.validate()),
button(text("Cancel")).on_press(Message::Settings(SettingsMessage::Server(
ServerMessage::Clear
))),
]
.spacing(10),
]
.spacing(10)
.padding([10, 0])
.into()
}
pub fn validate(&self) -> Option<Message> {
(!self.name.is_empty() && !self.url.is_empty())
.then(|| Message::Settings(SettingsMessage::Server(ServerMessage::Add)))
}
}
mod screens {
use iced_aw::Tabs;
use super::*;
pub fn settings(state: &State) -> Element<'_, Message> {
Tabs::new(|f| Message::Settings(SettingsMessage::Select(f)))
.push(
SettingsScreen::Main,
iced_aw::TabLabel::Text("General".into()),
main(state),
)
.push(
SettingsScreen::Servers,
iced_aw::TabLabel::Text("Servers".into()),
server(state),
)
.push(
SettingsScreen::Users,
iced_aw::TabLabel::Text("Users".into()),
user(state),
)
.set_active_tab(&state.settings.screen)
.into()
}
pub fn settings_screen(state: &State) -> Element<'_, Message> {
container(match state.settings.screen {
SettingsScreen::Main => main(state),
SettingsScreen::Servers => server(state),
SettingsScreen::Users => user(state),
})
.width(Length::FillPortion(10))
.height(Length::Fill)
.style(|theme| container::background(theme.extended_palette().background.base.color))
.pipe(container)
.padding(10)
.style(|theme| container::background(theme.extended_palette().secondary.base.color))
.width(Length::FillPortion(10))
.into()
}
pub fn settings_list(state: &State) -> Element<'_, Message> {
column(
[
button(center_text("General")).on_press(Message::Settings(
SettingsMessage::Select(SettingsScreen::Main),
)),
button(center_text("Servers")).on_press(Message::Settings(
SettingsMessage::Select(SettingsScreen::Servers),
)),
button(center_text("Users")).on_press(Message::Settings(SettingsMessage::Select(
SettingsScreen::Users,
))),
]
.map(|p| p.clip(true).width(Length::Fill).into()),
)
.width(Length::FillPortion(2))
.spacing(10)
.padding(10)
.pipe(scrollable)
.into()
}
pub fn main(state: &State) -> Element<'_, Message> {
empty()
Column::new()
.push(text("Main Settings"))
.push(toggler(true).label("HDR"))
.spacing(20)
.padding(20)
.pipe(container)
.into()
}
pub fn server(state: &State) -> Element<'_, Message> {
empty()
Column::new()
.push(text("Server Settings"))
.push(state.settings.server_form.view())
.spacing(20)
.padding(20)
.pipe(container)
.into()
}
pub fn user(state: &State) -> Element<'_, Message> {
empty()
Column::new()
.push(text("User Settings"))
.push(state.settings.login_form.view())
.spacing(20)
.padding(20)
.pipe(container)
.into()
}
}
pub fn center_text(content: &str) -> Element<'_, Message> {
text(content)
.align_x(Alignment::Center)
.width(Length::Fill)
.into()
}

View File

@@ -3,6 +3,7 @@ use super::*;
pub enum VideoMessage {
EndOfStream,
Open(url::Url),
Loaded(VideoHandle<Message, Ready>),
Pause,
Play,
Seek(f64),
@@ -17,34 +18,26 @@ pub fn update(state: &mut State, message: VideoMessage) -> Task<Message> {
Task::none()
}
VideoMessage::Open(url) => {
match Video::new(&url)
.inspect_err(|err| {
tracing::error!("Failed to play video at {}: {:?}", url, err);
Task::perform(VideoHandle::load(url.clone()), move |result| match result {
Ok(video) => Message::Video(VideoMessage::Loaded(video)),
Err(err) => Message::Error(format!("Error opening video at {}: {:?}", url, err)),
})
.inspect(|video| {
tracing::error!("Framerate is {}", video.framerate());
})
.map(Arc::new)
{
Ok(video) => {
state.video = Some(video);
Task::none()
}
Err(err) => Task::done(Message::Error(format!(
"Error opening video at {}: {:?}",
url, err
))),
}
VideoMessage::Loaded(video) => {
state.video = Some(Arc::new(
video.on_end_of_stream(Message::Video(VideoMessage::EndOfStream)),
));
Task::done(VideoMessage::Play).map(Message::Video)
}
VideoMessage::Pause => {
if let Some(video) = state.video.as_mut().and_then(Arc::get_mut) {
video.set_paused(true);
if let Some(ref video) = state.video {
video.pause();
}
Task::none()
}
VideoMessage::Play => {
if let Some(video) = state.video.as_mut().and_then(Arc::get_mut) {
video.set_paused(false);
if let Some(ref video) = state.video {
video.play();
}
Task::none()
}
@@ -55,28 +48,26 @@ pub fn update(state: &mut State, message: VideoMessage) -> Task<Message> {
Task::none()
}
VideoMessage::Stop => {
state.video.as_ref().map(|video| {
video.stop();
});
state.video = None;
Task::none()
}
VideoMessage::Test => {
let url = url::Url::parse(
// "file:///home/servius/Projects/jello/crates/iced_video_player/.media/test.mp4",
"https://gstreamer.freedesktop.org/data/media/sintel_trailer-480p.webm",
// "https://www.youtube.com/watch?v=QbUUaXGA3C4",
)
let url = url::Url::parse("https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c")
.expect("Impossible: Failed to parse hardcoded URL");
Task::done(Message::Video(VideoMessage::Open(url)))
Task::done(VideoMessage::Open(url)).map(Message::Video)
}
}
}
pub fn player(video: &Video) -> Element<'_, Message> {
pub fn player(video: &VideoHandle<Message, Ready>) -> Element<'_, Message> {
container(
VideoPlayer::new(video)
Video::new(video)
.width(Length::Fill)
.height(Length::Fill)
.content_fit(iced::ContentFit::Contain)
.on_end_of_stream(Message::Video(VideoMessage::EndOfStream)),
.content_fit(iced::ContentFit::Contain),
)
.style(|_| container::background(iced::Color::BLACK))
.width(Length::Fill)