Compare commits

...

44 Commits

Author SHA1 Message Date
d509fb7813 feat: Update cargo.lock 2026-01-29 00:45:46 +05:30
dcbb5a127b feat: Move settings to a tab based ui 2026-01-29 00:40:12 +05:30
e66c457b57 feat: Added BACKGROUND_COLOR to settings popup 2026-01-28 02:06:57 +05:30
76fc14c73b feat: Use a floating box for settings 2026-01-28 02:00:45 +05:30
5b4fbd5df6 feat(store): add SecretStore, ApiKey, remove Store trait 2026-01-26 21:00:56 +05:30
e7fd01c0af chore: Update cargo.lock 2026-01-20 21:52:46 +05:30
a040478069 chore: Update flake.lock 2026-01-20 21:52:46 +05:30
e5ef173473 fix(iced-video): Update the color matrices and subtract .5 from uv samples to
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2026-01-15 17:25:09 +05:30
429371002b fix(iced-video): Write the conversion matrix buffer so the video actually shows up
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2026-01-15 17:01:41 +05:30
335e8fdbef feat: move cuda to linux
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2026-01-14 15:55:49 +05:30
9dac0b6c78 feat(iced-video): added video format to the video frame
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2026-01-14 09:51:56 +05:30
uttarayan21
97a7a632d4 feat(iced-video): implement planar YUV texture support with HDR conversion matrices and update dependencies
Some checks failed
build / checks-matrix (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
build / checks-build (push) Has been cancelled
2026-01-04 23:02:47 +05:30
uttarayan21
29390140cd feat(settings): simplify form updates and temporarily disable server toggler 2025-12-27 00:13:54 +05:30
uttarayan21
97c2b3f14c feat(settings): implement user and server form handling with update functions and UI views 2025-12-27 00:04:42 +05:30
uttarayan21
2b2e8060e7 feat(ui-iced): implement settings screen with navigation and basic UI elements
Some checks failed
build / checks-matrix (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
build / checks-build (push) Has been cancelled
2025-12-26 21:21:58 +05:30
uttarayan21
584495453f feat: Many more improvements to video player now with a subscription 2025-12-26 19:06:40 +05:30
uttarayan21
99853167df feat(config): enable unfree packages, add CUDA toolkit
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-26 10:43:15 +05:30
uttarayan21
fc9555873b refactor: move PlayFlags defaults into Playbin3 and clean up unused prelude imports
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-26 10:39:00 +05:30
uttarayan21
a7ffa69326 fix(iced-video): Fix the very high ram usage
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
feat(playback): add GstPlayFlags for playbin and playbin3
2025-12-26 10:29:31 +05:30
uttarayan21
4ed15c97f0 feat: Add keybinds to minimal example 2025-12-25 21:43:55 +05:30
uttarayan21
a2491695b3 fix(video): try to optimize memory leaks
Some checks failed
build / checks-matrix (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
build / checks-build (push) Has been cancelled
2025-12-25 06:28:52 +05:30
uttarayan21
5a0bdae84b fix: Try to minimize frame latency 2025-12-25 05:48:51 +05:30
uttarayan21
5d0b795ba5 feat: Added readme and forgotten id.rs
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-25 02:15:43 +05:30
uttarayan21
ebe2312272 feat: Get iced-video working 2025-12-25 02:14:56 +05:30
uttarayan21
3382aebb1f feat: Added PipelineExt trait for all Children of Pipelines
Some checks failed
build / checks-matrix (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
build / checks-build (push) Has been cancelled
2025-12-23 01:33:54 +05:30
uttarayan21
8d46bd2b85 feat: Restructure the gst parent<->child relations 2025-12-23 01:09:01 +05:30
uttarayan21
043d1e99f0 feat: Modify gst crate to add lot of more granularity 2025-12-22 13:27:30 +05:30
uttarayan21
d42ef3b550 feat(gst): enhance GStreamer integration with new modules and improved API
Some checks failed
build / checks-matrix (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
build / checks-build (push) Has been cancelled
This commit introduces significant enhancements to the GStreamer integration by:
- Adding new modules for bins, caps, elements, pads, and plugins
- Implementing a more ergonomic API with helper methods like play(), pause(), ready()
- Adding support for various GStreamer plugins including app, autodetect, playback, and videoconvertscale
- Improving error handling with better context attachment
- Updating dependencies to latest versions including gstreamer-video 0.24.4
- Refactoring existing code to use modern Rust patterns and features
2025-12-17 23:35:05 +05:30
uttarayan21
21cbaff610 feat(gst): implement Playbin3 wrapper with basic playback functionality
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-17 14:08:17 +05:30
uttarayan21
a0bda88246 feat(gst): add glib dependency and update video texture handling 2025-12-17 14:07:53 +05:30
uttarayan21
ccae03d105 feat: enable proper detection of hdr texture
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-16 19:14:20 +05:30
uttarayan21
232c0f4d53 chore: Update .gitignore file
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-16 14:50:49 +05:30
uttarayan21
5cec7821d0 feat: Move perf to linux only packages and update Cargo.lock file
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-16 14:50:00 +05:30
uttarayan21
c2fdedf05a feat(examples): update package name and add perf ignore rules
Some checks failed
build / checks-matrix (push) Has been cancelled
build / checks-build (push) Has been cancelled
build / codecov (push) Has been cancelled
docs / docs (push) Has been cancelled
2025-12-16 02:26:54 +05:30
uttarayan21
7003002b69 chore: remove unused rust-analyzer target files and fix compilation errors in jello-test example 2025-12-16 02:25:36 +05:30
uttarayan21
c675c29be3 chore(gst): Remove flake files and configurations from gst 2025-12-16 02:25:09 +05:30
uttarayan21
7f9152e8fd feat(gst): Added gst a high level wrapper over gstreamer
chore(example): Added hdr-gstreamer-wgpu example
chore(license): Added MIT license to all crates
2025-12-16 02:23:30 +05:30
uttarayan21
6cc83ba655 chore: remove iced_video_player crate and its dependencies 2025-12-15 17:59:40 +05:30
uttarayan21
253d27c176 feat: Update iced_video_player to master 2025-12-13 03:40:12 +05:30
uttarayan21
c7afcd3f0d fix: remove debug statements from video playback initialization 2025-12-09 23:56:20 +05:30
uttarayan21
d75a2fb7e4 feat(ui): comment out gpui ui code and improve iced ui logic 2025-12-09 23:46:00 +05:30
uttarayan21
73fcf9bad1 feat: add jello-types crate and update dependencies with backtrace support 2025-12-09 23:28:51 +05:30
uttarayan21
05ae9ff570 feat(store): add database storage with redb and bson support
This commit introduces a new `store` crate that provides database functionality using redb for storage and bson for serialization. It includes tables for users, servers, and settings, along with async operations for getting, inserting, modifying, and removing data. The store supports UUID keys and integrates with the existing Jellyfin client authentication flow.

The changes also include:
- Adding new dependencies to Cargo.lock for bitvec, bson, deranged, funty, num-conv, powerfmt, radium, serde_bytes, simdutf8, time, and wyz
- Updating Cargo.toml to include the new store crate in workspace members
- Modifying ui-iced to use the new database initialization flow with config loading from TOML
- Adding a settings module to ui-iced with UI components for managing server and user configuration
- Implementing secret string handling for sensitive data like passwords
- Updating API client to support pre-authenticated clients with cached tokens
2025-11-26 16:15:41 +05:30
uttarayan21
ca1fd2e977 feat: Update the api crate 2025-11-25 18:48:13 +05:30
79 changed files with 7379 additions and 9035 deletions

1826
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -5,19 +5,24 @@ members = [
"typegen", "typegen",
"ui-gpui", "ui-gpui",
"ui-iced", "ui-iced",
"crates/iced_video_player", "store",
"jello-types",
"gst",
"examples/hdr-gstreamer-wgpu",
"crates/iced-video",
] ]
[workspace.dependencies] [workspace.dependencies]
iced = { git = "https://github.com/iced-rs/iced", features = [ iced = { version = "0.14.0" }
"advanced", gst = { version = "0.1.0", path = "gst" }
"canvas", iced_wgpu = { version = "0.14.0" }
"image", iced-video = { version = "0.1.0", path = "crates/iced-video" }
"sipper",
"tokio", [patch.crates-io]
"debug", iced_wgpu = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
] } iced_core = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
iced_wgpu = { git = "https://github.com/iced-rs/iced" } iced_renderer = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
iced_video_player = { path = "crates/iced_video_player" } iced_futures = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
iced = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
[package] [package]
name = "jello" name = "jello"
@@ -27,8 +32,11 @@ license = "MIT"
[dependencies] [dependencies]
api = { version = "0.1.0", path = "api" } api = { version = "0.1.0", path = "api" }
bytemuck = { version = "1.24.0", features = ["derive"] }
clap = { version = "4.5", features = ["derive"] } clap = { version = "4.5", features = ["derive"] }
clap-verbosity-flag = { version = "3.0.4", features = ["tracing"] }
clap_complete = "4.5" clap_complete = "4.5"
color-backtrace = "0.7.2"
dotenvy = "0.15.7" dotenvy = "0.15.7"
error-stack = "0.6" error-stack = "0.6"
thiserror = "2.0" thiserror = "2.0"

112
README.md Normal file
View File

@@ -0,0 +1,112 @@
# Jello
A WIP video client for jellyfin.
(Planned) Features
1. Integrate with jellyfin
2. HDR video playback
3. Audio Track selection
4. Chapter selection
Libraries and frameworks used for this
1. iced -> primary gui toolkit
2. gstreamer -> primary video + audio decoding library
3. wgpu -> rendering the video from gstreamer in iced
### HDR
I'll try to document all my findings about HDR here.
I'm making this project to mainly learn about videos, color-spaces and gpu programming. And so very obviously I'm bound to make mistakes in either the code or the fundamental understanding of a concept. Please don't take anything in this text as absolute.
```rust
let window = ... // use winnit to get a window handle, check the example in this repo
let instance = wgpu::Instance::default();
let surface = instance.create_surface(window).unwrap();
let adapter = instance
.request_adapter(&wgpu::RequestAdapterOptions {
power_preference: wgpu::PowerPreference::default(),
compatible_surface: Some(&surface),
force_fallback_adapter: false,
})
.await
.context("Failed to request wgpu adapter")?;
let caps = surface.get_capabilities();
println!("{:#?}", caps.formats);
```
This should print out all the texture formats that can be used by your current hardware
Among these the formats that support hdr (afaik) are
```
wgpu::TextureFormat::Rgba16Float
wgpu::TextureFormat::Rgba32Float
wgpu::TextureFormat::Rgb10a2Unorm
wgpu::TextureFormat::Rgb10a2Uint // (unsure)
```
My display supports Rgb10a2Unorm so I'll be going forward with that texture format.
`Rgb10a2Unorm` is still the same size as a `Rgba8Unorm` but data is in a different representation in each of them
`Rgb10a2Unorm`:
R, G, B => 10 bits each (2^10 = 1024 [0..=1023])
A => 2 bits (2^2 = 4 [0..=3])
Whereas in a normal pixel
`Rgba8Unorm`
R, G, B, A => 8 bits each (2^8 = 256 [0..=255])
For displaying videos the alpha components is not really used (I don't know of any) so we can use re-allocate 6 bits from the alpha channel and put them in the r,g and b components.
In the shader the components get uniformly normalized from [0..=1023] integer to [0..=1] in float so we can compute them properly
Videos however are generally not stored in this format or any rgb format in general because it is not as efficient for (lossy) compression as YUV formats.
Right now I don't want to deal with yuv formats so I'll use gstreamer caps to convert the video into `Rgba10a2` format
## Pixel formats and Planes
Dated: Sun Jan 4 09:09:16 AM IST 2026
| value | count | quantile | percentage | frequency |
| --- | --- | --- | --- | --- |
| yuv420p | 1815 | 0.5067001675041876 | 50.67% | ************************************************** |
| yuv420p10le | 1572 | 0.4388609715242881 | 43.89% | ******************************************* |
| yuvj420p | 171 | 0.04773869346733668 | 4.77% | **** |
| rgba | 14 | 0.003908431044109436 | 0.39% | |
| yuvj444p | 10 | 0.0027917364600781687 | 0.28% | |
For all of my media collection these are the pixel formats for all the videos
### RGBA
Pretty self evident
8 channels for each of R, G, B and A
Hopefully shouldn't be too hard to make a function or possibly a lut that takes data from rgba and maps it to Rgb10a2Unorm
```mermaid
packet
title RGBA
+8: "R"
+8: "G"
+8: "B"
+8: "A"
```
### YUV
[All YUV formats](https://learn.microsoft.com/en-us/windows/win32/medfound/recommended-8-bit-yuv-formats-for-video-rendering#surface-definitions)
[10 and 16 bit yuv formats](https://learn.microsoft.com/en-us/windows/win32/medfound/10-bit-and-16-bit-yuv-video-formats)
Y -> Luminance
U,V -> Chrominance
p -> Planar
sp -> semi planar
j -> full range
planar formats have each of the channels in a contiguous array one after another
in semi-planar formats the y channel is seperate and uv channels are interleaved
## Chroma Subsampling

View File

@@ -2,6 +2,7 @@
name = "api" name = "api"
version = "0.1.0" version = "0.1.0"
edition = "2024" edition = "2024"
license = "MIT"
[dependencies] [dependencies]
bytes = "1.11.0" bytes = "1.11.0"

View File

@@ -4,7 +4,7 @@ pub async fn main() {
let config = std::fs::read_to_string("config.toml").expect("Config.toml"); let config = std::fs::read_to_string("config.toml").expect("Config.toml");
let config: JellyfinConfig = toml::from_str(&config).expect("Failed to parse config.toml"); let config: JellyfinConfig = toml::from_str(&config).expect("Failed to parse config.toml");
let mut jellyfin = JellyfinClient::new(config); let mut jellyfin = JellyfinClient::new_with_config(config);
jellyfin jellyfin
.authenticate_with_cached_token(".session") .authenticate_with_cached_token(".session")
.await .await

View File

@@ -3,7 +3,7 @@ pub mod jellyfin;
use std::sync::Arc; use std::sync::Arc;
use ::tap::*; use ::tap::*;
use reqwest::Method; use reqwest::{Method, header::InvalidHeaderValue};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
#[derive(thiserror::Error, Debug)] #[derive(thiserror::Error, Debug)]
@@ -15,6 +15,8 @@ pub enum JellyfinApiError {
#[error("IO error: {0}")] #[error("IO error: {0}")]
IoError(#[from] std::io::Error), IoError(#[from] std::io::Error),
#[error("Unknown Jellyfin API error")] #[error("Unknown Jellyfin API error")]
InvalidHeader(#[from] InvalidHeaderValue),
#[error("Unknown Jellyfin API error")]
Unknown, Unknown,
} }
@@ -28,7 +30,49 @@ pub struct JellyfinClient {
} }
impl JellyfinClient { impl JellyfinClient {
pub fn new(config: JellyfinConfig) -> Self { pub async fn authenticate(
username: impl AsRef<str>,
password: impl AsRef<str>,
config: JellyfinConfig,
) -> Result<Self> {
let url = format!("{}/Users/AuthenticateByName", config.server_url);
let client = reqwest::Client::new();
let token = client
.post(url)
.json(&jellyfin::AuthenticateUserByName {
username: Some(username.as_ref().to_string()),
pw: Some(password.as_ref().to_string()),
})
.send()
.await?
.error_for_status()?
.json::<jellyfin::AuthenticationResult>()
.await?
.access_token
.ok_or_else(|| std::io::Error::other("No field access_token in auth response"))?;
Self::pre_authenticated(token, config)
}
pub fn pre_authenticated(token: impl AsRef<str>, config: JellyfinConfig) -> Result<Self> {
let auth_header = core::iter::once((
reqwest::header::HeaderName::from_static("x-emby-authorization"),
reqwest::header::HeaderValue::from_str(&format!(
"MediaBrowser Client=\"{}\", Device=\"{}\", DeviceId=\"{}\", Version=\"{}\"",
config.client_name, config.device_name, config.device_id, config.version
))?,
))
.collect();
let client = reqwest::Client::builder()
.default_headers(auth_header)
.build()?;
Ok(Self {
client,
access_token: Some(token.as_ref().to_string().into()),
config: Arc::new(config),
})
}
pub fn new_with_config(config: JellyfinConfig) -> Self {
JellyfinClient { JellyfinClient {
client: reqwest::Client::new(), client: reqwest::Client::new(),
access_token: None, access_token: None,
@@ -119,45 +163,6 @@ impl JellyfinClient {
Ok(out) Ok(out)
} }
pub async fn authenticate(&mut self) -> Result<jellyfin::AuthenticationResult> {
let auth_result: jellyfin::AuthenticationResult = self
.post(
"Users/AuthenticateByName",
&jellyfin::AuthenticateUserByName {
username: Some(self.config.username.clone()),
pw: Some(self.config.password.clone()),
},
)
.await?;
self.access_token = auth_result.access_token.clone().map(Into::into);
Ok(auth_result)
}
pub async fn authenticate_with_cached_token(
&mut self,
path: impl AsRef<std::path::Path>,
) -> Result<String> {
let path = path.as_ref();
if let Ok(token) = self
.load_token(path)
.await
.inspect_err(|err| tracing::warn!("Failed to load cached token: {}", err))
{
tracing::info!("Authenticating with cached token from {:?}", path);
self.access_token = Some(token.clone().into());
Ok(token)
} else {
tracing::info!("No cached token found at {:?}, authenticating...", path);
let token = self
.authenticate()
.await?
.access_token
.ok_or_else(|| JellyfinApiError::Unknown)?;
self.save_token(path).await?;
Ok(token)
}
}
pub async fn raw_items(&self) -> Result<jellyfin::BaseItemDtoQueryResult> { pub async fn raw_items(&self) -> Result<jellyfin::BaseItemDtoQueryResult> {
let text = &self let text = &self
.request_builder(Method::GET, "Items") .request_builder(Method::GET, "Items")
@@ -250,53 +255,16 @@ impl JellyfinClient {
"{}/Videos/{}/stream?static=true", "{}/Videos/{}/stream?static=true",
self.config.server_url.as_str(), self.config.server_url.as_str(),
item, item,
// item,
); );
Ok(url::Url::parse(&stream_url).expect("Failed to parse stream URL")) Ok(url::Url::parse(&stream_url).expect("Failed to parse stream URL"))
} }
} }
// pub trait Item {
// fn id(&self) -> &str;
// fn name(&self) -> &str;
// fn type_(&self) -> jellyfin::BaseItemKind;
// fn media_type(&self) -> &str;
// }
#[derive(Debug, Serialize, Deserialize, Clone)] #[derive(Debug, Serialize, Deserialize, Clone)]
pub struct JellyfinConfig { pub struct JellyfinConfig {
pub username: String,
pub password: String,
pub server_url: iref::IriBuf, pub server_url: iref::IriBuf,
pub device_id: String, pub device_id: String,
} pub device_name: String,
pub client_name: String,
impl JellyfinConfig { pub version: String,
pub fn new(
username: String,
password: String,
server_url: impl AsRef<str>,
device_id: String,
) -> Self {
JellyfinConfig {
username,
password,
server_url: iref::IriBuf::new(server_url.as_ref().into())
.expect("Failed to parse server URL"),
device_id,
}
}
}
#[test]
fn test_client_authenticate() {
let config = JellyfinConfig {
username: "servius".to_string(),
password: "nfz6yqr_NZD1nxk!faj".to_string(),
server_url: iref::IriBuf::new("https://jellyfin.tsuba.darksailor.dev".into()).unwrap(),
device_id: "testdeviceid".to_string(),
};
let mut client = JellyfinClient::new(config);
let auth_result = tokio_test::block_on(client.authenticate());
assert!(auth_result.is_ok());
} }

View File

@@ -0,0 +1,29 @@
[package]
name = "iced-video"
version = "0.1.0"
edition = "2024"
[dependencies]
bytemuck = "1.24.0"
error-stack = "0.6.0"
futures-lite = "2.6.1"
gst.workspace = true
iced_core = "0.14.0"
iced_futures = "0.14.0"
iced_renderer = { version = "0.14.0", features = ["iced_wgpu"] }
iced_wgpu = { version = "0.14.0" }
thiserror = "2.0.17"
tracing = "0.1.43"
wgpu = { version = "27.0.1", features = ["vulkan"] }
[dev-dependencies]
iced.workspace = true
tracing-subscriber = { version = "0.3.22", features = ["env-filter"] }
[profile.dev]
debug = true
[profile.release]
debug = true
# [patch.crates-io]
# iced_wgpu = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }

View File

@@ -0,0 +1,178 @@
use iced_video::{Video, VideoHandle};
pub fn main() -> iced::Result {
use tracing_subscriber::prelude::*;
tracing_subscriber::registry()
.with(
tracing_subscriber::fmt::layer()
.with_thread_ids(true)
.with_file(true),
)
.with(tracing_subscriber::EnvFilter::from_default_env())
.init();
iced::application(State::new, update, view)
.subscription(|state| {
// Foo
match &state.video {
Some(video) => video.subscription_with(state, keyboard_event),
None => keyboard_event(state),
}
})
.run()
}
fn keyboard_event(_state: &State) -> iced::Subscription<Message> {
use iced::keyboard::{Key, key::Named};
iced::keyboard::listen().map(move |event| match event {
iced::keyboard::Event::KeyPressed { key, .. } => {
let key = key.as_ref();
match key {
Key::Named(Named::Escape) | Key::Character("q") => Message::Quit,
Key::Character("f") => Message::Fullscreen,
Key::Named(Named::Space) => Message::Toggle,
_ => Message::Noop,
}
}
_ => Message::Noop,
})
}
#[derive(Debug, Clone)]
pub struct State {
video: Option<VideoHandle<Message>>,
fullscreen: bool,
}
impl State {
pub fn new() -> (Self, iced::Task<Message>) {
(
Self {
video: None,
fullscreen: false,
},
iced::Task::done(Message::Load),
)
}
}
#[derive(Debug, Clone)]
pub enum Message {
Play,
Pause,
Toggle,
Noop,
Load,
Fullscreen,
OnLoad(VideoHandle<Message>),
OnError(String),
NewFrame,
Eos,
Quit,
}
pub fn update(state: &mut State, message: Message) -> iced::Task<Message> {
match message {
Message::NewFrame => {
iced::Task::none()
}
Message::Eos => {
iced::Task::done(Message::Pause)
}
Message::Load => {
iced::Task::perform(
VideoHandle::load(
"https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c",
),
|result| match result {
Ok(video) => Message::OnLoad(video),
Err(err) => Message::OnError(format!("Error loading video: {:?}", err)),
},
).chain(iced::Task::done(Message::Play))
}
Message::OnError(err) => {
eprintln!("Error: {}", err);
iced::Task::none()
}
Message::OnLoad(video) => {
state.video = Some(video.on_new_frame(Message::NewFrame).on_end_of_stream(Message::Eos));
iced::Task::none()
}
Message::Fullscreen => {
state.fullscreen = !state.fullscreen;
let fullscreen = state.fullscreen;
let mode = if fullscreen {
iced::window::Mode::Fullscreen
} else {
iced::window::Mode::Windowed
};
iced::window::oldest().and_then(move |id| iced::window::set_mode::<Message>(id, mode))
}
Message::Play => {
state
.video
.as_ref()
.unwrap()
.source()
.play()
.expect("Failed to play video");
iced::Task::none()
}
Message::Pause => {
state
.video
.as_ref()
.unwrap()
.source()
.pause()
.expect("Failed to pause video");
iced::Task::none()
}
Message::Toggle => {
state
.video
.as_ref()
.unwrap()
.source()
.toggle()
.expect("Failed to stop video");
iced::Task::none()
}
Message::Quit => {
state
.video
.as_ref()
.unwrap()
.source()
.stop()
.expect("Failed to stop video");
std::process::exit(0);
}
Message::Noop => iced::Task::none(),
}
}
pub fn view<'a>(state: &'a State) -> iced::Element<'a, Message> {
if let None = &state.video {
return iced::widget::Column::new()
.push(iced::widget::Text::new("Press any key to load video"))
.align_x(iced::Alignment::Center)
.into();
}
let video_widget = Video::new(&state.video.as_ref().unwrap())
.width(iced::Length::Fill)
.height(iced::Length::Fill)
.content_fit(iced::ContentFit::Contain);
iced::widget::Column::new()
.push(video_widget)
.push(
iced::widget::Row::new()
.push(iced::widget::Button::new("Play").on_press(Message::Play))
.push(iced::widget::Button::new("Pause").on_press(Message::Pause))
.spacing(5)
.padding(10)
.align_y(iced::Alignment::Center),
)
.align_x(iced::Alignment::Center)
.into()
}

View File

@@ -0,0 +1,8 @@
info:
RUST_LOG=info,wgpu_core=warn,wgpu_hal=warn cargo run --release --example minimal
# GST_DEBUG=5 RUST_LOG="" cargo run --release --example minimal
flame:
cargo flamegraph run --release --example minimal
heaptrack:
cargo build --release --example minimal
RUST_LOG="info,wgpu_hal=info" heaptrack $CARGO_TARGET_DIR/release/examples/minimal

View File

@@ -0,0 +1,55 @@
use std::borrow;
use std::sync::atomic::{self, AtomicUsize};
static NEXT_ID: AtomicUsize = AtomicUsize::new(0);
/// The identifier of a generic widget.
#[derive(Debug, Clone, PartialEq, Eq, Hash, PartialOrd, Ord)]
pub struct Id(Internal);
impl Id {
/// Creates a new [`Id`] from a static `str`.
pub const fn new(id: &'static str) -> Self {
Self(Internal::Custom(borrow::Cow::Borrowed(id)))
}
/// Creates a unique [`Id`].
///
/// This function produces a different [`Id`] every time it is called.
pub fn unique() -> Self {
let id = NEXT_ID.fetch_add(1, atomic::Ordering::Relaxed);
Self(Internal::Unique(id))
}
}
impl From<&'static str> for Id {
fn from(value: &'static str) -> Self {
Self::new(value)
}
}
impl From<String> for Id {
fn from(value: String) -> Self {
Self(Internal::Custom(borrow::Cow::Owned(value)))
}
}
#[derive(Debug, Clone, PartialEq, Eq, Hash, PartialOrd, Ord)]
enum Internal {
Unique(usize),
Custom(borrow::Cow<'static, str>),
}
#[cfg(test)]
mod tests {
use super::Id;
#[test]
fn unique_generates_different_ids() {
let a = Id::unique();
let b = Id::unique();
assert_ne!(a, b);
}
}

View File

@@ -0,0 +1,164 @@
pub mod id;
pub mod primitive;
pub mod source;
pub mod widget;
pub use widget::Video;
use error_stack::{Report, ResultExt};
use gst::plugins::app::AppSink;
use gst::plugins::playback::Playbin3;
use gst::plugins::videoconvertscale::VideoConvert;
#[derive(Debug, thiserror::Error)]
#[error("Iced Video Error")]
pub struct Error;
pub type Result<T, E = Report<Error>> = core::result::Result<T, E>;
use std::sync::{Arc, Mutex, atomic::AtomicBool};
mod seal {
pub trait Sealed {}
impl Sealed for super::Unknown {}
impl Sealed for super::Ready {}
}
pub trait State: seal::Sealed {
fn is_ready() -> bool {
false
}
}
#[derive(Debug, Clone)]
pub struct Unknown;
#[derive(Debug, Clone)]
pub struct Ready;
impl State for Unknown {}
impl State for Ready {
fn is_ready() -> bool {
true
}
}
/// This is the video handle that is used to control the video playback.
/// This should be keps in the application state.
#[derive(Debug, Clone)]
pub struct VideoHandle<Message, S: State = Unknown> {
id: id::Id,
pub source: source::VideoSource,
frame_ready: Arc<AtomicBool>,
on_new_frame: Option<Box<Message>>,
on_end_of_stream: Option<Box<Message>>,
on_about_to_finish: Option<Box<Message>>,
__marker: core::marker::PhantomData<S>,
}
impl<Message: Send + Sync + Clone> VideoHandle<Message, Unknown> {
pub fn new(url: impl AsRef<str>) -> Result<Self> {
let source = source::VideoSource::new(url)?;
let frame_ready = Arc::clone(&source.ready);
Ok(Self {
id: id::Id::unique(),
source: source,
on_new_frame: None,
on_end_of_stream: None,
on_about_to_finish: None,
frame_ready,
__marker: core::marker::PhantomData,
})
}
/// Creates a new video handle and waits for the metadata to be loaded.
pub async fn load(url: impl AsRef<str>) -> Result<VideoHandle<Message, Ready>> {
let handle = VideoHandle::new(url)?;
handle.wait().await
}
}
impl<Message: Send + Sync + Clone, S: State> VideoHandle<Message, S> {
pub fn id(&self) -> &id::Id {
&self.id
}
pub fn source(&self) -> &source::VideoSource {
&self.source
}
pub async fn wait(self) -> Result<VideoHandle<Message, Ready>> {
self.source.wait().await?;
Ok(self.state::<Ready>())
}
fn state<S2: State>(self) -> VideoHandle<Message, S2> {
VideoHandle {
id: self.id,
source: self.source,
on_new_frame: self.on_new_frame,
on_end_of_stream: self.on_end_of_stream,
on_about_to_finish: self.on_about_to_finish,
frame_ready: self.frame_ready,
__marker: core::marker::PhantomData,
}
}
// pub fn subscription(&self) -> iced_futures::subscription::Subscription<Message> {
// let sub = widget::VideoSubscription {
// id: self.id.clone(),
// on_end_of_stream: self.on_end_of_stream.clone(),
// on_new_frame: self.on_new_frame.clone(),
// on_about_to_finish: self.on_about_to_finish.clone(),
// bus: self.source.bus.clone(),
// };
// iced_futures::subscription::from_recipe(sub)
// }
//
// pub fn subscription_with<State>(
// &self,
// state: &State,
// f: impl FnOnce(&State) -> iced_futures::subscription::Subscription<Message> + 'static,
// ) -> iced_futures::subscription::Subscription<Message>
// where
// State: Send + Sync + 'static,
// {
// let sub = self.subscription();
// iced_futures::subscription::Subscription::batch([sub, f(state)])
// }
pub fn on_new_frame(self, message: Message) -> Self {
Self {
on_new_frame: Some(Box::new(message)),
..self
}
}
pub fn on_end_of_stream(self, message: Message) -> Self {
Self {
on_end_of_stream: Some(Box::new(message)),
..self
}
}
pub fn on_about_to_finish(self, message: Message) -> Self {
Self {
on_about_to_finish: Some(Box::new(message)),
..self
}
}
pub fn play(&self) {
self.source.play();
}
pub fn pause(&self) {
self.source.pause();
}
pub fn stop(&self) {
self.source.stop();
}
}
impl<Message: Send + Sync + Clone> VideoHandle<Message, Ready> {
pub fn format(&self) -> Result<gst::VideoFormat> {
self.source
.format()
.change_context(Error)
.attach("Failed to get video format")
}
}

View File

@@ -0,0 +1,574 @@
use crate::id;
use gst::videoconvertscale::VideoFormat;
use iced_wgpu::primitive::Pipeline;
use iced_wgpu::wgpu;
use std::collections::BTreeMap;
use std::sync::{Arc, Mutex, atomic::AtomicBool};
#[derive(Clone, Copy, Debug, bytemuck::Zeroable, bytemuck::Pod)]
#[repr(transparent)]
pub struct ConversionMatrix {
matrix: [Vec3f; 3],
}
#[derive(Clone, Copy, Debug, bytemuck::Zeroable, bytemuck::Pod)]
#[repr(C, align(16))]
pub struct Vec3f {
data: [f32; 3],
__padding: u32,
}
impl From<[f32; 3]> for Vec3f {
fn from(value: [f32; 3]) -> Self {
Vec3f {
data: [value[0], value[1], value[2]],
__padding: 0,
}
}
}
impl Vec3f {
pub fn new(x: f32, y: f32, z: f32) -> Self {
Vec3f {
data: [x, y, z],
__padding: 0,
}
}
pub const fn from(data: [f32; 3]) -> Self {
Vec3f {
data: [data[0], data[1], data[2]],
__padding: 0,
}
}
}
// impl ConversionMatrix {
// pub fn desc() -> wgpu::VertexBufferLayout<'static> {
// wgpu::VertexBufferLayout {
// array_stride: core::mem::size_of::<ConversionMatrix>() as wgpu::BufferAddress,
// step_mode: wgpu::VertexStepMode::Vertex,
// attributes: &[
// wgpu::VertexAttribute {
// offset: 0,
// shader_location: 0,
// format: wgpu::VertexFormat::Float32x4,
// },
// wgpu::VertexAttribute {
// offset: 16,
// shader_location: 1,
// format: wgpu::VertexFormat::Float32x4,
// },
// wgpu::VertexAttribute {
// offset: 32,
// shader_location: 2,
// format: wgpu::VertexFormat::Float32x4,
// },
// wgpu::VertexAttribute {
// offset: 48,
// shader_location: 3,
// format: wgpu::VertexFormat::Float32x4,
// },
// ],
// }
// }
// }
pub const BT2020_TO_RGB: ConversionMatrix = ConversionMatrix {
matrix: [
Vec3f::from([1.0, 0.0, 1.4746]),
Vec3f::from([1.0, -0.16455, -0.5714]),
Vec3f::from([1.0, 1.8814, 0.0]),
],
};
pub const BT709_TO_RGB: ConversionMatrix = ConversionMatrix {
matrix: [
Vec3f::from([1.0, 0.0, 1.5748]),
Vec3f::from([1.0, -0.1873, -0.4681]),
Vec3f::from([1.0, 1.8556, 0.0]),
],
};
#[derive(Debug)]
pub struct VideoFrame {
pub id: id::Id,
pub size: wgpu::Extent3d,
pub ready: Arc<AtomicBool>,
pub frame: Arc<Mutex<gst::Sample>>,
pub format: VideoFormat,
}
impl iced_wgpu::Primitive for VideoFrame {
type Pipeline = VideoPipeline;
fn prepare(
&self,
pipeline: &mut Self::Pipeline,
device: &wgpu::Device,
queue: &wgpu::Queue,
bounds: &iced_wgpu::core::Rectangle,
viewport: &iced_wgpu::graphics::Viewport,
) {
let video = pipeline.videos.entry(self.id.clone()).or_insert_with(|| {
let texture = VideoTexture::new(
"iced-video-texture",
self.size,
device,
pipeline.format,
self.format,
);
let bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
label: Some("iced-video-texture-bind-group"),
layout: &pipeline.bind_group_layout,
entries: &[
wgpu::BindGroupEntry {
binding: 0,
resource: wgpu::BindingResource::TextureView(&texture.y_texture()),
},
wgpu::BindGroupEntry {
binding: 1,
resource: wgpu::BindingResource::TextureView(&texture.uv_texture()),
},
wgpu::BindGroupEntry {
binding: 2,
resource: wgpu::BindingResource::Sampler(&pipeline.sampler),
},
wgpu::BindGroupEntry {
binding: 3,
resource: wgpu::BindingResource::Buffer(
texture
.conversion_matrix_buffer()
.as_entire_buffer_binding(),
),
},
],
});
let matrix = if matches!(self.format, VideoFormat::P01010le | VideoFormat::P016Le) {
BT2020_TO_RGB
} else {
BT709_TO_RGB
};
texture.write_conversion_matrix(&matrix, queue);
VideoFrameData {
id: self.id.clone(),
texture,
bind_group,
conversion_matrix: matrix,
ready: Arc::clone(&self.ready),
}
});
if self.size != video.texture.size() {
let new_texture = video
.texture
.resize("iced-video-texture-resized", self.size, device);
new_texture.write_conversion_matrix(&video.conversion_matrix, queue);
let new_bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
label: Some("iced-video-texture-bind-group"),
layout: &pipeline.bind_group_layout,
entries: &[
wgpu::BindGroupEntry {
binding: 0,
resource: wgpu::BindingResource::TextureView(&new_texture.y_texture()),
},
wgpu::BindGroupEntry {
binding: 1,
resource: wgpu::BindingResource::TextureView(&new_texture.uv_texture()),
},
wgpu::BindGroupEntry {
binding: 2,
resource: wgpu::BindingResource::Sampler(&pipeline.sampler),
},
wgpu::BindGroupEntry {
binding: 3,
resource: wgpu::BindingResource::Buffer(
video
.texture
.conversion_matrix_buffer()
.as_entire_buffer_binding(),
),
},
],
});
video.texture = new_texture;
video.bind_group = new_bind_group;
}
if video.ready.load(std::sync::atomic::Ordering::SeqCst) {
let frame = self.frame.lock().expect("BUG: Mutex poisoned");
let buffer = frame
.buffer()
.expect("BUG: Failed to get frame data from gst::Sample");
let data = buffer
.map_readable()
.expect("BUG: Failed to map gst::Buffer readable");
video.texture.write_texture(&data, queue);
drop(data);
video
.ready
.store(false, std::sync::atomic::Ordering::SeqCst);
}
}
fn render(
&self,
pipeline: &Self::Pipeline,
encoder: &mut wgpu::CommandEncoder,
target: &wgpu::TextureView,
bounds: &iced_wgpu::core::Rectangle<u32>,
) {
let Some(video) = pipeline.videos.get(&self.id) else {
return;
};
let mut render_pass = encoder.begin_render_pass(&wgpu::RenderPassDescriptor {
label: Some("iced-video-render-pass"),
color_attachments: &[Some(wgpu::RenderPassColorAttachment {
view: target,
resolve_target: None,
ops: wgpu::Operations {
load: wgpu::LoadOp::Load,
store: wgpu::StoreOp::Store,
},
depth_slice: None,
})],
depth_stencil_attachment: None,
timestamp_writes: None,
occlusion_query_set: None,
});
render_pass.set_pipeline(&pipeline.pipeline);
render_pass.set_bind_group(0, &video.bind_group, &[]);
render_pass.set_scissor_rect(
bounds.x as _,
bounds.y as _,
bounds.width as _,
bounds.height as _,
);
render_pass.draw(0..3, 0..1);
// self.ready
// .store(false, std::sync::atomic::Ordering::Relaxed);
}
}
/// NV12 or P010 are only supported in DX12 and Vulkan backends.
/// While we can use vulkan with moltenvk on macos, I'd much rather use metal directly
/// Right now only supports interleaved UV formats.
/// For planar formats we would need 3 textures.
/// Also NV12 and P010 textures are not COPY_DST capable
/// This assumes 4:2:0 chroma subsampling (for now).
/// So for 4 Y samples there is 1 U and 1 V sample.
/// This means that the UV texture is half the width and half the height of the Y texture.
#[derive(Debug)]
pub struct VideoTexture {
y: wgpu::Texture,
uv: wgpu::Texture,
size: wgpu::Extent3d,
video_format: VideoFormat,
surface_format: wgpu::TextureFormat,
conversion_matrix_buffer: wgpu::Buffer,
}
impl VideoTexture {
pub fn size(&self) -> wgpu::Extent3d {
self.size
}
pub fn new(
label: &str,
size: wgpu::Extent3d,
device: &wgpu::Device,
surface_format: wgpu::TextureFormat,
video_format: VideoFormat,
) -> Self {
let surface_hdr = surface_format.is_wide();
let video_hdr = matches!(video_format, VideoFormat::P01010le | VideoFormat::P016Le);
if surface_hdr && !video_hdr {
tracing::warn!("Surface texture is HDR but video format is SDR");
} else if !surface_hdr && video_hdr {
tracing::warn!("Video format is HDR but surface does not support HDR");
}
let y_texture = device.create_texture(&wgpu::TextureDescriptor {
label: Some(&format!("{}-y", label)),
size: wgpu::Extent3d {
width: size.width,
height: size.height,
depth_or_array_layers: 1,
},
mip_level_count: 1,
sample_count: 1,
dimension: wgpu::TextureDimension::D2,
format: wgpu::TextureFormat::R16Unorm,
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
view_formats: &[],
});
let uv_texture = device.create_texture(&wgpu::TextureDescriptor {
label: Some(&format!("{}-uv", label)),
size: wgpu::Extent3d {
width: size.width / 2,
height: size.height / 2,
depth_or_array_layers: 1,
},
mip_level_count: 1,
sample_count: 1,
dimension: wgpu::TextureDimension::D2,
format: wgpu::TextureFormat::Rg16Unorm,
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
view_formats: &[],
});
let buffer = device.create_buffer(&wgpu::BufferDescriptor {
label: Some("iced-video-conversion-matrix-buffer"),
usage: wgpu::BufferUsages::UNIFORM | wgpu::BufferUsages::COPY_DST,
size: core::mem::size_of::<ConversionMatrix>() as wgpu::BufferAddress,
mapped_at_creation: false,
});
VideoTexture {
y: y_texture,
uv: uv_texture,
size,
surface_format,
video_format,
conversion_matrix_buffer: buffer,
}
}
// This return the surface texture format, not the video pixel format
pub fn format(&self) -> wgpu::TextureFormat {
self.surface_format
}
pub fn y_texture(&self) -> wgpu::TextureView {
self.y.create_view(&wgpu::TextureViewDescriptor::default())
}
pub fn uv_texture(&self) -> wgpu::TextureView {
self.uv.create_view(&wgpu::TextureViewDescriptor::default())
}
pub fn resize(&self, name: &str, new_size: wgpu::Extent3d, device: &wgpu::Device) -> Self {
VideoTexture::new(name, new_size, device, self.format(), self.pixel_format())
}
pub fn pixel_format(&self) -> VideoFormat {
self.video_format
}
/// This assumes that the data is laid out correctly for the texture format.
pub fn write_texture(&self, data: &[u8], queue: &wgpu::Queue) {
let Self { y, uv, .. } = self;
let y_size = y.size();
let uv_size = uv.size();
let y_data_size = (y_size.width * y_size.height * 2) as usize;
let uv_data_size = (y_data_size / 2) as usize; // UV is interleaved
let y_data = &data[0..y_data_size];
let uv_data = &data[y_data_size..y_data_size + uv_data_size];
queue.write_texture(
wgpu::TexelCopyTextureInfo {
texture: y,
mip_level: 0,
origin: wgpu::Origin3d::ZERO,
aspect: wgpu::TextureAspect::All,
},
y_data,
wgpu::TexelCopyBufferLayout {
offset: 0,
bytes_per_row: Some(y_size.width * 2),
rows_per_image: None,
},
y_size,
);
queue.write_texture(
wgpu::TexelCopyTextureInfo {
texture: uv,
mip_level: 0,
origin: wgpu::Origin3d::ZERO,
aspect: wgpu::TextureAspect::All,
},
uv_data,
wgpu::TexelCopyBufferLayout {
offset: 0,
bytes_per_row: Some(uv_size.width * 4),
rows_per_image: None,
},
uv_size,
);
}
pub fn write_conversion_matrix(&self, matrix: &ConversionMatrix, queue: &wgpu::Queue) {
queue.write_buffer(
&self.conversion_matrix_buffer,
0,
bytemuck::bytes_of(matrix),
);
}
pub fn conversion_matrix_buffer(&self) -> &wgpu::Buffer {
&self.conversion_matrix_buffer
}
}
#[derive(Debug)]
pub struct VideoFrameData {
id: id::Id,
texture: VideoTexture,
bind_group: wgpu::BindGroup,
conversion_matrix: ConversionMatrix,
ready: Arc<AtomicBool>,
}
impl VideoFrameData {
pub fn is_hdr(&self) -> bool {
self.texture.format().is_wide()
}
}
#[derive(Debug)]
pub struct VideoPipeline {
pipeline: wgpu::RenderPipeline,
bind_group_layout: wgpu::BindGroupLayout,
sampler: wgpu::Sampler,
format: wgpu::TextureFormat,
videos: BTreeMap<id::Id, VideoFrameData>,
}
pub trait WideTextureFormatExt {
fn is_wide(&self) -> bool;
}
impl WideTextureFormatExt for wgpu::TextureFormat {
fn is_wide(&self) -> bool {
matches!(
self,
wgpu::TextureFormat::Rgba16Float
| wgpu::TextureFormat::Rgba32Float
| wgpu::TextureFormat::Rgb10a2Unorm
| wgpu::TextureFormat::Rgb10a2Uint
| wgpu::TextureFormat::P010
)
}
}
impl Pipeline for VideoPipeline {
fn new(device: &wgpu::Device, queue: &wgpu::Queue, format: wgpu::TextureFormat) -> Self
where
Self: Sized,
{
if format.is_wide() {
tracing::info!("HDR texture format detected: {:?}", format);
}
let bind_group_layout = device.create_bind_group_layout(&wgpu::BindGroupLayoutDescriptor {
label: Some("iced-video-texture-bind-group-layout"),
entries: &[
// y
wgpu::BindGroupLayoutEntry {
binding: 0,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Texture {
multisampled: false,
view_dimension: wgpu::TextureViewDimension::D2,
sample_type: wgpu::TextureSampleType::Float { filterable: true },
},
count: None,
},
// uv
wgpu::BindGroupLayoutEntry {
binding: 1,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Texture {
multisampled: false,
view_dimension: wgpu::TextureViewDimension::D2,
sample_type: wgpu::TextureSampleType::Float { filterable: true },
},
count: None,
},
// sampler
wgpu::BindGroupLayoutEntry {
binding: 2,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Sampler(wgpu::SamplerBindingType::Filtering),
count: None,
},
// conversion matrix
wgpu::BindGroupLayoutEntry {
binding: 3,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Buffer {
ty: wgpu::BufferBindingType::Uniform,
has_dynamic_offset: false,
min_binding_size: None,
},
count: None,
},
],
});
let shader_passthrough =
device.create_shader_module(wgpu::include_wgsl!("shaders/passthrough.wgsl"));
let render_pipeline_layout =
device.create_pipeline_layout(&wgpu::PipelineLayoutDescriptor {
label: Some("iced-video-render-pipeline-layout"),
bind_group_layouts: &[&bind_group_layout],
push_constant_ranges: &[],
});
let pipeline = device.create_render_pipeline(&wgpu::RenderPipelineDescriptor {
label: Some("iced-video-render-pipeline"),
layout: Some(&render_pipeline_layout),
vertex: wgpu::VertexState {
module: &shader_passthrough,
entry_point: Some("vs_main"),
buffers: &[],
compilation_options: wgpu::PipelineCompilationOptions::default(),
},
fragment: Some(wgpu::FragmentState {
module: &shader_passthrough,
entry_point: Some("fs_main"),
targets: &[Some(wgpu::ColorTargetState {
format,
blend: Some(wgpu::BlendState::REPLACE),
write_mask: wgpu::ColorWrites::ALL,
})],
compilation_options: wgpu::PipelineCompilationOptions::default(),
}),
primitive: wgpu::PrimitiveState::default(),
depth_stencil: None,
multisample: wgpu::MultisampleState::default(),
multiview: None,
cache: None,
});
let sampler = device.create_sampler(&wgpu::SamplerDescriptor {
label: Some("iced-video-sampler"),
address_mode_u: wgpu::AddressMode::ClampToEdge,
address_mode_v: wgpu::AddressMode::ClampToEdge,
address_mode_w: wgpu::AddressMode::ClampToEdge,
mag_filter: wgpu::FilterMode::Linear,
min_filter: wgpu::FilterMode::Linear,
mipmap_filter: wgpu::FilterMode::Nearest,
..Default::default()
});
Self {
pipeline,
bind_group_layout,
sampler,
format,
videos: BTreeMap::new(),
}
}
}

View File

@@ -0,0 +1,30 @@
struct VertexOutput {
@builtin(position) clip_position: vec4<f32>,
@location(0) tex_coords: vec2<f32>,
}
@vertex
fn vs_main(
@builtin(vertex_index) in_vertex_index: u32,
) -> VertexOutput {
var out: VertexOutput;
let uv = vec2<f32>(f32((in_vertex_index << 1u) & 2u), f32(in_vertex_index & 2u));
out.clip_position = vec4<f32>(uv * 2.0 - 1.0, 0.0, 1.0);
out.clip_position.y = -out.clip_position.y;
out.tex_coords = uv;
return out;
}
@group(0) @binding(0) var y_texture: texture_2d<f32>;
@group(0) @binding(1) var uv_texture: texture_2d<f32>;
@group(0) @binding(2) var texture_sampler: sampler;
@group(0) @binding(3) var<uniform> rgb_primaries: mat3x3<f32>;
@fragment
fn fs_main(input: VertexOutput) -> @location(0) vec4<f32> {
let y = textureSample(y_texture, texture_sampler, input.tex_coords).r;
let uv = textureSample(uv_texture, texture_sampler, input.tex_coords).rg;
let yuv = vec3f(y, uv.x - 0.5, uv.y - 0.5);
return vec4f(yuv * rgb_primaries, 1.0);
}

View File

@@ -0,0 +1,173 @@
use crate::{Error, Result, ResultExt};
use gst::{
Bus, Gst, MessageType, MessageView, Sink, Source,
app::AppSink,
caps::{Caps, CapsType},
element::ElementExt,
pipeline::PipelineExt,
playback::{PlayFlags, Playbin3},
videoconvertscale::VideoConvert,
};
use std::sync::{Arc, Mutex, atomic::AtomicBool};
#[derive(Debug, Clone)]
pub struct VideoSource {
pub(crate) playbin: Playbin3,
pub(crate) appsink: AppSink,
pub(crate) bus: Bus,
pub(crate) ready: Arc<AtomicBool>,
pub(crate) frame: Arc<Mutex<gst::Sample>>,
pub(crate) size: std::sync::OnceLock<(i32, i32)>,
}
impl VideoSource {
/// Creates a new video source from the given URL.
/// Since this doesn't have to parse the pipeline manually, we aren't sanitizing the URL for
/// now.
pub fn new(url: impl AsRef<str>) -> Result<Self> {
Gst::new();
let mut appsink = AppSink::new("iced-video-sink").change_context(Error)?;
appsink
.drop(true)
.sync(true)
// .async_(true)
.emit_signals(true);
let playbin = Playbin3::new("iced-video")
.change_context(Error)?
.with_uri(url.as_ref())
.with_buffer_duration(core::time::Duration::from_secs(2))
.with_buffer_size(4096 * 4096 * 4 * 3)
.with_ring_buffer_max_size(4096 * 4096 * 4 * 3)
.with_flags(Playbin3::default_flags() | PlayFlags::DOWNLOAD)
.with_video_sink(&appsink);
let bus = playbin.bus().change_context(Error)?;
playbin.pause().change_context(Error)?;
let ready = Arc::new(AtomicBool::new(false));
let frame = Arc::new(Mutex::new(gst::Sample::new()));
appsink.on_new_sample({
let ready = Arc::clone(&ready);
let frame = Arc::clone(&frame);
move |appsink| {
let Ok(sample) = appsink.pull_sample() else {
tracing::error!("Failed to pull video sample from appsink despite being notified of new frame");
return Ok(());
};
{
let mut guard = frame.lock().expect("BUG: Mutex poisoned");
core::mem::replace(&mut *guard, sample);
ready.store(true, std::sync::atomic::Ordering::Relaxed);
}
Ok(())
}
});
Ok(Self {
playbin,
appsink,
bus,
ready,
frame,
size: std::sync::OnceLock::new(),
})
}
pub async fn wait(&self) -> Result<()> {
use futures_lite::StreamExt;
// self.bus_stream()
// .for_each(|msg: gst::Message| {
// use gst::gstreamer::prelude::*;
// match msg.view() {
// MessageView::Eos(_) => {
// tracing::info!("Video reached end of stream");
// }
// MessageView::Error(err) => {
// tracing::error!(
// "Video Error from {:?}: {} ({:?})",
// err.src().map(|s| s.path_string()),
// err.error(),
// err.debug()
// );
// }
// view => tracing::info!("Video Message: {:#?}", view),
// }
// })
// .await;
self.playbin
.wait_for_states(&[gst::State::Paused, gst::State::Playing])
.await
.change_context(Error)
.attach("Failed to wait for video initialisation")?;
Ok(())
}
pub fn format(&self) -> Result<gst::VideoFormat> {
let caps = self
.appsink
.sink("sink")
.current_caps()
.change_context(Error)?;
let format = caps
.format()
.ok_or(Error)
.attach("Failed to get video caps structure")?;
Ok(format)
}
pub fn bus_stream(&self) -> impl futures_lite::Stream<Item = gst::Message> {
self.bus.stream()
}
pub fn is_playing(&self) -> Result<bool> {
let state = self.playbin.state(None).change_context(Error)?;
Ok(state == gst::State::Playing)
}
pub fn toggle(&self) -> Result<()> {
if self.is_playing()? {
self.pause()?;
} else {
self.play()?;
}
Ok(())
}
pub fn play(&self) -> Result<()> {
self.playbin
.play()
.change_context(Error)
.attach("Failed to play video")
}
pub fn pause(&self) -> Result<()> {
self.playbin
.pause()
.change_context(Error)
.attach("Failed to pause video")
}
pub fn stop(&self) -> Result<()> {
self.playbin
.stop()
.change_context(Error)
.attach("Failed to stop video")
}
pub fn size(&self) -> Result<(i32, i32)> {
if let Some(size) = self.size.get() {
return Ok(*size);
}
let caps = self
.appsink
.sink("sink")
.current_caps()
.change_context(Error)?;
let out = caps
.width()
.and_then(|width| caps.height().map(|height| (width, height)))
.ok_or(Error)
.attach("Failed to get width, height")?;
self.size.set(out);
Ok(out)
}
}

View File

@@ -0,0 +1,258 @@
use super::*;
use iced::Length;
use iced_core as iced;
use iced_wgpu::primitive::Renderer as PrimitiveRenderer;
use std::marker::PhantomData;
/// This is the Video widget that displays a video.
/// This should be used in the view function.
pub struct Video<'a, Message, Theme = iced::Theme, Renderer = iced_wgpu::Renderer>
where
Renderer: PrimitiveRenderer,
{
id: id::Id,
handle: &'a VideoHandle<Message, Ready>,
video_format: gst::VideoFormat,
content_fit: iced::ContentFit,
width: iced::Length,
height: iced::Length,
looping: bool,
__marker: PhantomData<(Renderer, Theme)>,
}
impl<'a, Message, Theme, Renderer> Video<'a, Message, Theme, Renderer>
where
Renderer: PrimitiveRenderer,
Message: Clone + Send + Sync,
{
pub fn new(handle: &'a VideoHandle<Message, Ready>) -> Self {
Self {
id: handle.id.clone(),
handle: &handle,
video_format: handle
.format()
.expect("Failed to get video format during widget creation"),
content_fit: iced::ContentFit::Contain,
width: Length::Shrink,
height: Length::Shrink,
looping: false,
__marker: PhantomData,
}
}
}
impl<'a, Message, Theme, Renderer> Video<'a, Message, Theme, Renderer>
where
Renderer: PrimitiveRenderer,
{
pub fn width(mut self, width: Length) -> Self {
self.width = width;
self
}
pub fn height(mut self, height: Length) -> Self {
self.height = height;
self
}
pub fn content_fit(mut self, fit: iced::ContentFit) -> Self {
self.content_fit = fit;
self
}
// pub fn on_end_of_stream(mut self, message: Message) -> Self {
// self.on_end_of_stream = Some(message);
// self
// }
//
// pub fn on_new_frame(mut self, message: Message) -> Self {
// self.on_new_frame = Some(message);
// self
// }
pub fn looping(mut self, looping: bool) -> Self {
self.looping = looping;
self
}
}
impl<Message, Theme, Renderer> iced::Widget<Message, Theme, Renderer>
for Video<'_, Message, Theme, Renderer>
where
Message: Clone + Send + Sync,
Renderer: PrimitiveRenderer,
{
fn size(&self) -> iced::Size<Length> {
iced::Size {
width: self.width,
height: self.height,
}
}
// The video player should take max space by default
fn layout(
&mut self,
_tree: &mut iced::widget::Tree,
_renderer: &Renderer,
limits: &iced::layout::Limits,
) -> iced::layout::Node {
iced::layout::Node::new(limits.max())
}
fn draw(
&self,
tree: &iced::widget::Tree,
renderer: &mut Renderer,
theme: &Theme,
style: &iced::renderer::Style,
layout: iced::Layout<'_>,
cursor: iced::mouse::Cursor,
viewport: &iced::Rectangle,
) {
if let Ok((width, height)) = self.handle.source.size() {
let video_size = iced::Size {
width: width as f32,
height: height as f32,
};
let bounds = layout.bounds();
let adjusted_fit = self.content_fit.fit(video_size, bounds.size());
let scale = iced::Vector::new(
adjusted_fit.width / video_size.width,
adjusted_fit.height / video_size.height,
);
let final_size = video_size * scale;
let position = match self.content_fit {
iced::ContentFit::None => iced::Point::new(
bounds.x + (video_size.width - adjusted_fit.width) / 2.0,
bounds.y + (video_size.height - adjusted_fit.height) / 2.0,
),
_ => iced::Point::new(
bounds.center_x() - final_size.width / 2.0,
bounds.center_y() - final_size.height / 2.0,
),
};
let drawing_bounds = iced::Rectangle::new(position, final_size);
let render = |renderer: &mut Renderer| {
renderer.draw_primitive(
drawing_bounds,
primitive::VideoFrame {
id: self.id.clone(),
size: iced_wgpu::wgpu::Extent3d {
width: width as u32,
height: height as u32,
depth_or_array_layers: 1,
},
ready: Arc::clone(&self.handle.frame_ready),
frame: Arc::clone(&self.handle.source.frame),
format: self.video_format,
},
);
};
if adjusted_fit.width > bounds.width || adjusted_fit.height > bounds.height {
renderer.with_layer(bounds, render);
} else {
render(renderer);
}
}
}
fn update(
&mut self,
_tree: &mut iced_core::widget::Tree,
event: &iced::Event,
_layout: iced_core::Layout<'_>,
_cursor: iced_core::mouse::Cursor,
_renderer: &Renderer,
_clipboard: &mut dyn iced_core::Clipboard,
shell: &mut iced_core::Shell<'_, Message>,
_viewport: &iced::Rectangle,
) {
if let iced::Event::Window(iced::window::Event::RedrawRequested(when)) = event {
if self
.handle
.frame_ready
.load(std::sync::atomic::Ordering::SeqCst)
{
shell.request_redraw();
} else {
shell.request_redraw_at(iced::window::RedrawRequest::At(
iced_core::time::Instant::now() + core::time::Duration::from_millis(16)
- when.elapsed(),
));
}
}
}
}
impl<'a, Message, Theme, Renderer> From<Video<'a, Message, Theme, Renderer>>
for iced::Element<'a, Message, Theme, Renderer>
where
Message: Send + Sync + 'a + Clone,
Theme: 'a,
Renderer: 'a + iced_wgpu::primitive::Renderer,
{
fn from(video: Video<'a, Message, Theme, Renderer>) -> Self {
Self::new(video)
}
}
#[derive(Debug, Clone)]
pub struct VideoSubscription<Message> {
pub(crate) id: id::Id,
pub(crate) on_end_of_stream: Option<Box<Message>>,
pub(crate) on_new_frame: Option<Box<Message>>,
pub(crate) on_about_to_finish: Option<Box<Message>>,
// on_subtitle_text: Option<Box<dyn Fn(Option<String>) -> Message>>,
// on_error: Option<Box<dyn Fn(&glib::Error) -> Message>>,
pub(crate) bus: gst::Bus,
}
impl<Message> VideoSubscription<Message> where Message: Clone {}
impl<Message> iced_futures::subscription::Recipe for VideoSubscription<Message>
where
Message: Clone + Send + Sync + 'static,
{
type Output = Message;
fn hash(&self, state: &mut iced_futures::subscription::Hasher) {
use std::hash::Hash;
self.id.hash(state);
}
fn stream(
self: Box<Self>,
_input: core::pin::Pin<
Box<dyn iced_futures::futures::Stream<Item = iced_futures::subscription::Event> + Send>,
>,
) -> core::pin::Pin<Box<dyn iced_futures::futures::Stream<Item = Self::Output> + Send>> {
// use iced_futures::futures::StreamExt;
use futures_lite::stream::StreamExt;
Box::pin(
self.bus
.filtered_stream(&[gst::MessageType::Eos, gst::MessageType::Element])
.filter_map({
let eos = self.on_end_of_stream.clone();
let frame = self.on_new_frame.clone();
move |message: gst::Message| match message.view() {
gst::MessageView::Eos(_) => eos.clone().map(|m| *m),
gst::MessageView::Element(element_msg) => {
let structure = element_msg.structure();
if let Some(structure) = structure {
if structure.name() == "GstVideoFrameReady" {
frame.clone().map(|m| *m)
} else {
None
}
} else {
None
}
}
_ => None,
}
}),
)
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,63 +0,0 @@
[package]
name = "iced_video_player"
description = "A convenient video player widget for Iced"
homepage = "https://github.com/jazzfool/iced_video_player"
repository = "https://github.com/jazzfool/iced_video_player"
readme = "README.md"
keywords = ["gui", "iced", "video"]
categories = ["gui", "multimedia"]
version = "0.6.0"
authors = ["jazzfool"]
edition = "2021"
resolver = "2"
license = "MIT OR Apache-2.0"
exclude = [".media/test.mp4"]
[dependencies]
iced = { git = "https://github.com/iced-rs/iced", features = [
"image",
"advanced",
"wgpu",
] }
iced_wgpu = { git = "https://github.com/iced-rs/iced" }
gstreamer = "0.23"
gstreamer-app = "0.23" # appsink
gstreamer-base = "0.23" # basesrc
glib = "0.20" # gobject traits and error type
log = "0.4"
thiserror = "1"
url = "2" # media uri
[package.metadata.nix]
systems = ["x86_64-linux"]
app = true
build = true
runtimeLibs = [
"vulkan-loader",
"wayland",
"wayland-protocols",
"libxkbcommon",
"xorg.libX11",
"xorg.libXrandr",
"xorg.libXi",
"gst_all_1.gstreamer",
"gst_all_1.gstreamermm",
"gst_all_1.gst-plugins-bad",
"gst_all_1.gst-plugins-ugly",
"gst_all_1.gst-plugins-good",
"gst_all_1.gst-plugins-base",
]
buildInputs = [
"libxkbcommon",
"gst_all_1.gstreamer",
"gst_all_1.gstreamermm",
"gst_all_1.gst-plugins-bad",
"gst_all_1.gst-plugins-ugly",
"gst_all_1.gst-plugins-good",
"gst_all_1.gst-plugins-base",
]
[package.metadata.docs.rs]
rustc-args = ["--cfg", "docsrs"]
rustdoc-args = ["--cfg", "docsrs"]
targets = ["wasm32-unknown-unknown"]

View File

@@ -1,176 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS

View File

@@ -1,23 +0,0 @@
Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the
Software without restriction, including without
limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice
shall be included in all copies or substantial portions
of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.

View File

@@ -1,64 +0,0 @@
# Iced Video Player Widget
Composable component to play videos in any Iced application built on the excellent GStreamer library.
<img src=".media/screenshot.png" width="50%" />
## Overview
In general, this supports anything that [`gstreamer/playbin`](https://gstreamer.freedesktop.org/documentation/playback/playbin.html?gi-language=c) supports.
Features:
- Load video files from any file path **or URL** (support for streaming over network).
- Video buffering when streaming on a network.
- Audio support.
- Programmatic control.
- Can capture thumbnails from a set of timestamps.
- Good performance (i.e., comparable to other video players). GStreamer (with the right plugins) will perform hardware-accelerated decoding, and the color space (YUV to RGB) is converted on the GPU whilst rendering the frame.
Limitations (hopefully to be fixed):
- GStreamer is a bit annoying to set up on Windows.
The player **does not** come with any surrounding GUI controls, but they should be quite easy to implement should you need them.
See the "minimal" example for a demonstration on how you could implement pausing, looping, and seeking.
## Example Usage
```rust
use iced_video_player::{Video, VideoPlayer};
fn main() -> iced::Result {
iced::run("Video Player", (), App::view)
}
struct App {
video: Video,
}
impl Default for App {
fn default() -> Self {
App {
video: Video::new(&url::Url::parse("file:///C:/my_video.mp4").unwrap()).unwrap(),
}
}
}
impl App {
fn view(&self) -> iced::Element<()> {
VideoPlayer::new(&self.video).into()
}
}
```
## Building
Follow the [GStreamer build instructions](https://github.com/sdroege/gstreamer-rs#installation). This should be able to compile on MSVC, MinGW, Linux, and MacOS.
## License
Licensed under either
- [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- [MIT](http://opensource.org/licenses/MIT)
at your option.

View File

@@ -1,139 +0,0 @@
use iced::{
widget::{Button, Column, Container, Row, Slider, Text},
Element,
};
use iced_video_player::{Video, VideoPlayer};
use std::time::Duration;
fn main() -> iced::Result {
iced::run(App::update, App::view)
}
#[derive(Clone, Debug)]
enum Message {
TogglePause,
ToggleLoop,
Seek(f64),
SeekRelease,
EndOfStream,
NewFrame,
}
struct App {
video: Video,
position: f64,
dragging: bool,
}
impl Default for App {
fn default() -> Self {
App {
video: Video::new(
&url::Url::parse("https://jellyfin.tsuba.darksailor.dev/Videos/1d7e2012-e17d-edbb-25c3-2dbcc803d6b6/stream?static=true")
.expect("Failed to parse URL"),
)
.expect("Failed to create video"),
position: 0.0,
dragging: false,
}
}
}
impl App {
fn update(&mut self, message: Message) {
match message {
Message::TogglePause => {
self.video.set_paused(!self.video.paused());
}
Message::ToggleLoop => {
self.video.set_looping(!self.video.looping());
}
Message::Seek(secs) => {
self.dragging = true;
self.video.set_paused(true);
self.position = secs;
}
Message::SeekRelease => {
self.dragging = false;
self.video
.seek(Duration::from_secs_f64(self.position), false)
.expect("seek");
self.video.set_paused(false);
}
Message::EndOfStream => {
println!("end of stream");
}
Message::NewFrame => {
if !self.dragging {
self.position = self.video.position().as_secs_f64();
}
}
}
}
fn view(&self) -> Element<Message> {
Column::new()
.push(
Container::new(
VideoPlayer::new(&self.video)
.width(iced::Length::Fill)
.height(iced::Length::Fill)
.content_fit(iced::ContentFit::Contain)
.on_end_of_stream(Message::EndOfStream)
.on_new_frame(Message::NewFrame),
)
.align_x(iced::Alignment::Center)
.align_y(iced::Alignment::Center)
.width(iced::Length::Fill)
.height(iced::Length::Fill),
)
.push(
Container::new(
Slider::new(
0.0..=self.video.duration().as_secs_f64(),
self.position,
Message::Seek,
)
.step(0.1)
.on_release(Message::SeekRelease),
)
.padding(iced::Padding::new(5.0).left(10.0).right(10.0)),
)
.push(
Row::new()
.spacing(5)
.align_y(iced::alignment::Vertical::Center)
.padding(iced::Padding::new(10.0).top(0.0))
.push(
Button::new(Text::new(if self.video.paused() {
"Play"
} else {
"Pause"
}))
.width(80.0)
.on_press(Message::TogglePause),
)
.push(
Button::new(Text::new(if self.video.looping() {
"Disable Loop"
} else {
"Enable Loop"
}))
.width(120.0)
.on_press(Message::ToggleLoop),
)
.push(
Text::new(format!(
"{}:{:02}s / {}:{:02}s",
self.position as u64 / 60,
self.position as u64 % 60,
self.video.duration().as_secs() / 60,
self.video.duration().as_secs() % 60,
))
.width(iced::Length::Fill)
.align_x(iced::alignment::Horizontal::Right),
),
)
.into()
}
}

View File

@@ -1,275 +0,0 @@
{
"nodes": {
"crane": {
"flake": false,
"locked": {
"lastModified": 1758758545,
"narHash": "sha256-NU5WaEdfwF6i8faJ2Yh+jcK9vVFrofLcwlD/mP65JrI=",
"owner": "ipetkov",
"repo": "crane",
"rev": "95d528a5f54eaba0d12102249ce42f4d01f4e364",
"type": "github"
},
"original": {
"owner": "ipetkov",
"ref": "v0.21.1",
"repo": "crane",
"type": "github"
}
},
"dream2nix": {
"inputs": {
"nixpkgs": [
"nixCargoIntegration",
"nixpkgs"
],
"purescript-overlay": "purescript-overlay",
"pyproject-nix": "pyproject-nix"
},
"locked": {
"lastModified": 1763413832,
"narHash": "sha256-dkqBwDXiv8MPoFyIvOuC4bVubAP+TlVZUkVMB78TTSg=",
"owner": "nix-community",
"repo": "dream2nix",
"rev": "5658fba3a0b6b7d5cb0460b949651f64f644a743",
"type": "github"
},
"original": {
"owner": "nix-community",
"repo": "dream2nix",
"type": "github"
}
},
"flake-compat": {
"flake": false,
"locked": {
"lastModified": 1696426674,
"narHash": "sha256-kvjfFW7WAETZlt09AgDn1MrtKzP7t90Vf7vypd3OL1U=",
"owner": "edolstra",
"repo": "flake-compat",
"rev": "0f9255e01c2351cc7d116c072cb317785dd33b33",
"type": "github"
},
"original": {
"owner": "edolstra",
"repo": "flake-compat",
"type": "github"
}
},
"flakeCompat": {
"flake": false,
"locked": {
"lastModified": 1761588595,
"narHash": "sha256-XKUZz9zewJNUj46b4AJdiRZJAvSZ0Dqj2BNfXvFlJC4=",
"owner": "edolstra",
"repo": "flake-compat",
"rev": "f387cd2afec9419c8ee37694406ca490c3f34ee5",
"type": "github"
},
"original": {
"owner": "edolstra",
"repo": "flake-compat",
"type": "github"
}
},
"mk-naked-shell": {
"flake": false,
"locked": {
"lastModified": 1681286841,
"narHash": "sha256-3XlJrwlR0nBiREnuogoa5i1b4+w/XPe0z8bbrJASw0g=",
"owner": "90-008",
"repo": "mk-naked-shell",
"rev": "7612f828dd6f22b7fb332cc69440e839d7ffe6bd",
"type": "github"
},
"original": {
"owner": "90-008",
"repo": "mk-naked-shell",
"type": "github"
}
},
"nixCargoIntegration": {
"inputs": {
"crane": "crane",
"dream2nix": "dream2nix",
"mk-naked-shell": "mk-naked-shell",
"nixpkgs": [
"nixpkgs"
],
"parts": "parts",
"rust-overlay": "rust-overlay",
"treefmt": "treefmt"
},
"locked": {
"lastModified": 1763619566,
"narHash": "sha256-92rSHIwh5qTXjcktVEWyKu5EPB3/7UdgjgjtWZ5ET6w=",
"owner": "yusdacra",
"repo": "nix-cargo-integration",
"rev": "ac45d8c0d6876e6547d62bc729654c7b9a79c760",
"type": "github"
},
"original": {
"owner": "yusdacra",
"repo": "nix-cargo-integration",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1763421233,
"narHash": "sha256-Stk9ZYRkGrnnpyJ4eqt9eQtdFWRRIvMxpNRf4sIegnw=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "89c2b2330e733d6cdb5eae7b899326930c2c0648",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"parts": {
"inputs": {
"nixpkgs-lib": [
"nixCargoIntegration",
"nixpkgs"
]
},
"locked": {
"lastModified": 1762980239,
"narHash": "sha256-8oNVE8TrD19ulHinjaqONf9QWCKK+w4url56cdStMpM=",
"owner": "hercules-ci",
"repo": "flake-parts",
"rev": "52a2caecc898d0b46b2b905f058ccc5081f842da",
"type": "github"
},
"original": {
"owner": "hercules-ci",
"repo": "flake-parts",
"type": "github"
}
},
"purescript-overlay": {
"inputs": {
"flake-compat": "flake-compat",
"nixpkgs": [
"nixCargoIntegration",
"dream2nix",
"nixpkgs"
],
"slimlock": "slimlock"
},
"locked": {
"lastModified": 1728546539,
"narHash": "sha256-Sws7w0tlnjD+Bjck1nv29NjC5DbL6nH5auL9Ex9Iz2A=",
"owner": "thomashoneyman",
"repo": "purescript-overlay",
"rev": "4ad4c15d07bd899d7346b331f377606631eb0ee4",
"type": "github"
},
"original": {
"owner": "thomashoneyman",
"repo": "purescript-overlay",
"type": "github"
}
},
"pyproject-nix": {
"inputs": {
"nixpkgs": [
"nixCargoIntegration",
"dream2nix",
"nixpkgs"
]
},
"locked": {
"lastModified": 1752481895,
"narHash": "sha256-luVj97hIMpCbwhx3hWiRwjP2YvljWy8FM+4W9njDhLA=",
"owner": "pyproject-nix",
"repo": "pyproject.nix",
"rev": "16ee295c25107a94e59a7fc7f2e5322851781162",
"type": "github"
},
"original": {
"owner": "pyproject-nix",
"repo": "pyproject.nix",
"type": "github"
}
},
"root": {
"inputs": {
"flakeCompat": "flakeCompat",
"nixCargoIntegration": "nixCargoIntegration",
"nixpkgs": "nixpkgs"
}
},
"rust-overlay": {
"inputs": {
"nixpkgs": [
"nixCargoIntegration",
"nixpkgs"
]
},
"locked": {
"lastModified": 1763606317,
"narHash": "sha256-lsq4Urmb9Iyg2zyg2yG6oMQk9yuaoIgy+jgvYM4guxA=",
"owner": "oxalica",
"repo": "rust-overlay",
"rev": "a5615abaf30cfaef2e32f1ff9bd5ca94e2911371",
"type": "github"
},
"original": {
"owner": "oxalica",
"repo": "rust-overlay",
"type": "github"
}
},
"slimlock": {
"inputs": {
"nixpkgs": [
"nixCargoIntegration",
"dream2nix",
"purescript-overlay",
"nixpkgs"
]
},
"locked": {
"lastModified": 1688756706,
"narHash": "sha256-xzkkMv3neJJJ89zo3o2ojp7nFeaZc2G0fYwNXNJRFlo=",
"owner": "thomashoneyman",
"repo": "slimlock",
"rev": "cf72723f59e2340d24881fd7bf61cb113b4c407c",
"type": "github"
},
"original": {
"owner": "thomashoneyman",
"repo": "slimlock",
"type": "github"
}
},
"treefmt": {
"inputs": {
"nixpkgs": [
"nixCargoIntegration",
"nixpkgs"
]
},
"locked": {
"lastModified": 1762938485,
"narHash": "sha256-AlEObg0syDl+Spi4LsZIBrjw+snSVU4T8MOeuZJUJjM=",
"owner": "numtide",
"repo": "treefmt-nix",
"rev": "5b4ee75aeefd1e2d5a1cc43cf6ba65eba75e83e4",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "treefmt-nix",
"type": "github"
}
}
},
"root": "root",
"version": 7
}

View File

@@ -1,38 +0,0 @@
{
inputs = {
flakeCompat = {
url = "github:edolstra/flake-compat";
flake = false;
};
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
nixCargoIntegration = {
url = "github:yusdacra/nix-cargo-integration";
inputs.nixpkgs.follows = "nixpkgs";
};
};
outputs = inputs: let
pkgs = import inputs.nixpkgs {
system = "x86_64-linux";
};
in {
devShells."x86_64-linux".default = pkgs.mkShell {
# "GST_PLUGIN_PATH" = "${pkgs.gst_all_1.gstreamer}:${pkgs.gst_all_1.gst-plugins-bad}:${pkgs.gst_all_1.gst-plugins-ugly}:${pkgs.gst_all_1.gst-plugins-good}:${pkgs.gst_all_1.gst-plugins-base}";
buildInputs = with pkgs; [
gst_all_1.gstreamer
gst_all_1.gst-plugins-bad
gst_all_1.gst-plugins-ugly
gst_all_1.gst-plugins-good
gst_all_1.gst-plugins-base
libxkbcommon
wayland
rustup
];
nativeBuildInputs = with pkgs; [
pkg-config
wayland
];
packages = with pkgs; [wayland];
};
};
}

View File

@@ -1,12 +0,0 @@
# Flake's devShell for non-flake-enabled nix instances
(import
(
let lock = builtins.fromJSON (builtins.readFile ./flake.lock);
in
fetchTarball {
url =
"https://github.com/edolstra/flake-compat/archive/${lock.nodes.flakeCompat.locked.rev}.tar.gz";
sha256 = lock.nodes.flakeCompat.locked.narHash;
}
)
{ src = ./.; }).shellNix.default

View File

@@ -1,76 +0,0 @@
//! # Iced Video Player
//!
//! A convenient video player widget for Iced.
//!
//! To get started, load a video from a URI (e.g., a file path prefixed with `file:///`) using [`Video::new`](crate::Video::new),
//! then use it like any other Iced widget in your `view` function by creating a [`VideoPlayer`].
//!
//! Example:
//! ```rust
//! use iced_video_player::{Video, VideoPlayer};
//!
//! fn main() -> iced::Result {
//! iced::run("Video Player", (), App::view)
//! }
//!
//! struct App {
//! video: Video,
//! }
//!
//! impl Default for App {
//! fn default() -> Self {
//! App {
//! video: Video::new(&url::Url::parse("file:///C:/my_video.mp4").unwrap()).unwrap(),
//! }
//! }
//! }
//!
//! impl App {
//! fn view(&self) -> iced::Element<()> {
//! VideoPlayer::new(&self.video).into()
//! }
//! }
//! ```
//!
//! You can programmatically control the video (e.g., seek, pause, loop, grab thumbnails) by accessing various methods on [`Video`].
mod pipeline;
mod video;
mod video_player;
use gstreamer as gst;
use thiserror::Error;
pub use video::Position;
pub use video::Video;
pub use video_player::VideoPlayer;
#[derive(Debug, Error)]
pub enum Error {
#[error("{0}")]
Glib(#[from] glib::Error),
#[error("{0}")]
Bool(#[from] glib::BoolError),
#[error("failed to get the gstreamer bus")]
Bus,
#[error("failed to get AppSink element with name='{0}' from gstreamer pipeline")]
AppSink(String),
#[error("{0}")]
StateChange(#[from] gst::StateChangeError),
#[error("failed to cast gstreamer element")]
Cast,
#[error("{0}")]
Io(#[from] std::io::Error),
#[error("invalid URI")]
Uri,
#[error("failed to get media capabilities")]
Caps,
#[error("failed to query media duration or position")]
Duration,
#[error("failed to sync with playback")]
Sync,
#[error("failed to lock internal sync primitive")]
Lock,
#[error("invalid framerate: {0}")]
Framerate(f64),
}

View File

@@ -1,469 +0,0 @@
use crate::video::Frame;
use iced_wgpu::primitive::Primitive;
use iced_wgpu::wgpu;
use std::{
collections::{btree_map::Entry, BTreeMap},
num::NonZero,
sync::{
atomic::{AtomicBool, AtomicUsize, Ordering},
Arc, Mutex,
},
};
#[repr(C)]
struct Uniforms {
rect: [f32; 4],
// because wgpu min_uniform_buffer_offset_alignment
_pad: [u8; 240],
}
struct VideoEntry {
texture_y: wgpu::Texture,
texture_uv: wgpu::Texture,
instances: wgpu::Buffer,
bg0: wgpu::BindGroup,
alive: Arc<AtomicBool>,
prepare_index: AtomicUsize,
render_index: AtomicUsize,
}
pub(crate) struct VideoPipeline {
pipeline: wgpu::RenderPipeline,
bg0_layout: wgpu::BindGroupLayout,
sampler: wgpu::Sampler,
videos: BTreeMap<u64, VideoEntry>,
}
impl VideoPipeline {
fn new(device: &wgpu::Device, format: wgpu::TextureFormat) -> Self {
let shader = device.create_shader_module(wgpu::ShaderModuleDescriptor {
label: Some("iced_video_player shader"),
source: wgpu::ShaderSource::Wgsl(include_str!("shader.wgsl").into()),
});
let bg0_layout = device.create_bind_group_layout(&wgpu::BindGroupLayoutDescriptor {
label: Some("iced_video_player bind group 0 layout"),
entries: &[
wgpu::BindGroupLayoutEntry {
binding: 0,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Texture {
sample_type: wgpu::TextureSampleType::Float { filterable: true },
view_dimension: wgpu::TextureViewDimension::D2,
multisampled: false,
},
count: None,
},
wgpu::BindGroupLayoutEntry {
binding: 1,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Texture {
sample_type: wgpu::TextureSampleType::Float { filterable: true },
view_dimension: wgpu::TextureViewDimension::D2,
multisampled: false,
},
count: None,
},
wgpu::BindGroupLayoutEntry {
binding: 2,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Sampler(wgpu::SamplerBindingType::Filtering),
count: None,
},
wgpu::BindGroupLayoutEntry {
binding: 3,
visibility: wgpu::ShaderStages::VERTEX,
ty: wgpu::BindingType::Buffer {
ty: wgpu::BufferBindingType::Uniform,
has_dynamic_offset: true,
min_binding_size: None,
},
count: None,
},
],
});
let layout = device.create_pipeline_layout(&wgpu::PipelineLayoutDescriptor {
label: Some("iced_video_player pipeline layout"),
bind_group_layouts: &[&bg0_layout],
push_constant_ranges: &[],
});
let pipeline = device.create_render_pipeline(&wgpu::RenderPipelineDescriptor {
label: Some("iced_video_player pipeline"),
layout: Some(&layout),
vertex: wgpu::VertexState {
module: &shader,
entry_point: Some("vs_main"),
buffers: &[],
compilation_options: wgpu::PipelineCompilationOptions::default(),
},
primitive: wgpu::PrimitiveState::default(),
depth_stencil: None,
multisample: wgpu::MultisampleState {
count: 1,
mask: !0,
alpha_to_coverage_enabled: false,
},
fragment: Some(wgpu::FragmentState {
module: &shader,
entry_point: Some("fs_main"),
targets: &[Some(wgpu::ColorTargetState {
format,
blend: None,
write_mask: wgpu::ColorWrites::ALL,
})],
compilation_options: wgpu::PipelineCompilationOptions::default(),
}),
multiview: None,
cache: None,
});
let sampler = device.create_sampler(&wgpu::SamplerDescriptor {
label: Some("iced_video_player sampler"),
address_mode_u: wgpu::AddressMode::ClampToEdge,
address_mode_v: wgpu::AddressMode::ClampToEdge,
address_mode_w: wgpu::AddressMode::ClampToEdge,
mag_filter: wgpu::FilterMode::Linear,
min_filter: wgpu::FilterMode::Linear,
mipmap_filter: wgpu::FilterMode::Nearest,
lod_min_clamp: 0.0,
lod_max_clamp: 1.0,
compare: None,
anisotropy_clamp: 1,
border_color: None,
});
VideoPipeline {
pipeline,
bg0_layout,
sampler,
videos: BTreeMap::new(),
}
}
fn upload(
&mut self,
device: &wgpu::Device,
queue: &wgpu::Queue,
video_id: u64,
alive: &Arc<AtomicBool>,
(width, height): (u32, u32),
frame: &[u8],
) {
if let Entry::Vacant(entry) = self.videos.entry(video_id) {
let texture_y = device.create_texture(&wgpu::TextureDescriptor {
label: Some("iced_video_player texture"),
size: wgpu::Extent3d {
width,
height,
depth_or_array_layers: 1,
},
mip_level_count: 1,
sample_count: 1,
dimension: wgpu::TextureDimension::D2,
format: wgpu::TextureFormat::R8Unorm,
usage: wgpu::TextureUsages::COPY_DST | wgpu::TextureUsages::TEXTURE_BINDING,
view_formats: &[],
});
let texture_uv = device.create_texture(&wgpu::TextureDescriptor {
label: Some("iced_video_player texture"),
size: wgpu::Extent3d {
width: width / 2,
height: height / 2,
depth_or_array_layers: 1,
},
mip_level_count: 1,
sample_count: 1,
dimension: wgpu::TextureDimension::D2,
format: wgpu::TextureFormat::Rg8Unorm,
usage: wgpu::TextureUsages::COPY_DST | wgpu::TextureUsages::TEXTURE_BINDING,
view_formats: &[],
});
let view_y = texture_y.create_view(&wgpu::TextureViewDescriptor {
label: Some("iced_video_player texture view"),
format: None,
dimension: None,
aspect: wgpu::TextureAspect::All,
base_mip_level: 0,
mip_level_count: None,
base_array_layer: 0,
array_layer_count: None,
usage: Some(wgpu::TextureUsages::empty()),
});
let view_uv = texture_uv.create_view(&wgpu::TextureViewDescriptor {
label: Some("iced_video_player texture view"),
format: None,
dimension: None,
aspect: wgpu::TextureAspect::All,
base_mip_level: 0,
mip_level_count: None,
base_array_layer: 0,
array_layer_count: None,
usage: Some(wgpu::TextureUsages::empty()),
});
let instances = device.create_buffer(&wgpu::BufferDescriptor {
label: Some("iced_video_player uniform buffer"),
size: 256 * std::mem::size_of::<Uniforms>() as u64, // max 256 video players per frame
usage: wgpu::BufferUsages::COPY_DST | wgpu::BufferUsages::UNIFORM,
mapped_at_creation: false,
});
let bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
label: Some("iced_video_player bind group"),
layout: &self.bg0_layout,
entries: &[
wgpu::BindGroupEntry {
binding: 0,
resource: wgpu::BindingResource::TextureView(&view_y),
},
wgpu::BindGroupEntry {
binding: 1,
resource: wgpu::BindingResource::TextureView(&view_uv),
},
wgpu::BindGroupEntry {
binding: 2,
resource: wgpu::BindingResource::Sampler(&self.sampler),
},
wgpu::BindGroupEntry {
binding: 3,
resource: wgpu::BindingResource::Buffer(wgpu::BufferBinding {
buffer: &instances,
offset: 0,
size: Some(NonZero::new(std::mem::size_of::<Uniforms>() as _).unwrap()),
}),
},
],
});
entry.insert(VideoEntry {
texture_y,
texture_uv,
instances,
bg0: bind_group,
alive: Arc::clone(alive),
prepare_index: AtomicUsize::new(0),
render_index: AtomicUsize::new(0),
});
}
let VideoEntry {
texture_y,
texture_uv,
..
} = self.videos.get(&video_id).unwrap();
queue.write_texture(
wgpu::TexelCopyTextureInfo {
texture: texture_y,
mip_level: 0,
origin: wgpu::Origin3d::ZERO,
aspect: wgpu::TextureAspect::All,
},
&frame[..(width * height) as usize],
wgpu::TexelCopyBufferLayout {
offset: 0,
bytes_per_row: Some(width),
rows_per_image: Some(height),
},
wgpu::Extent3d {
width,
height,
depth_or_array_layers: 1,
},
);
queue.write_texture(
wgpu::TexelCopyTextureInfo {
texture: texture_uv,
mip_level: 0,
origin: wgpu::Origin3d::ZERO,
aspect: wgpu::TextureAspect::All,
},
&frame[(width * height) as usize..],
wgpu::TexelCopyBufferLayout {
offset: 0,
bytes_per_row: Some(width),
rows_per_image: Some(height / 2),
},
wgpu::Extent3d {
width: width / 2,
height: height / 2,
depth_or_array_layers: 1,
},
);
}
fn cleanup(&mut self) {
let ids: Vec<_> = self
.videos
.iter()
.filter_map(|(id, entry)| (!entry.alive.load(Ordering::SeqCst)).then_some(*id))
.collect();
for id in ids {
if let Some(video) = self.videos.remove(&id) {
video.texture_y.destroy();
video.texture_uv.destroy();
video.instances.destroy();
}
}
}
fn prepare(&mut self, queue: &wgpu::Queue, video_id: u64, bounds: &iced::Rectangle) {
if let Some(video) = self.videos.get_mut(&video_id) {
let uniforms = Uniforms {
rect: [
bounds.x,
bounds.y,
bounds.x + bounds.width,
bounds.y + bounds.height,
],
_pad: [0; 240],
};
queue.write_buffer(
&video.instances,
(video.prepare_index.load(Ordering::Relaxed) * std::mem::size_of::<Uniforms>())
as u64,
unsafe {
std::slice::from_raw_parts(
&uniforms as *const _ as *const u8,
std::mem::size_of::<Uniforms>(),
)
},
);
video.prepare_index.fetch_add(1, Ordering::Relaxed);
video.render_index.store(0, Ordering::Relaxed);
}
self.cleanup();
}
fn draw(
&self,
target: &wgpu::TextureView,
encoder: &mut wgpu::CommandEncoder,
clip: &iced::Rectangle<u32>,
video_id: u64,
) {
if let Some(video) = self.videos.get(&video_id) {
let mut pass = encoder.begin_render_pass(&wgpu::RenderPassDescriptor {
label: Some("iced_video_player render pass"),
color_attachments: &[Some(wgpu::RenderPassColorAttachment {
view: target,
resolve_target: None,
ops: wgpu::Operations {
load: wgpu::LoadOp::Load,
store: wgpu::StoreOp::Store,
},
depth_slice: None,
})],
depth_stencil_attachment: None,
timestamp_writes: None,
occlusion_query_set: None,
});
pass.set_pipeline(&self.pipeline);
pass.set_bind_group(
0,
&video.bg0,
&[
(video.render_index.load(Ordering::Relaxed) * std::mem::size_of::<Uniforms>())
as u32,
],
);
pass.set_scissor_rect(clip.x as _, clip.y as _, clip.width as _, clip.height as _);
pass.draw(0..6, 0..1);
video.prepare_index.store(0, Ordering::Relaxed);
video.render_index.fetch_add(1, Ordering::Relaxed);
}
}
}
#[derive(Debug, Clone)]
pub(crate) struct VideoPrimitive {
video_id: u64,
alive: Arc<AtomicBool>,
frame: Arc<Mutex<Frame>>,
size: (u32, u32),
upload_frame: bool,
}
impl VideoPrimitive {
pub fn new(
video_id: u64,
alive: Arc<AtomicBool>,
frame: Arc<Mutex<Frame>>,
size: (u32, u32),
upload_frame: bool,
) -> Self {
VideoPrimitive {
video_id,
alive,
frame,
size,
upload_frame,
}
}
}
impl Primitive for VideoPrimitive {
type Renderer = VideoPipeline;
fn initialize(
&self,
device: &wgpu::Device,
_queue: &wgpu::Queue,
format: wgpu::TextureFormat,
) -> Self::Renderer {
VideoPipeline::new(device, format)
}
fn prepare(
&self,
renderer: &mut Self::Renderer,
device: &wgpu::Device,
queue: &wgpu::Queue,
bounds: &iced::Rectangle,
viewport: &iced_wgpu::graphics::Viewport,
) {
if self.upload_frame {
if let Some(readable) = self.frame.lock().expect("lock frame mutex").readable() {
renderer.upload(
device,
queue,
self.video_id,
&self.alive,
self.size,
readable.as_slice(),
);
}
}
renderer.prepare(
queue,
self.video_id,
&(*bounds
* iced::Transformation::orthographic(
viewport.logical_size().width as _,
viewport.logical_size().height as _,
)),
);
}
fn render(
&self,
renderer: &Self::Renderer,
encoder: &mut wgpu::CommandEncoder,
target: &wgpu::TextureView,
clip_bounds: &iced::Rectangle<u32>,
) {
renderer.draw(target, encoder, clip_bounds, self.video_id);
}
}

View File

@@ -1,61 +0,0 @@
struct VertexOutput {
@builtin(position) position: vec4<f32>,
@location(0) uv: vec2<f32>,
}
struct Uniforms {
rect: vec4<f32>,
}
@group(0) @binding(0)
var tex_y: texture_2d<f32>;
@group(0) @binding(1)
var tex_uv: texture_2d<f32>;
@group(0) @binding(2)
var s: sampler;
@group(0) @binding(3)
var<uniform> uniforms: Uniforms;
@vertex
fn vs_main(@builtin(vertex_index) in_vertex_index: u32) -> VertexOutput {
var quad = array<vec4<f32>, 6>(
vec4<f32>(uniforms.rect.xy, 0.0, 0.0),
vec4<f32>(uniforms.rect.zy, 1.0, 0.0),
vec4<f32>(uniforms.rect.xw, 0.0, 1.0),
vec4<f32>(uniforms.rect.zy, 1.0, 0.0),
vec4<f32>(uniforms.rect.zw, 1.0, 1.0),
vec4<f32>(uniforms.rect.xw, 0.0, 1.0),
);
var out: VertexOutput;
out.uv = quad[in_vertex_index].zw;
out.position = vec4<f32>(quad[in_vertex_index].xy, 1.0, 1.0);
return out;
}
@fragment
fn fs_main(in: VertexOutput) -> @location(0) vec4<f32> {
let yuv2r = vec3<f32>(1.164, 0.0, 1.596);
let yuv2g = vec3<f32>(1.164, -0.391, -0.813);
let yuv2b = vec3<f32>(1.164, 2.018, 0.0);
var yuv = vec3<f32>(0.0);
yuv.x = textureSample(tex_y, s, in.uv).r - 0.0625;
yuv.y = textureSample(tex_uv, s, in.uv).r - 0.5;
yuv.z = textureSample(tex_uv, s, in.uv).g - 0.5;
var rgb = vec3<f32>(0.0);
rgb.x = dot(yuv, yuv2r);
rgb.y = dot(yuv, yuv2g);
rgb.z = dot(yuv, yuv2b);
let threshold = rgb <= vec3<f32>(0.04045);
let hi = pow((rgb + vec3<f32>(0.055)) / vec3<f32>(1.055), vec3<f32>(2.4));
let lo = rgb * vec3<f32>(1.0 / 12.92);
rgb = select(hi, lo, threshold);
return vec4<f32>(rgb, 1.0);
}

View File

@@ -1,662 +0,0 @@
use crate::Error;
use gstreamer as gst;
use gstreamer_app as gst_app;
use gstreamer_app::prelude::*;
use iced::widget::image as img;
use std::num::NonZeroU8;
use std::ops::{Deref, DerefMut};
use std::sync::atomic::{AtomicBool, AtomicU64, Ordering};
use std::sync::{Arc, Mutex, RwLock};
use std::time::{Duration, Instant};
/// Position in the media.
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub enum Position {
/// Position based on time.
///
/// Not the most accurate format for videos.
Time(Duration),
/// Position based on nth frame.
Frame(u64),
}
impl From<Position> for gst::GenericFormattedValue {
fn from(pos: Position) -> Self {
match pos {
Position::Time(t) => gst::ClockTime::from_nseconds(t.as_nanos() as _).into(),
Position::Frame(f) => gst::format::Default::from_u64(f).into(),
}
}
}
impl From<Duration> for Position {
fn from(t: Duration) -> Self {
Position::Time(t)
}
}
impl From<u64> for Position {
fn from(f: u64) -> Self {
Position::Frame(f)
}
}
#[derive(Debug)]
pub(crate) struct Frame(gst::Sample);
impl Frame {
pub fn empty() -> Self {
Self(gst::Sample::builder().build())
}
pub fn readable(&self) -> Option<gst::BufferMap<'_, gst::buffer::Readable>> {
self.0.buffer().and_then(|x| x.map_readable().ok())
}
}
#[derive(Debug)]
pub(crate) struct Internal {
pub(crate) id: u64,
pub(crate) bus: gst::Bus,
pub(crate) source: gst::Pipeline,
pub(crate) alive: Arc<AtomicBool>,
pub(crate) worker: Option<std::thread::JoinHandle<()>>,
pub(crate) width: i32,
pub(crate) height: i32,
pub(crate) framerate: f64,
pub(crate) duration: Duration,
pub(crate) speed: f64,
pub(crate) sync_av: bool,
pub(crate) frame: Arc<Mutex<Frame>>,
pub(crate) upload_frame: Arc<AtomicBool>,
pub(crate) last_frame_time: Arc<Mutex<Instant>>,
pub(crate) looping: bool,
pub(crate) is_eos: bool,
pub(crate) restart_stream: bool,
pub(crate) sync_av_avg: u64,
pub(crate) sync_av_counter: u64,
pub(crate) subtitle_text: Arc<Mutex<Option<String>>>,
pub(crate) upload_text: Arc<AtomicBool>,
}
impl Internal {
pub(crate) fn seek(&self, position: impl Into<Position>, accurate: bool) -> Result<(), Error> {
let position = position.into();
// gstreamer complains if the start & end value types aren't the same
match &position {
Position::Time(_) => self.source.seek(
self.speed,
gst::SeekFlags::FLUSH
| if accurate {
gst::SeekFlags::ACCURATE
} else {
gst::SeekFlags::empty()
},
gst::SeekType::Set,
gst::GenericFormattedValue::from(position),
gst::SeekType::Set,
gst::ClockTime::NONE,
)?,
Position::Frame(_) => self.source.seek(
self.speed,
gst::SeekFlags::FLUSH
| if accurate {
gst::SeekFlags::ACCURATE
} else {
gst::SeekFlags::empty()
},
gst::SeekType::Set,
gst::GenericFormattedValue::from(position),
gst::SeekType::Set,
gst::format::Default::NONE,
)?,
};
*self.subtitle_text.lock().expect("lock subtitle_text") = None;
self.upload_text.store(true, Ordering::SeqCst);
Ok(())
}
pub(crate) fn set_speed(&mut self, speed: f64) -> Result<(), Error> {
let Some(position) = self.source.query_position::<gst::ClockTime>() else {
return Err(Error::Caps);
};
if speed > 0.0 {
self.source.seek(
speed,
gst::SeekFlags::FLUSH | gst::SeekFlags::ACCURATE,
gst::SeekType::Set,
position,
gst::SeekType::End,
gst::ClockTime::from_seconds(0),
)?;
} else {
self.source.seek(
speed,
gst::SeekFlags::FLUSH | gst::SeekFlags::ACCURATE,
gst::SeekType::Set,
gst::ClockTime::from_seconds(0),
gst::SeekType::Set,
position,
)?;
}
self.speed = speed;
Ok(())
}
pub(crate) fn restart_stream(&mut self) -> Result<(), Error> {
self.is_eos = false;
self.set_paused(false);
self.seek(0, false)?;
Ok(())
}
pub(crate) fn set_paused(&mut self, paused: bool) {
self.source
.set_state(if paused {
gst::State::Paused
} else {
gst::State::Playing
})
.unwrap(/* state was changed in ctor; state errors caught there */);
// Set restart_stream flag to make the stream restart on the next Message::NextFrame
if self.is_eos && !paused {
self.restart_stream = true;
}
}
pub(crate) fn paused(&self) -> bool {
self.source.state(gst::ClockTime::ZERO).1 == gst::State::Paused
}
/// Syncs audio with video when there is (inevitably) latency presenting the frame.
pub(crate) fn set_av_offset(&mut self, offset: Duration) {
if self.sync_av {
self.sync_av_counter += 1;
self.sync_av_avg = self.sync_av_avg * (self.sync_av_counter - 1) / self.sync_av_counter
+ offset.as_nanos() as u64 / self.sync_av_counter;
if self.sync_av_counter % 128 == 0 {
self.source
.set_property("av-offset", -(self.sync_av_avg as i64));
}
}
}
}
/// A multimedia video loaded from a URI (e.g., a local file path or HTTP stream).
#[derive(Debug)]
pub struct Video(pub(crate) RwLock<Internal>);
impl Drop for Video {
fn drop(&mut self) {
let inner = self.0.get_mut().expect("failed to lock");
inner
.source
.set_state(gst::State::Null)
.expect("failed to set state");
inner.alive.store(false, Ordering::SeqCst);
if let Some(worker) = inner.worker.take() {
if let Err(err) = worker.join() {
match err.downcast_ref::<String>() {
Some(e) => log::error!("Video thread panicked: {e}"),
None => log::error!("Video thread panicked with unknown reason"),
}
}
}
}
}
impl Video {
/// Create a new video player from a given video which loads from `uri`.
/// Note that live sources will report the duration to be zero.
pub fn new(uri: &url::Url) -> Result<Self, Error> {
gst::init()?;
let pipeline = format!("playbin uri=\"{}\" text-sink=\"appsink name=iced_text sync=true drop=true\" video-sink=\"videoscale ! videoconvert ! appsink name=iced_video drop=true caps=video/x-raw,format=NV12,pixel-aspect-ratio=1/1\"", uri.as_str());
let pipeline = gst::parse::launch(pipeline.as_ref())?
.downcast::<gst::Pipeline>()
.map_err(|_| Error::Cast)?;
let video_sink: gst::Element = pipeline.property("video-sink");
let pad = video_sink.pads().first().cloned().unwrap();
let pad = pad.dynamic_cast::<gst::GhostPad>().unwrap();
let bin = pad
.parent_element()
.unwrap()
.downcast::<gst::Bin>()
.unwrap();
let video_sink = bin.by_name("iced_video").unwrap();
let video_sink = video_sink.downcast::<gst_app::AppSink>().unwrap();
let text_sink: gst::Element = pipeline.property("text-sink");
let text_sink = text_sink.downcast::<gst_app::AppSink>().unwrap();
Self::from_gst_pipeline(pipeline, video_sink, Some(text_sink))
}
/// Creates a new video based on an existing GStreamer pipeline and appsink.
/// Expects an `appsink` plugin with `caps=video/x-raw,format=NV12`.
///
/// An optional `text_sink` can be provided, which enables subtitle messages
/// to be emitted.
///
/// **Note:** Many functions of [`Video`] assume a `playbin` pipeline.
/// Non-`playbin` pipelines given here may not have full functionality.
pub fn from_gst_pipeline(
pipeline: gst::Pipeline,
video_sink: gst_app::AppSink,
text_sink: Option<gst_app::AppSink>,
) -> Result<Self, Error> {
gst::init()?;
static NEXT_ID: AtomicU64 = AtomicU64::new(0);
let id = NEXT_ID.fetch_add(1, Ordering::SeqCst);
// We need to ensure we stop the pipeline if we hit an error,
// or else there may be audio left playing in the background.
macro_rules! cleanup {
($expr:expr) => {
$expr.map_err(|e| {
let _ = pipeline.set_state(gst::State::Null);
e
})
};
}
let pad = video_sink.pads().first().cloned().unwrap();
dbg!(&pad);
dbg!(&pipeline);
cleanup!(pipeline.set_state(gst::State::Playing))?;
// wait for up to 5 seconds until the decoder gets the source capabilities
cleanup!(pipeline.state(gst::ClockTime::from_seconds(5)).0)?;
// extract resolution and framerate
// TODO(jazzfool): maybe we want to extract some other information too?
let caps = cleanup!(pad.current_caps().ok_or(Error::Caps))?;
let s = cleanup!(caps.structure(0).ok_or(Error::Caps))?;
let width = cleanup!(s.get::<i32>("width").map_err(|_| Error::Caps))?;
let height = cleanup!(s.get::<i32>("height").map_err(|_| Error::Caps))?;
// resolution should be mod4
let width = ((width + 4 - 1) / 4) * 4;
let framerate = cleanup!(s.get::<gst::Fraction>("framerate").map_err(|_| Error::Caps))?;
let framerate = framerate.numer() as f64 / framerate.denom() as f64;
if framerate.is_nan()
|| framerate.is_infinite()
|| framerate < 0.0
|| framerate.abs() < f64::EPSILON
{
let _ = pipeline.set_state(gst::State::Null);
return Err(Error::Framerate(framerate));
}
let duration = Duration::from_nanos(
pipeline
.query_duration::<gst::ClockTime>()
.map(|duration| duration.nseconds())
.unwrap_or(0),
);
let sync_av = pipeline.has_property("av-offset", None);
// NV12 = 12bpp
let frame = Arc::new(Mutex::new(Frame::empty()));
let upload_frame = Arc::new(AtomicBool::new(false));
let alive = Arc::new(AtomicBool::new(true));
let last_frame_time = Arc::new(Mutex::new(Instant::now()));
let frame_ref = Arc::clone(&frame);
let upload_frame_ref = Arc::clone(&upload_frame);
let alive_ref = Arc::clone(&alive);
let last_frame_time_ref = Arc::clone(&last_frame_time);
let subtitle_text = Arc::new(Mutex::new(None));
let upload_text = Arc::new(AtomicBool::new(false));
let subtitle_text_ref = Arc::clone(&subtitle_text);
let upload_text_ref = Arc::clone(&upload_text);
let pipeline_ref = pipeline.clone();
let worker = std::thread::spawn(move || {
let mut clear_subtitles_at = None;
while alive_ref.load(Ordering::Acquire) {
if let Err(gst::FlowError::Error) = (|| -> Result<(), gst::FlowError> {
let sample =
if pipeline_ref.state(gst::ClockTime::ZERO).1 != gst::State::Playing {
video_sink
.try_pull_preroll(gst::ClockTime::from_mseconds(16))
.ok_or(gst::FlowError::Eos)?
} else {
video_sink
.try_pull_sample(gst::ClockTime::from_mseconds(16))
.ok_or(gst::FlowError::Eos)?
};
*last_frame_time_ref
.lock()
.map_err(|_| gst::FlowError::Error)? = Instant::now();
let frame_segment = sample.segment().cloned().ok_or(gst::FlowError::Error)?;
let buffer = sample.buffer().ok_or(gst::FlowError::Error)?;
let frame_pts = buffer.pts().ok_or(gst::FlowError::Error)?;
let frame_duration = buffer.duration().ok_or(gst::FlowError::Error)?;
{
let mut frame_guard =
frame_ref.lock().map_err(|_| gst::FlowError::Error)?;
*frame_guard = Frame(sample);
}
upload_frame_ref.swap(true, Ordering::SeqCst);
if let Some(at) = clear_subtitles_at {
if frame_pts >= at {
*subtitle_text_ref
.lock()
.map_err(|_| gst::FlowError::Error)? = None;
upload_text_ref.store(true, Ordering::SeqCst);
clear_subtitles_at = None;
}
}
let text = text_sink
.as_ref()
.and_then(|sink| sink.try_pull_sample(gst::ClockTime::from_seconds(0)));
if let Some(text) = text {
let text_segment = text.segment().ok_or(gst::FlowError::Error)?;
let text = text.buffer().ok_or(gst::FlowError::Error)?;
let text_pts = text.pts().ok_or(gst::FlowError::Error)?;
let text_duration = text.duration().ok_or(gst::FlowError::Error)?;
let frame_running_time = frame_segment.to_running_time(frame_pts).value();
let frame_running_time_end = frame_segment
.to_running_time(frame_pts + frame_duration)
.value();
let text_running_time = text_segment.to_running_time(text_pts).value();
let text_running_time_end = text_segment
.to_running_time(text_pts + text_duration)
.value();
// see gst-plugins-base/ext/pango/gstbasetextoverlay.c (gst_base_text_overlay_video_chain)
// as an example of how to correctly synchronize the text+video segments
if text_running_time_end > frame_running_time
&& frame_running_time_end > text_running_time
{
let duration = text.duration().unwrap_or(gst::ClockTime::ZERO);
let map = text.map_readable().map_err(|_| gst::FlowError::Error)?;
let text = std::str::from_utf8(map.as_slice())
.map_err(|_| gst::FlowError::Error)?
.to_string();
*subtitle_text_ref
.lock()
.map_err(|_| gst::FlowError::Error)? = Some(text);
upload_text_ref.store(true, Ordering::SeqCst);
clear_subtitles_at = Some(text_pts + duration);
}
}
Ok(())
})() {
log::error!("error pulling frame");
}
}
});
Ok(Video(RwLock::new(Internal {
id,
bus: pipeline.bus().unwrap(),
source: pipeline,
alive,
worker: Some(worker),
width,
height,
framerate,
duration,
speed: 1.0,
sync_av,
frame,
upload_frame,
last_frame_time,
looping: false,
is_eos: false,
restart_stream: false,
sync_av_avg: 0,
sync_av_counter: 0,
subtitle_text,
upload_text,
})))
}
pub(crate) fn read(&self) -> impl Deref<Target = Internal> + '_ {
self.0.read().expect("lock")
}
pub(crate) fn write(&self) -> impl DerefMut<Target = Internal> + '_ {
self.0.write().expect("lock")
}
pub(crate) fn get_mut(&mut self) -> impl DerefMut<Target = Internal> + '_ {
self.0.get_mut().expect("lock")
}
/// Get the size/resolution of the video as `(width, height)`.
pub fn size(&self) -> (i32, i32) {
(self.read().width, self.read().height)
}
/// Get the framerate of the video as frames per second.
pub fn framerate(&self) -> f64 {
self.read().framerate
}
/// Set the volume multiplier of the audio.
/// `0.0` = 0% volume, `1.0` = 100% volume.
///
/// This uses a linear scale, for example `0.5` is perceived as half as loud.
pub fn set_volume(&mut self, volume: f64) {
self.get_mut().source.set_property("volume", volume);
self.set_muted(self.muted()); // for some reason gstreamer unmutes when changing volume?
}
/// Get the volume multiplier of the audio.
pub fn volume(&self) -> f64 {
self.read().source.property("volume")
}
/// Set if the audio is muted or not, without changing the volume.
pub fn set_muted(&mut self, muted: bool) {
self.get_mut().source.set_property("mute", muted);
}
/// Get if the audio is muted or not.
pub fn muted(&self) -> bool {
self.read().source.property("mute")
}
/// Get if the stream ended or not.
pub fn eos(&self) -> bool {
self.read().is_eos
}
/// Get if the media will loop or not.
pub fn looping(&self) -> bool {
self.read().looping
}
/// Set if the media will loop or not.
pub fn set_looping(&mut self, looping: bool) {
self.get_mut().looping = looping;
}
/// Set if the media is paused or not.
pub fn set_paused(&mut self, paused: bool) {
self.get_mut().set_paused(paused)
}
/// Get if the media is paused or not.
pub fn paused(&self) -> bool {
self.read().paused()
}
/// Jumps to a specific position in the media.
/// Passing `true` to the `accurate` parameter will result in more accurate seeking,
/// however, it is also slower. For most seeks (e.g., scrubbing) this is not needed.
pub fn seek(&mut self, position: impl Into<Position>, accurate: bool) -> Result<(), Error> {
self.get_mut().seek(position, accurate)
}
/// Set the playback speed of the media.
/// The default speed is `1.0`.
pub fn set_speed(&mut self, speed: f64) -> Result<(), Error> {
self.get_mut().set_speed(speed)
}
/// Get the current playback speed.
pub fn speed(&self) -> f64 {
self.read().speed
}
/// Get the current playback position in time.
pub fn position(&self) -> Duration {
Duration::from_nanos(
self.read()
.source
.query_position::<gst::ClockTime>()
.map_or(0, |pos| pos.nseconds()),
)
}
/// Get the media duration.
pub fn duration(&self) -> Duration {
self.read().duration
}
/// Restarts a stream; seeks to the first frame and unpauses, sets the `eos` flag to false.
pub fn restart_stream(&mut self) -> Result<(), Error> {
self.get_mut().restart_stream()
}
/// Set the subtitle URL to display.
pub fn set_subtitle_url(&mut self, url: &url::Url) -> Result<(), Error> {
let paused = self.paused();
let mut inner = self.get_mut();
inner.source.set_state(gst::State::Ready)?;
inner.source.set_property("suburi", url.as_str());
inner.set_paused(paused);
Ok(())
}
/// Get the current subtitle URL.
pub fn subtitle_url(&self) -> Option<url::Url> {
url::Url::parse(
&self
.read()
.source
.property::<Option<String>>("current-suburi")?,
)
.ok()
}
/// Get the underlying GStreamer pipeline.
pub fn pipeline(&self) -> gst::Pipeline {
self.read().source.clone()
}
/// Generates a list of thumbnails based on a set of positions in the media, downscaled by a given factor.
///
/// Slow; only needs to be called once for each instance.
/// It's best to call this at the very start of playback, otherwise the position may shift.
pub fn thumbnails<I>(
&mut self,
positions: I,
downscale: NonZeroU8,
) -> Result<Vec<img::Handle>, Error>
where
I: IntoIterator<Item = Position>,
{
let downscale = u8::from(downscale) as u32;
let paused = self.paused();
let muted = self.muted();
let pos = self.position();
self.set_paused(false);
self.set_muted(true);
let out = {
let inner = self.read();
let width = inner.width;
let height = inner.height;
positions
.into_iter()
.map(|pos| {
inner.seek(pos, true)?;
inner.upload_frame.store(false, Ordering::SeqCst);
while !inner.upload_frame.load(Ordering::SeqCst) {
std::hint::spin_loop();
}
let frame_guard = inner.frame.lock().map_err(|_| Error::Lock)?;
let frame = frame_guard.readable().ok_or(Error::Lock)?;
Ok(img::Handle::from_rgba(
inner.width as u32 / downscale,
inner.height as u32 / downscale,
yuv_to_rgba(frame.as_slice(), width as _, height as _, downscale),
))
})
.collect()
};
self.set_paused(paused);
self.set_muted(muted);
self.seek(pos, true)?;
out
}
}
fn yuv_to_rgba(yuv: &[u8], width: u32, height: u32, downscale: u32) -> Vec<u8> {
let uv_start = width * height;
let mut rgba = vec![];
for y in 0..height / downscale {
for x in 0..width / downscale {
let x_src = x * downscale;
let y_src = y * downscale;
let uv_i = uv_start + width * (y_src / 2) + x_src / 2 * 2;
let y = yuv[(y_src * width + x_src) as usize] as f32;
let u = yuv[uv_i as usize] as f32;
let v = yuv[(uv_i + 1) as usize] as f32;
let r = 1.164 * (y - 16.0) + 1.596 * (v - 128.0);
let g = 1.164 * (y - 16.0) - 0.813 * (v - 128.0) - 0.391 * (u - 128.0);
let b = 1.164 * (y - 16.0) + 2.018 * (u - 128.0);
rgba.push(r as u8);
rgba.push(g as u8);
rgba.push(b as u8);
rgba.push(0xFF);
}
}
rgba
}

View File

@@ -1,305 +0,0 @@
use crate::{pipeline::VideoPrimitive, video::Video};
use gstreamer as gst;
use iced::{
advanced::{self, layout, widget, Widget},
Element,
};
use iced_wgpu::primitive::Renderer as PrimitiveRenderer;
use log::error;
use std::{marker::PhantomData, sync::atomic::Ordering};
use std::{sync::Arc, time::Instant};
/// Video player widget which displays the current frame of a [`Video`](crate::Video).
pub struct VideoPlayer<'a, Message, Theme = iced::Theme, Renderer = iced::Renderer>
where
Renderer: PrimitiveRenderer,
{
video: &'a Video,
content_fit: iced::ContentFit,
width: iced::Length,
height: iced::Length,
on_end_of_stream: Option<Message>,
on_new_frame: Option<Message>,
on_subtitle_text: Option<Box<dyn Fn(Option<String>) -> Message + 'a>>,
on_error: Option<Box<dyn Fn(&glib::Error) -> Message + 'a>>,
_phantom: PhantomData<(Theme, Renderer)>,
}
impl<'a, Message, Theme, Renderer> VideoPlayer<'a, Message, Theme, Renderer>
where
Renderer: PrimitiveRenderer,
{
/// Creates a new video player widget for a given video.
pub fn new(video: &'a Video) -> Self {
VideoPlayer {
video,
content_fit: iced::ContentFit::default(),
width: iced::Length::Shrink,
height: iced::Length::Shrink,
on_end_of_stream: None,
on_new_frame: None,
on_subtitle_text: None,
on_error: None,
_phantom: Default::default(),
}
}
/// Sets the width of the `VideoPlayer` boundaries.
pub fn width(self, width: impl Into<iced::Length>) -> Self {
VideoPlayer {
width: width.into(),
..self
}
}
/// Sets the height of the `VideoPlayer` boundaries.
pub fn height(self, height: impl Into<iced::Length>) -> Self {
VideoPlayer {
height: height.into(),
..self
}
}
/// Sets the `ContentFit` of the `VideoPlayer`.
pub fn content_fit(self, content_fit: iced::ContentFit) -> Self {
VideoPlayer {
content_fit,
..self
}
}
/// Message to send when the video reaches the end of stream (i.e., the video ends).
pub fn on_end_of_stream(self, on_end_of_stream: Message) -> Self {
VideoPlayer {
on_end_of_stream: Some(on_end_of_stream),
..self
}
}
/// Message to send when the video receives a new frame.
pub fn on_new_frame(self, on_new_frame: Message) -> Self {
VideoPlayer {
on_new_frame: Some(on_new_frame),
..self
}
}
/// Message to send when the video receives a new frame.
pub fn on_subtitle_text<F>(self, on_subtitle_text: F) -> Self
where
F: 'a + Fn(Option<String>) -> Message,
{
VideoPlayer {
on_subtitle_text: Some(Box::new(on_subtitle_text)),
..self
}
}
/// Message to send when the video playback encounters an error.
pub fn on_error<F>(self, on_error: F) -> Self
where
F: 'a + Fn(&glib::Error) -> Message,
{
VideoPlayer {
on_error: Some(Box::new(on_error)),
..self
}
}
}
impl<Message, Theme, Renderer> Widget<Message, Theme, Renderer>
for VideoPlayer<'_, Message, Theme, Renderer>
where
Message: Clone,
Renderer: PrimitiveRenderer,
{
fn size(&self) -> iced::Size<iced::Length> {
iced::Size {
width: iced::Length::Shrink,
height: iced::Length::Shrink,
}
}
fn layout(
&mut self,
_tree: &mut widget::Tree,
_renderer: &Renderer,
limits: &layout::Limits,
) -> layout::Node {
let (video_width, video_height) = self.video.size();
// based on `Image::layout`
let image_size = iced::Size::new(video_width as f32, video_height as f32);
let raw_size = limits.resolve(self.width, self.height, image_size);
let full_size = self.content_fit.fit(image_size, raw_size);
let final_size = iced::Size {
width: match self.width {
iced::Length::Shrink => f32::min(raw_size.width, full_size.width),
_ => raw_size.width,
},
height: match self.height {
iced::Length::Shrink => f32::min(raw_size.height, full_size.height),
_ => raw_size.height,
},
};
layout::Node::new(final_size)
}
fn draw(
&self,
_tree: &widget::Tree,
renderer: &mut Renderer,
_theme: &Theme,
_style: &advanced::renderer::Style,
layout: advanced::Layout<'_>,
_cursor: advanced::mouse::Cursor,
_viewport: &iced::Rectangle,
) {
let mut inner = self.video.write();
// bounds based on `Image::draw`
let image_size = iced::Size::new(inner.width as f32, inner.height as f32);
let bounds = layout.bounds();
let adjusted_fit = self.content_fit.fit(image_size, bounds.size());
let scale = iced::Vector::new(
adjusted_fit.width / image_size.width,
adjusted_fit.height / image_size.height,
);
let final_size = image_size * scale;
let position = match self.content_fit {
iced::ContentFit::None => iced::Point::new(
bounds.x + (image_size.width - adjusted_fit.width) / 2.0,
bounds.y + (image_size.height - adjusted_fit.height) / 2.0,
),
_ => iced::Point::new(
bounds.center_x() - final_size.width / 2.0,
bounds.center_y() - final_size.height / 2.0,
),
};
let drawing_bounds = iced::Rectangle::new(position, final_size);
let upload_frame = inner.upload_frame.swap(false, Ordering::SeqCst);
if upload_frame {
let last_frame_time = inner
.last_frame_time
.lock()
.map(|time| *time)
.unwrap_or_else(|_| Instant::now());
inner.set_av_offset(Instant::now() - last_frame_time);
}
let render = |renderer: &mut Renderer| {
renderer.draw_primitive(
drawing_bounds,
VideoPrimitive::new(
inner.id,
Arc::clone(&inner.alive),
Arc::clone(&inner.frame),
(inner.width as _, inner.height as _),
upload_frame,
),
);
};
if adjusted_fit.width > bounds.width || adjusted_fit.height > bounds.height {
renderer.with_layer(bounds, render);
} else {
render(renderer);
}
}
fn update(
&mut self,
_state: &mut widget::Tree,
event: &iced::Event,
_layout: advanced::Layout<'_>,
_cursor: advanced::mouse::Cursor,
_renderer: &Renderer,
_clipboard: &mut dyn advanced::Clipboard,
shell: &mut advanced::Shell<'_, Message>,
_viewport: &iced::Rectangle,
) {
let mut inner = self.video.write();
if let iced::Event::Window(iced::window::Event::RedrawRequested(_)) = event {
if inner.restart_stream || (!inner.is_eos && !inner.paused()) {
let mut restart_stream = false;
if inner.restart_stream {
restart_stream = true;
// Set flag to false to avoid potentially multiple seeks
inner.restart_stream = false;
}
let mut eos_pause = false;
while let Some(msg) = inner
.bus
.pop_filtered(&[gst::MessageType::Error, gst::MessageType::Eos])
{
match msg.view() {
gst::MessageView::Error(err) => {
error!("bus returned an error: {err}");
if let Some(ref on_error) = self.on_error {
shell.publish(on_error(&err.error()))
};
}
gst::MessageView::Eos(_eos) => {
if let Some(on_end_of_stream) = self.on_end_of_stream.clone() {
shell.publish(on_end_of_stream);
}
if inner.looping {
restart_stream = true;
} else {
eos_pause = true;
}
}
_ => {}
}
}
// Don't run eos_pause if restart_stream is true; fixes "pausing" after restarting a stream
if restart_stream {
if let Err(err) = inner.restart_stream() {
error!("cannot restart stream (can't seek): {err:#?}");
}
} else if eos_pause {
inner.is_eos = true;
inner.set_paused(true);
}
if inner.upload_frame.load(Ordering::SeqCst) {
if let Some(on_new_frame) = self.on_new_frame.clone() {
shell.publish(on_new_frame);
}
}
if let Some(on_subtitle_text) = &self.on_subtitle_text {
if inner.upload_text.swap(false, Ordering::SeqCst) {
if let Ok(text) = inner.subtitle_text.try_lock() {
shell.publish(on_subtitle_text(text.clone()));
}
}
}
shell.request_redraw();
} else {
shell.request_redraw();
}
}
}
}
impl<'a, Message, Theme, Renderer> From<VideoPlayer<'a, Message, Theme, Renderer>>
for Element<'a, Message, Theme, Renderer>
where
Message: 'a + Clone,
Theme: 'a,
Renderer: 'a + PrimitiveRenderer,
{
fn from(video_player: VideoPlayer<'a, Message, Theme, Renderer>) -> Self {
Self::new(video_player)
}
}

View File

@@ -92,7 +92,15 @@ allow = [
"MIT", "MIT",
"Apache-2.0", "Apache-2.0",
"Unicode-3.0", "Unicode-3.0",
#"Apache-2.0 WITH LLVM-exception", "BSD-2-Clause",
"BSD-3-Clause",
"Apache-2.0 WITH LLVM-exception",
"Zlib",
"ISC",
"NCSA",
"CC0-1.0",
"BSL-1.0",
# "LGPL",
] ]
# The confidence threshold for detecting a license from license text. # The confidence threshold for detecting a license from license text.
# The higher the value, the more closely the license text must be to the # The higher the value, the more closely the license text must be to the

View File

@@ -0,0 +1,2 @@
perf*
target/

View File

@@ -0,0 +1,20 @@
[package]
name = "hdr-gstreamer-wgpu"
version = "0.1.0"
edition = "2024"
[dependencies]
# gst = { workspace = true }
wgpu = "27"
gstreamer = { version = "0.24.4", features = ["v1_26"] }
gstreamer-app = { version = "0.24.4", features = ["v1_26"] }
gstreamer-base = { version = "0.24.4", features = ["v1_26"] }
gstreamer-video = { version = "0.24.4", features = ["v1_26"] }
winit = { version = "*", features = ["wayland"] }
anyhow = "*"
pollster = "0.4.0"
tracing = { version = "0.1.43", features = ["log"] }
tracing-subscriber = "0.3.22"
[profile.release]
debug = true

View File

View File

@@ -0,0 +1,592 @@
use std::sync::Arc;
use gstreamer as gst;
use gstreamer_app as gst_app;
use anyhow::{Context, Result};
use winit::{
application::ApplicationHandler,
event::*,
event_loop::{ActiveEventLoop, EventLoop},
keyboard::*,
window::Window,
};
pub struct App {
state: Option<State>,
}
impl App {
pub fn new() -> Self {
Self { state: None }
}
}
pub trait HdrTextureFormatExt {
fn is_hdr_format(&self) -> bool;
}
impl HdrTextureFormatExt for wgpu::TextureFormat {
fn is_hdr_format(&self) -> bool {
matches!(
self,
wgpu::TextureFormat::Rgba16Float
| wgpu::TextureFormat::Rgba32Float
| wgpu::TextureFormat::Rgb10a2Unorm
)
}
}
pub struct State {
window: Arc<Window>,
gst: Video,
surface: wgpu::Surface<'static>,
video_texture: wgpu::Texture,
device: wgpu::Device,
queue: wgpu::Queue,
config: wgpu::SurfaceConfiguration,
pipeline: wgpu::RenderPipeline,
bind_group: wgpu::BindGroup,
is_surface_initialized: bool,
}
impl State {
async fn new(window: Arc<Window>) -> Result<State> {
let instance = wgpu::Instance::default();
let surface = instance
.create_surface(window.clone())
.context("Failed to create wgpu surface")?;
let adapter = instance
.request_adapter(&wgpu::RequestAdapterOptions {
power_preference: wgpu::PowerPreference::HighPerformance,
compatible_surface: Some(&surface),
force_fallback_adapter: false,
})
.await
.context("Failed to request wgpu adapter")?;
let (device, queue) = adapter
.request_device(&wgpu::DeviceDescriptor {
label: None,
required_features: wgpu::Features::empty(),
required_limits: wgpu::Limits::default(),
memory_hints: wgpu::MemoryHints::default(),
..Default::default()
})
.await
.context("Failed to request wgpu device")?;
let surface_caps = surface.get_capabilities(&adapter);
tracing::info!("Caps: {:#?}", &surface_caps);
let surface_format = surface_caps
.formats
.iter()
.rev() // float one comes first
.find(|f| f.is_hdr_format())
.expect("HDR format not supported")
.clone();
tracing::info!("Using surface format: {:?}", surface_format);
let size = window.inner_size();
let config = wgpu::SurfaceConfiguration {
usage: wgpu::TextureUsages::RENDER_ATTACHMENT,
format: surface_format,
width: size.width,
height: size.height,
present_mode: surface_caps.present_modes[0],
alpha_mode: surface_caps.alpha_modes[0],
view_formats: vec![],
desired_maximum_frame_latency: 2, // calculate upto 5 frames ahead
};
surface.configure(&device, &config);
let shader = device.create_shader_module(wgpu::include_wgsl!("shader.wgsl"));
let texture_bind_group_layout =
device.create_bind_group_layout(&wgpu::BindGroupLayoutDescriptor {
label: Some("texture_bind_group_layout"),
entries: &[
wgpu::BindGroupLayoutEntry {
binding: 0,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Texture {
multisampled: false,
view_dimension: wgpu::TextureViewDimension::D2,
sample_type: wgpu::TextureSampleType::Float { filterable: true },
},
count: None,
},
wgpu::BindGroupLayoutEntry {
binding: 1,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Sampler(wgpu::SamplerBindingType::Filtering),
count: None,
},
],
});
let render_pipeline_layout =
device.create_pipeline_layout(&wgpu::PipelineLayoutDescriptor {
label: Some("Jello Render Pipeline Layout"),
bind_group_layouts: &[&texture_bind_group_layout],
push_constant_ranges: &[],
});
let render_pipeline = device.create_render_pipeline(&wgpu::RenderPipelineDescriptor {
label: Some("Jello Render Pipeline"),
layout: Some(&render_pipeline_layout),
vertex: wgpu::VertexState {
module: &shader,
entry_point: Some("vs_main"),
buffers: &[],
compilation_options: wgpu::PipelineCompilationOptions::default(),
},
fragment: Some(wgpu::FragmentState {
module: &shader,
entry_point: Some("fs_main"),
compilation_options: wgpu::PipelineCompilationOptions::default(),
targets: &[Some(wgpu::ColorTargetState {
format: surface_format,
blend: Some(wgpu::BlendState::REPLACE),
write_mask: wgpu::ColorWrites::ALL,
})],
}),
primitive: wgpu::PrimitiveState::default(),
depth_stencil: None,
multisample: wgpu::MultisampleState {
count: 1,
mask: !0,
alpha_to_coverage_enabled: false,
},
multiview: None,
cache: None,
});
let texture_size = wgpu::Extent3d {
width: size.width,
height: size.height,
depth_or_array_layers: 1,
};
let video_texture = device.create_texture(&wgpu::TextureDescriptor {
size: texture_size,
mip_level_count: 1,
sample_count: 1,
dimension: wgpu::TextureDimension::D2,
format: surface_format,
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
label: Some("Jello Video Texture"),
view_formats: &[],
});
// TODO: Use a better sampler
let sampler = device.create_sampler(&wgpu::SamplerDescriptor {
label: Some("texture_sampler"),
address_mode_u: wgpu::AddressMode::ClampToEdge,
address_mode_v: wgpu::AddressMode::ClampToEdge,
address_mode_w: wgpu::AddressMode::ClampToEdge,
mag_filter: wgpu::FilterMode::Linear,
min_filter: wgpu::FilterMode::Linear,
mipmap_filter: wgpu::FilterMode::Nearest,
..Default::default()
});
let bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
layout: &texture_bind_group_layout,
entries: &[
wgpu::BindGroupEntry {
binding: 0,
resource: wgpu::BindingResource::TextureView(
&video_texture.create_view(&wgpu::TextureViewDescriptor::default()),
),
},
wgpu::BindGroupEntry {
binding: 1,
resource: wgpu::BindingResource::Sampler(&sampler),
},
],
label: Some("Jello Texture Bind Group"),
});
let gst = Video::new().context("Failed to create Video")?;
std::thread::sleep(std::time::Duration::from_secs(10));
// surface.configure(&device, &config);
Ok(Self {
window,
gst,
surface,
video_texture,
device,
queue,
config,
is_surface_initialized: true,
bind_group,
pipeline: render_pipeline,
})
}
// async fn next_frame(&mut self)
fn resize(&mut self, width: u32, height: u32) {
if width > 0 && height > 0 {
self.config.width = width;
self.config.height = height;
self.surface.configure(&self.device, &self.config);
self.is_surface_initialized = true;
}
}
fn render(&mut self) -> Result<(), wgpu::SurfaceError> {
if !self.is_surface_initialized {
return Ok(());
}
self.gst.poll();
self.copy_next_frame_to_texture()
.inspect_err(|e| {
tracing::error!("Failed to copy video frame to texture: {e:?}");
})
.map_err(|_| wgpu::SurfaceError::Lost)?;
let output = match self.surface.get_current_texture() {
Ok(output) => output,
Err(wgpu::SurfaceError::Lost) => {
self.surface.configure(&self.device, &self.config);
return Ok(());
}
Err(e) => return Err(e),
};
let view = output
.texture
.create_view(&wgpu::TextureViewDescriptor::default());
let mut encoder = self
.device
.create_command_encoder(&wgpu::CommandEncoderDescriptor {
label: Some("Jello Render Encoder"),
});
let mut render_pass = encoder.begin_render_pass(&wgpu::RenderPassDescriptor {
label: Some("Jello Render Pass"),
color_attachments: &[Some(wgpu::RenderPassColorAttachment {
view: &view,
resolve_target: None,
ops: wgpu::Operations {
load: wgpu::LoadOp::Clear(wgpu::Color {
r: 0.1,
g: 0.2,
b: 0.3,
a: 1.0,
}),
store: wgpu::StoreOp::Store,
},
depth_slice: None,
})],
depth_stencil_attachment: None,
occlusion_query_set: None,
timestamp_writes: None,
});
render_pass.set_pipeline(&self.pipeline);
render_pass.set_bind_group(0, &self.bind_group, &[]);
render_pass.draw(0..3, 0..1);
drop(render_pass);
self.queue.submit(std::iter::once(encoder.finish()));
output.present();
self.window.request_redraw();
Ok(())
}
pub fn copy_next_frame_to_texture(&mut self) -> Result<()> {
let frame = self
.gst
.appsink
.try_pull_sample(gst::ClockTime::NONE)
.context("Failed to pull sample from appsink")?;
let caps = frame.caps().context("Failed to get caps from sample")?;
let size = caps
.structure(0)
.context("Failed to get structure from caps")?;
let width = size
.get::<i32>("width")
.context("Failed to get width from caps")? as u32;
let height = size
.get::<i32>("height")
.context("Failed to get height from caps")? as u32;
let texture_size = self.video_texture.size();
if texture_size.width != width || texture_size.height != height {
tracing::info!(
"Resizing video texture from {}x{} to {}x{}",
texture_size.width,
texture_size.height,
width,
height
);
self.video_texture = self.device.create_texture(&wgpu::TextureDescriptor {
size: wgpu::Extent3d {
width: width as u32,
height: height as u32,
depth_or_array_layers: 1,
},
mip_level_count: 1,
sample_count: 1,
dimension: wgpu::TextureDimension::D2,
format: self.config.format,
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
label: Some("Jello Video Texture"),
view_formats: &[],
});
let texture_bind_group_layout =
self.device
.create_bind_group_layout(&wgpu::BindGroupLayoutDescriptor {
label: Some("texture_bind_group_layout"),
entries: &[
wgpu::BindGroupLayoutEntry {
binding: 0,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Texture {
multisampled: false,
view_dimension: wgpu::TextureViewDimension::D2,
sample_type: wgpu::TextureSampleType::Float {
filterable: true,
},
},
count: None,
},
wgpu::BindGroupLayoutEntry {
binding: 1,
visibility: wgpu::ShaderStages::FRAGMENT,
ty: wgpu::BindingType::Sampler(wgpu::SamplerBindingType::Filtering),
count: None,
},
],
});
let sampler = self.device.create_sampler(&wgpu::SamplerDescriptor {
label: Some("texture_sampler"),
address_mode_u: wgpu::AddressMode::ClampToEdge,
address_mode_v: wgpu::AddressMode::ClampToEdge,
address_mode_w: wgpu::AddressMode::ClampToEdge,
mag_filter: wgpu::FilterMode::Linear,
min_filter: wgpu::FilterMode::Linear,
mipmap_filter: wgpu::FilterMode::Nearest,
..Default::default()
});
self.bind_group = self.device.create_bind_group(&wgpu::BindGroupDescriptor {
layout: &texture_bind_group_layout,
entries: &[
wgpu::BindGroupEntry {
binding: 0,
resource: wgpu::BindingResource::TextureView(
&self
.video_texture
.create_view(&wgpu::TextureViewDescriptor::default()),
),
},
wgpu::BindGroupEntry {
binding: 1,
resource: wgpu::BindingResource::Sampler(&sampler),
},
],
label: Some("Jello Texture Bind Group"),
});
}
let texture = &self.video_texture;
let buffer = frame.buffer().context("Failed to get buffer from sample")?;
let map = buffer
.map_readable()
.context("Failed to map buffer readable")?;
self.queue.write_texture(
wgpu::TexelCopyTextureInfo {
texture: &texture,
mip_level: 0,
origin: wgpu::Origin3d::ZERO,
aspect: wgpu::TextureAspect::All,
},
&map,
wgpu::TexelCopyBufferLayout {
offset: 0,
bytes_per_row: Some(4 * width as u32),
rows_per_image: Some(height as u32),
},
texture.size(),
);
// drop(map);
// drop(frame);
Ok(())
}
}
impl ApplicationHandler<State> for App {
fn resumed(&mut self, event_loop: &ActiveEventLoop) {
#[allow(unused_mut)]
let mut window_attributes = Window::default_attributes();
let window = Arc::new(event_loop.create_window(window_attributes).unwrap());
// let monitor = event_loop
// .primary_monitor()
// .or_else(|| window.current_monitor());
// window.set_fullscreen(None);
// window.set_fullscreen(Some(winit::window::Fullscreen::Borderless(monitor)));
self.state = Some(pollster::block_on(State::new(window)).expect("Failed to block"));
}
fn user_event(&mut self, _event_loop: &ActiveEventLoop, event: State) {
self.state = Some(event);
}
fn about_to_wait(&mut self, _event_loop: &ActiveEventLoop) {
let state = match &mut self.state {
Some(state) => state,
None => return,
};
state.window.request_redraw();
}
fn window_event(
&mut self,
event_loop: &ActiveEventLoop,
_window_id: winit::window::WindowId,
event: WindowEvent,
) {
let state = match &mut self.state {
Some(canvas) => canvas,
None => return,
};
match event {
WindowEvent::CloseRequested => event_loop.exit(),
WindowEvent::Resized(size) => {
tracing::info!("Window resized to {size:?}");
state.resize(size.width, size.height)
}
WindowEvent::RedrawRequested => {
// if state.gst.poll() {
// event_loop.exit();
// return;
// }
match state.render() {
Ok(_) => {}
// Reconfigure the surface if lost
Err(wgpu::SurfaceError::Lost | wgpu::SurfaceError::Outdated) => {
let size = state.window.inner_size();
tracing::info!("Reconfiguring surface to {size:?}");
state.resize(size.width, size.height);
}
// The system is out of memory, we should probably quit
Err(wgpu::SurfaceError::OutOfMemory) => event_loop.exit(),
// All other errors (Outdated, Timeout) should be resolved by the next frame
Err(e) => {
tracing::error!("Failed to render frame: {e:?}");
}
}
}
// WindowEvent::AboutToWait => {
// state.window.request_redraw();
// }
WindowEvent::KeyboardInput {
event:
KeyEvent {
physical_key: PhysicalKey::Code(code),
state,
..
},
..
} => match (code, state.is_pressed()) {
(KeyCode::Escape, true) => event_loop.exit(),
(KeyCode::KeyQ, true) => event_loop.exit(),
_ => {}
},
_ => {}
}
}
}
pub fn main() -> anyhow::Result<()> {
tracing_subscriber::fmt::init();
let event_loop = EventLoop::with_user_event().build()?;
let mut app = App::new();
event_loop.run_app(&mut app)?;
Ok(())
}
pub struct Video {
pipeline: gst::Pipeline,
bus: gst::Bus,
appsink: gst_app::AppSink,
}
impl Video {
pub fn new() -> Result<Self> {
gst::init()?;
use gst::prelude::*;
let pipeline = gst::parse::launch(
r##"playbin3 uri=https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c video-sink="videoconvert ! video/x-raw,format=RGB10A2_LE ! appsink sync=true drop=true name=appsink""##
).context("Failed to parse gst pipeline")?;
let pipeline = pipeline
.downcast::<gst::Pipeline>()
.map_err(|_| anyhow::anyhow!("Failed to downcast gst element to Pipeline"))?;
let video_sink = pipeline.property::<gst::Bin>("video-sink");
let appsink = video_sink
.by_name("appsink")
.context("Failed to get appsink from video-sink")?
.downcast::<gst_app::AppSink>()
.map_err(|_| {
anyhow::anyhow!("Failed to downcast video-sink appsink to gst_app::AppSink")
})?;
// appsink.set_property("max-buffers", 2u32);
// appsink.set_property("emit-signals", true);
// appsink.set_callbacks(
// gst_app::AppSinkCallbacks::builder()
// .new_sample(|_appsink| Ok(gst::FlowSuccess::Ok))
// .build(),
// );
let bus = pipeline.bus().context("Failed to get gst pipeline bus")?;
pipeline.set_state(gst::State::Playing)?;
pipeline
.state(gst::ClockTime::from_seconds(5))
.0
.context("Failed to wait for pipeline")?;
Ok(Self {
pipeline,
bus,
appsink,
})
}
pub fn poll(&mut self) -> bool {
use gst::prelude::*;
for msg in self.bus.iter_timed(gst::ClockTime::ZERO) {
use gst::MessageView;
match msg.view() {
MessageView::Eos(..) => {
tracing::info!("End of stream");
self.pipeline.set_state(gst::State::Null).ok();
return true;
}
MessageView::Error(err) => {
tracing::error!(
"Error from {:?}: {} ({:?})",
err.src().map(|s| s.path_string()),
err.error(),
err.debug()
);
self.pipeline.set_state(gst::State::Null).ok();
return true;
}
_ => {}
}
}
false
}
}

View File

@@ -0,0 +1,31 @@
// Vertex shader
struct VertexOutput {
@builtin(position) clip_position: vec4<f32>,
@location(0) tex_coords: vec2<f32>,
};
@vertex
fn vs_main(
@builtin(vertex_index) in_vertex_index: u32,
) -> VertexOutput {
var out: VertexOutput;
let uv = vec2<f32>(f32((in_vertex_index << 1u) & 2u), f32(in_vertex_index & 2u));
out.clip_position = vec4<f32>(uv * 2.0 - 1.0, 0.0, 1.0);
out.clip_position.y = -out.clip_position.y;
out.tex_coords = uv;
return out;
}
// Fragment shader
@group(0) @binding(0)
var t_diffuse: texture_2d<f32>;
@group(0) @binding(1)
var s_diffuse: sampler;
@fragment
fn fs_main(in: VertexOutput) -> @location(0) vec4<f32> {
return textureSample(t_diffuse, s_diffuse, in.tex_coords);
}

35
flake.lock generated
View File

@@ -3,11 +3,11 @@
"advisory-db": { "advisory-db": {
"flake": false, "flake": false,
"locked": { "locked": {
"lastModified": 1763456551, "lastModified": 1768679419,
"narHash": "sha256-z5NogiOp+1r7Fd39jVFN0kT3aXUef8sYkuBsrAUNB5g=", "narHash": "sha256-l9rM4lXBeS2mIAJsJjVfl0UABx3S3zg5tul7bv+bn50=",
"owner": "rustsec", "owner": "rustsec",
"repo": "advisory-db", "repo": "advisory-db",
"rev": "6799e5dea99315eb8de85c6084fd99892b4a25d0", "rev": "c700e1cd023ca87343cbd9217d50d47023e9adc7",
"type": "github" "type": "github"
}, },
"original": { "original": {
@@ -18,11 +18,11 @@
}, },
"crane": { "crane": {
"locked": { "locked": {
"lastModified": 1762538466, "lastModified": 1768873933,
"narHash": "sha256-8zrIPl6J+wLm9MH5ksHcW7BUHo7jSNOu0/hA0ohOOaM=", "narHash": "sha256-CfyzdaeLNGkyAHp3kT5vjvXhA1pVVK7nyDziYxCPsNk=",
"owner": "ipetkov", "owner": "ipetkov",
"repo": "crane", "repo": "crane",
"rev": "0cea393fffb39575c46b7a0318386467272182fe", "rev": "0bda7e7d005ccb5522a76d11ccfbf562b71953ca",
"type": "github" "type": "github"
}, },
"original": { "original": {
@@ -34,10 +34,10 @@
"crates-io-index": { "crates-io-index": {
"flake": false, "flake": false,
"locked": { "locked": {
"lastModified": 1763363725, "lastModified": 1769614137,
"narHash": "sha256-cxr5xIKZFP45yV1ZHFTB1sHo5YGiR3FA8D9vAfDizMo=", "narHash": "sha256-3Td8fiv6iFVxeS0hYq3xdd10ZvUkC9INMAiQx/mECas=",
"ref": "refs/heads/master", "ref": "refs/heads/master",
"rev": "0382002e816a4cbd17d8d5b172f08b848aa22ff6", "rev": "c7e7d6394bc95555d6acd5c6783855f47d64c90d",
"shallow": true, "shallow": true,
"type": "git", "type": "git",
"url": "https://github.com/rust-lang/crates.io-index" "url": "https://github.com/rust-lang/crates.io-index"
@@ -50,7 +50,9 @@
}, },
"crates-nix": { "crates-nix": {
"inputs": { "inputs": {
"crates-io-index": "crates-io-index" "crates-io-index": [
"crates-io-index"
]
}, },
"locked": { "locked": {
"lastModified": 1763364255, "lastModified": 1763364255,
@@ -106,11 +108,11 @@
}, },
"nixpkgs": { "nixpkgs": {
"locked": { "locked": {
"lastModified": 1763283776, "lastModified": 1768564909,
"narHash": "sha256-Y7TDFPK4GlqrKrivOcsHG8xSGqQx3A6c+i7novT85Uk=", "narHash": "sha256-Kell/SpJYVkHWMvnhqJz/8DqQg2b6PguxVWOuadbHCc=",
"owner": "nixos", "owner": "nixos",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "50a96edd8d0db6cc8db57dab6bb6d6ee1f3dc49a", "rev": "e4bae1bd10c9c57b2cf517953ab70060a828ee6f",
"type": "github" "type": "github"
}, },
"original": { "original": {
@@ -124,6 +126,7 @@
"inputs": { "inputs": {
"advisory-db": "advisory-db", "advisory-db": "advisory-db",
"crane": "crane", "crane": "crane",
"crates-io-index": "crates-io-index",
"crates-nix": "crates-nix", "crates-nix": "crates-nix",
"flake-utils": "flake-utils", "flake-utils": "flake-utils",
"nix-github-actions": "nix-github-actions", "nix-github-actions": "nix-github-actions",
@@ -138,11 +141,11 @@
] ]
}, },
"locked": { "locked": {
"lastModified": 1763433504, "lastModified": 1768877311,
"narHash": "sha256-cVid5UNpk88sPYHkLAA5aZEHOFQXSB/2L1vl18Aq7IM=", "narHash": "sha256-abSDl0cNr0B+YCsIDpO1SjXD9JMxE4s8EFnhLEFVovI=",
"owner": "oxalica", "owner": "oxalica",
"repo": "rust-overlay", "repo": "rust-overlay",
"rev": "42ce16c6d8318a654d53f047c9400b7d902d6e61", "rev": "59e4ab96304585fde3890025fd59bd2717985cc1",
"type": "github" "type": "github"
}, },
"original": { "original": {

View File

@@ -9,7 +9,14 @@
url = "github:nix-community/nix-github-actions"; url = "github:nix-community/nix-github-actions";
inputs.nixpkgs.follows = "nixpkgs"; inputs.nixpkgs.follows = "nixpkgs";
}; };
crates-nix.url = "github:uttarayan21/crates.nix"; crates-io-index = {
url = "git+https://github.com/rust-lang/crates.io-index?shallow=1";
flake = false;
};
crates-nix = {
url = "github:uttarayan21/crates.nix";
inputs.crates-io-index.follows = "crates-io-index";
};
rust-overlay = { rust-overlay = {
url = "github:oxalica/rust-overlay"; url = "github:oxalica/rust-overlay";
inputs.nixpkgs.follows = "nixpkgs"; inputs.nixpkgs.follows = "nixpkgs";
@@ -35,6 +42,7 @@
system: let system: let
pkgs = import nixpkgs { pkgs = import nixpkgs {
inherit system; inherit system;
config.allowUnfree = true;
overlays = [ overlays = [
rust-overlay.overlays.default rust-overlay.overlays.default
]; ];
@@ -56,7 +64,7 @@
src = let src = let
filterBySuffix = path: exts: lib.any (ext: lib.hasSuffix ext path) exts; filterBySuffix = path: exts: lib.any (ext: lib.hasSuffix ext path) exts;
sourceFilters = path: type: (craneLib.filterCargoSources path type) || filterBySuffix path [".c" ".h" ".hpp" ".cpp" ".cc"]; sourceFilters = path: type: (craneLib.filterCargoSources path type) || filterBySuffix path [".c" ".h" ".hpp" ".cpp" ".cc" "wgsl"];
in in
lib.cleanSourceWith { lib.cleanSourceWith {
filter = sourceFilters; filter = sourceFilters;
@@ -70,36 +78,52 @@
nativeBuildInputs = with pkgs; [ nativeBuildInputs = with pkgs; [
pkg-config pkg-config
]; ];
# LD_LIBRARY_PATH = pkgs.lib.makeLibraryPath [pkgs.wayland];
LD_LIBRARY_PATH = pkgs.lib.makeLibraryPath buildInputs; LD_LIBRARY_PATH = pkgs.lib.makeLibraryPath buildInputs;
# SYSTEM_DEPS_LINK = "static";
# PKG_CONFIG_ALL_STATIC = "1";
buildInputs = with pkgs; buildInputs = with pkgs;
[ [
gst_all_1.gst-editing-services
gst_all_1.gst-libav gst_all_1.gst-libav
gst_all_1.gst-plugins-bad
gst_all_1.gst-plugins-base gst_all_1.gst-plugins-base
gst_all_1.gst-plugins-good gst_all_1.gst-plugins-good
gst_all_1.gst-plugins-rs gst_all_1.gst-plugins-bad
gst_all_1.gst-plugins-ugly gst_all_1.gst-plugins-ugly
gst_all_1.gst-rtsp-server gst_all_1.gst-plugins-rs
gst_all_1.gstreamer gst_all_1.gstreamer
glib
glib-networking
wrapGAppsHook4
# bzip2_1_1
# libsysprof-capture
# pcre2
# libunwind
# elfutils
# zstd
openssl openssl
vulkan-loader vulkan-loader
glib
] ]
++ (lib.optionals pkgs.stdenv.isLinux [ ++ (lib.optionals pkgs.stdenv.isLinux [
gst_all_1.gstreamermm gst_all_1.gstreamermm
gst_all_1.gst-vaapi gst_all_1.gst-vaapi
cudatoolkit
# util-linux
# libselinux
# libsepol
alsa-lib-with-plugins alsa-lib-with-plugins
libxkbcommon libxkbcommon
udev udev
wayland wayland
wayland-protocols wayland-protocols
xorg.libX11 # xorg.libX11
xorg.libXi # xorg.libXi
xorg.libXrandr # xorg.libXrandr
]) ])
++ (lib.optionals pkgs.stdenv.isDarwin [ ++ (lib.optionals pkgs.stdenv.isDarwin [
libiconv libiconv
@@ -159,35 +183,55 @@
default = pkg; default = pkg;
}; };
devShells = { devShells = rec {
default = rust-shell =
pkgs.mkShell.override { pkgs.mkShell.override {
stdenv = stdenv = pkgs.clangStdenv;
if pkgs.stdenv.isLinux # if pkgs.stdenv.isLinux
then (pkgs.stdenvAdapters.useMoldLinker pkgs.clangStdenv) # then (pkgs.stdenvAdapters.useMoldLinker pkgs.clangStdenv)
else pkgs.clangStdenv; # else pkgs.clangStdenv;
} (commonArgs }
(commonArgs
// { // {
# GST_PLUGIN_PATH = "/run/current-system/sw/lib/gstreamer-1.0/";
GIO_EXTRA_MODULES = "${pkgs.glib-networking}/lib/gio/modules";
packages = with pkgs; packages = with pkgs;
[ [
toolchainWithRustAnalyzer toolchainWithRustAnalyzer
cargo-nextest bacon
cargo-audit
cargo-deny cargo-deny
cargo-expand cargo-expand
bacon
cargo-make
cargo-hack cargo-hack
cargo-make
cargo-nextest
cargo-outdated cargo-outdated
lld lld
lldb lldb
(crates.buildCrate "cargo-with" {doCheck = false;})
(crates.buildCrate "dioxus-cli" {
nativeBuildInputs = with pkgs; [pkg-config];
buildInputs = [openssl];
doCheck = false;
})
(crates.buildCrate "cargo-hot" {
nativeBuildInputs = with pkgs; [pkg-config];
buildInputs = [openssl];
})
] ]
++ (lib.optionals pkgs.stdenv.isDarwin [ ++ (lib.optionals pkgs.stdenv.isDarwin [
apple-sdk_26 apple-sdk_26
]) ])
++ (lib.optionals pkgs.stdenv.isLinux [ ++ (lib.optionals pkgs.stdenv.isLinux [
mold ffmpeg
heaptrack
samply
cargo-flamegraph
perf
# mold
]); ]);
}); });
default = rust-shell;
}; };
} }
) )

62
gst/.github/workflows/build.yaml vendored Normal file
View File

@@ -0,0 +1,62 @@
name: build
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
env:
CARGO_TERM_COLOR: always
jobs:
checks-matrix:
runs-on: ubuntu-latest
outputs:
matrix: ${{ steps.set-matrix.outputs.matrix }}
steps:
- uses: actions/checkout@v4
- uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
- id: set-matrix
name: Generate Nix Matrix
run: |
set -Eeu
matrix="$(nix eval --json '.#githubActions.matrix')"
echo "matrix=$matrix" >> "$GITHUB_OUTPUT"
checks-build:
needs: checks-matrix
runs-on: ${{ matrix.os }}
strategy:
matrix: ${{fromJSON(needs.checks-matrix.outputs.matrix)}}
steps:
- uses: actions/checkout@v4
- uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
- run: nix build -L '.#${{ matrix.attr }}'
codecov:
runs-on: ubuntu-latest
permissions:
id-token: "write"
contents: "read"
steps:
- uses: actions/checkout@v4
- uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
- name: Run codecov
run: nix build .#checks.x86_64-linux.hello-llvm-cov
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v4.0.1
with:
flags: unittests
name: codecov-hello
fail_ci_if_error: true
token: ${{ secrets.CODECOV_TOKEN }}
files: ./result
verbose: true

38
gst/.github/workflows/docs.yaml vendored Normal file
View File

@@ -0,0 +1,38 @@
name: docs
on:
push:
branches: [ master ]
env:
CARGO_TERM_COLOR: always
jobs:
docs:
runs-on: ubuntu-latest
permissions:
id-token: "write"
contents: "read"
pages: "write"
steps:
- uses: actions/checkout@v4
- uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
- uses: DeterminateSystems/flake-checker-action@main
- name: Generate docs
run: nix build .#checks.x86_64-linux.hello-docs
- name: Setup Pages
uses: actions/configure-pages@v5
- name: Upload artifact
uses: actions/upload-pages-artifact@v3
with:
path: result/share/doc
- name: Deploy to gh-pages
id: deployment
uses: actions/deploy-pages@v4

View File

@@ -1,3 +1,3 @@
/result
/target /target
.direnv .direnv
.media

1040
gst/Cargo.lock generated Normal file

File diff suppressed because it is too large Load Diff

24
gst/Cargo.toml Normal file
View File

@@ -0,0 +1,24 @@
[package]
name = "gst"
version = "0.1.0"
edition = "2024"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
error-stack = "0.6"
futures = "0.3.31"
futures-lite = "2.6.1"
glib = "0.21.5"
glib-sys = "0.21.5"
gstreamer = { version = "0.24.4", features = ["v1_26"] }
gstreamer-app = { version = "0.24.4", features = ["v1_26"] }
gstreamer-video = { version = "0.24.4", features = ["v1_26"] }
gstreamer-base = { version = "0.24.4", features = ["v1_26"] }
thiserror = "2.0"
tracing = { version = "0.1", features = ["log"] }
bitflags = "2.10.0"
[dev-dependencies]
smol = "2.0.2"
tracing-subscriber = "0.3.22"

38
gst/src/bin.rs Normal file
View File

@@ -0,0 +1,38 @@
use crate::priv_prelude::*;
wrap_gst!(Bin);
parent_child!(Element, Bin);
impl Bin {
pub fn new(name: impl AsRef<str>) -> Self {
let bin = gstreamer::Bin::with_name(name.as_ref());
Bin { inner: bin }
}
pub fn add(&mut self, element: &impl ChildOf<Element>) -> Result<&mut Self> {
self.inner
.add(&element.upcast_ref().inner)
.change_context(Error)
.attach("Failed to add element to bin")?;
Ok(self)
}
pub fn add_many<'a, E: ChildOf<Element> + 'a>(
&mut self,
elements: impl IntoIterator<Item = &'a E>,
) -> Result<&mut Self> {
self.inner
.add_many(elements.into_iter().map(|e| &e.upcast_ref().inner))
.change_context(Error)
.attach("Failed to add elements to bin")?;
Ok(self)
}
pub fn add_pad(&mut self, pad: &Pad) -> Result<&mut Self> {
self.inner
.add_pad(&pad.inner)
.change_context(Error)
.attach("Failed to add pad to bin")?;
Ok(self)
}
}

27
gst/src/bus.rs Normal file
View File

@@ -0,0 +1,27 @@
use crate::priv_prelude::*;
wrap_gst!(Bus);
impl Bus {
pub fn iter_timed(
&self,
timeout: impl Into<Option<core::time::Duration>>,
) -> gstreamer::bus::Iter<'_> {
let clocktime = match timeout.into() {
Some(dur) => gstreamer::ClockTime::try_from(dur).ok(),
None => gstreamer::ClockTime::NONE,
};
self.inner.iter_timed(clocktime)
}
pub fn stream(&self) -> gstreamer::bus::BusStream {
self.inner.stream()
}
pub fn filtered_stream<'a>(
&self,
msg_types: &'a [gstreamer::MessageType],
) -> impl futures::stream::FusedStream<Item = gstreamer::Message> + Unpin + Send + 'a {
self.inner.stream_filtered(msg_types)
}
}

78
gst/src/caps.rs Normal file
View File

@@ -0,0 +1,78 @@
use gstreamer::Fraction;
#[derive(Debug, Clone)]
#[repr(transparent)]
pub struct Caps {
pub(crate) inner: gstreamer::caps::Caps,
}
impl Caps {
pub fn builder(cs: CapsType) -> CapsBuilder {
CapsBuilder::new(cs)
}
}
pub struct CapsBuilder {
inner: gstreamer::caps::Builder<gstreamer::caps::NoFeature>,
}
impl CapsBuilder {
pub fn field<V: Into<glib::Value> + Send>(mut self, name: impl AsRef<str>, value: V) -> Self {
self.inner = self.inner.field(name.as_ref(), value);
self
}
pub fn build(self) -> Caps {
Caps {
inner: self.inner.build(),
}
}
}
pub enum CapsType {
Video,
Audio,
Text,
}
impl CapsType {
pub fn as_str(&self) -> &str {
match self {
CapsType::Video => "video/x-raw",
CapsType::Audio => "audio/x-raw",
CapsType::Text => "text/x-raw",
}
}
}
impl CapsBuilder {
pub fn new(cs: CapsType) -> Self {
CapsBuilder {
inner: gstreamer::Caps::builder(cs.as_str()),
}
}
}
impl Caps {
pub fn format(&self) -> Option<gstreamer_video::VideoFormat> {
self.inner
.structure(0)
.and_then(|s| s.get::<&str>("format").ok())
.map(|s| gstreamer_video::VideoFormat::from_string(s))
}
pub fn width(&self) -> Option<i32> {
self.inner
.structure(0)
.and_then(|s| s.get::<i32>("width").ok())
}
pub fn height(&self) -> Option<i32> {
self.inner
.structure(0)
.and_then(|s| s.get::<i32>("height").ok())
}
pub fn framerate(&self) -> Option<gstreamer::Fraction> {
self.inner
.structure(0)
.and_then(|s| s.get::<Fraction>("framerate").ok())
}
}

133
gst/src/element.rs Normal file
View File

@@ -0,0 +1,133 @@
use crate::priv_prelude::*;
use crate::wrap_gst;
wrap_gst!(Element, gstreamer::Element);
// pub trait IsElement {
// fn upcast_ref(&self) -> &Element;
// fn into_element(self) -> Element;
// fn pad(&self, name: &str) -> Option<Pad> {
// use gstreamer::prelude::*;
// self.upcast_ref().inner.static_pad(name).map(Pad::from)
// }
// }
// impl IsElement for Element {
// fn upcast_ref(&self) -> &Element {
// self
// }
//
// fn into_element(self) -> Element {
// self
// }
// }
impl Element {
pub fn pad(&self, name: impl AsRef<str>) -> Option<Pad> {
use gstreamer::prelude::*;
self.inner.static_pad(name.as_ref()).map(Pad::from)
}
pub fn bus(&self) -> Result<Bus> {
use gstreamer::prelude::*;
self.inner
.bus()
.map(Bus::from)
.ok_or(Error)
.attach_with(|| format!("Failed to get bus from Element: {}", self.inner.name()))
}
}
pub trait Sink: ChildOf<Element> {
fn sink(&self, name: impl AsRef<str>) -> Pad {
self.upcast_ref()
.pad(name)
.expect("Sink element has no sink pad")
}
}
pub trait Source: ChildOf<Element> {
fn source(&self, name: impl AsRef<str>) -> Pad {
self.upcast_ref()
.pad(name)
.expect("Source element has no src pad")
}
fn link<S: Sink>(&self, sink: &S) -> Result<Bin>
where
Self: Sized,
{
use gstreamer::prelude::*;
if let Ok(bin) = self.upcast_ref().inner.clone().downcast::<gstreamer::Bin>() {
bin.add(&sink.upcast_ref().inner)
.change_context(Error)
.attach("Failed to add sink to bin")?;
self.upcast_ref()
.inner
.link(&sink.upcast_ref().inner)
.change_context(Error)
.attach("Failed to link elements")?;
Ok(Bin::from(bin))
} else {
let bin = gstreamer::Bin::builder()
.name(format!(
"{}-link-{}",
self.upcast_ref().inner.name(),
sink.upcast_ref().inner.name()
))
.build();
bin.add(&self.upcast_ref().inner)
.change_context(Error)
.attach("Failed to add source to bin")?;
bin.add(&sink.upcast_ref().inner)
.change_context(Error)
.attach("Failed to add sink to bin")?;
self.upcast_ref()
.inner
.link(&sink.upcast_ref().inner)
.change_context(Error)
.attach("Failed to link elements")?;
if let Some(sink_pad) = self.upcast_ref().pad("sink") {
let ghost_pad = Pad::ghost(&sink_pad)?;
bin.add_pad(&ghost_pad.inner)
.change_context(Error)
.attach("Failed to add src pad to bin")?;
ghost_pad.activate(true)?;
}
Ok(From::from(bin))
}
}
// fn link_pad<S: Sink>(&self, sink: &S, src_pad_name: &str, sink_pad_name: &str) -> Result<()> {
// use gstreamer::prelude::*;
// let src_pad = self
// .upcast_ref()
// .pad(src_pad_name)
// .ok_or(Error)
// .attach("Source pad not found")?;
// let sink_pad = sink
// .upcast_ref()
// .pad(sink_pad_name)
// .ok_or(Error)
// .attach("Sink pad not found")?;
// src_pad
// .inner
// .link(&sink_pad.inner)
// .change_context(Error)
// .attach("Failed to link source pad to sink pad")?;
// Ok(())
// }
}
pub trait ElementExt: ChildOf<Element> + Sync {
#[track_caller]
fn bus(&self) -> Result<Bus> {
self.upcast_ref().bus()
}
#[track_caller]
fn pad(&self, name: impl AsRef<str>) -> Option<Pad> {
self.upcast_ref().pad(name)
}
}
impl<T: ChildOf<Element> + Sync> ElementExt for T {}

7
gst/src/errors.rs Normal file
View File

@@ -0,0 +1,7 @@
pub use error_stack::{Report, ResultExt};
#[derive(Debug, thiserror::Error)]
#[error("An error occurred")]
pub struct Error;
pub type Result<T, E = error_stack::Report<Error>> = core::result::Result<T, E>;

64
gst/src/lib.rs Normal file
View File

@@ -0,0 +1,64 @@
pub mod bin;
pub mod bus;
pub mod caps;
pub mod element;
pub mod errors;
pub mod pad;
pub mod pipeline;
pub mod plugins;
#[macro_use]
pub mod wrapper;
pub mod sample;
pub use bin::*;
pub use bus::*;
pub use caps::*;
pub use element::*;
pub use gstreamer;
#[doc(inline)]
pub use gstreamer::{Message, MessageType, MessageView, State};
pub use gstreamer_video::VideoFormat;
pub use pad::*;
pub use pipeline::*;
pub use plugins::*;
pub use sample::*;
pub(crate) mod priv_prelude {
pub use crate::errors::*;
pub use crate::wrapper::*;
pub use crate::*;
pub use gstreamer::prelude::ElementExt as _;
pub use gstreamer::prelude::*;
#[track_caller]
pub fn duration_to_clocktime(
timeout: impl Into<Option<core::time::Duration>>,
) -> Result<Option<gstreamer::ClockTime>> {
match timeout.into() {
Some(dur) => {
let clocktime = gstreamer::ClockTime::try_from(dur)
.change_context(Error)
.attach("Failed to convert duration to ClockTime")?;
Ok(Some(clocktime))
}
None => Ok(gstreamer::ClockTime::NONE),
}
}
}
use std::sync::Arc;
static GST: std::sync::LazyLock<std::sync::Arc<Gst>> = std::sync::LazyLock::new(|| {
gstreamer::init().expect("Failed to initialize GStreamer");
std::sync::Arc::new(Gst {
__private: core::marker::PhantomData,
})
});
pub struct Gst {
__private: core::marker::PhantomData<()>,
}
impl Gst {
pub fn new() -> Arc<Self> {
Arc::clone(&GST)
}
}

45
gst/src/pad.rs Normal file
View File

@@ -0,0 +1,45 @@
use crate::priv_prelude::*;
wrap_gst!(Pad, gstreamer::Pad);
impl Pad {
#[track_caller]
pub fn ghost(target: &Pad) -> Result<Pad> {
let ghost_pad = gstreamer::GhostPad::with_target(&target.inner)
.change_context(Error)
.attach("Failed to create ghost pad")?;
Ok(Pad {
inner: ghost_pad.upcast(),
})
}
#[track_caller]
pub fn link(&self, peer: &Pad) -> Result<()> {
use gstreamer::prelude::*;
self.inner
.link(&peer.inner)
.change_context(Error)
.attach("Failed to link pads")?;
Ok(())
}
#[track_caller]
pub fn current_caps(&self) -> Result<Caps> {
let caps = self
.inner
.current_caps()
.ok_or(Error)
.attach("Failed to get pad caps")?;
Ok(Caps { inner: caps })
}
#[track_caller]
pub fn activate(&self, activate: bool) -> Result<()> {
use gstreamer::prelude::*;
self.inner
.set_active(activate)
.change_context(Error)
.attach("Failed to set_active pad")?;
Ok(())
}
}

211
gst/src/pipeline.rs Normal file
View File

@@ -0,0 +1,211 @@
use crate::priv_prelude::*;
wrap_gst!(Pipeline);
parent_child!(Element, Pipeline);
parent_child!(Bin, Pipeline);
impl Drop for Pipeline {
fn drop(&mut self) {
let _ = self.inner.set_state(gstreamer::State::Null);
}
}
impl Pipeline {
#[track_caller]
pub fn bus(&self) -> Result<Bus> {
let bus = self
.inner
.bus()
.ok_or(Error)
.attach("Failed to get bus from pipeline")?;
Ok(Bus::from_gst(bus))
}
/// Get the state
pub fn state(
&self,
timeout: impl Into<Option<core::time::Duration>>,
) -> Result<gstreamer::State> {
let (result, current, _pending) = self.inner.state(duration_to_clocktime(timeout)?);
result.change_context(Error).attach("Failed to get state")?;
Ok(current)
}
pub fn play(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Playing)
.change_context(Error)
.attach("Failed to set pipeline to Playing state")?;
Ok(())
}
pub fn pause(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Paused)
.change_context(Error)
.attach("Failed to set pipeline to Paused state")?;
Ok(())
}
pub fn ready(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Ready)
.change_context(Error)
.attach("Failed to set pipeline to Ready state")?;
Ok(())
}
pub fn stop(&self) -> Result<()> {
self.inner
.set_state(gstreamer::State::Null)
.change_context(Error)
.attach("Failed to set pipeline to Null state")?;
Ok(())
}
pub fn set_state(&self, state: gstreamer::State) -> Result<gstreamer::StateChangeSuccess> {
let result = self
.inner
.set_state(state)
.change_context(Error)
.attach("Failed to set pipeline state")?;
Ok(result)
}
pub async fn wait_for(&self, state: gstreamer::State) -> Result<()> {
let current_state = self.state(core::time::Duration::ZERO)?;
if current_state == state {
Ok(())
} else {
// use futures::stream::StreamExt;
use futures_lite::stream::StreamExt as _;
self.bus()?
.filtered_stream(&[MessageType::StateChanged])
.find(|message: &gstreamer::Message| {
let view = message.view();
if let gstreamer::MessageView::StateChanged(changed) = view {
changed.current() == state
&& changed.src().is_some_and(|s| s == &self.inner)
} else {
false
}
})
.await;
Ok(())
}
}
pub async fn wait_for_states(&self, states: impl AsRef<[gstreamer::State]>) -> Result<()> {
let current_state = self.state(core::time::Duration::ZERO)?;
let states = states.as_ref();
if states.contains(&current_state) {
Ok(())
} else {
use futures_lite::stream::StreamExt as _;
self.bus()?
.filtered_stream(&[MessageType::StateChanged])
.find(|message: &gstreamer::Message| {
let view = message.view();
if let gstreamer::MessageView::StateChanged(changed) = view {
states.contains(&changed.current())
&& changed.src().is_some_and(|s| s == &self.inner)
} else {
false
}
})
.await;
Ok(())
}
}
pub async fn wait_for_message<'a, F2>(
&self,
filter: Option<&'a [gstreamer::MessageType]>,
filter_fn: F2,
) -> Result<gstreamer::Message>
where
F2: Fn(&gstreamer::Message) -> bool + Send + 'a,
{
use futures_lite::stream::StreamExt as _;
match filter {
Some(filter) => {
let message = self.bus()?.filtered_stream(filter).find(filter_fn).await;
match message {
Some(msg) => Ok(msg),
None => {
Err(Error).attach("Failed to find message matching the provided filter")
}
}
}
None => {
let message = self.bus()?.stream().find(filter_fn).await;
match message {
Some(msg) => Ok(msg),
None => {
Err(Error).attach("Failed to find message matching the provided filter")
}
}
}
}
}
}
pub trait PipelineExt: ChildOf<Pipeline> + Sync {
// #[track_caller]
// fn bus(&self) -> Result<Bus> {
// self.upcast_ref().bus()
// }
#[track_caller]
fn play(&self) -> Result<()> {
self.upcast_ref().play()
}
#[track_caller]
fn pause(&self) -> Result<()> {
self.upcast_ref().pause()
}
#[track_caller]
fn ready(&self) -> Result<()> {
self.upcast_ref().ready()
}
#[track_caller]
fn stop(&self) -> Result<()> {
self.upcast_ref().stop()
}
#[track_caller]
fn set_state(&self, state: gstreamer::State) -> Result<gstreamer::StateChangeSuccess> {
self.upcast_ref().set_state(state)
}
#[track_caller]
fn state(&self, timeout: impl Into<Option<core::time::Duration>>) -> Result<gstreamer::State> {
self.upcast_ref().state(timeout)
}
fn wait_for(
&self,
state: gstreamer::State,
) -> impl std::future::Future<Output = Result<()>> + Send {
self.upcast_ref().wait_for(state)
}
fn wait_for_states(
&self,
states: impl AsRef<[gstreamer::State]> + Send,
) -> impl std::future::Future<Output = Result<()>> + Send {
self.upcast_ref().wait_for_states(states)
}
fn wait_for_message<'a, F2>(
&self,
filter: Option<&'a [gstreamer::MessageType]>,
filter_fn: F2,
) -> impl std::future::Future<Output = Result<gstreamer::Message>> + Send
where
F2: Fn(&gstreamer::Message) -> bool + Send + 'a,
{
self.upcast_ref().wait_for_message(filter, filter_fn)
}
}
impl<T: ChildOf<Pipeline> + Sync> PipelineExt for T {}

4
gst/src/plugins.rs Normal file
View File

@@ -0,0 +1,4 @@
pub mod app;
pub mod autodetect;
pub mod playback;
pub mod videoconvertscale;

2
gst/src/plugins/app.rs Normal file
View File

@@ -0,0 +1,2 @@
pub mod appsink;
pub use appsink::*;

View File

@@ -0,0 +1,278 @@
use crate::priv_prelude::*;
#[doc(inline)]
pub use gstreamer_app::AppSinkCallbacks;
wrap_gst!(AppSink, gstreamer::Element);
parent_child!(Element, AppSink);
pub struct AppSinkBuilder {
inner: AppSink,
callbacks: Option<gstreamer_app::app_sink::AppSinkCallbacksBuilder>,
}
impl AppSinkBuilder {
pub fn on_new_sample<F>(mut self, mut f: F) -> Self
where
F: FnMut(&AppSink) -> Result<(), gstreamer::FlowError> + Send + 'static,
{
let mut callbacks_builder = self
.callbacks
.take()
.unwrap_or_else(gstreamer_app::app_sink::AppSinkCallbacks::builder);
callbacks_builder = callbacks_builder.new_sample(move |appsink| {
use glib::object::Cast;
let element = appsink.upcast_ref::<gstreamer::Element>();
let appsink = AppSink::from_gst_ref(element);
std::panic::catch_unwind(std::panic::AssertUnwindSafe(|| f(appsink)))
.unwrap_or(Err(gstreamer::FlowError::Error))
.map(|_| gstreamer::FlowSuccess::Ok)
});
self.callbacks = Some(callbacks_builder);
self
}
pub fn on_new_preroll<F>(mut self, mut f: F) -> Self
where
F: FnMut(&AppSink) -> Result<(), gstreamer::FlowError> + Send + 'static,
{
let mut callbacks_builder = self
.callbacks
.take()
.unwrap_or_else(gstreamer_app::app_sink::AppSinkCallbacks::builder);
callbacks_builder = callbacks_builder.new_preroll(move |appsink| {
use glib::object::Cast;
let element = appsink.upcast_ref::<gstreamer::Element>();
let appsink = AppSink::from_gst_ref(element);
std::panic::catch_unwind(std::panic::AssertUnwindSafe(|| f(appsink)))
.unwrap_or(Err(gstreamer::FlowError::Error))
.map(|_| gstreamer::FlowSuccess::Ok)
});
self.callbacks = Some(callbacks_builder);
self
}
pub fn build(self) -> AppSink {
let AppSinkBuilder { inner, callbacks } = self;
if let Some(callbacks) = callbacks {
inner.appsink().set_callbacks(callbacks.build());
}
inner
}
}
impl Sink for AppSink {}
impl AppSink {
pub fn builder(name: impl AsRef<str>) -> AppSinkBuilder {
let inner = AppSink::new(name).expect("Failed to create AppSink");
AppSinkBuilder {
inner,
callbacks: None,
}
}
fn appsink(&self) -> &gstreamer_app::AppSink {
self.inner
.downcast_ref::<gstreamer_app::AppSink>()
.expect("Failed to downcast to AppSink")
}
pub fn new(name: impl AsRef<str>) -> Result<Self> {
let inner = gstreamer::ElementFactory::make("appsink")
.name(name.as_ref())
.build()
.change_context(Error)
.attach("Failed to create appsink element")?;
Ok(AppSink { inner })
}
pub fn emit_signals(&mut self, emit: bool) -> &mut Self {
self.inner.set_property("emit-signals", emit);
self
}
pub fn async_(&mut self, async_: bool) -> &mut Self {
self.inner.set_property("async", async_);
self
}
pub fn sync(&mut self, sync: bool) -> &mut Self {
self.inner.set_property("sync", sync);
self
}
pub fn drop(&mut self, drop: bool) -> &mut Self {
self.inner.set_property("drop", drop);
self
}
pub fn caps(&mut self, caps: Caps) -> &mut Self {
self.inner.set_property("caps", caps.inner);
self
}
pub fn callbacks(&mut self, callbacks: gstreamer_app::AppSinkCallbacks) -> &mut Self {
self.appsink().set_callbacks(callbacks);
self
}
pub fn on_new_sample<F>(&mut self, mut f: F) -> &mut Self
where
F: FnMut(&AppSink) -> Result<(), gstreamer::FlowError> + Send + 'static,
{
self.emit_signals(true).callbacks(
AppSinkCallbacks::builder()
.new_sample(move |appsink| {
use glib::object::Cast;
let element = appsink.upcast_ref::<gstreamer::Element>();
let appsink = AppSink::from_gst_ref(element);
std::panic::catch_unwind(std::panic::AssertUnwindSafe(|| f(appsink)))
.unwrap_or(Err(gstreamer::FlowError::Error))
.map(|_| gstreamer::FlowSuccess::Ok)
})
.build(),
)
}
pub fn pull_sample(&self) -> Result<Sample> {
self.appsink()
.pull_sample()
.change_context(Error)
.attach("Failed to pull sample from AppSink")
.map(Sample::from)
}
pub fn try_pull_sample(
&self,
timeout: impl Into<Option<core::time::Duration>>,
) -> Result<Option<Sample>> {
Ok(self
.appsink()
.try_pull_sample(duration_to_clocktime(timeout)?)
.map(From::from))
}
pub fn pull_preroll(&self) -> Result<Sample> {
self.appsink()
.pull_preroll()
.change_context(Error)
.attach("Failed to pull preroll sample from AppSink")
.map(Sample::from)
}
pub fn try_pull_preroll(
&self,
timeout: impl Into<Option<core::time::Duration>>,
) -> Result<Option<Sample>> {
Ok(self
.appsink()
.try_pull_preroll(duration_to_clocktime(timeout)?)
.map(From::from))
}
}
#[test]
fn test_appsink() {
use gstreamer::prelude::*;
use tracing_subscriber::prelude::*;
tracing_subscriber::registry()
.with(
tracing_subscriber::fmt::layer()
.with_thread_ids(true)
.with_file(true),
)
.init();
tracing::info!("Linking videoconvert to appsink");
Gst::new();
let playbin3 = playback::Playbin3::new("pppppppppppppppppppppppppppppp").unwrap().with_uri("https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c");
let video_convert = plugins::videoconvertscale::VideoConvert::new("vcvcvcvcvcvcvcvcvcvcvcvcvc")
.expect("Create videoconvert");
let mut appsink = app::AppSink::new("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa").expect("Create appsink");
appsink.caps(
Caps::builder(CapsType::Video)
.field("format", "RGB")
.build(),
);
let video_sink = video_convert
.link(&appsink)
.expect("Link videoconvert to appsink");
let playbin3 = playbin3.with_video_sink(&video_sink);
playbin3.play().expect("Play video");
let bus = playbin3.bus().unwrap();
for msg in bus.iter_timed(None) {
match msg.view() {
gstreamer::MessageView::Eos(..) => {
tracing::info!("End of stream reached");
break;
}
gstreamer::MessageView::Error(err) => {
tracing::error!(
"Error from {:?}: {} ({:?})",
err.src().map(|s| s.path_string()),
err.error(),
err.debug()
);
break;
}
gstreamer::MessageView::StateChanged(state) => {
eprintln!(
"State changed from {:?} to \x1b[33m{:?}\x1b[0m for {:?}",
state.old(),
state.current(),
state.src().map(|s| s.path_string())
);
}
_ => {}
}
// tracing::info!("{:#?}", &msg.view());
}
// std::thread::sleep(std::time::Duration::from_secs(5));
}
#[test]
fn test_appsink_metadata() {
use tracing_subscriber::prelude::*;
tracing_subscriber::registry()
.with(
tracing_subscriber::fmt::layer()
.with_thread_ids(true)
.with_file(true),
)
.init();
crate::Gst::new();
let url = "https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c";
let videoconvert = crate::plugins::videoconvertscale::VideoConvert::new("iced-video-convert")
// .unwrap();
// .with_output_format(gst::plugins::videoconvertscale::VideoFormat::Rgba)
.unwrap();
let appsink = crate::plugins::app::AppSink::new("iced-video-sink")
.unwrap()
.with_async(true)
.with_sync(true);
let video_sink = videoconvert.link(&appsink).unwrap();
let playbin = crate::plugins::playback::Playbin3::new("iced-video")
.unwrap()
.with_uri(url)
.with_video_sink(&video_sink);
playbin.pause().unwrap();
smol::block_on(async {
playbin.wait_for(gstreamer::State::Paused).await.unwrap();
});
// std::thread::sleep(core::time::Duration::from_secs(1));
let pad = appsink.pad("sink").unwrap();
let caps = pad.current_caps().unwrap();
let format = caps.format();
let height = caps.height();
let width = caps.width();
let framerate = caps.framerate();
dbg!(&format, height, width, framerate);
dbg!(&caps);
}

View File

@@ -0,0 +1,2 @@
pub mod autovideosink;
pub use autovideosink::*;

View File

@@ -0,0 +1,18 @@
use crate::priv_prelude::*;
wrap_gst!(AutoVideoSink, gstreamer::Element);
parent_child!(Element, AutoVideoSink);
parent_child!(Bin, AutoVideoSink, downcast);
impl Sink for AutoVideoSink {}
impl AutoVideoSink {
pub fn new(name: impl AsRef<str>) -> Result<Self> {
let element = gstreamer::ElementFactory::make("autovideosink")
.name(name.as_ref())
.build()
.change_context(Error)
.attach("Failed to create autovideosink element")?;
Ok(AutoVideoSink { inner: element })
}
}

View File

@@ -0,0 +1,71 @@
pub mod playbin3;
pub use playbin3::*;
pub mod playbin;
pub use playbin::*;
bitflags::bitflags! {
/// Extra flags to configure the behaviour of the sinks.
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub struct PlayFlags: u32 {
/// Render the video stream
const VIDEO = (1 << 0);
/// Render the audio stream
const AUDIO = (1 << 1);
/// Render subtitles
const TEXT = (1 << 2);
/// Render visualisation when no video is present
const VIS = (1 << 3);
/// Use software volume
const SOFT_VOLUME = (1 << 4);
/// Only use native audio formats
const NATIVE_AUDIO = (1 << 5);
/// Only use native video formats
const NATIVE_VIDEO = (1 << 6);
/// Attempt progressive download buffering
const DOWNLOAD = (1 << 7);
/// Buffer demuxed/parsed data
const BUFFERING = (1 << 8);
/// Deinterlace video if necessary
const DEINTERLACE = (1 << 9);
/// Use software color balance
const SOFT_COLORBALANCE = (1 << 10);
/// Force audio/video filter(s) to be applied
const FORCE_FILTERS = (1 << 11);
/// Force only software-based decoders (no effect for playbin3)
const FORCE_SW_DECODERS = (1 << 12);
}
}
const _: () = {
use glib::types::StaticType;
impl glib::types::StaticType for PlayFlags {
#[inline]
#[doc(alias = "gst_play_flags_get_type")]
fn static_type() -> glib::Type {
glib::Type::from_name("GstPlayFlags").expect("GstPlayFlags type not found")
}
}
impl glib::value::ToValue for PlayFlags {
#[inline]
fn to_value(&self) -> glib::Value {
let value = self.bits().to_value();
value
.transform_with_type(Self::static_type())
.expect("Failed to transform PlayFlags(u32) to GstPlayFlags")
}
#[inline]
fn value_type(&self) -> glib::Type {
Self::static_type()
}
}
impl From<PlayFlags> for glib::Value {
#[inline]
fn from(v: PlayFlags) -> Self {
// skip_assert_initialized!();
glib::value::ToValue::to_value(&v)
}
}
};

View File

@@ -0,0 +1,82 @@
use crate::priv_prelude::*;
wrap_gst!(Playbin, gstreamer::Element);
parent_child!(Element, Playbin);
parent_child!(Pipeline, Playbin, downcast);
parent_child!(Bin, Playbin, downcast);
impl Drop for Playbin {
fn drop(&mut self) {
self.set_state(gstreamer::State::Null).ok();
}
}
impl Playbin {
pub fn new(name: impl AsRef<str>) -> Result<Self> {
gstreamer::ElementFactory::make("playbin3")
.name(name.as_ref())
.build()
.map(|element| Playbin { inner: element })
.change_context(Error)
}
pub fn with_uri(self, uri: impl AsRef<str>) -> Self {
self.inner.set_property("uri", uri.as_ref());
self
}
pub fn with_buffer_duration(self, duration: impl Into<Option<core::time::Duration>>) -> Self {
let duration = match duration.into() {
Some(dur) => dur.as_secs() as i64,
None => -1,
};
self.inner.set_property("buffer-duration", duration);
self
}
pub fn with_buffer_size(self, size: impl Into<Option<u32>>) -> Self {
let size = match size.into() {
Some(size) => size as i32,
None => -1,
};
self.inner.set_property("buffer-size", size);
self
}
/// Sets the maximum size of the ring buffer in bytes.
pub fn with_ring_buffer_max_size(self, size: u64) -> Self {
self.inner.set_property("ring-buffer-max-size", size);
self
}
pub fn with_video_sink(self, video_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("video-sink", &video_sink.upcast_ref().inner);
self
}
pub fn with_text_sink(self, text_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("text-sink", &text_sink.upcast_ref().inner);
self
}
pub fn with_audio_sink(self, audio_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("audio-sink", &audio_sink.upcast_ref().inner);
self
}
pub fn set_volume(&self, volume: f64) {
self.inner.set_property("volume", volume.clamp(1.0, 100.0))
}
pub fn get_volume(&self) -> f64 {
self.inner.property::<f64>("volume")
}
pub fn with_flags(self, flags: playback::PlayFlags) -> Self {
self.inner.set_property("flags", flags);
self
}
}

View File

@@ -0,0 +1,95 @@
use crate::priv_prelude::*;
use playback::PlayFlags;
wrap_gst!(Playbin3, gstreamer::Element);
parent_child!(Element, Playbin3);
parent_child!(Pipeline, Playbin3, downcast);
parent_child!(Bin, Playbin3, downcast);
impl Drop for Playbin3 {
fn drop(&mut self) {
self.set_state(gstreamer::State::Null).ok();
}
}
impl Playbin3 {
pub fn new(name: impl AsRef<str>) -> Result<Self> {
gstreamer::ElementFactory::make("playbin3")
.name(name.as_ref())
.build()
.map(|element| Playbin3 { inner: element })
.change_context(Error)
}
pub fn with_uri(self, uri: impl AsRef<str>) -> Self {
self.inner.set_property("uri", uri.as_ref());
self
}
pub fn with_buffer_duration(self, duration: impl Into<Option<core::time::Duration>>) -> Self {
let duration = match duration.into() {
Some(dur) => dur.as_secs() as i64,
None => -1,
};
self.inner.set_property("buffer-duration", duration);
self
}
pub fn with_buffer_size(self, size: impl Into<Option<u32>>) -> Self {
let size = match size.into() {
Some(size) => size as i32,
None => -1,
};
self.inner.set_property("buffer-size", size);
self
}
/// Sets the maximum size of the ring buffer in bytes.
pub fn with_ring_buffer_max_size(self, size: u64) -> Self {
self.inner.set_property("ring-buffer-max-size", size);
self
}
pub fn with_video_sink(self, video_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("video-sink", &video_sink.upcast_ref().inner);
self
}
pub fn with_text_sink(self, text_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("text-sink", &text_sink.upcast_ref().inner);
self
}
pub fn with_audio_sink(self, audio_sink: &impl ChildOf<Element>) -> Self {
self.inner
.set_property("audio-sink", &audio_sink.upcast_ref().inner);
self
}
pub fn set_volume(&self, volume: f64) {
self.inner.set_property("volume", volume.clamp(1.0, 100.0))
}
pub fn get_volume(&self) -> f64 {
self.inner.property::<f64>("volume")
}
pub fn with_flags(self, flags: playback::PlayFlags) -> Self {
self.inner.set_property("flags", flags);
self
}
}
impl Playbin3 {
pub fn default_flags() -> PlayFlags {
PlayFlags::SOFT_COLORBALANCE
| PlayFlags::DEINTERLACE
| PlayFlags::BUFFERING
| PlayFlags::SOFT_VOLUME
| PlayFlags::TEXT
| PlayFlags::AUDIO
| PlayFlags::VIDEO
}
}

View File

@@ -0,0 +1,2 @@
pub mod videoconvert;
pub use videoconvert::*;

View File

@@ -0,0 +1,36 @@
use crate::priv_prelude::*;
#[doc(inline)]
pub use gstreamer_video::VideoFormat;
wrap_gst!(VideoConvert, gstreamer::Element);
parent_child!(Element, VideoConvert);
impl Sink for VideoConvert {}
impl Source for VideoConvert {}
impl VideoConvert {
pub fn new(name: impl AsRef<str>) -> Result<Self> {
let element = gstreamer::ElementFactory::make("videoconvert")
.name(name.as_ref())
.build()
.change_context(Error)
.attach("Failed to create videoconvert element")?;
Ok(VideoConvert { inner: element })
}
// pub fn with_caps(mut self, caps: &gstreamer::Caps) -> Self {
// use gstreamer::prelude::*;
// self.inner.set_property("caps", caps);
// self
// }
pub fn with_output_format(self, format: VideoFormat) -> Result<Self> {
use gstreamer::prelude::*;
let caps = Caps::builder(CapsType::Video)
.field("format", format.to_str())
.build();
self.inner.set_property("caps", &caps.inner);
// .change_context(Error)
// .attach("Failed to set output format on videoconvert")?;
Ok(self)
}
}

37
gst/src/sample.rs Normal file
View File

@@ -0,0 +1,37 @@
impl From<gstreamer::Sample> for Sample {
fn from(inner: gstreamer::Sample) -> Self {
Sample { inner }
}
}
#[repr(transparent)]
#[derive(Debug, Clone)]
pub struct Sample {
pub inner: gstreamer::Sample,
}
use gstreamer::BufferRef;
impl Sample {
#[doc(alias = "empty")]
pub fn new() -> Self {
Self {
inner: gstreamer::Sample::builder().build(),
}
}
pub fn buffer(&self) -> Option<&BufferRef> {
self.inner.buffer()
}
pub fn caps(&self) -> Option<&gstreamer::CapsRef> {
self.inner.caps()
}
pub fn info(&self) -> Option<&gstreamer::StructureRef> {
self.inner.info()
}
// pub fn set_buffer(&mut self) {
// self.inner.set_buffer(None);
// }
}

2
gst/src/wgpu.rs Normal file
View File

@@ -0,0 +1,2 @@

145
gst/src/wrapper.rs Normal file
View File

@@ -0,0 +1,145 @@
pub trait GstWrapper {
type GstType: glib::prelude::ObjectType;
fn from_gst(gst: Self::GstType) -> Self;
// fn into_gst(self) -> Self::GstType;
fn as_gst_ref(&self) -> &Self::GstType;
fn from_gst_ref(gst: &Self::GstType) -> &Self;
}
#[macro_export]
macro_rules! wrap_gst {
($name:ident) => {
$crate::wrap_gst!($name, gstreamer::$name);
};
($name:ident, $inner:ty) => {
$crate::wrap_gst!(core $name, $inner);
$crate::wrap_gst!($name, $inner, into_inner);
};
($name:ident, $inner:ty, skip_inner) => {
$crate::wrap_gst!(core $name, $inner);
};
(core $name:ident, $inner:ty) => {
#[derive(Debug, Clone)]
#[repr(transparent)]
pub struct $name {
pub(crate) inner: $inner,
}
// impl From<$name> for $inner {
// fn from(wrapper: $name) -> Self {
// wrapper.into_inner()
// }
// }
impl $name {
pub fn into_inner(self) -> $inner {
self.inner.clone()
}
}
impl $crate::wrapper::GstWrapper for $name {
type GstType = $inner;
fn from_gst(gst: Self::GstType) -> Self {
Self { inner: gst }
}
// fn into_gst(self) -> Self::GstType {
// self.inner.clone()
// }
fn as_gst_ref(&self) -> &Self::GstType {
&self.inner
}
fn from_gst_ref(gst: &Self::GstType) -> &Self {
unsafe { &*(gst as *const Self::GstType as *const Self) }
}
}
impl ChildOf<$name> for $name {
fn upcast_ref(&self) -> &$name {
self
}
}
};
($name:ident, $inner:ty, into_inner) => {
impl From<$inner> for $name {
fn from(inner: $inner) -> Self {
Self { inner }
}
}
};
}
/// A trait for types that can be upcasted to type T.
pub trait ChildOf<T> {
fn upcast_ref(&self) -> &T;
}
#[macro_export]
macro_rules! parent_child {
($parent:ty, $child:ty) => {
impl ChildOf<$parent> for $child
where
$child: GstWrapper,
$parent: GstWrapper,
{
fn upcast_ref(&self) -> &$parent {
let upcasted = self.inner.upcast_ref::<<$parent as GstWrapper>::GstType>();
unsafe { &*(upcasted as *const <$parent as GstWrapper>::GstType as *const $parent) }
}
}
};
($parent:ty, $child:ty, downcast) => {
impl ChildOf<$parent> for $child
where
$child: GstWrapper,
$parent: GstWrapper,
{
fn upcast_ref(&self) -> &$parent {
let downcasted = self
.inner
.downcast_ref::<<$parent as GstWrapper>::GstType>()
.expect(
format!(
"BUG: Failed to downcast GStreamer type from child {} to parent {}",
stringify!($child),
stringify!($parent)
)
.as_str(),
);
unsafe {
&*(downcasted as *const <$parent as GstWrapper>::GstType as *const $parent)
}
}
}
}; // ($parent:ty, $child:ty, deref) => {
// $crate::parent_child!($parent, $child);
// $crate::parent_child!($parent, $child, __deref);
// };
//
// ($parent:ty, $child:ty, downcast, deref) => {
// $crate::parent_child!($parent, $child, downcast);
// $crate::parent_child!($parent, $child, __deref);
// };
// ($parent:ty, $child:ty, deref, downcast) => {
// $crate::parent_child!($parent, $child, downcast);
// $crate::parent_child!($parent, $child, __deref);
// };
//
// ($parent:ty, $child:ty, __deref) => {
// impl core::ops::Deref for $child
// where
// $child: GstWrapper,
// $parent: GstWrapper,
// {
// type Target = $parent;
//
// fn deref(&self) -> &Self::Target {
// self.upcast_ref()
// }
// }
// };
}

8
jello-types/Cargo.toml Normal file
View File

@@ -0,0 +1,8 @@
[package]
name = "jello-types"
version = "0.1.0"
edition = "2024"
[dependencies]
serde = { version = "1.0.228", features = ["derive"] }
uuid = { version = "1.18.1", features = ["serde"] }

6
jello-types/src/lib.rs Normal file
View File

@@ -0,0 +1,6 @@
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize)]
pub struct User {
id: uuid::Uuid,
name: Option<String>,
primary_image_tag: Option<String>,
}

View File

@@ -1,6 +1,18 @@
jello:
cargo r -r -- -vv
# iced-video:
# cd crates/iced-video && cargo run --release --example minimal
typegen: typegen:
@echo "Generating jellyfin type definitions..." @echo "Generating jellyfin type definitions..."
cd typegen && cargo run cd typegen && cargo run
cp typegen/jellyfin.rs api/src/jellyfin.rs cp typegen/jellyfin.rs api/src/jellyfin.rs
rm typegen/jellyfin.rs rm typegen/jellyfin.rs
hdrtest:
GST_DEBUG=3 gst-launch-1.0 playbin3 uri=https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c video-sink="videoconvert ! video/x-raw,format=(string)RGB10A2_LE ! fakesink"
codec:
GST_DEBUG=3 gst-discoverer-1.0 https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c
ffprobe:
ffprobe -v error -show_format -show_streams "https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c" | grep pix_fmt

View File

@@ -1,36 +1,38 @@
#[derive(Debug, clap::Parser)] #[derive(Debug, clap::Parser)]
pub struct Cli { pub struct Cli {
#[clap(subcommand)] // #[clap(subcommand)]
pub cmd: SubCommand, // pub cmd: SubCommand,
#[command(flatten)]
pub verbosity: clap_verbosity_flag::Verbosity,
} }
#[derive(Debug, clap::Subcommand)] // #[derive(Debug, clap::Subcommand)]
pub enum SubCommand { // pub enum SubCommand {
#[clap(name = "add")] // #[clap(name = "add")]
Add(Add), // Add(Add),
#[clap(name = "list")] // #[clap(name = "list")]
List(List), // List(List),
#[clap(name = "completions")] // #[clap(name = "completions")]
Completions { shell: clap_complete::Shell }, // Completions { shell: clap_complete::Shell },
} // }
//
#[derive(Debug, clap::Args)] // #[derive(Debug, clap::Args)]
pub struct Add { // pub struct Add {
#[clap(short, long)] // #[clap(short, long)]
pub name: String, // pub name: String,
} // }
//
#[derive(Debug, clap::Args)] // #[derive(Debug, clap::Args)]
pub struct List {} // pub struct List {}
//
impl Cli { // impl Cli {
pub fn completions(shell: clap_complete::Shell) { // pub fn completions(shell: clap_complete::Shell) {
let mut command = <Cli as clap::CommandFactory>::command(); // let mut command = <Cli as clap::CommandFactory>::command();
clap_complete::generate( // clap_complete::generate(
shell, // shell,
&mut command, // &mut command,
env!("CARGO_BIN_NAME"), // env!("CARGO_BIN_NAME"),
&mut std::io::stdout(), // &mut std::io::stdout(),
); // );
} // }
} // }

View File

@@ -1,52 +1,16 @@
mod cli;
mod errors; mod errors;
use api::JellyfinConfig; use api::JellyfinConfig;
use errors::*; use errors::*;
fn jellyfin_config_try() -> Result<JellyfinConfig> {
let file = std::fs::read("config.toml").change_context(Error)?;
let config: JellyfinConfig = toml::from_slice(&file)
.change_context(Error)
.attach("Failed to parse Jellyfin Config")?;
Ok(config)
}
fn jellyfin_config() -> JellyfinConfig {
jellyfin_config_try().unwrap_or_else(|err| {
eprintln!("Error loading Jellyfin configuration: {:?}", err);
std::process::exit(1);
})
}
fn main() -> Result<()> { fn main() -> Result<()> {
tracing_subscriber::fmt::init(); color_backtrace::install();
ui_iced::ui(jellyfin_config).change_context(Error)?; let args = <cli::Cli as clap::Parser>::parse();
tracing_subscriber::fmt()
.with_max_level(args.verbosity)
.with_file(true)
.with_line_number(true)
.init();
ui_iced::ui().change_context(Error)?;
Ok(()) Ok(())
} }
// #[tokio::main]
// pub async fn main() -> Result<()> {
// dotenvy::dotenv()
// .change_context(Error)
// .inspect_err(|err| {
// eprintln!("Failed to load .env file: {}", err);
// })
// .ok();
// let config = JellyfinConfig::new(
// std::env::var("JELLYFIN_USERNAME").change_context(Error)?,
// std::env::var("JELLYFIN_PASSWORD").change_context(Error)?,
// std::env::var("JELLYFIN_SERVER_URL").change_context(Error)?,
// "jello".to_string(),
// );
// let mut jellyfin = api::JellyfinClient::new(config);
// jellyfin
// .authenticate_with_cached_token(".session")
// .await
// .change_context(Error)?;
//
// #[cfg(feature = "iced")]
// ui_iced::ui(jellyfin);
// #[cfg(feature = "gpui")]
// ui_gpui::ui(jellyfin);
//
// Ok(())
// }

14
store/Cargo.toml Normal file
View File

@@ -0,0 +1,14 @@
[package]
name = "store"
version = "0.1.0"
edition = "2024"
[dependencies]
bson = { version = "3.1.0", features = ["serde"] }
futures = "0.3.31"
parking_lot = "0.12.5"
redb = { version = "3.1.0", features = ["uuid"] }
secrecy = "0.10.3"
serde = "1.0.228"
tokio = { version = "1.48.0", features = ["rt"] }
uuid = { version = "1.18.1", features = ["v4"] }

10
store/src/lib.rs Normal file
View File

@@ -0,0 +1,10 @@
use std::collections::BTreeMap;
use uuid::Uuid;
pub struct ApiKey {
inner: secrecy::SecretBox<String>,
}
pub struct SecretStore {
api_keys: BTreeMap<Uuid, ApiKey>,
}

225
store/src/redb.rs Normal file
View File

@@ -0,0 +1,225 @@
// use std::{
// borrow::Borrow,
// collections::VecDeque,
// marker::PhantomData,
// path::Path,
// sync::{Arc, RwLock, atomic::AtomicBool},
// };
//
// use futures::task::AtomicWaker;
// use redb::{Error, Key, ReadableDatabase, TableDefinition, Value};
// use serde::{Serialize, de::DeserializeOwned};
//
// const USERS: TableDefinition<uuid::Uuid, Vec<u8>> = TableDefinition::new("users");
// const SERVERS: TableDefinition<uuid::Uuid, Vec<u8>> = TableDefinition::new("servers");
// const SETTINGS: TableDefinition<uuid::Uuid, Vec<u8>> = TableDefinition::new("settings");
//
// #[derive(Debug)]
// pub struct TableInner<T> {
// db: Arc<T>,
// }
//
// impl<T> Clone for TableInner<T> {
// fn clone(&self) -> Self {
// Self {
// db: Arc::clone(&self.db),
// }
// }
// }
//
// impl<T> TableInner<T> {
// fn new(db: Arc<T>) -> Self {
// Self { db }
// }
// }
//
// impl TableInner<DatabaseHandle> {
// async fn get<'a, K: Key, V: Serialize + DeserializeOwned>(
// &self,
// table: TableDefinition<'static, K, Vec<u8>>,
// key: impl Borrow<K::SelfType<'a>>,
// ) -> Result<Option<V>> {
// let db: &redb::Database = &self.db.as_ref().database;
// let db_reader = db.begin_read()?;
// let table = db_reader.open_table(table)?;
// table
// .get(key)?
// .map(|value| bson::deserialize_from_slice(&value.value()))
// .transpose()
// .map_err(|e| redb::Error::Io(std::io::Error::other(e)))
// }
//
// async fn insert<
// 'a,
// 'b,
// K: Key + Send + Sync,
// V: Serialize + DeserializeOwned + Send + Sync + 'a,
// >(
// &'b self,
// table: TableDefinition<'static, K, Vec<u8>>,
// key: impl Borrow<K::SelfType<'a>> + Send + 'b,
// value: V,
// ) -> Result<Option<V>> {
// let db: &redb::Database = &self.db.as_ref().database;
// // self.db
// // .writing
// // .store(true, std::sync::atomic::Ordering::SeqCst);
//
// // let out = tokio::task::spawn_blocking(move || -> Result<Option<V>>
//
// let out = tokio::task::spawn_blocking(|| -> Result<Option<V>> {
// let db_writer = db.begin_write()?;
// let out = {
// let mut table = db_writer.open_table(table)?;
// let serialized_value = bson::serialize_to_vec(&value)
// .map_err(|e| redb::Error::Io(std::io::Error::other(e)))?;
// let previous = table.insert(key, &serialized_value)?;
// let out = previous
// .map(|value| bson::deserialize_from_slice(&value.value()))
// .transpose()
// .map_err(|e| redb::Error::Io(std::io::Error::other(e)));
// out
// };
// db_writer.commit()?;
// out
// })
// .await
// .expect("Task panicked");
//
// out
// }
// }
//
// // impl<K: Key, V: Serialize + DeserializeOwned> Table<K, V> for TableInner {
// // async fn get(&self, key: K) -> Result<Option<Value>> {}
// // async fn insert(&self, key: K, value: V) -> Result<Option<Value>> {}
// // async fn modify(&self, key: K, v: FnOnce(V) -> V) -> Result<bool> {}
// // async fn remove(&self, key: K) -> Result<Option<Value>> {}
// // }
//
// #[derive(Debug)]
// pub struct Users<T>(TableInner<T>);
//
// impl<T> Clone for Users<T> {
// fn clone(&self) -> Self {
// Self(self.0.clone())
// }
// }
// impl<T> Users<T> {
// const TABLE: TableDefinition<'static, uuid::Uuid, Vec<u8>> = USERS;
// }
//
// #[derive(Debug)]
// pub struct Servers<T>(TableInner<T>);
// impl<T> Clone for Servers<T> {
// fn clone(&self) -> Self {
// Self(self.0.clone())
// }
// }
// impl<T> Servers<T> {
// const TABLE: TableDefinition<'static, uuid::Uuid, Vec<u8>> = SERVERS;
// }
//
// #[derive(Debug)]
// pub struct Settings<T>(TableInner<T>);
// impl<T> Clone for Settings<T> {
// fn clone(&self) -> Self {
// Self(self.0.clone())
// }
// }
// impl<T> Settings<T> {
// const TABLE: TableDefinition<'static, uuid::Uuid, Vec<u8>> = SETTINGS;
// }
//
// #[derive(Debug, Clone)]
// pub struct Database {
// users: Users<DatabaseHandle>,
// servers: Servers<DatabaseHandle>,
// settings: Settings<DatabaseHandle>,
// handle: Arc<DatabaseHandle>,
// }
//
// #[derive(Debug)]
// pub struct DatabaseHandle {
// database: redb::Database,
// writing: AtomicBool,
// wakers: RwLock<VecDeque<AtomicWaker>>,
// }
//
// #[derive(Debug)]
// pub struct DatabaseWriterGuard<'a> {
// handle: &'a DatabaseHandle,
// dropper: Arc<AtomicBool>,
// }
//
// // impl Drop for DatabaseWriterGuard<'_> {
// // fn drop(&mut self) {
// // self.handle
// // .writing
// // .store(false, std::sync::atomic::Ordering::SeqCst);
// // let is_panicking = std::thread::panicking();
// // let Ok(writer) = self.handle.wakers.write() else {
// // if is_panicking {
// // return;
// // } else {
// // panic!("Wakers lock poisoned");
// // }
// // }
// // if let Some(waker) = (self.handle.wakers.write()).pop() {
// // waker.wake();
// // };
// // // let mut wakers = self.handle.wakers.write().expect();
// // // if let Some(waker) = self.handle.wakers.write().expect("Wakers lock poisoned").pop_front() {
// // // waker.wake();
// // // }
// // // while let Some(waker) = wakers.pop_front() {
// // // waker.wake();
// // // }
// // }
// // }
//
// type Result<O, E = redb::Error> = core::result::Result<O, E>;
//
// pub trait Table<K: Key> {
// fn insert<V: Serialize + DeserializeOwned>(
// &self,
// key: K,
// value: V,
// ) -> impl Future<Output = Result<Option<V>>> + Send;
// fn modify<V: Serialize + DeserializeOwned, O: Serialize + DeserializeOwned>(
// &self,
// key: K,
// v: impl FnOnce(V) -> O,
// ) -> impl Future<Output = Result<bool>> + Send;
// fn remove<V: Serialize + DeserializeOwned>(
// &self,
// key: K,
// ) -> impl Future<Output = Result<Option<V>>> + Send;
// fn get<V: Serialize + DeserializeOwned>(
// &self,
// key: K,
// ) -> impl Future<Output = Result<Option<V>>> + Send;
// }
//
// impl Database {
// pub fn create(path: impl AsRef<Path>) -> Result<Self, Error> {
// let writing = AtomicBool::new(false);
// let wakers = RwLock::new(VecDeque::new());
// let db = redb::Database::create(path)?;
// let db = Arc::new(DatabaseHandle {
// database: db,
// writing,
// wakers,
// });
// let table_inner = TableInner::new(Arc::clone(&db));
// let users = Users(table_inner.clone());
// let servers = Servers(table_inner.clone());
// let settings = Settings(table_inner.clone());
// Ok(Self {
// servers,
// users,
// settings,
// handle: db,
// })
// }
// }

1
store/src/sqlite.rs Normal file
View File

@@ -0,0 +1 @@

1
store/src/toml.rs Normal file
View File

@@ -0,0 +1 @@

View File

@@ -1,262 +1,262 @@
use ::tap::*; // use ::tap::*;
//
use std::{collections::BTreeMap, sync::Arc}; // use std::{collections::BTreeMap, sync::Arc};
//
use gpui::{ // use gpui::{
App, Application, Bounds, ClickEvent, Context, ImageId, ImageSource, RenderImage, Resource, // App, Application, Bounds, ClickEvent, Context, ImageId, ImageSource, RenderImage, Resource,
SharedString, Window, WindowBounds, WindowOptions, actions, div, prelude::*, px, rgb, size, // SharedString, Window, WindowBounds, WindowOptions, actions, div, prelude::*, px, rgb, size,
}; // };
//
#[derive(Clone, Debug)] // #[derive(Clone, Debug)]
pub struct AppState { // pub struct AppState {
pub title: SharedString, // pub title: SharedString,
pub items: BTreeMap<SharedString, Item>, // pub items: BTreeMap<SharedString, Item>,
pub item_ids: BTreeMap<usize, SharedString>, // pub item_ids: BTreeMap<usize, SharedString>,
pub current_item: Option<SharedString>, // pub current_item: Option<SharedString>,
pub errors: Vec<String>, // pub errors: Vec<String>,
pub jellyfin_client: api::JellyfinClient, // pub jellyfin_client: api::JellyfinClient,
}
#[derive(Clone, Debug)]
pub struct Item {
pub id: SharedString,
pub name: SharedString,
pub item_type: SharedString,
pub media_type: SharedString,
}
impl Render for AppState {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
div()
.flex()
.flex_col()
.size_full()
.justify_center()
.text_color(rgb(0xffffff))
.child(Self::header())
.child(Self::body(self, window, cx))
.child(Self::footer())
}
}
actions!(jello_actions, [OpenItem, OnLoadItem, MouseDownEvent]);
impl AppState {
fn new(title: impl AsRef<str>, jellyfin_client: api::JellyfinClient) -> Self {
AppState {
title: SharedString::new(title.as_ref()),
items: BTreeMap::new(),
item_ids: BTreeMap::new(),
current_item: None,
errors: Vec::new(),
jellyfin_client,
}
}
// fn on_mouse_down(
// &mut self,
// event: &MouseDownEvent,
// window: &mut Window,
// cx: &mut Context<Self>,
// ) {
// // Handle mouse down event
// } // }
//
fn load_item(id: usize) -> impl Fn(&mut Self, &ClickEvent, &mut Window, &mut Context<Self>) { // #[derive(Clone, Debug)]
move |state: &mut Self, event: &ClickEvent, window: &mut Window, cx: &mut Context<Self>| { // pub struct Item {
let item_id = id; // pub id: SharedString,
cx.spawn(async move |entity, app| { // pub name: SharedString,
tracing::info!("Loading item with id: {}", item_id); // pub item_type: SharedString,
}); // pub media_type: SharedString,
} // }
} //
// impl Render for AppState {
fn hover_item(id: usize) -> impl Fn(&mut Self, &bool, &mut Window, &mut Context<Self>) { // fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
move |state: &mut Self, item: &bool, window: &mut Window, cx: &mut Context<Self>| { // div()
dbg!("Hovering over item: {:?}", id); // .flex()
} // .flex_col()
} // .size_full()
// .justify_center()
fn header() -> impl IntoElement { // .text_color(rgb(0xffffff))
div() // .child(Self::header())
.flex() // .child(Self::body(self, window, cx))
.flex_row() // .child(Self::footer())
.w_full() // }
.justify_end() // }
.h_20() //
.border_10() // actions!(jello_actions, [OpenItem, OnLoadItem, MouseDownEvent]);
.bg(rgb(0x333333)) //
.child(Self::button("Refresh")) // impl AppState {
} // fn new(title: impl AsRef<str>, jellyfin_client: api::JellyfinClient) -> Self {
// AppState {
fn footer() -> impl IntoElement { // title: SharedString::new(title.as_ref()),
div().flex().flex_row().w_full().h_20().bg(rgb(0x333333)) // items: BTreeMap::new(),
} // item_ids: BTreeMap::new(),
// current_item: None,
fn body(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement { // errors: Vec::new(),
div() // jellyfin_client,
.flex() // }
.flex_row() // }
.size_full() //
.child(Self::content(self, window, cx)) // // fn on_mouse_down(
.child(Self::sidebar(self, window, cx)) // // &mut self,
} // // event: &MouseDownEvent,
// // window: &mut Window,
fn button(label: &str) -> impl IntoElement { // // cx: &mut Context<Self>,
div() // // ) {
.flex() // // // Handle mouse down event
.justify_center() // // }
.items_center() //
.bg(rgb(0xff00ff)) // fn load_item(id: usize) -> impl Fn(&mut Self, &ClickEvent, &mut Window, &mut Context<Self>) {
.text_color(rgb(0xffffff)) // move |state: &mut Self, event: &ClickEvent, window: &mut Window, cx: &mut Context<Self>| {
.border_5() // let item_id = id;
.rounded_lg() // cx.spawn(async move |entity, app| {
.child(label.to_string()) // tracing::info!("Loading item with id: {}", item_id);
} // });
// }
fn content(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement { // }
div() //
.debug_below() // fn hover_item(id: usize) -> impl Fn(&mut Self, &bool, &mut Window, &mut Context<Self>) {
.w_3_4() // move |state: &mut Self, item: &bool, window: &mut Window, cx: &mut Context<Self>| {
// dbg!("Hovering over item: {:?}", id);
// }
// }
//
// fn header() -> impl IntoElement {
// div()
// .flex()
// .flex_row()
// .w_full()
// .justify_end()
// .h_20()
// .border_10()
// .bg(rgb(0x333333))
// .child(Self::button("Refresh"))
// }
//
// fn footer() -> impl IntoElement {
// div().flex().flex_row().w_full().h_20().bg(rgb(0x333333))
// }
//
// fn body(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement {
// div()
// .flex()
// .flex_row()
// .size_full()
// .child(Self::content(self, window, cx))
// .child(Self::sidebar(self, window, cx))
// }
//
// fn button(label: &str) -> impl IntoElement {
// div()
// .flex()
// .justify_center()
// .items_center()
// .bg(rgb(0xff00ff))
// .text_color(rgb(0xffffff))
// .border_5()
// .rounded_lg()
// .child(label.to_string())
// }
//
// fn content(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement {
// div()
// .debug_below()
// .w_3_4()
// // .flex()
// // .flex_wrap()
// .bg(rgb(0x111111))
// .justify_start()
// .items_start()
// .overflow_hidden()
// .child(
// div()
// .size_full()
// .flex() // .flex()
// .flex_wrap() // .flex_wrap()
.bg(rgb(0x111111)) // .justify_start()
.justify_start() // .items_start()
.items_start() // .content_start()
.overflow_hidden() // .gap_y_10()
.child( // .gap_x_10()
div() // .border_t_10()
.size_full() // .p_5()
.flex() // .child(Self::card(cx, 1))
.flex_wrap() // .child(Self::card(cx, 2))
.justify_start() // .child(Self::card(cx, 3))
.items_start() // .child(Self::card(cx, 4))
.content_start() // .child(Self::card(cx, 5))
.gap_y_10() // .child(Self::card(cx, 6))
.gap_x_10() // .child(Self::card(cx, 7))
.border_t_10() // .child(Self::card(cx, 8))
.p_5() // .child(Self::card(cx, 9)),
.child(Self::card(cx, 1)) // )
.child(Self::card(cx, 2)) // }
.child(Self::card(cx, 3)) //
.child(Self::card(cx, 4)) // fn sidebar(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement {
.child(Self::card(cx, 5)) // div()
.child(Self::card(cx, 6)) // .flex()
.child(Self::card(cx, 7)) // .flex_col()
.child(Self::card(cx, 8)) // .w_1_4()
.child(Self::card(cx, 9)), // .min_w_1_6()
) // .bg(rgb(0x222222))
} // .child(div().size_full().bg(gpui::yellow()))
// }
fn sidebar(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement { //
div() // fn card(cx: &mut Context<AppState>, number: usize) -> impl IntoElement {
.flex() // div()
.flex_col() // .id(number)
.w_1_4() // .on_click(cx.listener(Self::load_item(number)))
.min_w_1_6() // .on_hover(cx.listener(Self::hover_item(number)))
.bg(rgb(0x222222)) // .flex()
.child(div().size_full().bg(gpui::yellow())) // .flex_col()
} // .w_48()
// .h_64()
fn card(cx: &mut Context<AppState>, number: usize) -> impl IntoElement { // .p_10()
div() // .bg(rgb(0xff00ff))
.id(number) // .rounded_lg()
.on_click(cx.listener(Self::load_item(number))) // }
.on_hover(cx.listener(Self::hover_item(number))) // }
.flex() //
.flex_col() // pub fn ui(jellyfin_client: api::JellyfinClient) {
.w_48() // Application::new().run(|cx: &mut App| {
.h_64() // let bounds = Bounds::centered(None, size(px(500.0), px(500.0)), cx);
.p_10() // cx.open_window(
.bg(rgb(0xff00ff)) // WindowOptions {
.rounded_lg() // window_bounds: Some(WindowBounds::Windowed(bounds)),
} // ..Default::default()
} // },
// |_, cx| cx.new(|_| AppState::new("Jello Media Browser", jellyfin_client)),
pub fn ui(jellyfin_client: api::JellyfinClient) { // )
Application::new().run(|cx: &mut App| { // .expect("Failed to open window");
let bounds = Bounds::centered(None, size(px(500.0), px(500.0)), cx); // })
cx.open_window( // }
WindowOptions { //
window_bounds: Some(WindowBounds::Windowed(bounds)), // #[derive(Clone, Debug)]
..Default::default() // pub struct Card {
}, // pub id: usize,
|_, cx| cx.new(|_| AppState::new("Jello Media Browser", jellyfin_client)), // pub title: SharedString,
) // pub description: SharedString,
.expect("Failed to open window"); // pub image: SharedString,
}) // pub image_blurhash: BlurHash,
} // pub media_type: SharedString,
// pub loading: bool,
#[derive(Clone, Debug)] // }
pub struct Card { //
pub id: usize, // impl Render for Card {
pub title: SharedString, // fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
pub description: SharedString, // div()
pub image: SharedString, // .id(self.id)
pub image_blurhash: BlurHash, // .flex()
pub media_type: SharedString, // .flex_col()
pub loading: bool, // .w_48()
} // .h_64()
// .p_10()
impl Render for Card { // .bg(rgb(0xff00ff))
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement { // .rounded_lg()
div() // .pipe(|card| {
.id(self.id) // if self.loading {
.flex() // card.child(self.image_blurhash.clone())
.flex_col() // } else {
.w_48() // card.child(gpui::img(self.image.clone()))
.h_64() // }
.p_10() // })
.bg(rgb(0xff00ff)) // }
.rounded_lg() // }
.pipe(|card| { //
if self.loading { // #[derive(Clone, Debug)]
card.child(self.image_blurhash.clone()) // pub struct BlurHash {
} else { // pub id: ImageId,
card.child(gpui::img(self.image.clone())) // pub data: Arc<RenderImage>,
} // }
}) //
} // impl BlurHash {
} // pub fn new(
// data: impl AsRef<str>,
#[derive(Clone, Debug)] // width: u32,
pub struct BlurHash { // height: u32,
pub id: ImageId, // punch: f32,
pub data: Arc<RenderImage>, // ) -> Result<Self, error_stack::Report<crate::Error>> {
} // use error_stack::ResultExt;
// let decoded =
impl BlurHash { // blurhash::decode(data.as_ref(), width, height, punch).change_context(crate::Error)?;
pub fn new( // let buffer = image::RgbaImage::from_raw(width, height, decoded)
data: impl AsRef<str>, // .ok_or(crate::Error)
width: u32, // .attach("Failed to convert")?;
height: u32, // let frame = image::Frame::new(buffer);
punch: f32, // let render_image = RenderImage::new([frame]);
) -> Result<Self, error_stack::Report<crate::Error>> { // Ok(Self {
use error_stack::ResultExt; // id: render_image.id,
let decoded = // data: Arc::from(render_image),
blurhash::decode(data.as_ref(), width, height, punch).change_context(crate::Error)?; // })
let buffer = image::RgbaImage::from_raw(width, height, decoded) // }
.ok_or(crate::Error) // }
.attach("Failed to convert")?; //
let frame = image::Frame::new(buffer); // impl Render for BlurHash {
let render_image = RenderImage::new([frame]); // fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
Ok(Self { // gpui::img(ImageSource::Render(self.data.clone()))
id: render_image.id, // }
data: Arc::from(render_image), // }
}) //
} // impl IntoElement for BlurHash {
} // type Element = gpui::Img;
//
impl Render for BlurHash { // fn into_element(self) -> Self::Element {
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement { // gpui::img(ImageSource::Render(self.data.clone()))
gpui::img(ImageSource::Render(self.data.clone())) // }
} // }
}
impl IntoElement for BlurHash {
type Element = gpui::Img;
fn into_element(self) -> Self::Element {
gpui::img(ImageSource::Render(self.data.clone()))
}
}

View File

@@ -2,16 +2,31 @@
name = "ui-iced" name = "ui-iced"
version = "0.1.0" version = "0.1.0"
edition = "2024" edition = "2024"
license = "MIT"
[dependencies] [dependencies]
api = { version = "0.1.0", path = "../api" } api = { version = "0.1.0", path = "../api" }
blurhash = "0.2.3" blurhash = "0.2.3"
bytes = "1.11.0" bytes = "1.11.0"
gpui_util = "0.2.2" gpui_util = "0.2.2"
iced = { workspace = true } iced = { workspace = true, features = [
iced_video_player = { workspace = true } "advanced",
reqwest = "0.12.24" "canvas",
"image",
"sipper",
"tokio",
"debug",
"hot",
], default-features = true }
iced-video = { workspace = true }
iced_aw = "0.13.0"
iced_wgpu = "0.14.0"
iced_winit = "0.14.0"
reqwest = "0.13"
tap = "1.0.1" tap = "1.0.1"
toml = "0.9.8"
tracing = "0.1.41" tracing = "0.1.41"
url = "2.5.7" url = "2.5.7"
uuid = "1.18.1" uuid = "1.18.1"

View File

@@ -1,15 +1,20 @@
mod settings;
mod video;
mod shared_string; mod shared_string;
use iced_video_player::{Video, VideoPlayer}; use iced_video::{Ready, Video, VideoHandle};
use shared_string::SharedString; use shared_string::SharedString;
use tap::Pipe as _;
use std::sync::Arc; use std::sync::Arc;
mod blur_hash; mod blur_hash;
use blur_hash::BlurHash; use blur_hash::BlurHash;
mod preview; mod preview;
use preview::Preview; // use preview::Preview;
use iced::{Alignment, Element, Length, Shadow, Task, widget::*}; use iced::{Alignment, Element, Length, Task, widget::*};
use std::collections::{BTreeMap, BTreeSet}; use std::collections::{BTreeMap, BTreeSet};
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
@@ -21,6 +26,8 @@ pub struct ItemCache {
pub tree: BTreeMap<Option<uuid::Uuid>, BTreeSet<uuid::Uuid>>, pub tree: BTreeMap<Option<uuid::Uuid>, BTreeSet<uuid::Uuid>>,
} }
const BACKGROUND_COLOR: iced::Color = iced::Color::from_rgba8(30, 30, 30, 0.7);
impl ItemCache { impl ItemCache {
pub fn insert(&mut self, parent: impl Into<Option<uuid::Uuid>>, item: Item) { pub fn insert(&mut self, parent: impl Into<Option<uuid::Uuid>>, item: Item) {
let parent = parent.into(); let parent = parent.into();
@@ -102,37 +109,55 @@ pub enum Screen {
User, User,
Video, Video,
} }
#[derive(Debug, Clone)]
pub struct Config {
pub server_url: Option<String>,
pub device_id: Option<String>,
pub device_name: Option<String>,
pub client_name: Option<String>,
pub version: Option<String>,
}
impl Default for Config {
fn default() -> Self {
Config {
server_url: Some("http://localhost:8096".to_string()),
device_id: Some("jello-iced".to_string()),
device_name: Some("Jello Iced".to_string()),
client_name: Some("Jello".to_string()),
version: Some("0.1.0".to_string()),
}
}
}
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
struct State { struct State {
loading: Option<Loading>, loading: Option<Loading>,
current: Option<uuid::Uuid>, current: Option<uuid::Uuid>,
cache: ItemCache, cache: ItemCache,
jellyfin_client: api::JellyfinClient, jellyfin_client: Option<api::JellyfinClient>,
messages: Vec<String>, messages: Vec<String>,
history: Vec<Option<uuid::Uuid>>, history: Vec<Option<uuid::Uuid>>,
query: Option<String>, query: Option<String>,
screen: Screen, screen: Screen,
// Login form state settings: settings::SettingsState,
username_input: String,
password_input: String,
is_authenticated: bool, is_authenticated: bool,
// Video video: Option<Arc<VideoHandle<Message, Ready>>>,
video: Option<Arc<Video>>,
} }
impl State { impl State {
pub fn new(jellyfin_client: api::JellyfinClient) -> Self { pub fn new() -> Self {
State { State {
loading: None, loading: None,
current: None, current: None,
cache: ItemCache::default(), cache: ItemCache::default(),
jellyfin_client, jellyfin_client: None,
messages: Vec::new(), messages: Vec::new(),
history: Vec::new(), history: Vec::new(),
query: None, query: None,
screen: Screen::Home, screen: Screen::Home,
username_input: String::new(), settings: settings::SettingsState::default(),
password_input: String::new(),
is_authenticated: false, is_authenticated: false,
video: None, video: None,
} }
@@ -141,109 +166,23 @@ impl State {
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub enum Message { pub enum Message {
OpenSettings, Settings(settings::SettingsMessage),
CloseSettings,
Refresh, Refresh,
Search, Search,
SearchQueryChanged(String), SearchQueryChanged(String),
OpenItem(Option<uuid::Uuid>), OpenItem(Option<uuid::Uuid>),
LoadedItem(Option<uuid::Uuid>, Vec<Item>), LoadedItem(Option<uuid::Uuid>, Vec<Item>),
Error(String), Error(String),
SetToken(String),
Back, Back,
Home, Home,
// Login-related messages Video(video::VideoMessage),
UsernameChanged(String),
PasswordChanged(String),
Login,
LoginSuccess(String),
Logout,
Video(VideoMessage),
}
#[derive(Debug, Clone)]
pub enum VideoMessage {
EndOfStream,
Open(url::Url),
Pause,
Play,
Seek(f64),
Stop,
Test,
} }
fn update(state: &mut State, message: Message) -> Task<Message> { fn update(state: &mut State, message: Message) -> Task<Message> {
match message { match message {
Message::OpenSettings => { Message::Settings(msg) => settings::update(state, msg),
state.screen = Screen::Settings;
Task::none()
}
Message::CloseSettings => {
state.screen = Screen::Home;
Task::none()
}
Message::UsernameChanged(username) => {
state.username_input = username;
Task::none()
}
Message::PasswordChanged(password) => {
state.password_input = password;
Task::none()
}
Message::Login => {
let username = state.username_input.clone();
let password = state.password_input.clone();
// Update the client config with the new credentials
let mut config = (*state.jellyfin_client.config).clone();
config.username = username;
config.password = password;
Task::perform(
async move {
let mut client = api::JellyfinClient::new(config);
client.authenticate().await
},
|result| match result {
Ok(auth_result) => {
if let Some(token) = auth_result.access_token {
Message::LoginSuccess(token)
} else {
Message::Error("Authentication failed: No token received".to_string())
}
}
Err(e) => Message::Error(format!("Login failed: {}", e)),
},
)
}
Message::LoginSuccess(token) => {
state.jellyfin_client.set_token(token.clone());
state.is_authenticated = true;
state.password_input.clear();
state.messages.push("Login successful!".to_string());
state.screen = Screen::Home;
// Save token and refresh items
let client = state.jellyfin_client.clone();
Task::perform(
async move {
let _ = client.save_token(".session").await;
},
|_| Message::Refresh,
)
}
Message::Logout => {
state.is_authenticated = false;
state.jellyfin_client.set_token("");
state.cache = ItemCache::default();
state.current = None;
state.username_input.clear();
state.password_input.clear();
state.messages.push("Logged out successfully".to_string());
Task::none()
}
Message::OpenItem(id) => { Message::OpenItem(id) => {
let client = state.jellyfin_client.clone(); if let Some(client) = state.jellyfin_client.clone() {
use api::jellyfin::BaseItemKind::*; use api::jellyfin::BaseItemKind::*;
if let Some(cached) = id.as_ref().and_then(|id| state.cache.get(id)) if let Some(cached) = id.as_ref().and_then(|id| state.cache.get(id))
&& matches!(cached._type, Video | Movie | Episode) && matches!(cached._type, Video | Movie | Episode)
@@ -251,7 +190,7 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
let url = client let url = client
.stream_url(id.expect("ID exists")) .stream_url(id.expect("ID exists"))
.expect("Failed to get stream URL"); .expect("Failed to get stream URL");
Task::done(Message::Video(VideoMessage::Open(url))) Task::done(Message::Video(video::VideoMessage::Open(url)))
} else { } else {
Task::perform( Task::perform(
async move { async move {
@@ -267,6 +206,9 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
}, },
) )
} }
} else {
Task::none()
}
} }
Message::LoadedItem(id, items) => { Message::LoadedItem(id, items) => {
state.cache.extend(id, items); state.cache.extend(id, items);
@@ -275,8 +217,7 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
Task::none() Task::none()
} }
Message::Refresh => { Message::Refresh => {
// Handle refresh logic if let Some(client) = state.jellyfin_client.clone() {
let client = state.jellyfin_client.clone();
let current = state.current; let current = state.current;
Task::perform( Task::perform(
async move { async move {
@@ -291,18 +232,15 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
Ok(items) => Message::LoadedItem(msg, items), Ok(items) => Message::LoadedItem(msg, items),
}, },
) )
} else {
Task::none()
}
} }
Message::Error(err) => { Message::Error(err) => {
tracing::error!("Error: {}", err); tracing::error!("Error: {}", err);
state.messages.push(err); state.messages.push(err);
Task::none() Task::none()
} }
Message::SetToken(token) => {
tracing::info!("Authenticated with token: {}", token);
state.jellyfin_client.set_token(token);
state.is_authenticated = true;
Task::none()
}
Message::Back => { Message::Back => {
state.current = state.history.pop().unwrap_or(None); state.current = state.history.pop().unwrap_or(None);
Task::none() Task::none()
@@ -313,12 +251,12 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
} }
Message::SearchQueryChanged(query) => { Message::SearchQueryChanged(query) => {
state.query = Some(query); state.query = Some(query);
// Handle search query change
Task::none() Task::none()
} }
Message::Search => { Message::Search => {
// Handle search action // Handle search action
let client = state.jellyfin_client.clone(); // let client = state.jellyfin_client.clone();
if let Some(client) = state.jellyfin_client.clone() {
let query = state.query.clone().unwrap_or_default(); let query = state.query.clone().unwrap_or_default();
Task::perform(async move { client.search(query).await }, |r| match r { Task::perform(async move { client.search(query).await }, |r| match r {
Err(e) => Message::Error(format!("Search failed: {}", e)), Err(e) => Message::Error(format!("Search failed: {}", e)),
@@ -327,65 +265,39 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
Message::LoadedItem(None, items) Message::LoadedItem(None, items)
} }
}) })
} } else {
Message::Video(msg) => match msg {
VideoMessage::EndOfStream => {
state.video = None;
Task::none() Task::none()
} }
VideoMessage::Open(url) => {
state.video = Video::new(&url)
.inspect_err(|err| {
tracing::error!("Failed to play video at {}: {:?}", url, err);
})
.ok()
.map(Arc::new);
Task::none()
} }
VideoMessage::Pause => { Message::Video(msg) => video::update(state, msg),
if let Some(video) = state.video.as_mut().and_then(Arc::get_mut) { _ => todo!(),
video.set_paused(true);
}
Task::none()
}
VideoMessage::Play => {
if let Some(video) = state.video.as_mut().and_then(Arc::get_mut) {
video.set_paused(false);
}
Task::none()
}
VideoMessage::Seek(position) => {
// if let Some(ref video) = state.video {
// // video.seek(position, true);
// }
Task::none()
}
VideoMessage::Stop => {
state.video = None;
Task::none()
}
VideoMessage::Test => {
let url = url::Url::parse(
// "file:///home/servius/Projects/jello/crates/iced_video_player/.media/test.mp4",
"https://gstreamer.freedesktop.org/data/media/sintel_trailer-480p.webm",
)
.unwrap();
state.video = Video::new(&url)
.inspect_err(|err| {
dbg!(err);
})
.ok()
.map(Arc::new);
Task::none()
}
},
} }
} }
fn view(state: &State) -> Element<'_, Message> { fn view(state: &State) -> Element<'_, Message> {
let content = home(state);
match state.screen { match state.screen {
Screen::Settings => settings(state), Screen::Settings => {
Screen::Home | _ => home(state), let settings = settings::settings(state);
let settings = container(settings)
.width(Length::FillPortion(4))
.height(Length::FillPortion(4))
.style(container::rounded_box)
.pipe(mouse_area)
.on_press(Message::Refresh)
.pipe(|c| iced::widget::column![space::vertical(), c, space::vertical()])
.pipe(container)
.width(Length::Fill)
.width(Length::Fill)
.align_y(Alignment::Center)
.align_x(Alignment::Center)
.style(|_| container::background(BACKGROUND_COLOR))
.padding(50)
.pipe(mouse_area)
.on_press(Message::Settings(settings::SettingsMessage::Close));
stack![content, settings].into()
}
Screen::Home | _ => content,
} }
} }
@@ -396,38 +308,20 @@ fn home(state: &State) -> Element<'_, Message> {
.into() .into()
} }
fn player(video: &Video) -> Element<'_, Message> {
container(
VideoPlayer::new(video)
.width(Length::Fill)
.height(Length::Fill)
.content_fit(iced::ContentFit::Contain)
.on_end_of_stream(Message::Video(VideoMessage::EndOfStream)),
)
.style(|_| container::background(iced::Color::BLACK))
.width(Length::Fill)
.height(Length::Fill)
.align_x(Alignment::Center)
.align_y(Alignment::Center)
.into()
}
fn body(state: &State) -> Element<'_, Message> { fn body(state: &State) -> Element<'_, Message> {
if let Some(ref video) = state.video { if let Some(ref video) = state.video {
player(video) video::player(video)
} else { } else {
scrollable(
container(
Grid::with_children(state.cache.items_of(state.current).into_iter().map(card)) Grid::with_children(state.cache.items_of(state.current).into_iter().map(card))
.fluid(400) .fluid(400)
.spacing(50), .spacing(50)
) .pipe(container)
.padding(50) .padding(50)
.align_x(Alignment::Center) .align_x(Alignment::Center)
// .align_y(Alignment::Center) // .align_y(Alignment::Center)
.height(Length::Fill) .height(Length::Fill)
.width(Length::Fill), .width(Length::Fill)
) .pipe(scrollable)
.height(Length::Fill) .height(Length::Fill)
.into() .into()
} }
@@ -435,13 +329,17 @@ fn body(state: &State) -> Element<'_, Message> {
fn header(state: &State) -> Element<'_, Message> { fn header(state: &State) -> Element<'_, Message> {
row([ row([
container( text(
Button::new( state
Text::new(state.jellyfin_client.config.server_url.as_str()) .jellyfin_client
.align_x(Alignment::Start), .as_ref()
) .map(|c| c.config.server_url.as_str())
.on_press(Message::Home), .unwrap_or("No Server"),
) )
.align_x(Alignment::Start)
.pipe(button)
.on_press(Message::Home)
.pipe(container)
.padding(10) .padding(10)
.width(Length::Fill) .width(Length::Fill)
.height(Length::Fill) .height(Length::Fill)
@@ -450,16 +348,17 @@ fn header(state: &State) -> Element<'_, Message> {
.style(container::rounded_box) .style(container::rounded_box)
.into(), .into(),
search(state), search(state),
container(
row([ row([
button("Refresh").on_press(Message::Refresh).into(), button("Refresh").on_press(Message::Refresh).into(),
button("Settings").on_press(Message::OpenSettings).into(), button("Settings")
.on_press(Message::Settings(settings::SettingsMessage::Open))
.into(),
button("TestVideo") button("TestVideo")
.on_press(Message::Video(VideoMessage::Test)) .on_press(Message::Video(video::VideoMessage::Test))
.into(), .into(),
]) ])
.spacing(10), .spacing(10)
) .pipe(container)
.padding(10) .padding(10)
.width(Length::Fill) .width(Length::Fill)
.height(Length::Fill) .height(Length::Fill)
@@ -475,14 +374,13 @@ fn header(state: &State) -> Element<'_, Message> {
} }
fn search(state: &State) -> Element<'_, Message> { fn search(state: &State) -> Element<'_, Message> {
container(
TextInput::new("Search...", state.query.as_deref().unwrap_or_default()) TextInput::new("Search...", state.query.as_deref().unwrap_or_default())
.padding(10) .padding(10)
.size(16) .size(16)
.width(Length::Fill) .width(Length::Fill)
.on_input(Message::SearchQueryChanged) .on_input(Message::SearchQueryChanged)
.on_submit(Message::Search), .on_submit(Message::Search)
) .pipe(container)
.padding(10) .padding(10)
.width(Length::Fill) .width(Length::Fill)
.height(Length::Shrink) .height(Length::Shrink)
@@ -508,123 +406,6 @@ fn footer(state: &State) -> Element<'_, Message> {
.into() .into()
} }
fn settings(state: &State) -> Element<'_, Message> {
let content = if state.is_authenticated {
// Authenticated view - show user info and logout
column([
Text::new("Settings").size(32).into(),
container(
column([
Text::new("Account").size(24).into(),
Text::new("Server URL").size(14).into(),
Text::new(state.jellyfin_client.config.server_url.as_str())
.size(12)
.into(),
container(Text::new("Status: Logged In").size(14))
.padding(10)
.width(Length::Fill)
.into(),
container(
row([
Button::new(Text::new("Logout"))
.padding(10)
.on_press(Message::Logout)
.into(),
Button::new(Text::new("Close"))
.padding(10)
.on_press(Message::CloseSettings)
.into(),
])
.spacing(10),
)
.padding(10)
.width(Length::Fill)
.into(),
])
.spacing(10)
.max_width(400)
.align_x(Alignment::Center),
)
.padding(20)
.width(Length::Fill)
.align_x(Alignment::Center)
.style(container::rounded_box)
.into(),
])
.spacing(20)
.padding(50)
.align_x(Alignment::Center)
} else {
// Not authenticated view - show login form
column([
Text::new("Settings").size(32).into(),
container(
column([
Text::new("Login to Jellyfin").size(24).into(),
Text::new("Server URL").size(14).into(),
Text::new(state.jellyfin_client.config.server_url.as_str())
.size(12)
.into(),
container(
TextInput::new("Username", &state.username_input)
.padding(10)
.size(16)
.on_input(Message::UsernameChanged),
)
.padding(10)
.width(Length::Fill)
.into(),
container(
TextInput::new("Password", &state.password_input)
.padding(10)
.size(16)
.secure(true)
.on_input(Message::PasswordChanged)
.on_submit(Message::Login),
)
.padding(10)
.width(Length::Fill)
.into(),
container(
row([
Button::new(Text::new("Login"))
.padding(10)
.on_press(Message::Login)
.into(),
Button::new(Text::new("Cancel"))
.padding(10)
.on_press(Message::CloseSettings)
.into(),
])
.spacing(10),
)
.padding(10)
.width(Length::Fill)
.into(),
])
.spacing(10)
.max_width(400)
.align_x(Alignment::Center),
)
.padding(20)
.width(Length::Fill)
.align_x(Alignment::Center)
.style(container::rounded_box)
.into(),
])
.spacing(20)
.padding(50)
.align_x(Alignment::Center)
};
container(content)
.width(Length::Fill)
.height(Length::Fill)
.align_x(Alignment::Center)
.align_y(Alignment::Center)
.into()
}
fn card(item: &Item) -> Element<'_, Message> { fn card(item: &Item) -> Element<'_, Message> {
let name = item let name = item
.name .name
@@ -660,25 +441,54 @@ fn card(item: &Item) -> Element<'_, Message> {
.into() .into()
} }
// fn video(url: &str fn init() -> (State, Task<Message>) {
// Create a default config for initial state
// let default_config = api::JellyfinConfig {
// server_url: "http://localhost:8096".parse().expect("Valid URL"),
// device_id: "jello-iced".to_string(),
// device_name: "Jello Iced".to_string(),
// client_name: "Jello".to_string(),
// version: "0.1.0".to_string(),
// };
// let default_client = api::JellyfinClient::new_with_config(default_config);
fn init(config: impl Fn() -> api::JellyfinConfig + 'static) -> impl Fn() -> (State, Task<Message>) {
move || {
let mut jellyfin = api::JellyfinClient::new(config());
( (
State::new(jellyfin.clone()), State::new(),
Task::perform( Task::perform(
async move { jellyfin.authenticate_with_cached_token(".session").await }, async move {
|token| match token { let config_str = std::fs::read_to_string("config.toml")
Ok(token) => Message::SetToken(token), .map_err(|e| api::JellyfinApiError::IoError(e))?;
Err(e) => Message::Error(format!("Authentication failed: {}", e)), let config: api::JellyfinConfig = toml::from_str(&config_str).map_err(|e| {
api::JellyfinApiError::IoError(std::io::Error::new(
std::io::ErrorKind::InvalidData,
e,
))
})?;
// Try to load cached token and authenticate
match std::fs::read_to_string(".session") {
Ok(token) => {
let client = api::JellyfinClient::pre_authenticated(token.trim(), config)?;
Ok((client, true))
}
Err(_) => {
// No cached token, create unauthenticated client
let client = api::JellyfinClient::new_with_config(config);
Ok((client, false))
}
}
},
|result: Result<_, api::JellyfinApiError>| match result {
// Ok((client, is_authenticated)) => Message::LoadedClient(client, is_authenticated),
Err(e) => Message::Error(format!("Initialization failed: {}", e)),
_ => Message::Error("Login Unimplemented".to_string()),
}, },
) )
.chain(Task::done(Message::Refresh)), .chain(Task::done(Message::Refresh)),
) )
} }
}
pub fn ui(config: impl Fn() -> api::JellyfinConfig + 'static) -> iced::Result { pub fn ui() -> iced::Result {
iced::application(init(config), update, view).run() iced::application(init, update, view).run()
} }

296
ui-iced/src/settings.rs Normal file
View File

@@ -0,0 +1,296 @@
use crate::*;
use iced::Element;
pub fn settings(state: &State) -> Element<'_, Message> {
screens::settings(state)
}
pub fn update(state: &mut State, message: SettingsMessage) -> Task<Message> {
match message {
SettingsMessage::Open => {
tracing::trace!("Opening settings");
state.screen = Screen::Settings;
}
SettingsMessage::Close => {
tracing::trace!("Closing settings");
state.screen = Screen::Home;
}
SettingsMessage::Select(screen) => {
tracing::trace!("Switching settings screen to {:?}", screen);
state.settings.screen = screen;
}
SettingsMessage::User(user) => state.settings.login_form.update(user),
SettingsMessage::Server(server) => state.settings.server_form.update(server),
}
Task::none()
}
pub fn empty() -> Element<'static, Message> {
column([]).into()
}
#[derive(Debug, Clone, Default)]
pub struct SettingsState {
login_form: LoginForm,
server_form: ServerForm,
screen: SettingsScreen,
}
#[derive(Debug, Clone)]
pub enum SettingsMessage {
Open,
Close,
Select(SettingsScreen),
User(UserMessage),
Server(ServerMessage),
}
#[derive(Debug, Clone)]
pub enum UserMessage {
Add,
UsernameChanged(String),
PasswordChanged(String),
// Edit(uuid::Uuid),
// Delete(uuid::Uuid),
Clear,
}
#[derive(Debug, Clone)]
pub enum ServerMessage {
Add,
NameChanged(String),
UrlChanged(String),
// Edit(uuid::Uuid),
// Delete(uuid::Uuid),
Clear,
}
#[derive(Debug, Clone, Default, PartialEq, Eq)]
pub enum SettingsScreen {
#[default]
Main,
Users,
Servers,
}
#[derive(Debug, Clone)]
pub struct ServerItem {
pub id: uuid::Uuid,
pub name: SharedString,
pub url: SharedString,
pub users: Vec<uuid::Uuid>,
}
#[derive(Debug, Clone)]
pub struct UserItem {
pub id: uuid::Uuid,
pub name: SharedString,
}
#[derive(Debug, Clone, Default)]
pub struct LoginForm {
username: String,
password: String,
}
impl LoginForm {
pub fn update(&mut self, message: UserMessage) {
match message {
UserMessage::UsernameChanged(data) => {
self.username = data;
}
UserMessage::PasswordChanged(data) => {
self.password = data;
}
UserMessage::Add => {
// Handle adding user
}
UserMessage::Clear => {
self.username.clear();
self.password.clear();
}
}
}
pub fn view(&self) -> Element<'_, Message> {
iced::widget::column![
text("Login Form"),
text_input("Enter Username", &self.username).on_input(|data| {
Message::Settings(SettingsMessage::User(UserMessage::UsernameChanged(data)))
}),
text_input("Enter Password", &self.password)
.secure(true)
.on_input(|data| {
Message::Settings(SettingsMessage::User(UserMessage::PasswordChanged(data)))
}),
row![
button(text("Add User")).on_press_maybe(self.validate()),
button(text("Cancel"))
.on_press(Message::Settings(SettingsMessage::User(UserMessage::Clear))),
]
.spacing(10),
]
.spacing(10)
.padding([10, 0])
.into()
}
pub fn validate(&self) -> Option<Message> {
(!self.username.is_empty() && !self.password.is_empty())
.then(|| Message::Settings(SettingsMessage::User(UserMessage::Add)))
}
}
#[derive(Debug, Clone, Default)]
pub struct ServerForm {
name: String,
url: String,
}
impl ServerForm {
pub fn update(&mut self, message: ServerMessage) {
match message {
ServerMessage::NameChanged(data) => {
self.name = data;
}
ServerMessage::UrlChanged(data) => {
self.url = data;
}
ServerMessage::Add => {
// Handle adding server
}
ServerMessage::Clear => {
self.name.clear();
self.url.clear();
}
_ => {}
}
}
pub fn view(&self) -> Element<'_, Message> {
iced::widget::column![
text("Add New Server"),
text_input("Enter server name", &self.name).on_input(|data| {
Message::Settings(SettingsMessage::Server(ServerMessage::NameChanged(data)))
}),
text_input("Enter server URL", &self.url).on_input(|data| {
Message::Settings(SettingsMessage::Server(ServerMessage::UrlChanged(data)))
}),
row![
button(text("Add Server")).on_press_maybe(self.validate()),
button(text("Cancel")).on_press(Message::Settings(SettingsMessage::Server(
ServerMessage::Clear
))),
]
.spacing(10),
]
.spacing(10)
.padding([10, 0])
.into()
}
pub fn validate(&self) -> Option<Message> {
(!self.name.is_empty() && !self.url.is_empty())
.then(|| Message::Settings(SettingsMessage::Server(ServerMessage::Add)))
}
}
mod screens {
use iced_aw::Tabs;
use super::*;
pub fn settings(state: &State) -> Element<'_, Message> {
Tabs::new(|f| Message::Settings(SettingsMessage::Select(f)))
.push(
SettingsScreen::Main,
iced_aw::TabLabel::Text("General".into()),
main(state),
)
.push(
SettingsScreen::Servers,
iced_aw::TabLabel::Text("Servers".into()),
server(state),
)
.push(
SettingsScreen::Users,
iced_aw::TabLabel::Text("Users".into()),
user(state),
)
.set_active_tab(&state.settings.screen)
.into()
}
pub fn settings_screen(state: &State) -> Element<'_, Message> {
container(match state.settings.screen {
SettingsScreen::Main => main(state),
SettingsScreen::Servers => server(state),
SettingsScreen::Users => user(state),
})
.width(Length::FillPortion(10))
.height(Length::Fill)
.style(|theme| container::background(theme.extended_palette().background.base.color))
.pipe(container)
.padding(10)
.style(|theme| container::background(theme.extended_palette().secondary.base.color))
.width(Length::FillPortion(10))
.into()
}
pub fn settings_list(state: &State) -> Element<'_, Message> {
column(
[
button(center_text("General")).on_press(Message::Settings(
SettingsMessage::Select(SettingsScreen::Main),
)),
button(center_text("Servers")).on_press(Message::Settings(
SettingsMessage::Select(SettingsScreen::Servers),
)),
button(center_text("Users")).on_press(Message::Settings(SettingsMessage::Select(
SettingsScreen::Users,
))),
]
.map(|p| p.clip(true).width(Length::Fill).into()),
)
.width(Length::FillPortion(2))
.spacing(10)
.padding(10)
.pipe(scrollable)
.into()
}
pub fn main(state: &State) -> Element<'_, Message> {
Column::new()
.push(text("Main Settings"))
.push(toggler(true).label("HDR"))
.spacing(20)
.padding(20)
.pipe(container)
.into()
}
pub fn server(state: &State) -> Element<'_, Message> {
Column::new()
.push(text("Server Settings"))
.push(state.settings.server_form.view())
.spacing(20)
.padding(20)
.pipe(container)
.into()
}
pub fn user(state: &State) -> Element<'_, Message> {
Column::new()
.push(text("User Settings"))
.push(state.settings.login_form.view())
.spacing(20)
.padding(20)
.pipe(container)
.into()
}
}
pub fn center_text(content: &str) -> Element<'_, Message> {
text(content)
.align_x(Alignment::Center)
.width(Length::Fill)
.into()
}

View File

@@ -49,6 +49,21 @@ impl std::ops::Deref for SharedString {
} }
} }
#[derive(Clone, PartialEq, Eq, Hash)]
pub struct SecretSharedString(ArcCow<'static, str>);
impl core::fmt::Debug for SecretSharedString {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str("(..secret..)")
}
}
impl From<String> for SecretSharedString {
fn from(s: String) -> Self {
Self(ArcCow::Owned(Arc::from(s)))
}
}
#[derive(Debug, PartialEq, Eq, Hash)] #[derive(Debug, PartialEq, Eq, Hash)]
pub enum ArcCow<'a, T: ?Sized> { pub enum ArcCow<'a, T: ?Sized> {
Borrowed(&'a T), Borrowed(&'a T),
@@ -66,3 +81,9 @@ where
} }
} }
} }
impl<'a, T> From<&'a T> for ArcCow<'a, T> {
fn from(value: &'a T) -> Self {
ArcCow::Borrowed(value)
}
}

78
ui-iced/src/video.rs Normal file
View File

@@ -0,0 +1,78 @@
use super::*;
#[derive(Debug, Clone)]
pub enum VideoMessage {
EndOfStream,
Open(url::Url),
Loaded(VideoHandle<Message, Ready>),
Pause,
Play,
Seek(f64),
Stop,
Test,
}
pub fn update(state: &mut State, message: VideoMessage) -> Task<Message> {
match message {
VideoMessage::EndOfStream => {
state.video = None;
Task::none()
}
VideoMessage::Open(url) => {
Task::perform(VideoHandle::load(url.clone()), move |result| match result {
Ok(video) => Message::Video(VideoMessage::Loaded(video)),
Err(err) => Message::Error(format!("Error opening video at {}: {:?}", url, err)),
})
}
VideoMessage::Loaded(video) => {
state.video = Some(Arc::new(
video.on_end_of_stream(Message::Video(VideoMessage::EndOfStream)),
));
Task::done(VideoMessage::Play).map(Message::Video)
}
VideoMessage::Pause => {
if let Some(ref video) = state.video {
video.pause();
}
Task::none()
}
VideoMessage::Play => {
if let Some(ref video) = state.video {
video.play();
}
Task::none()
}
VideoMessage::Seek(position) => {
// if let Some(ref video) = state.video {
// // video.seek(position, true);
// }
Task::none()
}
VideoMessage::Stop => {
state.video.as_ref().map(|video| {
video.stop();
});
state.video = None;
Task::none()
}
VideoMessage::Test => {
let url = url::Url::parse("https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c")
.expect("Impossible: Failed to parse hardcoded URL");
Task::done(VideoMessage::Open(url)).map(Message::Video)
}
}
}
pub fn player(video: &VideoHandle<Message, Ready>) -> Element<'_, Message> {
container(
Video::new(video)
.width(Length::Fill)
.height(Length::Fill)
.content_fit(iced::ContentFit::Contain),
)
.style(|_| container::background(iced::Color::BLACK))
.width(Length::Fill)
.height(Length::Fill)
.align_x(Alignment::Center)
.align_y(Alignment::Center)
.into()
}