Compare commits
44 Commits
77fe7b6bb4
...
sloppy
| Author | SHA1 | Date | |
|---|---|---|---|
| 0bc0fd8103 | |||
| dcbb5a127b | |||
| e66c457b57 | |||
| 76fc14c73b | |||
| 5b4fbd5df6 | |||
| e7fd01c0af | |||
| a040478069 | |||
| e5ef173473 | |||
| 429371002b | |||
| 335e8fdbef | |||
| 9dac0b6c78 | |||
|
|
97a7a632d4 | ||
|
|
29390140cd | ||
|
|
97c2b3f14c | ||
|
|
2b2e8060e7 | ||
|
|
584495453f | ||
|
|
99853167df | ||
|
|
fc9555873b | ||
|
|
a7ffa69326 | ||
|
|
4ed15c97f0 | ||
|
|
a2491695b3 | ||
|
|
5a0bdae84b | ||
|
|
5d0b795ba5 | ||
|
|
ebe2312272 | ||
|
|
3382aebb1f | ||
|
|
8d46bd2b85 | ||
|
|
043d1e99f0 | ||
|
|
d42ef3b550 | ||
|
|
21cbaff610 | ||
|
|
a0bda88246 | ||
|
|
ccae03d105 | ||
|
|
232c0f4d53 | ||
|
|
5cec7821d0 | ||
|
|
c2fdedf05a | ||
|
|
7003002b69 | ||
|
|
c675c29be3 | ||
|
|
7f9152e8fd | ||
|
|
6cc83ba655 | ||
|
|
253d27c176 | ||
|
|
c7afcd3f0d | ||
|
|
d75a2fb7e4 | ||
|
|
73fcf9bad1 | ||
|
|
05ae9ff570 | ||
|
|
ca1fd2e977 |
199
AGENTS.md
Normal file
199
AGENTS.md
Normal file
@@ -0,0 +1,199 @@
|
||||
# Agent Guidelines for Jello
|
||||
|
||||
This document provides guidelines for AI coding agents working on the Jello codebase.
|
||||
|
||||
## Project Overview
|
||||
|
||||
Jello is a WIP video client for Jellyfin written in Rust, focusing on HDR video playback using:
|
||||
- **iced** - Primary GUI toolkit
|
||||
- **gstreamer** - Video + audio decoding library
|
||||
- **wgpu** - Rendering video from GStreamer in iced
|
||||
|
||||
## Build, Test, and Lint Commands
|
||||
|
||||
### Building
|
||||
```bash
|
||||
# Build in release mode
|
||||
cargo build --release
|
||||
cargo build -r
|
||||
|
||||
# Build specific workspace member
|
||||
cargo build -p api
|
||||
cargo build -p gst
|
||||
cargo build -p ui-iced
|
||||
|
||||
# Run the application
|
||||
cargo run --release -- -vv
|
||||
just jello # Uses justfile
|
||||
```
|
||||
|
||||
### Testing
|
||||
```bash
|
||||
# Run all tests in workspace
|
||||
cargo test --workspace
|
||||
|
||||
# Run tests for a specific package
|
||||
cargo test -p gst
|
||||
cargo test -p api
|
||||
cargo test -p iced-video
|
||||
|
||||
# Run a single test by name
|
||||
cargo test test_appsink
|
||||
cargo test -p gst test_appsink
|
||||
|
||||
# Run a specific test in a specific file
|
||||
cargo test -p gst --test <test_file_name> <test_function_name>
|
||||
|
||||
# Run tests with output
|
||||
cargo test -- --nocapture
|
||||
cargo test -- --show-output
|
||||
```
|
||||
|
||||
### Linting and Formatting
|
||||
```bash
|
||||
# Check code without building
|
||||
cargo check
|
||||
cargo check --workspace
|
||||
|
||||
# Run clippy (linter)
|
||||
cargo clippy
|
||||
cargo clippy --workspace
|
||||
cargo clippy --workspace -- -D warnings
|
||||
|
||||
# Format code
|
||||
cargo fmt
|
||||
cargo fmt --all
|
||||
|
||||
# Check formatting without modifying files
|
||||
cargo fmt --all -- --check
|
||||
```
|
||||
|
||||
### Other Tools
|
||||
```bash
|
||||
# Check for security vulnerabilities and license compliance
|
||||
cargo deny check
|
||||
|
||||
# Generate Jellyfin type definitions
|
||||
just typegen
|
||||
```
|
||||
|
||||
## Code Style Guidelines
|
||||
|
||||
### Rust Edition
|
||||
- Use **Rust 2024 edition** (as specified in Cargo.toml files)
|
||||
|
||||
### Imports
|
||||
- Use `use` statements at the top of files
|
||||
- Group imports: std library, external crates, then local modules
|
||||
- Use `crate::` for absolute paths within the crate
|
||||
- Common pattern: create a `priv_prelude` module for internal imports
|
||||
- Use `pub use` to re-export commonly used items
|
||||
- Use wildcard imports (`use crate::priv_prelude::*;`) within internal modules when a prelude exists
|
||||
|
||||
Example:
|
||||
```rust
|
||||
use std::sync::Arc;
|
||||
|
||||
use reqwest::{Method, header::InvalidHeaderValue};
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
use crate::errors::*;
|
||||
```
|
||||
|
||||
### Naming Conventions
|
||||
- **Types/Structs/Enums**: PascalCase (e.g., `JellyfinClient`, `Error`, `AppSink`)
|
||||
- **Functions/Methods**: snake_case (e.g., `request_builder`, `stream_url`)
|
||||
- **Variables**: snake_case (e.g., `access_token`, `device_id`)
|
||||
- **Constants**: SCREAMING_SNAKE_CASE (e.g., `NEXT_ID`, `GST`)
|
||||
- **Modules**: snake_case (e.g., `priv_prelude`, `error_stack`)
|
||||
|
||||
### Error Handling
|
||||
- Use **`error-stack`** for error handling with context propagation
|
||||
- Use **`thiserror`** for defining error types
|
||||
- Standard error type pattern:
|
||||
```rust
|
||||
pub use error_stack::{Report, ResultExt};
|
||||
|
||||
#[derive(Debug, thiserror::Error)]
|
||||
#[error("An error occurred")]
|
||||
pub struct Error;
|
||||
|
||||
pub type Result<T, E = error_stack::Report<Error>> = core::result::Result<T, E>;
|
||||
```
|
||||
- Attach context to errors using `.change_context(Error)` and `.attach("description")`
|
||||
- Use `#[track_caller]` on functions that may panic or error for better error messages
|
||||
- Error handling example:
|
||||
```rust
|
||||
self.inner
|
||||
.set_state(gstreamer::State::Playing)
|
||||
.change_context(Error)
|
||||
.attach("Failed to set pipeline to Playing state")?;
|
||||
```
|
||||
|
||||
### Types
|
||||
- Prefer explicit types over type inference when it improves clarity
|
||||
- Use `impl Trait` for function parameters when appropriate (e.g., `impl AsRef<str>`)
|
||||
- Use `Option<T>` and `Result<T, E>` idiomatically
|
||||
- Use `Arc<T>` for shared ownership
|
||||
- Use newtype patterns for semantic clarity (e.g., `ApiKey` wrapping `secrecy::SecretBox<String>`)
|
||||
|
||||
### Formatting
|
||||
- Use 4 spaces for indentation
|
||||
- Line length: aim for 100 characters, but not strictly enforced
|
||||
- Use trailing commas in multi-line collections
|
||||
- Follow standard Rust formatting conventions (enforced by `cargo fmt`)
|
||||
|
||||
### Documentation
|
||||
- Add doc comments (`///`) for public APIs
|
||||
- Use inline comments (`//`) sparingly, prefer self-documenting code
|
||||
- Include examples in doc comments when helpful
|
||||
|
||||
### Async/Await
|
||||
- Use `tokio` as the async runtime
|
||||
- Mark async functions with `async` keyword
|
||||
- Use `.await` for async operations
|
||||
- Common pattern: `tokio::fs` for file operations
|
||||
|
||||
### Module Structure
|
||||
- Use `mod.rs` or inline modules as appropriate
|
||||
- Keep related functionality together
|
||||
- Use `pub(crate)` for internal APIs
|
||||
- Re-export commonly used items at crate root
|
||||
|
||||
### Macros
|
||||
- Custom macros used: `wrap_gst!`, `parent_child!`
|
||||
- Use macros for reducing boilerplate, only in the `gst` crate
|
||||
|
||||
### Testing
|
||||
- Place tests in the same file with `#[test]` or `#[cfg(test)]`
|
||||
- Use descriptive test function names (e.g., `test_appsink`, `unique_generates_different_ids`)
|
||||
- Initialize tracing in tests when needed for debugging
|
||||
|
||||
### Dependencies
|
||||
- Prefer well-maintained crates from crates.io
|
||||
- Use `workspace.dependencies` for shared dependencies across workspace members
|
||||
- Pin versions when stability is important
|
||||
|
||||
### Workspace Structure
|
||||
The project uses a Cargo workspace with multiple members:
|
||||
- `.` - Main jello binary
|
||||
- `api` - Jellyfin API client
|
||||
- `gst` - GStreamer wrapper
|
||||
- `ui-iced` - Iced UI implementation
|
||||
- `ui-gpui` - GPUI UI implementation (optional)
|
||||
- `store` - Secret/data/storage management
|
||||
- `jello-types` - Shared type definitions
|
||||
- `typegen` - Jellyfin type generator
|
||||
- `crates/iced-video` - Custom iced video widget
|
||||
- `examples/hdr-gstreamer-wgpu` - HDR example
|
||||
|
||||
### Project-Specific Patterns
|
||||
- Use `LazyLock` for global initialization (e.g., GStreamer init)
|
||||
- Use the builder pattern with method chaining (e.g., `request_builder()`)
|
||||
- Use `tap` crate's `.pipe()` for functional transformations
|
||||
- Prefer `BTreeMap`/`BTreeSet` over `HashMap`/`HashSet` when order matters
|
||||
- Prefer a functional programming style instead of an imperative one.
|
||||
- When building UIs keep the handler and view code in the same module (eg. settings view and settings handle in the same file)
|
||||
|
||||
## License
|
||||
All code in this project is MIT licensed.
|
||||
1697
Cargo.lock
generated
1697
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
30
Cargo.toml
30
Cargo.toml
@@ -5,19 +5,24 @@ members = [
|
||||
"typegen",
|
||||
"ui-gpui",
|
||||
"ui-iced",
|
||||
"crates/iced_video_player",
|
||||
"store",
|
||||
"jello-types",
|
||||
"gst",
|
||||
"examples/hdr-gstreamer-wgpu",
|
||||
"crates/iced-video",
|
||||
]
|
||||
[workspace.dependencies]
|
||||
iced = { git = "https://github.com/iced-rs/iced", features = [
|
||||
"advanced",
|
||||
"canvas",
|
||||
"image",
|
||||
"sipper",
|
||||
"tokio",
|
||||
"debug",
|
||||
] }
|
||||
iced_wgpu = { git = "https://github.com/iced-rs/iced" }
|
||||
iced_video_player = { path = "crates/iced_video_player" }
|
||||
iced = { version = "0.14.0" }
|
||||
gst = { version = "0.1.0", path = "gst" }
|
||||
iced_wgpu = { version = "0.14.0" }
|
||||
iced-video = { version = "0.1.0", path = "crates/iced-video" }
|
||||
|
||||
[patch.crates-io]
|
||||
iced_wgpu = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
|
||||
iced_core = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
|
||||
iced_renderer = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
|
||||
iced_futures = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
|
||||
iced = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
|
||||
|
||||
[package]
|
||||
name = "jello"
|
||||
@@ -27,8 +32,11 @@ license = "MIT"
|
||||
|
||||
[dependencies]
|
||||
api = { version = "0.1.0", path = "api" }
|
||||
bytemuck = { version = "1.24.0", features = ["derive"] }
|
||||
clap = { version = "4.5", features = ["derive"] }
|
||||
clap-verbosity-flag = { version = "3.0.4", features = ["tracing"] }
|
||||
clap_complete = "4.5"
|
||||
color-backtrace = "0.7.2"
|
||||
dotenvy = "0.15.7"
|
||||
error-stack = "0.6"
|
||||
thiserror = "2.0"
|
||||
|
||||
112
README.md
Normal file
112
README.md
Normal file
@@ -0,0 +1,112 @@
|
||||
# Jello
|
||||
|
||||
A WIP video client for jellyfin.
|
||||
|
||||
(Planned) Features
|
||||
|
||||
1. Integrate with jellyfin
|
||||
2. HDR video playback
|
||||
3. Audio Track selection
|
||||
4. Chapter selection
|
||||
|
||||
Libraries and frameworks used for this
|
||||
1. iced -> primary gui toolkit
|
||||
2. gstreamer -> primary video + audio decoding library
|
||||
3. wgpu -> rendering the video from gstreamer in iced
|
||||
|
||||
|
||||
### HDR
|
||||
I'll try to document all my findings about HDR here.
|
||||
I'm making this project to mainly learn about videos, color-spaces and gpu programming. And so very obviously I'm bound to make mistakes in either the code or the fundamental understanding of a concept. Please don't take anything in this text as absolute.
|
||||
|
||||
```rust
|
||||
let window = ... // use winnit to get a window handle, check the example in this repo
|
||||
let instance = wgpu::Instance::default();
|
||||
let surface = instance.create_surface(window).unwrap();
|
||||
let adapter = instance
|
||||
.request_adapter(&wgpu::RequestAdapterOptions {
|
||||
power_preference: wgpu::PowerPreference::default(),
|
||||
compatible_surface: Some(&surface),
|
||||
force_fallback_adapter: false,
|
||||
})
|
||||
.await
|
||||
.context("Failed to request wgpu adapter")?;
|
||||
let caps = surface.get_capabilities();
|
||||
println!("{:#?}", caps.formats);
|
||||
```
|
||||
|
||||
This should print out all the texture formats that can be used by your current hardware
|
||||
Among these the formats that support hdr (afaik) are
|
||||
```
|
||||
wgpu::TextureFormat::Rgba16Float
|
||||
wgpu::TextureFormat::Rgba32Float
|
||||
wgpu::TextureFormat::Rgb10a2Unorm
|
||||
wgpu::TextureFormat::Rgb10a2Uint // (unsure)
|
||||
```
|
||||
|
||||
My display supports Rgb10a2Unorm so I'll be going forward with that texture format.
|
||||
|
||||
`Rgb10a2Unorm` is still the same size as a `Rgba8Unorm` but data is in a different representation in each of them
|
||||
|
||||
`Rgb10a2Unorm`:
|
||||
R, G, B => 10 bits each (2^10 = 1024 [0..=1023])
|
||||
A => 2 bits (2^2 = 4 [0..=3])
|
||||
|
||||
Whereas in a normal pixel
|
||||
`Rgba8Unorm`
|
||||
R, G, B, A => 8 bits each (2^8 = 256 [0..=255])
|
||||
|
||||
|
||||
For displaying videos the alpha components is not really used (I don't know of any) so we can use re-allocate 6 bits from the alpha channel and put them in the r,g and b components.
|
||||
In the shader the components get uniformly normalized from [0..=1023] integer to [0..=1] in float so we can compute them properly
|
||||
|
||||
Videos however are generally not stored in this format or any rgb format in general because it is not as efficient for (lossy) compression as YUV formats.
|
||||
|
||||
Right now I don't want to deal with yuv formats so I'll use gstreamer caps to convert the video into `Rgba10a2` format
|
||||
|
||||
|
||||
## Pixel formats and Planes
|
||||
Dated: Sun Jan 4 09:09:16 AM IST 2026
|
||||
| value | count | quantile | percentage | frequency |
|
||||
| --- | --- | --- | --- | --- |
|
||||
| yuv420p | 1815 | 0.5067001675041876 | 50.67% | ************************************************** |
|
||||
| yuv420p10le | 1572 | 0.4388609715242881 | 43.89% | ******************************************* |
|
||||
| yuvj420p | 171 | 0.04773869346733668 | 4.77% | **** |
|
||||
| rgba | 14 | 0.003908431044109436 | 0.39% | |
|
||||
| yuvj444p | 10 | 0.0027917364600781687 | 0.28% | |
|
||||
|
||||
For all of my media collection these are the pixel formats for all the videos
|
||||
|
||||
### RGBA
|
||||
Pretty self evident
|
||||
8 channels for each of R, G, B and A
|
||||
Hopefully shouldn't be too hard to make a function or possibly a lut that takes data from rgba and maps it to Rgb10a2Unorm
|
||||
|
||||
```mermaid
|
||||
packet
|
||||
title RGBA
|
||||
+8: "R"
|
||||
+8: "G"
|
||||
+8: "B"
|
||||
+8: "A"
|
||||
```
|
||||
|
||||
|
||||
### YUV
|
||||
[All YUV formats](https://learn.microsoft.com/en-us/windows/win32/medfound/recommended-8-bit-yuv-formats-for-video-rendering#surface-definitions)
|
||||
[10 and 16 bit yuv formats](https://learn.microsoft.com/en-us/windows/win32/medfound/10-bit-and-16-bit-yuv-video-formats)
|
||||
|
||||
Y -> Luminance
|
||||
U,V -> Chrominance
|
||||
|
||||
p -> Planar
|
||||
sp -> semi planar
|
||||
|
||||
j -> full range
|
||||
|
||||
planar formats have each of the channels in a contiguous array one after another
|
||||
in semi-planar formats the y channel is seperate and uv channels are interleaved
|
||||
|
||||
|
||||
## Chroma Subsampling
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
name = "api"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
license = "MIT"
|
||||
|
||||
[dependencies]
|
||||
bytes = "1.11.0"
|
||||
|
||||
@@ -4,7 +4,7 @@ pub async fn main() {
|
||||
let config = std::fs::read_to_string("config.toml").expect("Config.toml");
|
||||
let config: JellyfinConfig = toml::from_str(&config).expect("Failed to parse config.toml");
|
||||
|
||||
let mut jellyfin = JellyfinClient::new(config);
|
||||
let mut jellyfin = JellyfinClient::new_with_config(config);
|
||||
jellyfin
|
||||
.authenticate_with_cached_token(".session")
|
||||
.await
|
||||
|
||||
130
api/src/lib.rs
130
api/src/lib.rs
@@ -3,7 +3,7 @@ pub mod jellyfin;
|
||||
use std::sync::Arc;
|
||||
|
||||
use ::tap::*;
|
||||
use reqwest::Method;
|
||||
use reqwest::{Method, header::InvalidHeaderValue};
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(thiserror::Error, Debug)]
|
||||
@@ -15,6 +15,8 @@ pub enum JellyfinApiError {
|
||||
#[error("IO error: {0}")]
|
||||
IoError(#[from] std::io::Error),
|
||||
#[error("Unknown Jellyfin API error")]
|
||||
InvalidHeader(#[from] InvalidHeaderValue),
|
||||
#[error("Unknown Jellyfin API error")]
|
||||
Unknown,
|
||||
}
|
||||
|
||||
@@ -28,7 +30,49 @@ pub struct JellyfinClient {
|
||||
}
|
||||
|
||||
impl JellyfinClient {
|
||||
pub fn new(config: JellyfinConfig) -> Self {
|
||||
pub async fn authenticate(
|
||||
username: impl AsRef<str>,
|
||||
password: impl AsRef<str>,
|
||||
config: JellyfinConfig,
|
||||
) -> Result<Self> {
|
||||
let url = format!("{}/Users/AuthenticateByName", config.server_url);
|
||||
let client = reqwest::Client::new();
|
||||
let token = client
|
||||
.post(url)
|
||||
.json(&jellyfin::AuthenticateUserByName {
|
||||
username: Some(username.as_ref().to_string()),
|
||||
pw: Some(password.as_ref().to_string()),
|
||||
})
|
||||
.send()
|
||||
.await?
|
||||
.error_for_status()?
|
||||
.json::<jellyfin::AuthenticationResult>()
|
||||
.await?
|
||||
.access_token
|
||||
.ok_or_else(|| std::io::Error::other("No field access_token in auth response"))?;
|
||||
Self::pre_authenticated(token, config)
|
||||
}
|
||||
|
||||
pub fn pre_authenticated(token: impl AsRef<str>, config: JellyfinConfig) -> Result<Self> {
|
||||
let auth_header = core::iter::once((
|
||||
reqwest::header::HeaderName::from_static("x-emby-authorization"),
|
||||
reqwest::header::HeaderValue::from_str(&format!(
|
||||
"MediaBrowser Client=\"{}\", Device=\"{}\", DeviceId=\"{}\", Version=\"{}\"",
|
||||
config.client_name, config.device_name, config.device_id, config.version
|
||||
))?,
|
||||
))
|
||||
.collect();
|
||||
let client = reqwest::Client::builder()
|
||||
.default_headers(auth_header)
|
||||
.build()?;
|
||||
Ok(Self {
|
||||
client,
|
||||
access_token: Some(token.as_ref().to_string().into()),
|
||||
config: Arc::new(config),
|
||||
})
|
||||
}
|
||||
|
||||
pub fn new_with_config(config: JellyfinConfig) -> Self {
|
||||
JellyfinClient {
|
||||
client: reqwest::Client::new(),
|
||||
access_token: None,
|
||||
@@ -119,45 +163,6 @@ impl JellyfinClient {
|
||||
Ok(out)
|
||||
}
|
||||
|
||||
pub async fn authenticate(&mut self) -> Result<jellyfin::AuthenticationResult> {
|
||||
let auth_result: jellyfin::AuthenticationResult = self
|
||||
.post(
|
||||
"Users/AuthenticateByName",
|
||||
&jellyfin::AuthenticateUserByName {
|
||||
username: Some(self.config.username.clone()),
|
||||
pw: Some(self.config.password.clone()),
|
||||
},
|
||||
)
|
||||
.await?;
|
||||
self.access_token = auth_result.access_token.clone().map(Into::into);
|
||||
Ok(auth_result)
|
||||
}
|
||||
|
||||
pub async fn authenticate_with_cached_token(
|
||||
&mut self,
|
||||
path: impl AsRef<std::path::Path>,
|
||||
) -> Result<String> {
|
||||
let path = path.as_ref();
|
||||
if let Ok(token) = self
|
||||
.load_token(path)
|
||||
.await
|
||||
.inspect_err(|err| tracing::warn!("Failed to load cached token: {}", err))
|
||||
{
|
||||
tracing::info!("Authenticating with cached token from {:?}", path);
|
||||
self.access_token = Some(token.clone().into());
|
||||
Ok(token)
|
||||
} else {
|
||||
tracing::info!("No cached token found at {:?}, authenticating...", path);
|
||||
let token = self
|
||||
.authenticate()
|
||||
.await?
|
||||
.access_token
|
||||
.ok_or_else(|| JellyfinApiError::Unknown)?;
|
||||
self.save_token(path).await?;
|
||||
Ok(token)
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn raw_items(&self) -> Result<jellyfin::BaseItemDtoQueryResult> {
|
||||
let text = &self
|
||||
.request_builder(Method::GET, "Items")
|
||||
@@ -250,53 +255,16 @@ impl JellyfinClient {
|
||||
"{}/Videos/{}/stream?static=true",
|
||||
self.config.server_url.as_str(),
|
||||
item,
|
||||
// item,
|
||||
);
|
||||
Ok(url::Url::parse(&stream_url).expect("Failed to parse stream URL"))
|
||||
}
|
||||
}
|
||||
|
||||
// pub trait Item {
|
||||
// fn id(&self) -> &str;
|
||||
// fn name(&self) -> &str;
|
||||
// fn type_(&self) -> jellyfin::BaseItemKind;
|
||||
// fn media_type(&self) -> &str;
|
||||
// }
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize, Clone)]
|
||||
pub struct JellyfinConfig {
|
||||
pub username: String,
|
||||
pub password: String,
|
||||
pub server_url: iref::IriBuf,
|
||||
pub device_id: String,
|
||||
}
|
||||
|
||||
impl JellyfinConfig {
|
||||
pub fn new(
|
||||
username: String,
|
||||
password: String,
|
||||
server_url: impl AsRef<str>,
|
||||
device_id: String,
|
||||
) -> Self {
|
||||
JellyfinConfig {
|
||||
username,
|
||||
password,
|
||||
server_url: iref::IriBuf::new(server_url.as_ref().into())
|
||||
.expect("Failed to parse server URL"),
|
||||
device_id,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_client_authenticate() {
|
||||
let config = JellyfinConfig {
|
||||
username: "servius".to_string(),
|
||||
password: "nfz6yqr_NZD1nxk!faj".to_string(),
|
||||
server_url: iref::IriBuf::new("https://jellyfin.tsuba.darksailor.dev".into()).unwrap(),
|
||||
device_id: "testdeviceid".to_string(),
|
||||
};
|
||||
let mut client = JellyfinClient::new(config);
|
||||
let auth_result = tokio_test::block_on(client.authenticate());
|
||||
assert!(auth_result.is_ok());
|
||||
pub device_name: String,
|
||||
pub client_name: String,
|
||||
pub version: String,
|
||||
}
|
||||
|
||||
29
crates/iced-video/Cargo.toml
Normal file
29
crates/iced-video/Cargo.toml
Normal file
@@ -0,0 +1,29 @@
|
||||
[package]
|
||||
name = "iced-video"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
|
||||
[dependencies]
|
||||
bytemuck = "1.24.0"
|
||||
error-stack = "0.6.0"
|
||||
futures-lite = "2.6.1"
|
||||
gst.workspace = true
|
||||
iced_core = "0.14.0"
|
||||
iced_futures = "0.14.0"
|
||||
iced_renderer = { version = "0.14.0", features = ["iced_wgpu"] }
|
||||
iced_wgpu = { version = "0.14.0" }
|
||||
thiserror = "2.0.17"
|
||||
tracing = "0.1.43"
|
||||
wgpu = { version = "27.0.1", features = ["vulkan"] }
|
||||
|
||||
[dev-dependencies]
|
||||
iced.workspace = true
|
||||
tracing-subscriber = { version = "0.3.22", features = ["env-filter"] }
|
||||
|
||||
[profile.dev]
|
||||
debug = true
|
||||
[profile.release]
|
||||
debug = true
|
||||
|
||||
# [patch.crates-io]
|
||||
# iced_wgpu = { git = "https://github.com/uttarayan21/iced", branch = "0.14" }
|
||||
178
crates/iced-video/examples/minimal.rs
Normal file
178
crates/iced-video/examples/minimal.rs
Normal file
@@ -0,0 +1,178 @@
|
||||
use iced_video::{Video, VideoHandle};
|
||||
|
||||
pub fn main() -> iced::Result {
|
||||
use tracing_subscriber::prelude::*;
|
||||
tracing_subscriber::registry()
|
||||
.with(
|
||||
tracing_subscriber::fmt::layer()
|
||||
.with_thread_ids(true)
|
||||
.with_file(true),
|
||||
)
|
||||
.with(tracing_subscriber::EnvFilter::from_default_env())
|
||||
.init();
|
||||
iced::application(State::new, update, view)
|
||||
.subscription(|state| {
|
||||
// Foo
|
||||
match &state.video {
|
||||
Some(video) => video.subscription_with(state, keyboard_event),
|
||||
None => keyboard_event(state),
|
||||
}
|
||||
})
|
||||
.run()
|
||||
}
|
||||
|
||||
fn keyboard_event(_state: &State) -> iced::Subscription<Message> {
|
||||
use iced::keyboard::{Key, key::Named};
|
||||
iced::keyboard::listen().map(move |event| match event {
|
||||
iced::keyboard::Event::KeyPressed { key, .. } => {
|
||||
let key = key.as_ref();
|
||||
match key {
|
||||
Key::Named(Named::Escape) | Key::Character("q") => Message::Quit,
|
||||
Key::Character("f") => Message::Fullscreen,
|
||||
Key::Named(Named::Space) => Message::Toggle,
|
||||
_ => Message::Noop,
|
||||
}
|
||||
}
|
||||
_ => Message::Noop,
|
||||
})
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct State {
|
||||
video: Option<VideoHandle<Message>>,
|
||||
fullscreen: bool,
|
||||
}
|
||||
|
||||
impl State {
|
||||
pub fn new() -> (Self, iced::Task<Message>) {
|
||||
(
|
||||
Self {
|
||||
video: None,
|
||||
fullscreen: false,
|
||||
},
|
||||
iced::Task::done(Message::Load),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum Message {
|
||||
Play,
|
||||
Pause,
|
||||
Toggle,
|
||||
Noop,
|
||||
Load,
|
||||
Fullscreen,
|
||||
OnLoad(VideoHandle<Message>),
|
||||
OnError(String),
|
||||
NewFrame,
|
||||
Eos,
|
||||
Quit,
|
||||
}
|
||||
|
||||
pub fn update(state: &mut State, message: Message) -> iced::Task<Message> {
|
||||
match message {
|
||||
Message::NewFrame => {
|
||||
iced::Task::none()
|
||||
}
|
||||
Message::Eos => {
|
||||
iced::Task::done(Message::Pause)
|
||||
}
|
||||
Message::Load => {
|
||||
iced::Task::perform(
|
||||
VideoHandle::load(
|
||||
"https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c",
|
||||
),
|
||||
|result| match result {
|
||||
Ok(video) => Message::OnLoad(video),
|
||||
Err(err) => Message::OnError(format!("Error loading video: {:?}", err)),
|
||||
},
|
||||
).chain(iced::Task::done(Message::Play))
|
||||
}
|
||||
Message::OnError(err) => {
|
||||
eprintln!("Error: {}", err);
|
||||
iced::Task::none()
|
||||
}
|
||||
Message::OnLoad(video) => {
|
||||
state.video = Some(video.on_new_frame(Message::NewFrame).on_end_of_stream(Message::Eos));
|
||||
iced::Task::none()
|
||||
}
|
||||
Message::Fullscreen => {
|
||||
state.fullscreen = !state.fullscreen;
|
||||
let fullscreen = state.fullscreen;
|
||||
let mode = if fullscreen {
|
||||
iced::window::Mode::Fullscreen
|
||||
} else {
|
||||
iced::window::Mode::Windowed
|
||||
};
|
||||
iced::window::oldest().and_then(move |id| iced::window::set_mode::<Message>(id, mode))
|
||||
}
|
||||
Message::Play => {
|
||||
state
|
||||
.video
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.source()
|
||||
.play()
|
||||
.expect("Failed to play video");
|
||||
iced::Task::none()
|
||||
}
|
||||
Message::Pause => {
|
||||
state
|
||||
.video
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.source()
|
||||
.pause()
|
||||
.expect("Failed to pause video");
|
||||
iced::Task::none()
|
||||
}
|
||||
Message::Toggle => {
|
||||
state
|
||||
.video
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.source()
|
||||
.toggle()
|
||||
.expect("Failed to stop video");
|
||||
iced::Task::none()
|
||||
}
|
||||
Message::Quit => {
|
||||
state
|
||||
.video
|
||||
.as_ref()
|
||||
.unwrap()
|
||||
.source()
|
||||
.stop()
|
||||
.expect("Failed to stop video");
|
||||
std::process::exit(0);
|
||||
}
|
||||
Message::Noop => iced::Task::none(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn view<'a>(state: &'a State) -> iced::Element<'a, Message> {
|
||||
if let None = &state.video {
|
||||
return iced::widget::Column::new()
|
||||
.push(iced::widget::Text::new("Press any key to load video"))
|
||||
.align_x(iced::Alignment::Center)
|
||||
.into();
|
||||
}
|
||||
let video_widget = Video::new(&state.video.as_ref().unwrap())
|
||||
.width(iced::Length::Fill)
|
||||
.height(iced::Length::Fill)
|
||||
.content_fit(iced::ContentFit::Contain);
|
||||
|
||||
iced::widget::Column::new()
|
||||
.push(video_widget)
|
||||
.push(
|
||||
iced::widget::Row::new()
|
||||
.push(iced::widget::Button::new("Play").on_press(Message::Play))
|
||||
.push(iced::widget::Button::new("Pause").on_press(Message::Pause))
|
||||
.spacing(5)
|
||||
.padding(10)
|
||||
.align_y(iced::Alignment::Center),
|
||||
)
|
||||
.align_x(iced::Alignment::Center)
|
||||
.into()
|
||||
}
|
||||
8
crates/iced-video/justfile
Normal file
8
crates/iced-video/justfile
Normal file
@@ -0,0 +1,8 @@
|
||||
info:
|
||||
RUST_LOG=info,wgpu_core=warn,wgpu_hal=warn cargo run --release --example minimal
|
||||
# GST_DEBUG=5 RUST_LOG="" cargo run --release --example minimal
|
||||
flame:
|
||||
cargo flamegraph run --release --example minimal
|
||||
heaptrack:
|
||||
cargo build --release --example minimal
|
||||
RUST_LOG="info,wgpu_hal=info" heaptrack $CARGO_TARGET_DIR/release/examples/minimal
|
||||
55
crates/iced-video/src/id.rs
Normal file
55
crates/iced-video/src/id.rs
Normal file
@@ -0,0 +1,55 @@
|
||||
use std::borrow;
|
||||
use std::sync::atomic::{self, AtomicUsize};
|
||||
|
||||
static NEXT_ID: AtomicUsize = AtomicUsize::new(0);
|
||||
|
||||
/// The identifier of a generic widget.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash, PartialOrd, Ord)]
|
||||
pub struct Id(Internal);
|
||||
|
||||
impl Id {
|
||||
/// Creates a new [`Id`] from a static `str`.
|
||||
pub const fn new(id: &'static str) -> Self {
|
||||
Self(Internal::Custom(borrow::Cow::Borrowed(id)))
|
||||
}
|
||||
|
||||
/// Creates a unique [`Id`].
|
||||
///
|
||||
/// This function produces a different [`Id`] every time it is called.
|
||||
pub fn unique() -> Self {
|
||||
let id = NEXT_ID.fetch_add(1, atomic::Ordering::Relaxed);
|
||||
|
||||
Self(Internal::Unique(id))
|
||||
}
|
||||
}
|
||||
|
||||
impl From<&'static str> for Id {
|
||||
fn from(value: &'static str) -> Self {
|
||||
Self::new(value)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<String> for Id {
|
||||
fn from(value: String) -> Self {
|
||||
Self(Internal::Custom(borrow::Cow::Owned(value)))
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash, PartialOrd, Ord)]
|
||||
enum Internal {
|
||||
Unique(usize),
|
||||
Custom(borrow::Cow<'static, str>),
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::Id;
|
||||
|
||||
#[test]
|
||||
fn unique_generates_different_ids() {
|
||||
let a = Id::unique();
|
||||
let b = Id::unique();
|
||||
|
||||
assert_ne!(a, b);
|
||||
}
|
||||
}
|
||||
164
crates/iced-video/src/lib.rs
Normal file
164
crates/iced-video/src/lib.rs
Normal file
@@ -0,0 +1,164 @@
|
||||
pub mod id;
|
||||
pub mod primitive;
|
||||
pub mod source;
|
||||
pub mod widget;
|
||||
pub use widget::Video;
|
||||
|
||||
use error_stack::{Report, ResultExt};
|
||||
|
||||
use gst::plugins::app::AppSink;
|
||||
use gst::plugins::playback::Playbin3;
|
||||
use gst::plugins::videoconvertscale::VideoConvert;
|
||||
|
||||
#[derive(Debug, thiserror::Error)]
|
||||
#[error("Iced Video Error")]
|
||||
pub struct Error;
|
||||
pub type Result<T, E = Report<Error>> = core::result::Result<T, E>;
|
||||
|
||||
use std::sync::{Arc, Mutex, atomic::AtomicBool};
|
||||
|
||||
mod seal {
|
||||
pub trait Sealed {}
|
||||
impl Sealed for super::Unknown {}
|
||||
impl Sealed for super::Ready {}
|
||||
}
|
||||
|
||||
pub trait State: seal::Sealed {
|
||||
fn is_ready() -> bool {
|
||||
false
|
||||
}
|
||||
}
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Unknown;
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Ready;
|
||||
impl State for Unknown {}
|
||||
impl State for Ready {
|
||||
fn is_ready() -> bool {
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
/// This is the video handle that is used to control the video playback.
|
||||
/// This should be keps in the application state.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct VideoHandle<Message, S: State = Unknown> {
|
||||
id: id::Id,
|
||||
pub source: source::VideoSource,
|
||||
frame_ready: Arc<AtomicBool>,
|
||||
on_new_frame: Option<Box<Message>>,
|
||||
on_end_of_stream: Option<Box<Message>>,
|
||||
on_about_to_finish: Option<Box<Message>>,
|
||||
__marker: core::marker::PhantomData<S>,
|
||||
}
|
||||
|
||||
impl<Message: Send + Sync + Clone> VideoHandle<Message, Unknown> {
|
||||
pub fn new(url: impl AsRef<str>) -> Result<Self> {
|
||||
let source = source::VideoSource::new(url)?;
|
||||
let frame_ready = Arc::clone(&source.ready);
|
||||
Ok(Self {
|
||||
id: id::Id::unique(),
|
||||
source: source,
|
||||
on_new_frame: None,
|
||||
on_end_of_stream: None,
|
||||
on_about_to_finish: None,
|
||||
frame_ready,
|
||||
__marker: core::marker::PhantomData,
|
||||
})
|
||||
}
|
||||
|
||||
/// Creates a new video handle and waits for the metadata to be loaded.
|
||||
pub async fn load(url: impl AsRef<str>) -> Result<VideoHandle<Message, Ready>> {
|
||||
let handle = VideoHandle::new(url)?;
|
||||
handle.wait().await
|
||||
}
|
||||
}
|
||||
impl<Message: Send + Sync + Clone, S: State> VideoHandle<Message, S> {
|
||||
pub fn id(&self) -> &id::Id {
|
||||
&self.id
|
||||
}
|
||||
|
||||
pub fn source(&self) -> &source::VideoSource {
|
||||
&self.source
|
||||
}
|
||||
|
||||
pub async fn wait(self) -> Result<VideoHandle<Message, Ready>> {
|
||||
self.source.wait().await?;
|
||||
Ok(self.state::<Ready>())
|
||||
}
|
||||
|
||||
fn state<S2: State>(self) -> VideoHandle<Message, S2> {
|
||||
VideoHandle {
|
||||
id: self.id,
|
||||
source: self.source,
|
||||
on_new_frame: self.on_new_frame,
|
||||
on_end_of_stream: self.on_end_of_stream,
|
||||
on_about_to_finish: self.on_about_to_finish,
|
||||
frame_ready: self.frame_ready,
|
||||
__marker: core::marker::PhantomData,
|
||||
}
|
||||
}
|
||||
|
||||
// pub fn subscription(&self) -> iced_futures::subscription::Subscription<Message> {
|
||||
// let sub = widget::VideoSubscription {
|
||||
// id: self.id.clone(),
|
||||
// on_end_of_stream: self.on_end_of_stream.clone(),
|
||||
// on_new_frame: self.on_new_frame.clone(),
|
||||
// on_about_to_finish: self.on_about_to_finish.clone(),
|
||||
// bus: self.source.bus.clone(),
|
||||
// };
|
||||
// iced_futures::subscription::from_recipe(sub)
|
||||
// }
|
||||
//
|
||||
// pub fn subscription_with<State>(
|
||||
// &self,
|
||||
// state: &State,
|
||||
// f: impl FnOnce(&State) -> iced_futures::subscription::Subscription<Message> + 'static,
|
||||
// ) -> iced_futures::subscription::Subscription<Message>
|
||||
// where
|
||||
// State: Send + Sync + 'static,
|
||||
// {
|
||||
// let sub = self.subscription();
|
||||
// iced_futures::subscription::Subscription::batch([sub, f(state)])
|
||||
// }
|
||||
|
||||
pub fn on_new_frame(self, message: Message) -> Self {
|
||||
Self {
|
||||
on_new_frame: Some(Box::new(message)),
|
||||
..self
|
||||
}
|
||||
}
|
||||
|
||||
pub fn on_end_of_stream(self, message: Message) -> Self {
|
||||
Self {
|
||||
on_end_of_stream: Some(Box::new(message)),
|
||||
..self
|
||||
}
|
||||
}
|
||||
|
||||
pub fn on_about_to_finish(self, message: Message) -> Self {
|
||||
Self {
|
||||
on_about_to_finish: Some(Box::new(message)),
|
||||
..self
|
||||
}
|
||||
}
|
||||
|
||||
pub fn play(&self) {
|
||||
self.source.play();
|
||||
}
|
||||
pub fn pause(&self) {
|
||||
self.source.pause();
|
||||
}
|
||||
pub fn stop(&self) {
|
||||
self.source.stop();
|
||||
}
|
||||
}
|
||||
|
||||
impl<Message: Send + Sync + Clone> VideoHandle<Message, Ready> {
|
||||
pub fn format(&self) -> Result<gst::VideoFormat> {
|
||||
self.source
|
||||
.format()
|
||||
.change_context(Error)
|
||||
.attach("Failed to get video format")
|
||||
}
|
||||
}
|
||||
574
crates/iced-video/src/primitive.rs
Normal file
574
crates/iced-video/src/primitive.rs
Normal file
@@ -0,0 +1,574 @@
|
||||
use crate::id;
|
||||
use gst::videoconvertscale::VideoFormat;
|
||||
use iced_wgpu::primitive::Pipeline;
|
||||
use iced_wgpu::wgpu;
|
||||
use std::collections::BTreeMap;
|
||||
use std::sync::{Arc, Mutex, atomic::AtomicBool};
|
||||
|
||||
#[derive(Clone, Copy, Debug, bytemuck::Zeroable, bytemuck::Pod)]
|
||||
#[repr(transparent)]
|
||||
pub struct ConversionMatrix {
|
||||
matrix: [Vec3f; 3],
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug, bytemuck::Zeroable, bytemuck::Pod)]
|
||||
#[repr(C, align(16))]
|
||||
pub struct Vec3f {
|
||||
data: [f32; 3],
|
||||
__padding: u32,
|
||||
}
|
||||
|
||||
impl From<[f32; 3]> for Vec3f {
|
||||
fn from(value: [f32; 3]) -> Self {
|
||||
Vec3f {
|
||||
data: [value[0], value[1], value[2]],
|
||||
__padding: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Vec3f {
|
||||
pub fn new(x: f32, y: f32, z: f32) -> Self {
|
||||
Vec3f {
|
||||
data: [x, y, z],
|
||||
__padding: 0,
|
||||
}
|
||||
}
|
||||
pub const fn from(data: [f32; 3]) -> Self {
|
||||
Vec3f {
|
||||
data: [data[0], data[1], data[2]],
|
||||
__padding: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// impl ConversionMatrix {
|
||||
// pub fn desc() -> wgpu::VertexBufferLayout<'static> {
|
||||
// wgpu::VertexBufferLayout {
|
||||
// array_stride: core::mem::size_of::<ConversionMatrix>() as wgpu::BufferAddress,
|
||||
// step_mode: wgpu::VertexStepMode::Vertex,
|
||||
// attributes: &[
|
||||
// wgpu::VertexAttribute {
|
||||
// offset: 0,
|
||||
// shader_location: 0,
|
||||
// format: wgpu::VertexFormat::Float32x4,
|
||||
// },
|
||||
// wgpu::VertexAttribute {
|
||||
// offset: 16,
|
||||
// shader_location: 1,
|
||||
// format: wgpu::VertexFormat::Float32x4,
|
||||
// },
|
||||
// wgpu::VertexAttribute {
|
||||
// offset: 32,
|
||||
// shader_location: 2,
|
||||
// format: wgpu::VertexFormat::Float32x4,
|
||||
// },
|
||||
// wgpu::VertexAttribute {
|
||||
// offset: 48,
|
||||
// shader_location: 3,
|
||||
// format: wgpu::VertexFormat::Float32x4,
|
||||
// },
|
||||
// ],
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
|
||||
pub const BT2020_TO_RGB: ConversionMatrix = ConversionMatrix {
|
||||
matrix: [
|
||||
Vec3f::from([1.0, 0.0, 1.4746]),
|
||||
Vec3f::from([1.0, -0.16455, -0.5714]),
|
||||
Vec3f::from([1.0, 1.8814, 0.0]),
|
||||
],
|
||||
};
|
||||
|
||||
pub const BT709_TO_RGB: ConversionMatrix = ConversionMatrix {
|
||||
matrix: [
|
||||
Vec3f::from([1.0, 0.0, 1.5748]),
|
||||
Vec3f::from([1.0, -0.1873, -0.4681]),
|
||||
Vec3f::from([1.0, 1.8556, 0.0]),
|
||||
],
|
||||
};
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct VideoFrame {
|
||||
pub id: id::Id,
|
||||
pub size: wgpu::Extent3d,
|
||||
pub ready: Arc<AtomicBool>,
|
||||
pub frame: Arc<Mutex<gst::Sample>>,
|
||||
pub format: VideoFormat,
|
||||
}
|
||||
|
||||
impl iced_wgpu::Primitive for VideoFrame {
|
||||
type Pipeline = VideoPipeline;
|
||||
|
||||
fn prepare(
|
||||
&self,
|
||||
pipeline: &mut Self::Pipeline,
|
||||
device: &wgpu::Device,
|
||||
queue: &wgpu::Queue,
|
||||
bounds: &iced_wgpu::core::Rectangle,
|
||||
viewport: &iced_wgpu::graphics::Viewport,
|
||||
) {
|
||||
let video = pipeline.videos.entry(self.id.clone()).or_insert_with(|| {
|
||||
let texture = VideoTexture::new(
|
||||
"iced-video-texture",
|
||||
self.size,
|
||||
device,
|
||||
pipeline.format,
|
||||
self.format,
|
||||
);
|
||||
|
||||
let bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
|
||||
label: Some("iced-video-texture-bind-group"),
|
||||
layout: &pipeline.bind_group_layout,
|
||||
entries: &[
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 0,
|
||||
resource: wgpu::BindingResource::TextureView(&texture.y_texture()),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 1,
|
||||
resource: wgpu::BindingResource::TextureView(&texture.uv_texture()),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 2,
|
||||
resource: wgpu::BindingResource::Sampler(&pipeline.sampler),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 3,
|
||||
resource: wgpu::BindingResource::Buffer(
|
||||
texture
|
||||
.conversion_matrix_buffer()
|
||||
.as_entire_buffer_binding(),
|
||||
),
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
let matrix = if matches!(self.format, VideoFormat::P01010le | VideoFormat::P016Le) {
|
||||
BT2020_TO_RGB
|
||||
} else {
|
||||
BT709_TO_RGB
|
||||
};
|
||||
|
||||
texture.write_conversion_matrix(&matrix, queue);
|
||||
|
||||
VideoFrameData {
|
||||
id: self.id.clone(),
|
||||
texture,
|
||||
bind_group,
|
||||
conversion_matrix: matrix,
|
||||
ready: Arc::clone(&self.ready),
|
||||
}
|
||||
});
|
||||
if self.size != video.texture.size() {
|
||||
let new_texture = video
|
||||
.texture
|
||||
.resize("iced-video-texture-resized", self.size, device);
|
||||
|
||||
new_texture.write_conversion_matrix(&video.conversion_matrix, queue);
|
||||
|
||||
let new_bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
|
||||
label: Some("iced-video-texture-bind-group"),
|
||||
layout: &pipeline.bind_group_layout,
|
||||
entries: &[
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 0,
|
||||
resource: wgpu::BindingResource::TextureView(&new_texture.y_texture()),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 1,
|
||||
resource: wgpu::BindingResource::TextureView(&new_texture.uv_texture()),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 2,
|
||||
resource: wgpu::BindingResource::Sampler(&pipeline.sampler),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 3,
|
||||
resource: wgpu::BindingResource::Buffer(
|
||||
video
|
||||
.texture
|
||||
.conversion_matrix_buffer()
|
||||
.as_entire_buffer_binding(),
|
||||
),
|
||||
},
|
||||
],
|
||||
});
|
||||
video.texture = new_texture;
|
||||
video.bind_group = new_bind_group;
|
||||
}
|
||||
if video.ready.load(std::sync::atomic::Ordering::SeqCst) {
|
||||
let frame = self.frame.lock().expect("BUG: Mutex poisoned");
|
||||
let buffer = frame
|
||||
.buffer()
|
||||
.expect("BUG: Failed to get frame data from gst::Sample");
|
||||
|
||||
let data = buffer
|
||||
.map_readable()
|
||||
.expect("BUG: Failed to map gst::Buffer readable");
|
||||
|
||||
video.texture.write_texture(&data, queue);
|
||||
|
||||
drop(data);
|
||||
video
|
||||
.ready
|
||||
.store(false, std::sync::atomic::Ordering::SeqCst);
|
||||
}
|
||||
}
|
||||
|
||||
fn render(
|
||||
&self,
|
||||
pipeline: &Self::Pipeline,
|
||||
encoder: &mut wgpu::CommandEncoder,
|
||||
target: &wgpu::TextureView,
|
||||
bounds: &iced_wgpu::core::Rectangle<u32>,
|
||||
) {
|
||||
let Some(video) = pipeline.videos.get(&self.id) else {
|
||||
return;
|
||||
};
|
||||
|
||||
let mut render_pass = encoder.begin_render_pass(&wgpu::RenderPassDescriptor {
|
||||
label: Some("iced-video-render-pass"),
|
||||
color_attachments: &[Some(wgpu::RenderPassColorAttachment {
|
||||
view: target,
|
||||
resolve_target: None,
|
||||
ops: wgpu::Operations {
|
||||
load: wgpu::LoadOp::Load,
|
||||
store: wgpu::StoreOp::Store,
|
||||
},
|
||||
depth_slice: None,
|
||||
})],
|
||||
depth_stencil_attachment: None,
|
||||
timestamp_writes: None,
|
||||
occlusion_query_set: None,
|
||||
});
|
||||
|
||||
render_pass.set_pipeline(&pipeline.pipeline);
|
||||
render_pass.set_bind_group(0, &video.bind_group, &[]);
|
||||
render_pass.set_scissor_rect(
|
||||
bounds.x as _,
|
||||
bounds.y as _,
|
||||
bounds.width as _,
|
||||
bounds.height as _,
|
||||
);
|
||||
render_pass.draw(0..3, 0..1);
|
||||
// self.ready
|
||||
// .store(false, std::sync::atomic::Ordering::Relaxed);
|
||||
}
|
||||
}
|
||||
|
||||
/// NV12 or P010 are only supported in DX12 and Vulkan backends.
|
||||
/// While we can use vulkan with moltenvk on macos, I'd much rather use metal directly
|
||||
/// Right now only supports interleaved UV formats.
|
||||
/// For planar formats we would need 3 textures.
|
||||
/// Also NV12 and P010 textures are not COPY_DST capable
|
||||
/// This assumes 4:2:0 chroma subsampling (for now).
|
||||
/// So for 4 Y samples there is 1 U and 1 V sample.
|
||||
/// This means that the UV texture is half the width and half the height of the Y texture.
|
||||
#[derive(Debug)]
|
||||
pub struct VideoTexture {
|
||||
y: wgpu::Texture,
|
||||
uv: wgpu::Texture,
|
||||
size: wgpu::Extent3d,
|
||||
video_format: VideoFormat,
|
||||
surface_format: wgpu::TextureFormat,
|
||||
conversion_matrix_buffer: wgpu::Buffer,
|
||||
}
|
||||
|
||||
impl VideoTexture {
|
||||
pub fn size(&self) -> wgpu::Extent3d {
|
||||
self.size
|
||||
}
|
||||
|
||||
pub fn new(
|
||||
label: &str,
|
||||
size: wgpu::Extent3d,
|
||||
device: &wgpu::Device,
|
||||
surface_format: wgpu::TextureFormat,
|
||||
video_format: VideoFormat,
|
||||
) -> Self {
|
||||
let surface_hdr = surface_format.is_wide();
|
||||
let video_hdr = matches!(video_format, VideoFormat::P01010le | VideoFormat::P016Le);
|
||||
|
||||
if surface_hdr && !video_hdr {
|
||||
tracing::warn!("Surface texture is HDR but video format is SDR");
|
||||
} else if !surface_hdr && video_hdr {
|
||||
tracing::warn!("Video format is HDR but surface does not support HDR");
|
||||
}
|
||||
|
||||
let y_texture = device.create_texture(&wgpu::TextureDescriptor {
|
||||
label: Some(&format!("{}-y", label)),
|
||||
size: wgpu::Extent3d {
|
||||
width: size.width,
|
||||
height: size.height,
|
||||
depth_or_array_layers: 1,
|
||||
},
|
||||
mip_level_count: 1,
|
||||
sample_count: 1,
|
||||
dimension: wgpu::TextureDimension::D2,
|
||||
format: wgpu::TextureFormat::R16Unorm,
|
||||
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
|
||||
view_formats: &[],
|
||||
});
|
||||
let uv_texture = device.create_texture(&wgpu::TextureDescriptor {
|
||||
label: Some(&format!("{}-uv", label)),
|
||||
size: wgpu::Extent3d {
|
||||
width: size.width / 2,
|
||||
height: size.height / 2,
|
||||
depth_or_array_layers: 1,
|
||||
},
|
||||
mip_level_count: 1,
|
||||
sample_count: 1,
|
||||
dimension: wgpu::TextureDimension::D2,
|
||||
format: wgpu::TextureFormat::Rg16Unorm,
|
||||
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
|
||||
view_formats: &[],
|
||||
});
|
||||
|
||||
let buffer = device.create_buffer(&wgpu::BufferDescriptor {
|
||||
label: Some("iced-video-conversion-matrix-buffer"),
|
||||
usage: wgpu::BufferUsages::UNIFORM | wgpu::BufferUsages::COPY_DST,
|
||||
size: core::mem::size_of::<ConversionMatrix>() as wgpu::BufferAddress,
|
||||
mapped_at_creation: false,
|
||||
});
|
||||
|
||||
VideoTexture {
|
||||
y: y_texture,
|
||||
uv: uv_texture,
|
||||
size,
|
||||
surface_format,
|
||||
video_format,
|
||||
conversion_matrix_buffer: buffer,
|
||||
}
|
||||
}
|
||||
|
||||
// This return the surface texture format, not the video pixel format
|
||||
pub fn format(&self) -> wgpu::TextureFormat {
|
||||
self.surface_format
|
||||
}
|
||||
|
||||
pub fn y_texture(&self) -> wgpu::TextureView {
|
||||
self.y.create_view(&wgpu::TextureViewDescriptor::default())
|
||||
}
|
||||
|
||||
pub fn uv_texture(&self) -> wgpu::TextureView {
|
||||
self.uv.create_view(&wgpu::TextureViewDescriptor::default())
|
||||
}
|
||||
|
||||
pub fn resize(&self, name: &str, new_size: wgpu::Extent3d, device: &wgpu::Device) -> Self {
|
||||
VideoTexture::new(name, new_size, device, self.format(), self.pixel_format())
|
||||
}
|
||||
|
||||
pub fn pixel_format(&self) -> VideoFormat {
|
||||
self.video_format
|
||||
}
|
||||
|
||||
/// This assumes that the data is laid out correctly for the texture format.
|
||||
pub fn write_texture(&self, data: &[u8], queue: &wgpu::Queue) {
|
||||
let Self { y, uv, .. } = self;
|
||||
let y_size = y.size();
|
||||
let uv_size = uv.size();
|
||||
|
||||
let y_data_size = (y_size.width * y_size.height * 2) as usize;
|
||||
let uv_data_size = (y_data_size / 2) as usize; // UV is interleaved
|
||||
|
||||
let y_data = &data[0..y_data_size];
|
||||
let uv_data = &data[y_data_size..y_data_size + uv_data_size];
|
||||
|
||||
queue.write_texture(
|
||||
wgpu::TexelCopyTextureInfo {
|
||||
texture: y,
|
||||
mip_level: 0,
|
||||
origin: wgpu::Origin3d::ZERO,
|
||||
aspect: wgpu::TextureAspect::All,
|
||||
},
|
||||
y_data,
|
||||
wgpu::TexelCopyBufferLayout {
|
||||
offset: 0,
|
||||
bytes_per_row: Some(y_size.width * 2),
|
||||
rows_per_image: None,
|
||||
},
|
||||
y_size,
|
||||
);
|
||||
|
||||
queue.write_texture(
|
||||
wgpu::TexelCopyTextureInfo {
|
||||
texture: uv,
|
||||
mip_level: 0,
|
||||
origin: wgpu::Origin3d::ZERO,
|
||||
aspect: wgpu::TextureAspect::All,
|
||||
},
|
||||
uv_data,
|
||||
wgpu::TexelCopyBufferLayout {
|
||||
offset: 0,
|
||||
bytes_per_row: Some(uv_size.width * 4),
|
||||
rows_per_image: None,
|
||||
},
|
||||
uv_size,
|
||||
);
|
||||
}
|
||||
|
||||
pub fn write_conversion_matrix(&self, matrix: &ConversionMatrix, queue: &wgpu::Queue) {
|
||||
queue.write_buffer(
|
||||
&self.conversion_matrix_buffer,
|
||||
0,
|
||||
bytemuck::bytes_of(matrix),
|
||||
);
|
||||
}
|
||||
|
||||
pub fn conversion_matrix_buffer(&self) -> &wgpu::Buffer {
|
||||
&self.conversion_matrix_buffer
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct VideoFrameData {
|
||||
id: id::Id,
|
||||
texture: VideoTexture,
|
||||
bind_group: wgpu::BindGroup,
|
||||
conversion_matrix: ConversionMatrix,
|
||||
ready: Arc<AtomicBool>,
|
||||
}
|
||||
|
||||
impl VideoFrameData {
|
||||
pub fn is_hdr(&self) -> bool {
|
||||
self.texture.format().is_wide()
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct VideoPipeline {
|
||||
pipeline: wgpu::RenderPipeline,
|
||||
bind_group_layout: wgpu::BindGroupLayout,
|
||||
sampler: wgpu::Sampler,
|
||||
format: wgpu::TextureFormat,
|
||||
videos: BTreeMap<id::Id, VideoFrameData>,
|
||||
}
|
||||
|
||||
pub trait WideTextureFormatExt {
|
||||
fn is_wide(&self) -> bool;
|
||||
}
|
||||
|
||||
impl WideTextureFormatExt for wgpu::TextureFormat {
|
||||
fn is_wide(&self) -> bool {
|
||||
matches!(
|
||||
self,
|
||||
wgpu::TextureFormat::Rgba16Float
|
||||
| wgpu::TextureFormat::Rgba32Float
|
||||
| wgpu::TextureFormat::Rgb10a2Unorm
|
||||
| wgpu::TextureFormat::Rgb10a2Uint
|
||||
| wgpu::TextureFormat::P010
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
impl Pipeline for VideoPipeline {
|
||||
fn new(device: &wgpu::Device, queue: &wgpu::Queue, format: wgpu::TextureFormat) -> Self
|
||||
where
|
||||
Self: Sized,
|
||||
{
|
||||
if format.is_wide() {
|
||||
tracing::info!("HDR texture format detected: {:?}", format);
|
||||
}
|
||||
|
||||
let bind_group_layout = device.create_bind_group_layout(&wgpu::BindGroupLayoutDescriptor {
|
||||
label: Some("iced-video-texture-bind-group-layout"),
|
||||
entries: &[
|
||||
// y
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 0,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Texture {
|
||||
multisampled: false,
|
||||
view_dimension: wgpu::TextureViewDimension::D2,
|
||||
sample_type: wgpu::TextureSampleType::Float { filterable: true },
|
||||
},
|
||||
count: None,
|
||||
},
|
||||
// uv
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 1,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Texture {
|
||||
multisampled: false,
|
||||
view_dimension: wgpu::TextureViewDimension::D2,
|
||||
sample_type: wgpu::TextureSampleType::Float { filterable: true },
|
||||
},
|
||||
count: None,
|
||||
},
|
||||
// sampler
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 2,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Sampler(wgpu::SamplerBindingType::Filtering),
|
||||
count: None,
|
||||
},
|
||||
// conversion matrix
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 3,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Buffer {
|
||||
ty: wgpu::BufferBindingType::Uniform,
|
||||
has_dynamic_offset: false,
|
||||
min_binding_size: None,
|
||||
},
|
||||
count: None,
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
let shader_passthrough =
|
||||
device.create_shader_module(wgpu::include_wgsl!("shaders/passthrough.wgsl"));
|
||||
let render_pipeline_layout =
|
||||
device.create_pipeline_layout(&wgpu::PipelineLayoutDescriptor {
|
||||
label: Some("iced-video-render-pipeline-layout"),
|
||||
bind_group_layouts: &[&bind_group_layout],
|
||||
push_constant_ranges: &[],
|
||||
});
|
||||
let pipeline = device.create_render_pipeline(&wgpu::RenderPipelineDescriptor {
|
||||
label: Some("iced-video-render-pipeline"),
|
||||
layout: Some(&render_pipeline_layout),
|
||||
vertex: wgpu::VertexState {
|
||||
module: &shader_passthrough,
|
||||
entry_point: Some("vs_main"),
|
||||
buffers: &[],
|
||||
compilation_options: wgpu::PipelineCompilationOptions::default(),
|
||||
},
|
||||
fragment: Some(wgpu::FragmentState {
|
||||
module: &shader_passthrough,
|
||||
entry_point: Some("fs_main"),
|
||||
targets: &[Some(wgpu::ColorTargetState {
|
||||
format,
|
||||
blend: Some(wgpu::BlendState::REPLACE),
|
||||
write_mask: wgpu::ColorWrites::ALL,
|
||||
})],
|
||||
compilation_options: wgpu::PipelineCompilationOptions::default(),
|
||||
}),
|
||||
primitive: wgpu::PrimitiveState::default(),
|
||||
depth_stencil: None,
|
||||
multisample: wgpu::MultisampleState::default(),
|
||||
multiview: None,
|
||||
cache: None,
|
||||
});
|
||||
|
||||
let sampler = device.create_sampler(&wgpu::SamplerDescriptor {
|
||||
label: Some("iced-video-sampler"),
|
||||
address_mode_u: wgpu::AddressMode::ClampToEdge,
|
||||
address_mode_v: wgpu::AddressMode::ClampToEdge,
|
||||
address_mode_w: wgpu::AddressMode::ClampToEdge,
|
||||
mag_filter: wgpu::FilterMode::Linear,
|
||||
min_filter: wgpu::FilterMode::Linear,
|
||||
mipmap_filter: wgpu::FilterMode::Nearest,
|
||||
..Default::default()
|
||||
});
|
||||
|
||||
Self {
|
||||
pipeline,
|
||||
bind_group_layout,
|
||||
sampler,
|
||||
format,
|
||||
videos: BTreeMap::new(),
|
||||
}
|
||||
}
|
||||
}
|
||||
30
crates/iced-video/src/shaders/passthrough.wgsl
Normal file
30
crates/iced-video/src/shaders/passthrough.wgsl
Normal file
@@ -0,0 +1,30 @@
|
||||
struct VertexOutput {
|
||||
@builtin(position) clip_position: vec4<f32>,
|
||||
@location(0) tex_coords: vec2<f32>,
|
||||
}
|
||||
|
||||
@vertex
|
||||
fn vs_main(
|
||||
@builtin(vertex_index) in_vertex_index: u32,
|
||||
) -> VertexOutput {
|
||||
var out: VertexOutput;
|
||||
let uv = vec2<f32>(f32((in_vertex_index << 1u) & 2u), f32(in_vertex_index & 2u));
|
||||
out.clip_position = vec4<f32>(uv * 2.0 - 1.0, 0.0, 1.0);
|
||||
out.clip_position.y = -out.clip_position.y;
|
||||
out.tex_coords = uv;
|
||||
return out;
|
||||
}
|
||||
|
||||
|
||||
@group(0) @binding(0) var y_texture: texture_2d<f32>;
|
||||
@group(0) @binding(1) var uv_texture: texture_2d<f32>;
|
||||
@group(0) @binding(2) var texture_sampler: sampler;
|
||||
@group(0) @binding(3) var<uniform> rgb_primaries: mat3x3<f32>;
|
||||
|
||||
@fragment
|
||||
fn fs_main(input: VertexOutput) -> @location(0) vec4<f32> {
|
||||
let y = textureSample(y_texture, texture_sampler, input.tex_coords).r;
|
||||
let uv = textureSample(uv_texture, texture_sampler, input.tex_coords).rg;
|
||||
let yuv = vec3f(y, uv.x - 0.5, uv.y - 0.5);
|
||||
return vec4f(yuv * rgb_primaries, 1.0);
|
||||
}
|
||||
173
crates/iced-video/src/source.rs
Normal file
173
crates/iced-video/src/source.rs
Normal file
@@ -0,0 +1,173 @@
|
||||
use crate::{Error, Result, ResultExt};
|
||||
use gst::{
|
||||
Bus, Gst, MessageType, MessageView, Sink, Source,
|
||||
app::AppSink,
|
||||
caps::{Caps, CapsType},
|
||||
element::ElementExt,
|
||||
pipeline::PipelineExt,
|
||||
playback::{PlayFlags, Playbin3},
|
||||
videoconvertscale::VideoConvert,
|
||||
};
|
||||
use std::sync::{Arc, Mutex, atomic::AtomicBool};
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct VideoSource {
|
||||
pub(crate) playbin: Playbin3,
|
||||
pub(crate) appsink: AppSink,
|
||||
pub(crate) bus: Bus,
|
||||
pub(crate) ready: Arc<AtomicBool>,
|
||||
pub(crate) frame: Arc<Mutex<gst::Sample>>,
|
||||
pub(crate) size: std::sync::OnceLock<(i32, i32)>,
|
||||
}
|
||||
|
||||
impl VideoSource {
|
||||
/// Creates a new video source from the given URL.
|
||||
/// Since this doesn't have to parse the pipeline manually, we aren't sanitizing the URL for
|
||||
/// now.
|
||||
pub fn new(url: impl AsRef<str>) -> Result<Self> {
|
||||
Gst::new();
|
||||
let mut appsink = AppSink::new("iced-video-sink").change_context(Error)?;
|
||||
appsink
|
||||
.drop(true)
|
||||
.sync(true)
|
||||
// .async_(true)
|
||||
.emit_signals(true);
|
||||
let playbin = Playbin3::new("iced-video")
|
||||
.change_context(Error)?
|
||||
.with_uri(url.as_ref())
|
||||
.with_buffer_duration(core::time::Duration::from_secs(2))
|
||||
.with_buffer_size(4096 * 4096 * 4 * 3)
|
||||
.with_ring_buffer_max_size(4096 * 4096 * 4 * 3)
|
||||
.with_flags(Playbin3::default_flags() | PlayFlags::DOWNLOAD)
|
||||
.with_video_sink(&appsink);
|
||||
let bus = playbin.bus().change_context(Error)?;
|
||||
playbin.pause().change_context(Error)?;
|
||||
let ready = Arc::new(AtomicBool::new(false));
|
||||
let frame = Arc::new(Mutex::new(gst::Sample::new()));
|
||||
|
||||
appsink.on_new_sample({
|
||||
let ready = Arc::clone(&ready);
|
||||
let frame = Arc::clone(&frame);
|
||||
move |appsink| {
|
||||
let Ok(sample) = appsink.pull_sample() else {
|
||||
tracing::error!("Failed to pull video sample from appsink despite being notified of new frame");
|
||||
return Ok(());
|
||||
};
|
||||
{
|
||||
let mut guard = frame.lock().expect("BUG: Mutex poisoned");
|
||||
core::mem::replace(&mut *guard, sample);
|
||||
ready.store(true, std::sync::atomic::Ordering::Relaxed);
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
});
|
||||
|
||||
Ok(Self {
|
||||
playbin,
|
||||
appsink,
|
||||
bus,
|
||||
ready,
|
||||
frame,
|
||||
size: std::sync::OnceLock::new(),
|
||||
})
|
||||
}
|
||||
|
||||
pub async fn wait(&self) -> Result<()> {
|
||||
use futures_lite::StreamExt;
|
||||
// self.bus_stream()
|
||||
// .for_each(|msg: gst::Message| {
|
||||
// use gst::gstreamer::prelude::*;
|
||||
// match msg.view() {
|
||||
// MessageView::Eos(_) => {
|
||||
// tracing::info!("Video reached end of stream");
|
||||
// }
|
||||
// MessageView::Error(err) => {
|
||||
// tracing::error!(
|
||||
// "Video Error from {:?}: {} ({:?})",
|
||||
// err.src().map(|s| s.path_string()),
|
||||
// err.error(),
|
||||
// err.debug()
|
||||
// );
|
||||
// }
|
||||
// view => tracing::info!("Video Message: {:#?}", view),
|
||||
// }
|
||||
// })
|
||||
// .await;
|
||||
self.playbin
|
||||
.wait_for_states(&[gst::State::Paused, gst::State::Playing])
|
||||
.await
|
||||
.change_context(Error)
|
||||
.attach("Failed to wait for video initialisation")?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn format(&self) -> Result<gst::VideoFormat> {
|
||||
let caps = self
|
||||
.appsink
|
||||
.sink("sink")
|
||||
.current_caps()
|
||||
.change_context(Error)?;
|
||||
let format = caps
|
||||
.format()
|
||||
.ok_or(Error)
|
||||
.attach("Failed to get video caps structure")?;
|
||||
Ok(format)
|
||||
}
|
||||
|
||||
pub fn bus_stream(&self) -> impl futures_lite::Stream<Item = gst::Message> {
|
||||
self.bus.stream()
|
||||
}
|
||||
|
||||
pub fn is_playing(&self) -> Result<bool> {
|
||||
let state = self.playbin.state(None).change_context(Error)?;
|
||||
Ok(state == gst::State::Playing)
|
||||
}
|
||||
|
||||
pub fn toggle(&self) -> Result<()> {
|
||||
if self.is_playing()? {
|
||||
self.pause()?;
|
||||
} else {
|
||||
self.play()?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn play(&self) -> Result<()> {
|
||||
self.playbin
|
||||
.play()
|
||||
.change_context(Error)
|
||||
.attach("Failed to play video")
|
||||
}
|
||||
|
||||
pub fn pause(&self) -> Result<()> {
|
||||
self.playbin
|
||||
.pause()
|
||||
.change_context(Error)
|
||||
.attach("Failed to pause video")
|
||||
}
|
||||
|
||||
pub fn stop(&self) -> Result<()> {
|
||||
self.playbin
|
||||
.stop()
|
||||
.change_context(Error)
|
||||
.attach("Failed to stop video")
|
||||
}
|
||||
|
||||
pub fn size(&self) -> Result<(i32, i32)> {
|
||||
if let Some(size) = self.size.get() {
|
||||
return Ok(*size);
|
||||
}
|
||||
let caps = self
|
||||
.appsink
|
||||
.sink("sink")
|
||||
.current_caps()
|
||||
.change_context(Error)?;
|
||||
let out = caps
|
||||
.width()
|
||||
.and_then(|width| caps.height().map(|height| (width, height)))
|
||||
.ok_or(Error)
|
||||
.attach("Failed to get width, height")?;
|
||||
self.size.set(out);
|
||||
Ok(out)
|
||||
}
|
||||
}
|
||||
258
crates/iced-video/src/widget.rs
Normal file
258
crates/iced-video/src/widget.rs
Normal file
@@ -0,0 +1,258 @@
|
||||
use super::*;
|
||||
use iced::Length;
|
||||
use iced_core as iced;
|
||||
use iced_wgpu::primitive::Renderer as PrimitiveRenderer;
|
||||
use std::marker::PhantomData;
|
||||
/// This is the Video widget that displays a video.
|
||||
/// This should be used in the view function.
|
||||
pub struct Video<'a, Message, Theme = iced::Theme, Renderer = iced_wgpu::Renderer>
|
||||
where
|
||||
Renderer: PrimitiveRenderer,
|
||||
{
|
||||
id: id::Id,
|
||||
handle: &'a VideoHandle<Message, Ready>,
|
||||
video_format: gst::VideoFormat,
|
||||
content_fit: iced::ContentFit,
|
||||
width: iced::Length,
|
||||
height: iced::Length,
|
||||
looping: bool,
|
||||
__marker: PhantomData<(Renderer, Theme)>,
|
||||
}
|
||||
|
||||
impl<'a, Message, Theme, Renderer> Video<'a, Message, Theme, Renderer>
|
||||
where
|
||||
Renderer: PrimitiveRenderer,
|
||||
Message: Clone + Send + Sync,
|
||||
{
|
||||
pub fn new(handle: &'a VideoHandle<Message, Ready>) -> Self {
|
||||
Self {
|
||||
id: handle.id.clone(),
|
||||
handle: &handle,
|
||||
video_format: handle
|
||||
.format()
|
||||
.expect("Failed to get video format during widget creation"),
|
||||
content_fit: iced::ContentFit::Contain,
|
||||
width: Length::Shrink,
|
||||
height: Length::Shrink,
|
||||
looping: false,
|
||||
__marker: PhantomData,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, Message, Theme, Renderer> Video<'a, Message, Theme, Renderer>
|
||||
where
|
||||
Renderer: PrimitiveRenderer,
|
||||
{
|
||||
pub fn width(mut self, width: Length) -> Self {
|
||||
self.width = width;
|
||||
self
|
||||
}
|
||||
|
||||
pub fn height(mut self, height: Length) -> Self {
|
||||
self.height = height;
|
||||
self
|
||||
}
|
||||
|
||||
pub fn content_fit(mut self, fit: iced::ContentFit) -> Self {
|
||||
self.content_fit = fit;
|
||||
self
|
||||
}
|
||||
|
||||
// pub fn on_end_of_stream(mut self, message: Message) -> Self {
|
||||
// self.on_end_of_stream = Some(message);
|
||||
// self
|
||||
// }
|
||||
//
|
||||
// pub fn on_new_frame(mut self, message: Message) -> Self {
|
||||
// self.on_new_frame = Some(message);
|
||||
// self
|
||||
// }
|
||||
|
||||
pub fn looping(mut self, looping: bool) -> Self {
|
||||
self.looping = looping;
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
impl<Message, Theme, Renderer> iced::Widget<Message, Theme, Renderer>
|
||||
for Video<'_, Message, Theme, Renderer>
|
||||
where
|
||||
Message: Clone + Send + Sync,
|
||||
Renderer: PrimitiveRenderer,
|
||||
{
|
||||
fn size(&self) -> iced::Size<Length> {
|
||||
iced::Size {
|
||||
width: self.width,
|
||||
height: self.height,
|
||||
}
|
||||
}
|
||||
|
||||
// The video player should take max space by default
|
||||
fn layout(
|
||||
&mut self,
|
||||
_tree: &mut iced::widget::Tree,
|
||||
_renderer: &Renderer,
|
||||
limits: &iced::layout::Limits,
|
||||
) -> iced::layout::Node {
|
||||
iced::layout::Node::new(limits.max())
|
||||
}
|
||||
|
||||
fn draw(
|
||||
&self,
|
||||
tree: &iced::widget::Tree,
|
||||
renderer: &mut Renderer,
|
||||
theme: &Theme,
|
||||
style: &iced::renderer::Style,
|
||||
layout: iced::Layout<'_>,
|
||||
cursor: iced::mouse::Cursor,
|
||||
viewport: &iced::Rectangle,
|
||||
) {
|
||||
if let Ok((width, height)) = self.handle.source.size() {
|
||||
let video_size = iced::Size {
|
||||
width: width as f32,
|
||||
height: height as f32,
|
||||
};
|
||||
let bounds = layout.bounds();
|
||||
let adjusted_fit = self.content_fit.fit(video_size, bounds.size());
|
||||
let scale = iced::Vector::new(
|
||||
adjusted_fit.width / video_size.width,
|
||||
adjusted_fit.height / video_size.height,
|
||||
);
|
||||
let final_size = video_size * scale;
|
||||
let position = match self.content_fit {
|
||||
iced::ContentFit::None => iced::Point::new(
|
||||
bounds.x + (video_size.width - adjusted_fit.width) / 2.0,
|
||||
bounds.y + (video_size.height - adjusted_fit.height) / 2.0,
|
||||
),
|
||||
_ => iced::Point::new(
|
||||
bounds.center_x() - final_size.width / 2.0,
|
||||
bounds.center_y() - final_size.height / 2.0,
|
||||
),
|
||||
};
|
||||
|
||||
let drawing_bounds = iced::Rectangle::new(position, final_size);
|
||||
|
||||
let render = |renderer: &mut Renderer| {
|
||||
renderer.draw_primitive(
|
||||
drawing_bounds,
|
||||
primitive::VideoFrame {
|
||||
id: self.id.clone(),
|
||||
size: iced_wgpu::wgpu::Extent3d {
|
||||
width: width as u32,
|
||||
height: height as u32,
|
||||
depth_or_array_layers: 1,
|
||||
},
|
||||
ready: Arc::clone(&self.handle.frame_ready),
|
||||
frame: Arc::clone(&self.handle.source.frame),
|
||||
format: self.video_format,
|
||||
},
|
||||
);
|
||||
};
|
||||
|
||||
if adjusted_fit.width > bounds.width || adjusted_fit.height > bounds.height {
|
||||
renderer.with_layer(bounds, render);
|
||||
} else {
|
||||
render(renderer);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn update(
|
||||
&mut self,
|
||||
_tree: &mut iced_core::widget::Tree,
|
||||
event: &iced::Event,
|
||||
_layout: iced_core::Layout<'_>,
|
||||
_cursor: iced_core::mouse::Cursor,
|
||||
_renderer: &Renderer,
|
||||
_clipboard: &mut dyn iced_core::Clipboard,
|
||||
shell: &mut iced_core::Shell<'_, Message>,
|
||||
_viewport: &iced::Rectangle,
|
||||
) {
|
||||
if let iced::Event::Window(iced::window::Event::RedrawRequested(when)) = event {
|
||||
if self
|
||||
.handle
|
||||
.frame_ready
|
||||
.load(std::sync::atomic::Ordering::SeqCst)
|
||||
{
|
||||
shell.request_redraw();
|
||||
} else {
|
||||
shell.request_redraw_at(iced::window::RedrawRequest::At(
|
||||
iced_core::time::Instant::now() + core::time::Duration::from_millis(16)
|
||||
- when.elapsed(),
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, Message, Theme, Renderer> From<Video<'a, Message, Theme, Renderer>>
|
||||
for iced::Element<'a, Message, Theme, Renderer>
|
||||
where
|
||||
Message: Send + Sync + 'a + Clone,
|
||||
Theme: 'a,
|
||||
Renderer: 'a + iced_wgpu::primitive::Renderer,
|
||||
{
|
||||
fn from(video: Video<'a, Message, Theme, Renderer>) -> Self {
|
||||
Self::new(video)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct VideoSubscription<Message> {
|
||||
pub(crate) id: id::Id,
|
||||
pub(crate) on_end_of_stream: Option<Box<Message>>,
|
||||
pub(crate) on_new_frame: Option<Box<Message>>,
|
||||
pub(crate) on_about_to_finish: Option<Box<Message>>,
|
||||
// on_subtitle_text: Option<Box<dyn Fn(Option<String>) -> Message>>,
|
||||
// on_error: Option<Box<dyn Fn(&glib::Error) -> Message>>,
|
||||
pub(crate) bus: gst::Bus,
|
||||
}
|
||||
|
||||
impl<Message> VideoSubscription<Message> where Message: Clone {}
|
||||
|
||||
impl<Message> iced_futures::subscription::Recipe for VideoSubscription<Message>
|
||||
where
|
||||
Message: Clone + Send + Sync + 'static,
|
||||
{
|
||||
type Output = Message;
|
||||
fn hash(&self, state: &mut iced_futures::subscription::Hasher) {
|
||||
use std::hash::Hash;
|
||||
|
||||
self.id.hash(state);
|
||||
}
|
||||
|
||||
fn stream(
|
||||
self: Box<Self>,
|
||||
_input: core::pin::Pin<
|
||||
Box<dyn iced_futures::futures::Stream<Item = iced_futures::subscription::Event> + Send>,
|
||||
>,
|
||||
) -> core::pin::Pin<Box<dyn iced_futures::futures::Stream<Item = Self::Output> + Send>> {
|
||||
// use iced_futures::futures::StreamExt;
|
||||
use futures_lite::stream::StreamExt;
|
||||
Box::pin(
|
||||
self.bus
|
||||
.filtered_stream(&[gst::MessageType::Eos, gst::MessageType::Element])
|
||||
.filter_map({
|
||||
let eos = self.on_end_of_stream.clone();
|
||||
let frame = self.on_new_frame.clone();
|
||||
move |message: gst::Message| match message.view() {
|
||||
gst::MessageView::Eos(_) => eos.clone().map(|m| *m),
|
||||
gst::MessageView::Element(element_msg) => {
|
||||
let structure = element_msg.structure();
|
||||
if let Some(structure) = structure {
|
||||
if structure.name() == "GstVideoFrameReady" {
|
||||
frame.clone().map(|m| *m)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}),
|
||||
)
|
||||
}
|
||||
}
|
||||
5217
crates/iced_video_player/Cargo.lock
generated
5217
crates/iced_video_player/Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
@@ -1,63 +0,0 @@
|
||||
[package]
|
||||
name = "iced_video_player"
|
||||
description = "A convenient video player widget for Iced"
|
||||
homepage = "https://github.com/jazzfool/iced_video_player"
|
||||
repository = "https://github.com/jazzfool/iced_video_player"
|
||||
readme = "README.md"
|
||||
keywords = ["gui", "iced", "video"]
|
||||
categories = ["gui", "multimedia"]
|
||||
version = "0.6.0"
|
||||
authors = ["jazzfool"]
|
||||
edition = "2021"
|
||||
resolver = "2"
|
||||
license = "MIT OR Apache-2.0"
|
||||
exclude = [".media/test.mp4"]
|
||||
|
||||
[dependencies]
|
||||
iced = { git = "https://github.com/iced-rs/iced", features = [
|
||||
"image",
|
||||
"advanced",
|
||||
"wgpu",
|
||||
] }
|
||||
iced_wgpu = { git = "https://github.com/iced-rs/iced" }
|
||||
gstreamer = "0.23"
|
||||
gstreamer-app = "0.23" # appsink
|
||||
gstreamer-base = "0.23" # basesrc
|
||||
glib = "0.20" # gobject traits and error type
|
||||
log = "0.4"
|
||||
thiserror = "1"
|
||||
url = "2" # media uri
|
||||
|
||||
[package.metadata.nix]
|
||||
systems = ["x86_64-linux"]
|
||||
app = true
|
||||
build = true
|
||||
runtimeLibs = [
|
||||
"vulkan-loader",
|
||||
"wayland",
|
||||
"wayland-protocols",
|
||||
"libxkbcommon",
|
||||
"xorg.libX11",
|
||||
"xorg.libXrandr",
|
||||
"xorg.libXi",
|
||||
"gst_all_1.gstreamer",
|
||||
"gst_all_1.gstreamermm",
|
||||
"gst_all_1.gst-plugins-bad",
|
||||
"gst_all_1.gst-plugins-ugly",
|
||||
"gst_all_1.gst-plugins-good",
|
||||
"gst_all_1.gst-plugins-base",
|
||||
]
|
||||
buildInputs = [
|
||||
"libxkbcommon",
|
||||
"gst_all_1.gstreamer",
|
||||
"gst_all_1.gstreamermm",
|
||||
"gst_all_1.gst-plugins-bad",
|
||||
"gst_all_1.gst-plugins-ugly",
|
||||
"gst_all_1.gst-plugins-good",
|
||||
"gst_all_1.gst-plugins-base",
|
||||
]
|
||||
|
||||
[package.metadata.docs.rs]
|
||||
rustc-args = ["--cfg", "docsrs"]
|
||||
rustdoc-args = ["--cfg", "docsrs"]
|
||||
targets = ["wasm32-unknown-unknown"]
|
||||
@@ -1,176 +0,0 @@
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
@@ -1,23 +0,0 @@
|
||||
Permission is hereby granted, free of charge, to any
|
||||
person obtaining a copy of this software and associated
|
||||
documentation files (the "Software"), to deal in the
|
||||
Software without restriction, including without
|
||||
limitation the rights to use, copy, modify, merge,
|
||||
publish, distribute, sublicense, and/or sell copies of
|
||||
the Software, and to permit persons to whom the Software
|
||||
is furnished to do so, subject to the following
|
||||
conditions:
|
||||
|
||||
The above copyright notice and this permission notice
|
||||
shall be included in all copies or substantial portions
|
||||
of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
|
||||
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
|
||||
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
|
||||
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
|
||||
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
|
||||
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
|
||||
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
|
||||
DEALINGS IN THE SOFTWARE.
|
||||
@@ -1,64 +0,0 @@
|
||||
# Iced Video Player Widget
|
||||
|
||||
Composable component to play videos in any Iced application built on the excellent GStreamer library.
|
||||
|
||||
<img src=".media/screenshot.png" width="50%" />
|
||||
|
||||
## Overview
|
||||
|
||||
In general, this supports anything that [`gstreamer/playbin`](https://gstreamer.freedesktop.org/documentation/playback/playbin.html?gi-language=c) supports.
|
||||
|
||||
Features:
|
||||
- Load video files from any file path **or URL** (support for streaming over network).
|
||||
- Video buffering when streaming on a network.
|
||||
- Audio support.
|
||||
- Programmatic control.
|
||||
- Can capture thumbnails from a set of timestamps.
|
||||
- Good performance (i.e., comparable to other video players). GStreamer (with the right plugins) will perform hardware-accelerated decoding, and the color space (YUV to RGB) is converted on the GPU whilst rendering the frame.
|
||||
|
||||
Limitations (hopefully to be fixed):
|
||||
- GStreamer is a bit annoying to set up on Windows.
|
||||
|
||||
The player **does not** come with any surrounding GUI controls, but they should be quite easy to implement should you need them.
|
||||
See the "minimal" example for a demonstration on how you could implement pausing, looping, and seeking.
|
||||
|
||||
## Example Usage
|
||||
|
||||
```rust
|
||||
use iced_video_player::{Video, VideoPlayer};
|
||||
|
||||
fn main() -> iced::Result {
|
||||
iced::run("Video Player", (), App::view)
|
||||
}
|
||||
|
||||
struct App {
|
||||
video: Video,
|
||||
}
|
||||
|
||||
impl Default for App {
|
||||
fn default() -> Self {
|
||||
App {
|
||||
video: Video::new(&url::Url::parse("file:///C:/my_video.mp4").unwrap()).unwrap(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl App {
|
||||
fn view(&self) -> iced::Element<()> {
|
||||
VideoPlayer::new(&self.video).into()
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Building
|
||||
|
||||
Follow the [GStreamer build instructions](https://github.com/sdroege/gstreamer-rs#installation). This should be able to compile on MSVC, MinGW, Linux, and MacOS.
|
||||
|
||||
## License
|
||||
|
||||
Licensed under either
|
||||
|
||||
- [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
|
||||
- [MIT](http://opensource.org/licenses/MIT)
|
||||
|
||||
at your option.
|
||||
@@ -1,139 +0,0 @@
|
||||
use iced::{
|
||||
widget::{Button, Column, Container, Row, Slider, Text},
|
||||
Element,
|
||||
};
|
||||
use iced_video_player::{Video, VideoPlayer};
|
||||
use std::time::Duration;
|
||||
|
||||
fn main() -> iced::Result {
|
||||
iced::run(App::update, App::view)
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
enum Message {
|
||||
TogglePause,
|
||||
ToggleLoop,
|
||||
Seek(f64),
|
||||
SeekRelease,
|
||||
EndOfStream,
|
||||
NewFrame,
|
||||
}
|
||||
|
||||
struct App {
|
||||
video: Video,
|
||||
position: f64,
|
||||
dragging: bool,
|
||||
}
|
||||
|
||||
impl Default for App {
|
||||
fn default() -> Self {
|
||||
App {
|
||||
video: Video::new(
|
||||
&url::Url::parse("https://jellyfin.tsuba.darksailor.dev/Videos/1d7e2012-e17d-edbb-25c3-2dbcc803d6b6/stream?static=true")
|
||||
.expect("Failed to parse URL"),
|
||||
)
|
||||
.expect("Failed to create video"),
|
||||
position: 0.0,
|
||||
dragging: false,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl App {
|
||||
fn update(&mut self, message: Message) {
|
||||
match message {
|
||||
Message::TogglePause => {
|
||||
self.video.set_paused(!self.video.paused());
|
||||
}
|
||||
Message::ToggleLoop => {
|
||||
self.video.set_looping(!self.video.looping());
|
||||
}
|
||||
Message::Seek(secs) => {
|
||||
self.dragging = true;
|
||||
self.video.set_paused(true);
|
||||
self.position = secs;
|
||||
}
|
||||
Message::SeekRelease => {
|
||||
self.dragging = false;
|
||||
self.video
|
||||
.seek(Duration::from_secs_f64(self.position), false)
|
||||
.expect("seek");
|
||||
self.video.set_paused(false);
|
||||
}
|
||||
Message::EndOfStream => {
|
||||
println!("end of stream");
|
||||
}
|
||||
Message::NewFrame => {
|
||||
if !self.dragging {
|
||||
self.position = self.video.position().as_secs_f64();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn view(&self) -> Element<Message> {
|
||||
Column::new()
|
||||
.push(
|
||||
Container::new(
|
||||
VideoPlayer::new(&self.video)
|
||||
.width(iced::Length::Fill)
|
||||
.height(iced::Length::Fill)
|
||||
.content_fit(iced::ContentFit::Contain)
|
||||
.on_end_of_stream(Message::EndOfStream)
|
||||
.on_new_frame(Message::NewFrame),
|
||||
)
|
||||
.align_x(iced::Alignment::Center)
|
||||
.align_y(iced::Alignment::Center)
|
||||
.width(iced::Length::Fill)
|
||||
.height(iced::Length::Fill),
|
||||
)
|
||||
.push(
|
||||
Container::new(
|
||||
Slider::new(
|
||||
0.0..=self.video.duration().as_secs_f64(),
|
||||
self.position,
|
||||
Message::Seek,
|
||||
)
|
||||
.step(0.1)
|
||||
.on_release(Message::SeekRelease),
|
||||
)
|
||||
.padding(iced::Padding::new(5.0).left(10.0).right(10.0)),
|
||||
)
|
||||
.push(
|
||||
Row::new()
|
||||
.spacing(5)
|
||||
.align_y(iced::alignment::Vertical::Center)
|
||||
.padding(iced::Padding::new(10.0).top(0.0))
|
||||
.push(
|
||||
Button::new(Text::new(if self.video.paused() {
|
||||
"Play"
|
||||
} else {
|
||||
"Pause"
|
||||
}))
|
||||
.width(80.0)
|
||||
.on_press(Message::TogglePause),
|
||||
)
|
||||
.push(
|
||||
Button::new(Text::new(if self.video.looping() {
|
||||
"Disable Loop"
|
||||
} else {
|
||||
"Enable Loop"
|
||||
}))
|
||||
.width(120.0)
|
||||
.on_press(Message::ToggleLoop),
|
||||
)
|
||||
.push(
|
||||
Text::new(format!(
|
||||
"{}:{:02}s / {}:{:02}s",
|
||||
self.position as u64 / 60,
|
||||
self.position as u64 % 60,
|
||||
self.video.duration().as_secs() / 60,
|
||||
self.video.duration().as_secs() % 60,
|
||||
))
|
||||
.width(iced::Length::Fill)
|
||||
.align_x(iced::alignment::Horizontal::Right),
|
||||
),
|
||||
)
|
||||
.into()
|
||||
}
|
||||
}
|
||||
275
crates/iced_video_player/flake.lock
generated
275
crates/iced_video_player/flake.lock
generated
@@ -1,275 +0,0 @@
|
||||
{
|
||||
"nodes": {
|
||||
"crane": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1758758545,
|
||||
"narHash": "sha256-NU5WaEdfwF6i8faJ2Yh+jcK9vVFrofLcwlD/mP65JrI=",
|
||||
"owner": "ipetkov",
|
||||
"repo": "crane",
|
||||
"rev": "95d528a5f54eaba0d12102249ce42f4d01f4e364",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "ipetkov",
|
||||
"ref": "v0.21.1",
|
||||
"repo": "crane",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"dream2nix": {
|
||||
"inputs": {
|
||||
"nixpkgs": [
|
||||
"nixCargoIntegration",
|
||||
"nixpkgs"
|
||||
],
|
||||
"purescript-overlay": "purescript-overlay",
|
||||
"pyproject-nix": "pyproject-nix"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1763413832,
|
||||
"narHash": "sha256-dkqBwDXiv8MPoFyIvOuC4bVubAP+TlVZUkVMB78TTSg=",
|
||||
"owner": "nix-community",
|
||||
"repo": "dream2nix",
|
||||
"rev": "5658fba3a0b6b7d5cb0460b949651f64f644a743",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "nix-community",
|
||||
"repo": "dream2nix",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"flake-compat": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1696426674,
|
||||
"narHash": "sha256-kvjfFW7WAETZlt09AgDn1MrtKzP7t90Vf7vypd3OL1U=",
|
||||
"owner": "edolstra",
|
||||
"repo": "flake-compat",
|
||||
"rev": "0f9255e01c2351cc7d116c072cb317785dd33b33",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "edolstra",
|
||||
"repo": "flake-compat",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"flakeCompat": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1761588595,
|
||||
"narHash": "sha256-XKUZz9zewJNUj46b4AJdiRZJAvSZ0Dqj2BNfXvFlJC4=",
|
||||
"owner": "edolstra",
|
||||
"repo": "flake-compat",
|
||||
"rev": "f387cd2afec9419c8ee37694406ca490c3f34ee5",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "edolstra",
|
||||
"repo": "flake-compat",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"mk-naked-shell": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1681286841,
|
||||
"narHash": "sha256-3XlJrwlR0nBiREnuogoa5i1b4+w/XPe0z8bbrJASw0g=",
|
||||
"owner": "90-008",
|
||||
"repo": "mk-naked-shell",
|
||||
"rev": "7612f828dd6f22b7fb332cc69440e839d7ffe6bd",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "90-008",
|
||||
"repo": "mk-naked-shell",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixCargoIntegration": {
|
||||
"inputs": {
|
||||
"crane": "crane",
|
||||
"dream2nix": "dream2nix",
|
||||
"mk-naked-shell": "mk-naked-shell",
|
||||
"nixpkgs": [
|
||||
"nixpkgs"
|
||||
],
|
||||
"parts": "parts",
|
||||
"rust-overlay": "rust-overlay",
|
||||
"treefmt": "treefmt"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1763619566,
|
||||
"narHash": "sha256-92rSHIwh5qTXjcktVEWyKu5EPB3/7UdgjgjtWZ5ET6w=",
|
||||
"owner": "yusdacra",
|
||||
"repo": "nix-cargo-integration",
|
||||
"rev": "ac45d8c0d6876e6547d62bc729654c7b9a79c760",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "yusdacra",
|
||||
"repo": "nix-cargo-integration",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1763421233,
|
||||
"narHash": "sha256-Stk9ZYRkGrnnpyJ4eqt9eQtdFWRRIvMxpNRf4sIegnw=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "89c2b2330e733d6cdb5eae7b899326930c2c0648",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"ref": "nixos-unstable",
|
||||
"repo": "nixpkgs",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"parts": {
|
||||
"inputs": {
|
||||
"nixpkgs-lib": [
|
||||
"nixCargoIntegration",
|
||||
"nixpkgs"
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1762980239,
|
||||
"narHash": "sha256-8oNVE8TrD19ulHinjaqONf9QWCKK+w4url56cdStMpM=",
|
||||
"owner": "hercules-ci",
|
||||
"repo": "flake-parts",
|
||||
"rev": "52a2caecc898d0b46b2b905f058ccc5081f842da",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "hercules-ci",
|
||||
"repo": "flake-parts",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"purescript-overlay": {
|
||||
"inputs": {
|
||||
"flake-compat": "flake-compat",
|
||||
"nixpkgs": [
|
||||
"nixCargoIntegration",
|
||||
"dream2nix",
|
||||
"nixpkgs"
|
||||
],
|
||||
"slimlock": "slimlock"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1728546539,
|
||||
"narHash": "sha256-Sws7w0tlnjD+Bjck1nv29NjC5DbL6nH5auL9Ex9Iz2A=",
|
||||
"owner": "thomashoneyman",
|
||||
"repo": "purescript-overlay",
|
||||
"rev": "4ad4c15d07bd899d7346b331f377606631eb0ee4",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "thomashoneyman",
|
||||
"repo": "purescript-overlay",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"pyproject-nix": {
|
||||
"inputs": {
|
||||
"nixpkgs": [
|
||||
"nixCargoIntegration",
|
||||
"dream2nix",
|
||||
"nixpkgs"
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1752481895,
|
||||
"narHash": "sha256-luVj97hIMpCbwhx3hWiRwjP2YvljWy8FM+4W9njDhLA=",
|
||||
"owner": "pyproject-nix",
|
||||
"repo": "pyproject.nix",
|
||||
"rev": "16ee295c25107a94e59a7fc7f2e5322851781162",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "pyproject-nix",
|
||||
"repo": "pyproject.nix",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"root": {
|
||||
"inputs": {
|
||||
"flakeCompat": "flakeCompat",
|
||||
"nixCargoIntegration": "nixCargoIntegration",
|
||||
"nixpkgs": "nixpkgs"
|
||||
}
|
||||
},
|
||||
"rust-overlay": {
|
||||
"inputs": {
|
||||
"nixpkgs": [
|
||||
"nixCargoIntegration",
|
||||
"nixpkgs"
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1763606317,
|
||||
"narHash": "sha256-lsq4Urmb9Iyg2zyg2yG6oMQk9yuaoIgy+jgvYM4guxA=",
|
||||
"owner": "oxalica",
|
||||
"repo": "rust-overlay",
|
||||
"rev": "a5615abaf30cfaef2e32f1ff9bd5ca94e2911371",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "oxalica",
|
||||
"repo": "rust-overlay",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"slimlock": {
|
||||
"inputs": {
|
||||
"nixpkgs": [
|
||||
"nixCargoIntegration",
|
||||
"dream2nix",
|
||||
"purescript-overlay",
|
||||
"nixpkgs"
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1688756706,
|
||||
"narHash": "sha256-xzkkMv3neJJJ89zo3o2ojp7nFeaZc2G0fYwNXNJRFlo=",
|
||||
"owner": "thomashoneyman",
|
||||
"repo": "slimlock",
|
||||
"rev": "cf72723f59e2340d24881fd7bf61cb113b4c407c",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "thomashoneyman",
|
||||
"repo": "slimlock",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"treefmt": {
|
||||
"inputs": {
|
||||
"nixpkgs": [
|
||||
"nixCargoIntegration",
|
||||
"nixpkgs"
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1762938485,
|
||||
"narHash": "sha256-AlEObg0syDl+Spi4LsZIBrjw+snSVU4T8MOeuZJUJjM=",
|
||||
"owner": "numtide",
|
||||
"repo": "treefmt-nix",
|
||||
"rev": "5b4ee75aeefd1e2d5a1cc43cf6ba65eba75e83e4",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "numtide",
|
||||
"repo": "treefmt-nix",
|
||||
"type": "github"
|
||||
}
|
||||
}
|
||||
},
|
||||
"root": "root",
|
||||
"version": 7
|
||||
}
|
||||
@@ -1,38 +0,0 @@
|
||||
{
|
||||
inputs = {
|
||||
flakeCompat = {
|
||||
url = "github:edolstra/flake-compat";
|
||||
flake = false;
|
||||
};
|
||||
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
|
||||
nixCargoIntegration = {
|
||||
url = "github:yusdacra/nix-cargo-integration";
|
||||
inputs.nixpkgs.follows = "nixpkgs";
|
||||
};
|
||||
};
|
||||
|
||||
outputs = inputs: let
|
||||
pkgs = import inputs.nixpkgs {
|
||||
system = "x86_64-linux";
|
||||
};
|
||||
in {
|
||||
devShells."x86_64-linux".default = pkgs.mkShell {
|
||||
# "GST_PLUGIN_PATH" = "${pkgs.gst_all_1.gstreamer}:${pkgs.gst_all_1.gst-plugins-bad}:${pkgs.gst_all_1.gst-plugins-ugly}:${pkgs.gst_all_1.gst-plugins-good}:${pkgs.gst_all_1.gst-plugins-base}";
|
||||
buildInputs = with pkgs; [
|
||||
gst_all_1.gstreamer
|
||||
gst_all_1.gst-plugins-bad
|
||||
gst_all_1.gst-plugins-ugly
|
||||
gst_all_1.gst-plugins-good
|
||||
gst_all_1.gst-plugins-base
|
||||
libxkbcommon
|
||||
wayland
|
||||
rustup
|
||||
];
|
||||
nativeBuildInputs = with pkgs; [
|
||||
pkg-config
|
||||
wayland
|
||||
];
|
||||
packages = with pkgs; [wayland];
|
||||
};
|
||||
};
|
||||
}
|
||||
@@ -1,12 +0,0 @@
|
||||
# Flake's devShell for non-flake-enabled nix instances
|
||||
(import
|
||||
(
|
||||
let lock = builtins.fromJSON (builtins.readFile ./flake.lock);
|
||||
in
|
||||
fetchTarball {
|
||||
url =
|
||||
"https://github.com/edolstra/flake-compat/archive/${lock.nodes.flakeCompat.locked.rev}.tar.gz";
|
||||
sha256 = lock.nodes.flakeCompat.locked.narHash;
|
||||
}
|
||||
)
|
||||
{ src = ./.; }).shellNix.default
|
||||
@@ -1,76 +0,0 @@
|
||||
//! # Iced Video Player
|
||||
//!
|
||||
//! A convenient video player widget for Iced.
|
||||
//!
|
||||
//! To get started, load a video from a URI (e.g., a file path prefixed with `file:///`) using [`Video::new`](crate::Video::new),
|
||||
//! then use it like any other Iced widget in your `view` function by creating a [`VideoPlayer`].
|
||||
//!
|
||||
//! Example:
|
||||
//! ```rust
|
||||
//! use iced_video_player::{Video, VideoPlayer};
|
||||
//!
|
||||
//! fn main() -> iced::Result {
|
||||
//! iced::run("Video Player", (), App::view)
|
||||
//! }
|
||||
//!
|
||||
//! struct App {
|
||||
//! video: Video,
|
||||
//! }
|
||||
//!
|
||||
//! impl Default for App {
|
||||
//! fn default() -> Self {
|
||||
//! App {
|
||||
//! video: Video::new(&url::Url::parse("file:///C:/my_video.mp4").unwrap()).unwrap(),
|
||||
//! }
|
||||
//! }
|
||||
//! }
|
||||
//!
|
||||
//! impl App {
|
||||
//! fn view(&self) -> iced::Element<()> {
|
||||
//! VideoPlayer::new(&self.video).into()
|
||||
//! }
|
||||
//! }
|
||||
//! ```
|
||||
//!
|
||||
//! You can programmatically control the video (e.g., seek, pause, loop, grab thumbnails) by accessing various methods on [`Video`].
|
||||
|
||||
mod pipeline;
|
||||
mod video;
|
||||
mod video_player;
|
||||
|
||||
use gstreamer as gst;
|
||||
use thiserror::Error;
|
||||
|
||||
pub use video::Position;
|
||||
pub use video::Video;
|
||||
pub use video_player::VideoPlayer;
|
||||
|
||||
#[derive(Debug, Error)]
|
||||
pub enum Error {
|
||||
#[error("{0}")]
|
||||
Glib(#[from] glib::Error),
|
||||
#[error("{0}")]
|
||||
Bool(#[from] glib::BoolError),
|
||||
#[error("failed to get the gstreamer bus")]
|
||||
Bus,
|
||||
#[error("failed to get AppSink element with name='{0}' from gstreamer pipeline")]
|
||||
AppSink(String),
|
||||
#[error("{0}")]
|
||||
StateChange(#[from] gst::StateChangeError),
|
||||
#[error("failed to cast gstreamer element")]
|
||||
Cast,
|
||||
#[error("{0}")]
|
||||
Io(#[from] std::io::Error),
|
||||
#[error("invalid URI")]
|
||||
Uri,
|
||||
#[error("failed to get media capabilities")]
|
||||
Caps,
|
||||
#[error("failed to query media duration or position")]
|
||||
Duration,
|
||||
#[error("failed to sync with playback")]
|
||||
Sync,
|
||||
#[error("failed to lock internal sync primitive")]
|
||||
Lock,
|
||||
#[error("invalid framerate: {0}")]
|
||||
Framerate(f64),
|
||||
}
|
||||
@@ -1,469 +0,0 @@
|
||||
use crate::video::Frame;
|
||||
use iced_wgpu::primitive::Primitive;
|
||||
use iced_wgpu::wgpu;
|
||||
use std::{
|
||||
collections::{btree_map::Entry, BTreeMap},
|
||||
num::NonZero,
|
||||
sync::{
|
||||
atomic::{AtomicBool, AtomicUsize, Ordering},
|
||||
Arc, Mutex,
|
||||
},
|
||||
};
|
||||
|
||||
#[repr(C)]
|
||||
struct Uniforms {
|
||||
rect: [f32; 4],
|
||||
// because wgpu min_uniform_buffer_offset_alignment
|
||||
_pad: [u8; 240],
|
||||
}
|
||||
|
||||
struct VideoEntry {
|
||||
texture_y: wgpu::Texture,
|
||||
texture_uv: wgpu::Texture,
|
||||
instances: wgpu::Buffer,
|
||||
bg0: wgpu::BindGroup,
|
||||
alive: Arc<AtomicBool>,
|
||||
|
||||
prepare_index: AtomicUsize,
|
||||
render_index: AtomicUsize,
|
||||
}
|
||||
|
||||
pub(crate) struct VideoPipeline {
|
||||
pipeline: wgpu::RenderPipeline,
|
||||
bg0_layout: wgpu::BindGroupLayout,
|
||||
sampler: wgpu::Sampler,
|
||||
videos: BTreeMap<u64, VideoEntry>,
|
||||
}
|
||||
|
||||
impl VideoPipeline {
|
||||
fn new(device: &wgpu::Device, format: wgpu::TextureFormat) -> Self {
|
||||
let shader = device.create_shader_module(wgpu::ShaderModuleDescriptor {
|
||||
label: Some("iced_video_player shader"),
|
||||
source: wgpu::ShaderSource::Wgsl(include_str!("shader.wgsl").into()),
|
||||
});
|
||||
|
||||
let bg0_layout = device.create_bind_group_layout(&wgpu::BindGroupLayoutDescriptor {
|
||||
label: Some("iced_video_player bind group 0 layout"),
|
||||
entries: &[
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 0,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Texture {
|
||||
sample_type: wgpu::TextureSampleType::Float { filterable: true },
|
||||
view_dimension: wgpu::TextureViewDimension::D2,
|
||||
multisampled: false,
|
||||
},
|
||||
count: None,
|
||||
},
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 1,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Texture {
|
||||
sample_type: wgpu::TextureSampleType::Float { filterable: true },
|
||||
view_dimension: wgpu::TextureViewDimension::D2,
|
||||
multisampled: false,
|
||||
},
|
||||
count: None,
|
||||
},
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 2,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Sampler(wgpu::SamplerBindingType::Filtering),
|
||||
count: None,
|
||||
},
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 3,
|
||||
visibility: wgpu::ShaderStages::VERTEX,
|
||||
ty: wgpu::BindingType::Buffer {
|
||||
ty: wgpu::BufferBindingType::Uniform,
|
||||
has_dynamic_offset: true,
|
||||
min_binding_size: None,
|
||||
},
|
||||
count: None,
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
let layout = device.create_pipeline_layout(&wgpu::PipelineLayoutDescriptor {
|
||||
label: Some("iced_video_player pipeline layout"),
|
||||
bind_group_layouts: &[&bg0_layout],
|
||||
push_constant_ranges: &[],
|
||||
});
|
||||
|
||||
let pipeline = device.create_render_pipeline(&wgpu::RenderPipelineDescriptor {
|
||||
label: Some("iced_video_player pipeline"),
|
||||
layout: Some(&layout),
|
||||
vertex: wgpu::VertexState {
|
||||
module: &shader,
|
||||
entry_point: Some("vs_main"),
|
||||
buffers: &[],
|
||||
compilation_options: wgpu::PipelineCompilationOptions::default(),
|
||||
},
|
||||
primitive: wgpu::PrimitiveState::default(),
|
||||
depth_stencil: None,
|
||||
multisample: wgpu::MultisampleState {
|
||||
count: 1,
|
||||
mask: !0,
|
||||
alpha_to_coverage_enabled: false,
|
||||
},
|
||||
fragment: Some(wgpu::FragmentState {
|
||||
module: &shader,
|
||||
entry_point: Some("fs_main"),
|
||||
targets: &[Some(wgpu::ColorTargetState {
|
||||
format,
|
||||
blend: None,
|
||||
write_mask: wgpu::ColorWrites::ALL,
|
||||
})],
|
||||
compilation_options: wgpu::PipelineCompilationOptions::default(),
|
||||
}),
|
||||
multiview: None,
|
||||
cache: None,
|
||||
});
|
||||
|
||||
let sampler = device.create_sampler(&wgpu::SamplerDescriptor {
|
||||
label: Some("iced_video_player sampler"),
|
||||
address_mode_u: wgpu::AddressMode::ClampToEdge,
|
||||
address_mode_v: wgpu::AddressMode::ClampToEdge,
|
||||
address_mode_w: wgpu::AddressMode::ClampToEdge,
|
||||
mag_filter: wgpu::FilterMode::Linear,
|
||||
min_filter: wgpu::FilterMode::Linear,
|
||||
mipmap_filter: wgpu::FilterMode::Nearest,
|
||||
lod_min_clamp: 0.0,
|
||||
lod_max_clamp: 1.0,
|
||||
compare: None,
|
||||
anisotropy_clamp: 1,
|
||||
border_color: None,
|
||||
});
|
||||
|
||||
VideoPipeline {
|
||||
pipeline,
|
||||
bg0_layout,
|
||||
sampler,
|
||||
videos: BTreeMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
fn upload(
|
||||
&mut self,
|
||||
device: &wgpu::Device,
|
||||
queue: &wgpu::Queue,
|
||||
video_id: u64,
|
||||
alive: &Arc<AtomicBool>,
|
||||
(width, height): (u32, u32),
|
||||
frame: &[u8],
|
||||
) {
|
||||
if let Entry::Vacant(entry) = self.videos.entry(video_id) {
|
||||
let texture_y = device.create_texture(&wgpu::TextureDescriptor {
|
||||
label: Some("iced_video_player texture"),
|
||||
size: wgpu::Extent3d {
|
||||
width,
|
||||
height,
|
||||
depth_or_array_layers: 1,
|
||||
},
|
||||
mip_level_count: 1,
|
||||
sample_count: 1,
|
||||
dimension: wgpu::TextureDimension::D2,
|
||||
format: wgpu::TextureFormat::R8Unorm,
|
||||
usage: wgpu::TextureUsages::COPY_DST | wgpu::TextureUsages::TEXTURE_BINDING,
|
||||
view_formats: &[],
|
||||
});
|
||||
|
||||
let texture_uv = device.create_texture(&wgpu::TextureDescriptor {
|
||||
label: Some("iced_video_player texture"),
|
||||
size: wgpu::Extent3d {
|
||||
width: width / 2,
|
||||
height: height / 2,
|
||||
depth_or_array_layers: 1,
|
||||
},
|
||||
mip_level_count: 1,
|
||||
sample_count: 1,
|
||||
dimension: wgpu::TextureDimension::D2,
|
||||
format: wgpu::TextureFormat::Rg8Unorm,
|
||||
usage: wgpu::TextureUsages::COPY_DST | wgpu::TextureUsages::TEXTURE_BINDING,
|
||||
view_formats: &[],
|
||||
});
|
||||
|
||||
let view_y = texture_y.create_view(&wgpu::TextureViewDescriptor {
|
||||
label: Some("iced_video_player texture view"),
|
||||
format: None,
|
||||
dimension: None,
|
||||
aspect: wgpu::TextureAspect::All,
|
||||
base_mip_level: 0,
|
||||
mip_level_count: None,
|
||||
base_array_layer: 0,
|
||||
array_layer_count: None,
|
||||
usage: Some(wgpu::TextureUsages::empty()),
|
||||
});
|
||||
|
||||
let view_uv = texture_uv.create_view(&wgpu::TextureViewDescriptor {
|
||||
label: Some("iced_video_player texture view"),
|
||||
format: None,
|
||||
dimension: None,
|
||||
aspect: wgpu::TextureAspect::All,
|
||||
base_mip_level: 0,
|
||||
mip_level_count: None,
|
||||
base_array_layer: 0,
|
||||
array_layer_count: None,
|
||||
usage: Some(wgpu::TextureUsages::empty()),
|
||||
});
|
||||
|
||||
let instances = device.create_buffer(&wgpu::BufferDescriptor {
|
||||
label: Some("iced_video_player uniform buffer"),
|
||||
size: 256 * std::mem::size_of::<Uniforms>() as u64, // max 256 video players per frame
|
||||
usage: wgpu::BufferUsages::COPY_DST | wgpu::BufferUsages::UNIFORM,
|
||||
mapped_at_creation: false,
|
||||
});
|
||||
|
||||
let bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
|
||||
label: Some("iced_video_player bind group"),
|
||||
layout: &self.bg0_layout,
|
||||
entries: &[
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 0,
|
||||
resource: wgpu::BindingResource::TextureView(&view_y),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 1,
|
||||
resource: wgpu::BindingResource::TextureView(&view_uv),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 2,
|
||||
resource: wgpu::BindingResource::Sampler(&self.sampler),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 3,
|
||||
resource: wgpu::BindingResource::Buffer(wgpu::BufferBinding {
|
||||
buffer: &instances,
|
||||
offset: 0,
|
||||
size: Some(NonZero::new(std::mem::size_of::<Uniforms>() as _).unwrap()),
|
||||
}),
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
entry.insert(VideoEntry {
|
||||
texture_y,
|
||||
texture_uv,
|
||||
instances,
|
||||
bg0: bind_group,
|
||||
alive: Arc::clone(alive),
|
||||
|
||||
prepare_index: AtomicUsize::new(0),
|
||||
render_index: AtomicUsize::new(0),
|
||||
});
|
||||
}
|
||||
|
||||
let VideoEntry {
|
||||
texture_y,
|
||||
texture_uv,
|
||||
..
|
||||
} = self.videos.get(&video_id).unwrap();
|
||||
|
||||
queue.write_texture(
|
||||
wgpu::TexelCopyTextureInfo {
|
||||
texture: texture_y,
|
||||
mip_level: 0,
|
||||
origin: wgpu::Origin3d::ZERO,
|
||||
aspect: wgpu::TextureAspect::All,
|
||||
},
|
||||
&frame[..(width * height) as usize],
|
||||
wgpu::TexelCopyBufferLayout {
|
||||
offset: 0,
|
||||
bytes_per_row: Some(width),
|
||||
rows_per_image: Some(height),
|
||||
},
|
||||
wgpu::Extent3d {
|
||||
width,
|
||||
height,
|
||||
depth_or_array_layers: 1,
|
||||
},
|
||||
);
|
||||
|
||||
queue.write_texture(
|
||||
wgpu::TexelCopyTextureInfo {
|
||||
texture: texture_uv,
|
||||
mip_level: 0,
|
||||
origin: wgpu::Origin3d::ZERO,
|
||||
aspect: wgpu::TextureAspect::All,
|
||||
},
|
||||
&frame[(width * height) as usize..],
|
||||
wgpu::TexelCopyBufferLayout {
|
||||
offset: 0,
|
||||
bytes_per_row: Some(width),
|
||||
rows_per_image: Some(height / 2),
|
||||
},
|
||||
wgpu::Extent3d {
|
||||
width: width / 2,
|
||||
height: height / 2,
|
||||
depth_or_array_layers: 1,
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
fn cleanup(&mut self) {
|
||||
let ids: Vec<_> = self
|
||||
.videos
|
||||
.iter()
|
||||
.filter_map(|(id, entry)| (!entry.alive.load(Ordering::SeqCst)).then_some(*id))
|
||||
.collect();
|
||||
for id in ids {
|
||||
if let Some(video) = self.videos.remove(&id) {
|
||||
video.texture_y.destroy();
|
||||
video.texture_uv.destroy();
|
||||
video.instances.destroy();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn prepare(&mut self, queue: &wgpu::Queue, video_id: u64, bounds: &iced::Rectangle) {
|
||||
if let Some(video) = self.videos.get_mut(&video_id) {
|
||||
let uniforms = Uniforms {
|
||||
rect: [
|
||||
bounds.x,
|
||||
bounds.y,
|
||||
bounds.x + bounds.width,
|
||||
bounds.y + bounds.height,
|
||||
],
|
||||
_pad: [0; 240],
|
||||
};
|
||||
queue.write_buffer(
|
||||
&video.instances,
|
||||
(video.prepare_index.load(Ordering::Relaxed) * std::mem::size_of::<Uniforms>())
|
||||
as u64,
|
||||
unsafe {
|
||||
std::slice::from_raw_parts(
|
||||
&uniforms as *const _ as *const u8,
|
||||
std::mem::size_of::<Uniforms>(),
|
||||
)
|
||||
},
|
||||
);
|
||||
video.prepare_index.fetch_add(1, Ordering::Relaxed);
|
||||
video.render_index.store(0, Ordering::Relaxed);
|
||||
}
|
||||
|
||||
self.cleanup();
|
||||
}
|
||||
|
||||
fn draw(
|
||||
&self,
|
||||
target: &wgpu::TextureView,
|
||||
encoder: &mut wgpu::CommandEncoder,
|
||||
clip: &iced::Rectangle<u32>,
|
||||
video_id: u64,
|
||||
) {
|
||||
if let Some(video) = self.videos.get(&video_id) {
|
||||
let mut pass = encoder.begin_render_pass(&wgpu::RenderPassDescriptor {
|
||||
label: Some("iced_video_player render pass"),
|
||||
color_attachments: &[Some(wgpu::RenderPassColorAttachment {
|
||||
view: target,
|
||||
resolve_target: None,
|
||||
ops: wgpu::Operations {
|
||||
load: wgpu::LoadOp::Load,
|
||||
store: wgpu::StoreOp::Store,
|
||||
},
|
||||
depth_slice: None,
|
||||
})],
|
||||
depth_stencil_attachment: None,
|
||||
timestamp_writes: None,
|
||||
occlusion_query_set: None,
|
||||
});
|
||||
|
||||
pass.set_pipeline(&self.pipeline);
|
||||
pass.set_bind_group(
|
||||
0,
|
||||
&video.bg0,
|
||||
&[
|
||||
(video.render_index.load(Ordering::Relaxed) * std::mem::size_of::<Uniforms>())
|
||||
as u32,
|
||||
],
|
||||
);
|
||||
pass.set_scissor_rect(clip.x as _, clip.y as _, clip.width as _, clip.height as _);
|
||||
pass.draw(0..6, 0..1);
|
||||
|
||||
video.prepare_index.store(0, Ordering::Relaxed);
|
||||
video.render_index.fetch_add(1, Ordering::Relaxed);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub(crate) struct VideoPrimitive {
|
||||
video_id: u64,
|
||||
alive: Arc<AtomicBool>,
|
||||
frame: Arc<Mutex<Frame>>,
|
||||
size: (u32, u32),
|
||||
upload_frame: bool,
|
||||
}
|
||||
|
||||
impl VideoPrimitive {
|
||||
pub fn new(
|
||||
video_id: u64,
|
||||
alive: Arc<AtomicBool>,
|
||||
frame: Arc<Mutex<Frame>>,
|
||||
size: (u32, u32),
|
||||
upload_frame: bool,
|
||||
) -> Self {
|
||||
VideoPrimitive {
|
||||
video_id,
|
||||
alive,
|
||||
frame,
|
||||
size,
|
||||
upload_frame,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Primitive for VideoPrimitive {
|
||||
type Renderer = VideoPipeline;
|
||||
|
||||
fn initialize(
|
||||
&self,
|
||||
device: &wgpu::Device,
|
||||
_queue: &wgpu::Queue,
|
||||
format: wgpu::TextureFormat,
|
||||
) -> Self::Renderer {
|
||||
VideoPipeline::new(device, format)
|
||||
}
|
||||
|
||||
fn prepare(
|
||||
&self,
|
||||
renderer: &mut Self::Renderer,
|
||||
device: &wgpu::Device,
|
||||
queue: &wgpu::Queue,
|
||||
bounds: &iced::Rectangle,
|
||||
viewport: &iced_wgpu::graphics::Viewport,
|
||||
) {
|
||||
if self.upload_frame {
|
||||
if let Some(readable) = self.frame.lock().expect("lock frame mutex").readable() {
|
||||
renderer.upload(
|
||||
device,
|
||||
queue,
|
||||
self.video_id,
|
||||
&self.alive,
|
||||
self.size,
|
||||
readable.as_slice(),
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
renderer.prepare(
|
||||
queue,
|
||||
self.video_id,
|
||||
&(*bounds
|
||||
* iced::Transformation::orthographic(
|
||||
viewport.logical_size().width as _,
|
||||
viewport.logical_size().height as _,
|
||||
)),
|
||||
);
|
||||
}
|
||||
|
||||
fn render(
|
||||
&self,
|
||||
renderer: &Self::Renderer,
|
||||
encoder: &mut wgpu::CommandEncoder,
|
||||
target: &wgpu::TextureView,
|
||||
clip_bounds: &iced::Rectangle<u32>,
|
||||
) {
|
||||
renderer.draw(target, encoder, clip_bounds, self.video_id);
|
||||
}
|
||||
}
|
||||
@@ -1,61 +0,0 @@
|
||||
struct VertexOutput {
|
||||
@builtin(position) position: vec4<f32>,
|
||||
@location(0) uv: vec2<f32>,
|
||||
}
|
||||
|
||||
struct Uniforms {
|
||||
rect: vec4<f32>,
|
||||
}
|
||||
|
||||
@group(0) @binding(0)
|
||||
var tex_y: texture_2d<f32>;
|
||||
|
||||
@group(0) @binding(1)
|
||||
var tex_uv: texture_2d<f32>;
|
||||
|
||||
@group(0) @binding(2)
|
||||
var s: sampler;
|
||||
|
||||
@group(0) @binding(3)
|
||||
var<uniform> uniforms: Uniforms;
|
||||
|
||||
@vertex
|
||||
fn vs_main(@builtin(vertex_index) in_vertex_index: u32) -> VertexOutput {
|
||||
var quad = array<vec4<f32>, 6>(
|
||||
vec4<f32>(uniforms.rect.xy, 0.0, 0.0),
|
||||
vec4<f32>(uniforms.rect.zy, 1.0, 0.0),
|
||||
vec4<f32>(uniforms.rect.xw, 0.0, 1.0),
|
||||
vec4<f32>(uniforms.rect.zy, 1.0, 0.0),
|
||||
vec4<f32>(uniforms.rect.zw, 1.0, 1.0),
|
||||
vec4<f32>(uniforms.rect.xw, 0.0, 1.0),
|
||||
);
|
||||
|
||||
var out: VertexOutput;
|
||||
out.uv = quad[in_vertex_index].zw;
|
||||
out.position = vec4<f32>(quad[in_vertex_index].xy, 1.0, 1.0);
|
||||
return out;
|
||||
}
|
||||
|
||||
@fragment
|
||||
fn fs_main(in: VertexOutput) -> @location(0) vec4<f32> {
|
||||
let yuv2r = vec3<f32>(1.164, 0.0, 1.596);
|
||||
let yuv2g = vec3<f32>(1.164, -0.391, -0.813);
|
||||
let yuv2b = vec3<f32>(1.164, 2.018, 0.0);
|
||||
|
||||
var yuv = vec3<f32>(0.0);
|
||||
yuv.x = textureSample(tex_y, s, in.uv).r - 0.0625;
|
||||
yuv.y = textureSample(tex_uv, s, in.uv).r - 0.5;
|
||||
yuv.z = textureSample(tex_uv, s, in.uv).g - 0.5;
|
||||
|
||||
var rgb = vec3<f32>(0.0);
|
||||
rgb.x = dot(yuv, yuv2r);
|
||||
rgb.y = dot(yuv, yuv2g);
|
||||
rgb.z = dot(yuv, yuv2b);
|
||||
|
||||
let threshold = rgb <= vec3<f32>(0.04045);
|
||||
let hi = pow((rgb + vec3<f32>(0.055)) / vec3<f32>(1.055), vec3<f32>(2.4));
|
||||
let lo = rgb * vec3<f32>(1.0 / 12.92);
|
||||
rgb = select(hi, lo, threshold);
|
||||
|
||||
return vec4<f32>(rgb, 1.0);
|
||||
}
|
||||
@@ -1,662 +0,0 @@
|
||||
use crate::Error;
|
||||
use gstreamer as gst;
|
||||
use gstreamer_app as gst_app;
|
||||
use gstreamer_app::prelude::*;
|
||||
use iced::widget::image as img;
|
||||
use std::num::NonZeroU8;
|
||||
use std::ops::{Deref, DerefMut};
|
||||
use std::sync::atomic::{AtomicBool, AtomicU64, Ordering};
|
||||
use std::sync::{Arc, Mutex, RwLock};
|
||||
use std::time::{Duration, Instant};
|
||||
|
||||
/// Position in the media.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash)]
|
||||
pub enum Position {
|
||||
/// Position based on time.
|
||||
///
|
||||
/// Not the most accurate format for videos.
|
||||
Time(Duration),
|
||||
/// Position based on nth frame.
|
||||
Frame(u64),
|
||||
}
|
||||
|
||||
impl From<Position> for gst::GenericFormattedValue {
|
||||
fn from(pos: Position) -> Self {
|
||||
match pos {
|
||||
Position::Time(t) => gst::ClockTime::from_nseconds(t.as_nanos() as _).into(),
|
||||
Position::Frame(f) => gst::format::Default::from_u64(f).into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<Duration> for Position {
|
||||
fn from(t: Duration) -> Self {
|
||||
Position::Time(t)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<u64> for Position {
|
||||
fn from(f: u64) -> Self {
|
||||
Position::Frame(f)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub(crate) struct Frame(gst::Sample);
|
||||
|
||||
impl Frame {
|
||||
pub fn empty() -> Self {
|
||||
Self(gst::Sample::builder().build())
|
||||
}
|
||||
|
||||
pub fn readable(&self) -> Option<gst::BufferMap<'_, gst::buffer::Readable>> {
|
||||
self.0.buffer().and_then(|x| x.map_readable().ok())
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub(crate) struct Internal {
|
||||
pub(crate) id: u64,
|
||||
|
||||
pub(crate) bus: gst::Bus,
|
||||
pub(crate) source: gst::Pipeline,
|
||||
pub(crate) alive: Arc<AtomicBool>,
|
||||
pub(crate) worker: Option<std::thread::JoinHandle<()>>,
|
||||
|
||||
pub(crate) width: i32,
|
||||
pub(crate) height: i32,
|
||||
pub(crate) framerate: f64,
|
||||
pub(crate) duration: Duration,
|
||||
pub(crate) speed: f64,
|
||||
pub(crate) sync_av: bool,
|
||||
|
||||
pub(crate) frame: Arc<Mutex<Frame>>,
|
||||
pub(crate) upload_frame: Arc<AtomicBool>,
|
||||
pub(crate) last_frame_time: Arc<Mutex<Instant>>,
|
||||
pub(crate) looping: bool,
|
||||
pub(crate) is_eos: bool,
|
||||
pub(crate) restart_stream: bool,
|
||||
pub(crate) sync_av_avg: u64,
|
||||
pub(crate) sync_av_counter: u64,
|
||||
|
||||
pub(crate) subtitle_text: Arc<Mutex<Option<String>>>,
|
||||
pub(crate) upload_text: Arc<AtomicBool>,
|
||||
}
|
||||
|
||||
impl Internal {
|
||||
pub(crate) fn seek(&self, position: impl Into<Position>, accurate: bool) -> Result<(), Error> {
|
||||
let position = position.into();
|
||||
|
||||
// gstreamer complains if the start & end value types aren't the same
|
||||
match &position {
|
||||
Position::Time(_) => self.source.seek(
|
||||
self.speed,
|
||||
gst::SeekFlags::FLUSH
|
||||
| if accurate {
|
||||
gst::SeekFlags::ACCURATE
|
||||
} else {
|
||||
gst::SeekFlags::empty()
|
||||
},
|
||||
gst::SeekType::Set,
|
||||
gst::GenericFormattedValue::from(position),
|
||||
gst::SeekType::Set,
|
||||
gst::ClockTime::NONE,
|
||||
)?,
|
||||
Position::Frame(_) => self.source.seek(
|
||||
self.speed,
|
||||
gst::SeekFlags::FLUSH
|
||||
| if accurate {
|
||||
gst::SeekFlags::ACCURATE
|
||||
} else {
|
||||
gst::SeekFlags::empty()
|
||||
},
|
||||
gst::SeekType::Set,
|
||||
gst::GenericFormattedValue::from(position),
|
||||
gst::SeekType::Set,
|
||||
gst::format::Default::NONE,
|
||||
)?,
|
||||
};
|
||||
|
||||
*self.subtitle_text.lock().expect("lock subtitle_text") = None;
|
||||
self.upload_text.store(true, Ordering::SeqCst);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub(crate) fn set_speed(&mut self, speed: f64) -> Result<(), Error> {
|
||||
let Some(position) = self.source.query_position::<gst::ClockTime>() else {
|
||||
return Err(Error::Caps);
|
||||
};
|
||||
if speed > 0.0 {
|
||||
self.source.seek(
|
||||
speed,
|
||||
gst::SeekFlags::FLUSH | gst::SeekFlags::ACCURATE,
|
||||
gst::SeekType::Set,
|
||||
position,
|
||||
gst::SeekType::End,
|
||||
gst::ClockTime::from_seconds(0),
|
||||
)?;
|
||||
} else {
|
||||
self.source.seek(
|
||||
speed,
|
||||
gst::SeekFlags::FLUSH | gst::SeekFlags::ACCURATE,
|
||||
gst::SeekType::Set,
|
||||
gst::ClockTime::from_seconds(0),
|
||||
gst::SeekType::Set,
|
||||
position,
|
||||
)?;
|
||||
}
|
||||
self.speed = speed;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub(crate) fn restart_stream(&mut self) -> Result<(), Error> {
|
||||
self.is_eos = false;
|
||||
self.set_paused(false);
|
||||
self.seek(0, false)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub(crate) fn set_paused(&mut self, paused: bool) {
|
||||
self.source
|
||||
.set_state(if paused {
|
||||
gst::State::Paused
|
||||
} else {
|
||||
gst::State::Playing
|
||||
})
|
||||
.unwrap(/* state was changed in ctor; state errors caught there */);
|
||||
|
||||
// Set restart_stream flag to make the stream restart on the next Message::NextFrame
|
||||
if self.is_eos && !paused {
|
||||
self.restart_stream = true;
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn paused(&self) -> bool {
|
||||
self.source.state(gst::ClockTime::ZERO).1 == gst::State::Paused
|
||||
}
|
||||
|
||||
/// Syncs audio with video when there is (inevitably) latency presenting the frame.
|
||||
pub(crate) fn set_av_offset(&mut self, offset: Duration) {
|
||||
if self.sync_av {
|
||||
self.sync_av_counter += 1;
|
||||
self.sync_av_avg = self.sync_av_avg * (self.sync_av_counter - 1) / self.sync_av_counter
|
||||
+ offset.as_nanos() as u64 / self.sync_av_counter;
|
||||
if self.sync_av_counter % 128 == 0 {
|
||||
self.source
|
||||
.set_property("av-offset", -(self.sync_av_avg as i64));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A multimedia video loaded from a URI (e.g., a local file path or HTTP stream).
|
||||
#[derive(Debug)]
|
||||
pub struct Video(pub(crate) RwLock<Internal>);
|
||||
|
||||
impl Drop for Video {
|
||||
fn drop(&mut self) {
|
||||
let inner = self.0.get_mut().expect("failed to lock");
|
||||
|
||||
inner
|
||||
.source
|
||||
.set_state(gst::State::Null)
|
||||
.expect("failed to set state");
|
||||
|
||||
inner.alive.store(false, Ordering::SeqCst);
|
||||
if let Some(worker) = inner.worker.take() {
|
||||
if let Err(err) = worker.join() {
|
||||
match err.downcast_ref::<String>() {
|
||||
Some(e) => log::error!("Video thread panicked: {e}"),
|
||||
None => log::error!("Video thread panicked with unknown reason"),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Video {
|
||||
/// Create a new video player from a given video which loads from `uri`.
|
||||
/// Note that live sources will report the duration to be zero.
|
||||
pub fn new(uri: &url::Url) -> Result<Self, Error> {
|
||||
gst::init()?;
|
||||
|
||||
let pipeline = format!("playbin uri=\"{}\" text-sink=\"appsink name=iced_text sync=true drop=true\" video-sink=\"videoscale ! videoconvert ! appsink name=iced_video drop=true caps=video/x-raw,format=NV12,pixel-aspect-ratio=1/1\"", uri.as_str());
|
||||
let pipeline = gst::parse::launch(pipeline.as_ref())?
|
||||
.downcast::<gst::Pipeline>()
|
||||
.map_err(|_| Error::Cast)?;
|
||||
|
||||
let video_sink: gst::Element = pipeline.property("video-sink");
|
||||
let pad = video_sink.pads().first().cloned().unwrap();
|
||||
let pad = pad.dynamic_cast::<gst::GhostPad>().unwrap();
|
||||
let bin = pad
|
||||
.parent_element()
|
||||
.unwrap()
|
||||
.downcast::<gst::Bin>()
|
||||
.unwrap();
|
||||
let video_sink = bin.by_name("iced_video").unwrap();
|
||||
let video_sink = video_sink.downcast::<gst_app::AppSink>().unwrap();
|
||||
|
||||
let text_sink: gst::Element = pipeline.property("text-sink");
|
||||
let text_sink = text_sink.downcast::<gst_app::AppSink>().unwrap();
|
||||
|
||||
Self::from_gst_pipeline(pipeline, video_sink, Some(text_sink))
|
||||
}
|
||||
|
||||
/// Creates a new video based on an existing GStreamer pipeline and appsink.
|
||||
/// Expects an `appsink` plugin with `caps=video/x-raw,format=NV12`.
|
||||
///
|
||||
/// An optional `text_sink` can be provided, which enables subtitle messages
|
||||
/// to be emitted.
|
||||
///
|
||||
/// **Note:** Many functions of [`Video`] assume a `playbin` pipeline.
|
||||
/// Non-`playbin` pipelines given here may not have full functionality.
|
||||
pub fn from_gst_pipeline(
|
||||
pipeline: gst::Pipeline,
|
||||
video_sink: gst_app::AppSink,
|
||||
text_sink: Option<gst_app::AppSink>,
|
||||
) -> Result<Self, Error> {
|
||||
gst::init()?;
|
||||
static NEXT_ID: AtomicU64 = AtomicU64::new(0);
|
||||
let id = NEXT_ID.fetch_add(1, Ordering::SeqCst);
|
||||
|
||||
// We need to ensure we stop the pipeline if we hit an error,
|
||||
// or else there may be audio left playing in the background.
|
||||
macro_rules! cleanup {
|
||||
($expr:expr) => {
|
||||
$expr.map_err(|e| {
|
||||
let _ = pipeline.set_state(gst::State::Null);
|
||||
e
|
||||
})
|
||||
};
|
||||
}
|
||||
|
||||
let pad = video_sink.pads().first().cloned().unwrap();
|
||||
|
||||
dbg!(&pad);
|
||||
dbg!(&pipeline);
|
||||
cleanup!(pipeline.set_state(gst::State::Playing))?;
|
||||
|
||||
// wait for up to 5 seconds until the decoder gets the source capabilities
|
||||
cleanup!(pipeline.state(gst::ClockTime::from_seconds(5)).0)?;
|
||||
|
||||
// extract resolution and framerate
|
||||
// TODO(jazzfool): maybe we want to extract some other information too?
|
||||
let caps = cleanup!(pad.current_caps().ok_or(Error::Caps))?;
|
||||
let s = cleanup!(caps.structure(0).ok_or(Error::Caps))?;
|
||||
let width = cleanup!(s.get::<i32>("width").map_err(|_| Error::Caps))?;
|
||||
let height = cleanup!(s.get::<i32>("height").map_err(|_| Error::Caps))?;
|
||||
// resolution should be mod4
|
||||
let width = ((width + 4 - 1) / 4) * 4;
|
||||
let framerate = cleanup!(s.get::<gst::Fraction>("framerate").map_err(|_| Error::Caps))?;
|
||||
let framerate = framerate.numer() as f64 / framerate.denom() as f64;
|
||||
|
||||
if framerate.is_nan()
|
||||
|| framerate.is_infinite()
|
||||
|| framerate < 0.0
|
||||
|| framerate.abs() < f64::EPSILON
|
||||
{
|
||||
let _ = pipeline.set_state(gst::State::Null);
|
||||
return Err(Error::Framerate(framerate));
|
||||
}
|
||||
|
||||
let duration = Duration::from_nanos(
|
||||
pipeline
|
||||
.query_duration::<gst::ClockTime>()
|
||||
.map(|duration| duration.nseconds())
|
||||
.unwrap_or(0),
|
||||
);
|
||||
|
||||
let sync_av = pipeline.has_property("av-offset", None);
|
||||
|
||||
// NV12 = 12bpp
|
||||
let frame = Arc::new(Mutex::new(Frame::empty()));
|
||||
let upload_frame = Arc::new(AtomicBool::new(false));
|
||||
let alive = Arc::new(AtomicBool::new(true));
|
||||
let last_frame_time = Arc::new(Mutex::new(Instant::now()));
|
||||
|
||||
let frame_ref = Arc::clone(&frame);
|
||||
let upload_frame_ref = Arc::clone(&upload_frame);
|
||||
let alive_ref = Arc::clone(&alive);
|
||||
let last_frame_time_ref = Arc::clone(&last_frame_time);
|
||||
|
||||
let subtitle_text = Arc::new(Mutex::new(None));
|
||||
let upload_text = Arc::new(AtomicBool::new(false));
|
||||
let subtitle_text_ref = Arc::clone(&subtitle_text);
|
||||
let upload_text_ref = Arc::clone(&upload_text);
|
||||
|
||||
let pipeline_ref = pipeline.clone();
|
||||
|
||||
let worker = std::thread::spawn(move || {
|
||||
let mut clear_subtitles_at = None;
|
||||
|
||||
while alive_ref.load(Ordering::Acquire) {
|
||||
if let Err(gst::FlowError::Error) = (|| -> Result<(), gst::FlowError> {
|
||||
let sample =
|
||||
if pipeline_ref.state(gst::ClockTime::ZERO).1 != gst::State::Playing {
|
||||
video_sink
|
||||
.try_pull_preroll(gst::ClockTime::from_mseconds(16))
|
||||
.ok_or(gst::FlowError::Eos)?
|
||||
} else {
|
||||
video_sink
|
||||
.try_pull_sample(gst::ClockTime::from_mseconds(16))
|
||||
.ok_or(gst::FlowError::Eos)?
|
||||
};
|
||||
|
||||
*last_frame_time_ref
|
||||
.lock()
|
||||
.map_err(|_| gst::FlowError::Error)? = Instant::now();
|
||||
|
||||
let frame_segment = sample.segment().cloned().ok_or(gst::FlowError::Error)?;
|
||||
let buffer = sample.buffer().ok_or(gst::FlowError::Error)?;
|
||||
let frame_pts = buffer.pts().ok_or(gst::FlowError::Error)?;
|
||||
let frame_duration = buffer.duration().ok_or(gst::FlowError::Error)?;
|
||||
{
|
||||
let mut frame_guard =
|
||||
frame_ref.lock().map_err(|_| gst::FlowError::Error)?;
|
||||
*frame_guard = Frame(sample);
|
||||
}
|
||||
|
||||
upload_frame_ref.swap(true, Ordering::SeqCst);
|
||||
|
||||
if let Some(at) = clear_subtitles_at {
|
||||
if frame_pts >= at {
|
||||
*subtitle_text_ref
|
||||
.lock()
|
||||
.map_err(|_| gst::FlowError::Error)? = None;
|
||||
upload_text_ref.store(true, Ordering::SeqCst);
|
||||
clear_subtitles_at = None;
|
||||
}
|
||||
}
|
||||
|
||||
let text = text_sink
|
||||
.as_ref()
|
||||
.and_then(|sink| sink.try_pull_sample(gst::ClockTime::from_seconds(0)));
|
||||
if let Some(text) = text {
|
||||
let text_segment = text.segment().ok_or(gst::FlowError::Error)?;
|
||||
let text = text.buffer().ok_or(gst::FlowError::Error)?;
|
||||
let text_pts = text.pts().ok_or(gst::FlowError::Error)?;
|
||||
let text_duration = text.duration().ok_or(gst::FlowError::Error)?;
|
||||
|
||||
let frame_running_time = frame_segment.to_running_time(frame_pts).value();
|
||||
let frame_running_time_end = frame_segment
|
||||
.to_running_time(frame_pts + frame_duration)
|
||||
.value();
|
||||
|
||||
let text_running_time = text_segment.to_running_time(text_pts).value();
|
||||
let text_running_time_end = text_segment
|
||||
.to_running_time(text_pts + text_duration)
|
||||
.value();
|
||||
|
||||
// see gst-plugins-base/ext/pango/gstbasetextoverlay.c (gst_base_text_overlay_video_chain)
|
||||
// as an example of how to correctly synchronize the text+video segments
|
||||
if text_running_time_end > frame_running_time
|
||||
&& frame_running_time_end > text_running_time
|
||||
{
|
||||
let duration = text.duration().unwrap_or(gst::ClockTime::ZERO);
|
||||
let map = text.map_readable().map_err(|_| gst::FlowError::Error)?;
|
||||
|
||||
let text = std::str::from_utf8(map.as_slice())
|
||||
.map_err(|_| gst::FlowError::Error)?
|
||||
.to_string();
|
||||
*subtitle_text_ref
|
||||
.lock()
|
||||
.map_err(|_| gst::FlowError::Error)? = Some(text);
|
||||
upload_text_ref.store(true, Ordering::SeqCst);
|
||||
|
||||
clear_subtitles_at = Some(text_pts + duration);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
})() {
|
||||
log::error!("error pulling frame");
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
Ok(Video(RwLock::new(Internal {
|
||||
id,
|
||||
|
||||
bus: pipeline.bus().unwrap(),
|
||||
source: pipeline,
|
||||
alive,
|
||||
worker: Some(worker),
|
||||
|
||||
width,
|
||||
height,
|
||||
framerate,
|
||||
duration,
|
||||
speed: 1.0,
|
||||
sync_av,
|
||||
|
||||
frame,
|
||||
upload_frame,
|
||||
last_frame_time,
|
||||
looping: false,
|
||||
is_eos: false,
|
||||
restart_stream: false,
|
||||
sync_av_avg: 0,
|
||||
sync_av_counter: 0,
|
||||
|
||||
subtitle_text,
|
||||
upload_text,
|
||||
})))
|
||||
}
|
||||
|
||||
pub(crate) fn read(&self) -> impl Deref<Target = Internal> + '_ {
|
||||
self.0.read().expect("lock")
|
||||
}
|
||||
|
||||
pub(crate) fn write(&self) -> impl DerefMut<Target = Internal> + '_ {
|
||||
self.0.write().expect("lock")
|
||||
}
|
||||
|
||||
pub(crate) fn get_mut(&mut self) -> impl DerefMut<Target = Internal> + '_ {
|
||||
self.0.get_mut().expect("lock")
|
||||
}
|
||||
|
||||
/// Get the size/resolution of the video as `(width, height)`.
|
||||
pub fn size(&self) -> (i32, i32) {
|
||||
(self.read().width, self.read().height)
|
||||
}
|
||||
|
||||
/// Get the framerate of the video as frames per second.
|
||||
pub fn framerate(&self) -> f64 {
|
||||
self.read().framerate
|
||||
}
|
||||
|
||||
/// Set the volume multiplier of the audio.
|
||||
/// `0.0` = 0% volume, `1.0` = 100% volume.
|
||||
///
|
||||
/// This uses a linear scale, for example `0.5` is perceived as half as loud.
|
||||
pub fn set_volume(&mut self, volume: f64) {
|
||||
self.get_mut().source.set_property("volume", volume);
|
||||
self.set_muted(self.muted()); // for some reason gstreamer unmutes when changing volume?
|
||||
}
|
||||
|
||||
/// Get the volume multiplier of the audio.
|
||||
pub fn volume(&self) -> f64 {
|
||||
self.read().source.property("volume")
|
||||
}
|
||||
|
||||
/// Set if the audio is muted or not, without changing the volume.
|
||||
pub fn set_muted(&mut self, muted: bool) {
|
||||
self.get_mut().source.set_property("mute", muted);
|
||||
}
|
||||
|
||||
/// Get if the audio is muted or not.
|
||||
pub fn muted(&self) -> bool {
|
||||
self.read().source.property("mute")
|
||||
}
|
||||
|
||||
/// Get if the stream ended or not.
|
||||
pub fn eos(&self) -> bool {
|
||||
self.read().is_eos
|
||||
}
|
||||
|
||||
/// Get if the media will loop or not.
|
||||
pub fn looping(&self) -> bool {
|
||||
self.read().looping
|
||||
}
|
||||
|
||||
/// Set if the media will loop or not.
|
||||
pub fn set_looping(&mut self, looping: bool) {
|
||||
self.get_mut().looping = looping;
|
||||
}
|
||||
|
||||
/// Set if the media is paused or not.
|
||||
pub fn set_paused(&mut self, paused: bool) {
|
||||
self.get_mut().set_paused(paused)
|
||||
}
|
||||
|
||||
/// Get if the media is paused or not.
|
||||
pub fn paused(&self) -> bool {
|
||||
self.read().paused()
|
||||
}
|
||||
|
||||
/// Jumps to a specific position in the media.
|
||||
/// Passing `true` to the `accurate` parameter will result in more accurate seeking,
|
||||
/// however, it is also slower. For most seeks (e.g., scrubbing) this is not needed.
|
||||
pub fn seek(&mut self, position: impl Into<Position>, accurate: bool) -> Result<(), Error> {
|
||||
self.get_mut().seek(position, accurate)
|
||||
}
|
||||
|
||||
/// Set the playback speed of the media.
|
||||
/// The default speed is `1.0`.
|
||||
pub fn set_speed(&mut self, speed: f64) -> Result<(), Error> {
|
||||
self.get_mut().set_speed(speed)
|
||||
}
|
||||
|
||||
/// Get the current playback speed.
|
||||
pub fn speed(&self) -> f64 {
|
||||
self.read().speed
|
||||
}
|
||||
|
||||
/// Get the current playback position in time.
|
||||
pub fn position(&self) -> Duration {
|
||||
Duration::from_nanos(
|
||||
self.read()
|
||||
.source
|
||||
.query_position::<gst::ClockTime>()
|
||||
.map_or(0, |pos| pos.nseconds()),
|
||||
)
|
||||
}
|
||||
|
||||
/// Get the media duration.
|
||||
pub fn duration(&self) -> Duration {
|
||||
self.read().duration
|
||||
}
|
||||
|
||||
/// Restarts a stream; seeks to the first frame and unpauses, sets the `eos` flag to false.
|
||||
pub fn restart_stream(&mut self) -> Result<(), Error> {
|
||||
self.get_mut().restart_stream()
|
||||
}
|
||||
|
||||
/// Set the subtitle URL to display.
|
||||
pub fn set_subtitle_url(&mut self, url: &url::Url) -> Result<(), Error> {
|
||||
let paused = self.paused();
|
||||
let mut inner = self.get_mut();
|
||||
inner.source.set_state(gst::State::Ready)?;
|
||||
inner.source.set_property("suburi", url.as_str());
|
||||
inner.set_paused(paused);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Get the current subtitle URL.
|
||||
pub fn subtitle_url(&self) -> Option<url::Url> {
|
||||
url::Url::parse(
|
||||
&self
|
||||
.read()
|
||||
.source
|
||||
.property::<Option<String>>("current-suburi")?,
|
||||
)
|
||||
.ok()
|
||||
}
|
||||
|
||||
/// Get the underlying GStreamer pipeline.
|
||||
pub fn pipeline(&self) -> gst::Pipeline {
|
||||
self.read().source.clone()
|
||||
}
|
||||
|
||||
/// Generates a list of thumbnails based on a set of positions in the media, downscaled by a given factor.
|
||||
///
|
||||
/// Slow; only needs to be called once for each instance.
|
||||
/// It's best to call this at the very start of playback, otherwise the position may shift.
|
||||
pub fn thumbnails<I>(
|
||||
&mut self,
|
||||
positions: I,
|
||||
downscale: NonZeroU8,
|
||||
) -> Result<Vec<img::Handle>, Error>
|
||||
where
|
||||
I: IntoIterator<Item = Position>,
|
||||
{
|
||||
let downscale = u8::from(downscale) as u32;
|
||||
|
||||
let paused = self.paused();
|
||||
let muted = self.muted();
|
||||
let pos = self.position();
|
||||
|
||||
self.set_paused(false);
|
||||
self.set_muted(true);
|
||||
|
||||
let out = {
|
||||
let inner = self.read();
|
||||
let width = inner.width;
|
||||
let height = inner.height;
|
||||
positions
|
||||
.into_iter()
|
||||
.map(|pos| {
|
||||
inner.seek(pos, true)?;
|
||||
inner.upload_frame.store(false, Ordering::SeqCst);
|
||||
while !inner.upload_frame.load(Ordering::SeqCst) {
|
||||
std::hint::spin_loop();
|
||||
}
|
||||
let frame_guard = inner.frame.lock().map_err(|_| Error::Lock)?;
|
||||
let frame = frame_guard.readable().ok_or(Error::Lock)?;
|
||||
|
||||
Ok(img::Handle::from_rgba(
|
||||
inner.width as u32 / downscale,
|
||||
inner.height as u32 / downscale,
|
||||
yuv_to_rgba(frame.as_slice(), width as _, height as _, downscale),
|
||||
))
|
||||
})
|
||||
.collect()
|
||||
};
|
||||
|
||||
self.set_paused(paused);
|
||||
self.set_muted(muted);
|
||||
self.seek(pos, true)?;
|
||||
|
||||
out
|
||||
}
|
||||
}
|
||||
|
||||
fn yuv_to_rgba(yuv: &[u8], width: u32, height: u32, downscale: u32) -> Vec<u8> {
|
||||
let uv_start = width * height;
|
||||
let mut rgba = vec![];
|
||||
|
||||
for y in 0..height / downscale {
|
||||
for x in 0..width / downscale {
|
||||
let x_src = x * downscale;
|
||||
let y_src = y * downscale;
|
||||
|
||||
let uv_i = uv_start + width * (y_src / 2) + x_src / 2 * 2;
|
||||
|
||||
let y = yuv[(y_src * width + x_src) as usize] as f32;
|
||||
let u = yuv[uv_i as usize] as f32;
|
||||
let v = yuv[(uv_i + 1) as usize] as f32;
|
||||
|
||||
let r = 1.164 * (y - 16.0) + 1.596 * (v - 128.0);
|
||||
let g = 1.164 * (y - 16.0) - 0.813 * (v - 128.0) - 0.391 * (u - 128.0);
|
||||
let b = 1.164 * (y - 16.0) + 2.018 * (u - 128.0);
|
||||
|
||||
rgba.push(r as u8);
|
||||
rgba.push(g as u8);
|
||||
rgba.push(b as u8);
|
||||
rgba.push(0xFF);
|
||||
}
|
||||
}
|
||||
|
||||
rgba
|
||||
}
|
||||
@@ -1,305 +0,0 @@
|
||||
use crate::{pipeline::VideoPrimitive, video::Video};
|
||||
use gstreamer as gst;
|
||||
use iced::{
|
||||
advanced::{self, layout, widget, Widget},
|
||||
Element,
|
||||
};
|
||||
use iced_wgpu::primitive::Renderer as PrimitiveRenderer;
|
||||
use log::error;
|
||||
use std::{marker::PhantomData, sync::atomic::Ordering};
|
||||
use std::{sync::Arc, time::Instant};
|
||||
|
||||
/// Video player widget which displays the current frame of a [`Video`](crate::Video).
|
||||
pub struct VideoPlayer<'a, Message, Theme = iced::Theme, Renderer = iced::Renderer>
|
||||
where
|
||||
Renderer: PrimitiveRenderer,
|
||||
{
|
||||
video: &'a Video,
|
||||
content_fit: iced::ContentFit,
|
||||
width: iced::Length,
|
||||
height: iced::Length,
|
||||
on_end_of_stream: Option<Message>,
|
||||
on_new_frame: Option<Message>,
|
||||
on_subtitle_text: Option<Box<dyn Fn(Option<String>) -> Message + 'a>>,
|
||||
on_error: Option<Box<dyn Fn(&glib::Error) -> Message + 'a>>,
|
||||
_phantom: PhantomData<(Theme, Renderer)>,
|
||||
}
|
||||
|
||||
impl<'a, Message, Theme, Renderer> VideoPlayer<'a, Message, Theme, Renderer>
|
||||
where
|
||||
Renderer: PrimitiveRenderer,
|
||||
{
|
||||
/// Creates a new video player widget for a given video.
|
||||
pub fn new(video: &'a Video) -> Self {
|
||||
VideoPlayer {
|
||||
video,
|
||||
content_fit: iced::ContentFit::default(),
|
||||
width: iced::Length::Shrink,
|
||||
height: iced::Length::Shrink,
|
||||
on_end_of_stream: None,
|
||||
on_new_frame: None,
|
||||
on_subtitle_text: None,
|
||||
on_error: None,
|
||||
_phantom: Default::default(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Sets the width of the `VideoPlayer` boundaries.
|
||||
pub fn width(self, width: impl Into<iced::Length>) -> Self {
|
||||
VideoPlayer {
|
||||
width: width.into(),
|
||||
..self
|
||||
}
|
||||
}
|
||||
|
||||
/// Sets the height of the `VideoPlayer` boundaries.
|
||||
pub fn height(self, height: impl Into<iced::Length>) -> Self {
|
||||
VideoPlayer {
|
||||
height: height.into(),
|
||||
..self
|
||||
}
|
||||
}
|
||||
|
||||
/// Sets the `ContentFit` of the `VideoPlayer`.
|
||||
pub fn content_fit(self, content_fit: iced::ContentFit) -> Self {
|
||||
VideoPlayer {
|
||||
content_fit,
|
||||
..self
|
||||
}
|
||||
}
|
||||
|
||||
/// Message to send when the video reaches the end of stream (i.e., the video ends).
|
||||
pub fn on_end_of_stream(self, on_end_of_stream: Message) -> Self {
|
||||
VideoPlayer {
|
||||
on_end_of_stream: Some(on_end_of_stream),
|
||||
..self
|
||||
}
|
||||
}
|
||||
|
||||
/// Message to send when the video receives a new frame.
|
||||
pub fn on_new_frame(self, on_new_frame: Message) -> Self {
|
||||
VideoPlayer {
|
||||
on_new_frame: Some(on_new_frame),
|
||||
..self
|
||||
}
|
||||
}
|
||||
|
||||
/// Message to send when the video receives a new frame.
|
||||
pub fn on_subtitle_text<F>(self, on_subtitle_text: F) -> Self
|
||||
where
|
||||
F: 'a + Fn(Option<String>) -> Message,
|
||||
{
|
||||
VideoPlayer {
|
||||
on_subtitle_text: Some(Box::new(on_subtitle_text)),
|
||||
..self
|
||||
}
|
||||
}
|
||||
|
||||
/// Message to send when the video playback encounters an error.
|
||||
pub fn on_error<F>(self, on_error: F) -> Self
|
||||
where
|
||||
F: 'a + Fn(&glib::Error) -> Message,
|
||||
{
|
||||
VideoPlayer {
|
||||
on_error: Some(Box::new(on_error)),
|
||||
..self
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<Message, Theme, Renderer> Widget<Message, Theme, Renderer>
|
||||
for VideoPlayer<'_, Message, Theme, Renderer>
|
||||
where
|
||||
Message: Clone,
|
||||
Renderer: PrimitiveRenderer,
|
||||
{
|
||||
fn size(&self) -> iced::Size<iced::Length> {
|
||||
iced::Size {
|
||||
width: iced::Length::Shrink,
|
||||
height: iced::Length::Shrink,
|
||||
}
|
||||
}
|
||||
|
||||
fn layout(
|
||||
&mut self,
|
||||
_tree: &mut widget::Tree,
|
||||
_renderer: &Renderer,
|
||||
limits: &layout::Limits,
|
||||
) -> layout::Node {
|
||||
let (video_width, video_height) = self.video.size();
|
||||
|
||||
// based on `Image::layout`
|
||||
let image_size = iced::Size::new(video_width as f32, video_height as f32);
|
||||
let raw_size = limits.resolve(self.width, self.height, image_size);
|
||||
let full_size = self.content_fit.fit(image_size, raw_size);
|
||||
let final_size = iced::Size {
|
||||
width: match self.width {
|
||||
iced::Length::Shrink => f32::min(raw_size.width, full_size.width),
|
||||
_ => raw_size.width,
|
||||
},
|
||||
height: match self.height {
|
||||
iced::Length::Shrink => f32::min(raw_size.height, full_size.height),
|
||||
_ => raw_size.height,
|
||||
},
|
||||
};
|
||||
|
||||
layout::Node::new(final_size)
|
||||
}
|
||||
|
||||
fn draw(
|
||||
&self,
|
||||
_tree: &widget::Tree,
|
||||
renderer: &mut Renderer,
|
||||
_theme: &Theme,
|
||||
_style: &advanced::renderer::Style,
|
||||
layout: advanced::Layout<'_>,
|
||||
_cursor: advanced::mouse::Cursor,
|
||||
_viewport: &iced::Rectangle,
|
||||
) {
|
||||
let mut inner = self.video.write();
|
||||
|
||||
// bounds based on `Image::draw`
|
||||
let image_size = iced::Size::new(inner.width as f32, inner.height as f32);
|
||||
let bounds = layout.bounds();
|
||||
let adjusted_fit = self.content_fit.fit(image_size, bounds.size());
|
||||
let scale = iced::Vector::new(
|
||||
adjusted_fit.width / image_size.width,
|
||||
adjusted_fit.height / image_size.height,
|
||||
);
|
||||
let final_size = image_size * scale;
|
||||
|
||||
let position = match self.content_fit {
|
||||
iced::ContentFit::None => iced::Point::new(
|
||||
bounds.x + (image_size.width - adjusted_fit.width) / 2.0,
|
||||
bounds.y + (image_size.height - adjusted_fit.height) / 2.0,
|
||||
),
|
||||
_ => iced::Point::new(
|
||||
bounds.center_x() - final_size.width / 2.0,
|
||||
bounds.center_y() - final_size.height / 2.0,
|
||||
),
|
||||
};
|
||||
|
||||
let drawing_bounds = iced::Rectangle::new(position, final_size);
|
||||
|
||||
let upload_frame = inner.upload_frame.swap(false, Ordering::SeqCst);
|
||||
|
||||
if upload_frame {
|
||||
let last_frame_time = inner
|
||||
.last_frame_time
|
||||
.lock()
|
||||
.map(|time| *time)
|
||||
.unwrap_or_else(|_| Instant::now());
|
||||
inner.set_av_offset(Instant::now() - last_frame_time);
|
||||
}
|
||||
|
||||
let render = |renderer: &mut Renderer| {
|
||||
renderer.draw_primitive(
|
||||
drawing_bounds,
|
||||
VideoPrimitive::new(
|
||||
inner.id,
|
||||
Arc::clone(&inner.alive),
|
||||
Arc::clone(&inner.frame),
|
||||
(inner.width as _, inner.height as _),
|
||||
upload_frame,
|
||||
),
|
||||
);
|
||||
};
|
||||
|
||||
if adjusted_fit.width > bounds.width || adjusted_fit.height > bounds.height {
|
||||
renderer.with_layer(bounds, render);
|
||||
} else {
|
||||
render(renderer);
|
||||
}
|
||||
}
|
||||
|
||||
fn update(
|
||||
&mut self,
|
||||
_state: &mut widget::Tree,
|
||||
event: &iced::Event,
|
||||
_layout: advanced::Layout<'_>,
|
||||
_cursor: advanced::mouse::Cursor,
|
||||
_renderer: &Renderer,
|
||||
_clipboard: &mut dyn advanced::Clipboard,
|
||||
shell: &mut advanced::Shell<'_, Message>,
|
||||
_viewport: &iced::Rectangle,
|
||||
) {
|
||||
let mut inner = self.video.write();
|
||||
|
||||
if let iced::Event::Window(iced::window::Event::RedrawRequested(_)) = event {
|
||||
if inner.restart_stream || (!inner.is_eos && !inner.paused()) {
|
||||
let mut restart_stream = false;
|
||||
if inner.restart_stream {
|
||||
restart_stream = true;
|
||||
// Set flag to false to avoid potentially multiple seeks
|
||||
inner.restart_stream = false;
|
||||
}
|
||||
let mut eos_pause = false;
|
||||
|
||||
while let Some(msg) = inner
|
||||
.bus
|
||||
.pop_filtered(&[gst::MessageType::Error, gst::MessageType::Eos])
|
||||
{
|
||||
match msg.view() {
|
||||
gst::MessageView::Error(err) => {
|
||||
error!("bus returned an error: {err}");
|
||||
if let Some(ref on_error) = self.on_error {
|
||||
shell.publish(on_error(&err.error()))
|
||||
};
|
||||
}
|
||||
gst::MessageView::Eos(_eos) => {
|
||||
if let Some(on_end_of_stream) = self.on_end_of_stream.clone() {
|
||||
shell.publish(on_end_of_stream);
|
||||
}
|
||||
if inner.looping {
|
||||
restart_stream = true;
|
||||
} else {
|
||||
eos_pause = true;
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
// Don't run eos_pause if restart_stream is true; fixes "pausing" after restarting a stream
|
||||
if restart_stream {
|
||||
if let Err(err) = inner.restart_stream() {
|
||||
error!("cannot restart stream (can't seek): {err:#?}");
|
||||
}
|
||||
} else if eos_pause {
|
||||
inner.is_eos = true;
|
||||
inner.set_paused(true);
|
||||
}
|
||||
|
||||
if inner.upload_frame.load(Ordering::SeqCst) {
|
||||
if let Some(on_new_frame) = self.on_new_frame.clone() {
|
||||
shell.publish(on_new_frame);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(on_subtitle_text) = &self.on_subtitle_text {
|
||||
if inner.upload_text.swap(false, Ordering::SeqCst) {
|
||||
if let Ok(text) = inner.subtitle_text.try_lock() {
|
||||
shell.publish(on_subtitle_text(text.clone()));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
shell.request_redraw();
|
||||
} else {
|
||||
shell.request_redraw();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, Message, Theme, Renderer> From<VideoPlayer<'a, Message, Theme, Renderer>>
|
||||
for Element<'a, Message, Theme, Renderer>
|
||||
where
|
||||
Message: 'a + Clone,
|
||||
Theme: 'a,
|
||||
Renderer: 'a + PrimitiveRenderer,
|
||||
{
|
||||
fn from(video_player: VideoPlayer<'a, Message, Theme, Renderer>) -> Self {
|
||||
Self::new(video_player)
|
||||
}
|
||||
}
|
||||
10
deny.toml
10
deny.toml
@@ -92,7 +92,15 @@ allow = [
|
||||
"MIT",
|
||||
"Apache-2.0",
|
||||
"Unicode-3.0",
|
||||
#"Apache-2.0 WITH LLVM-exception",
|
||||
"BSD-2-Clause",
|
||||
"BSD-3-Clause",
|
||||
"Apache-2.0 WITH LLVM-exception",
|
||||
"Zlib",
|
||||
"ISC",
|
||||
"NCSA",
|
||||
"CC0-1.0",
|
||||
"BSL-1.0",
|
||||
# "LGPL",
|
||||
]
|
||||
# The confidence threshold for detecting a license from license text.
|
||||
# The higher the value, the more closely the license text must be to the
|
||||
|
||||
2
examples/hdr-gstreamer-wgpu/.gitignore
vendored
Normal file
2
examples/hdr-gstreamer-wgpu/.gitignore
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
perf*
|
||||
target/
|
||||
20
examples/hdr-gstreamer-wgpu/Cargo.toml
Normal file
20
examples/hdr-gstreamer-wgpu/Cargo.toml
Normal file
@@ -0,0 +1,20 @@
|
||||
[package]
|
||||
name = "hdr-gstreamer-wgpu"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
|
||||
[dependencies]
|
||||
# gst = { workspace = true }
|
||||
wgpu = "27"
|
||||
gstreamer = { version = "0.24.4", features = ["v1_26"] }
|
||||
gstreamer-app = { version = "0.24.4", features = ["v1_26"] }
|
||||
gstreamer-base = { version = "0.24.4", features = ["v1_26"] }
|
||||
gstreamer-video = { version = "0.24.4", features = ["v1_26"] }
|
||||
winit = { version = "*", features = ["wayland"] }
|
||||
anyhow = "*"
|
||||
pollster = "0.4.0"
|
||||
tracing = { version = "0.1.43", features = ["log"] }
|
||||
tracing-subscriber = "0.3.22"
|
||||
|
||||
[profile.release]
|
||||
debug = true
|
||||
0
examples/hdr-gstreamer-wgpu/src/cli.rs
Normal file
0
examples/hdr-gstreamer-wgpu/src/cli.rs
Normal file
592
examples/hdr-gstreamer-wgpu/src/main.rs
Normal file
592
examples/hdr-gstreamer-wgpu/src/main.rs
Normal file
@@ -0,0 +1,592 @@
|
||||
use std::sync::Arc;
|
||||
|
||||
use gstreamer as gst;
|
||||
use gstreamer_app as gst_app;
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use winit::{
|
||||
application::ApplicationHandler,
|
||||
event::*,
|
||||
event_loop::{ActiveEventLoop, EventLoop},
|
||||
keyboard::*,
|
||||
window::Window,
|
||||
};
|
||||
|
||||
pub struct App {
|
||||
state: Option<State>,
|
||||
}
|
||||
|
||||
impl App {
|
||||
pub fn new() -> Self {
|
||||
Self { state: None }
|
||||
}
|
||||
}
|
||||
|
||||
pub trait HdrTextureFormatExt {
|
||||
fn is_hdr_format(&self) -> bool;
|
||||
}
|
||||
|
||||
impl HdrTextureFormatExt for wgpu::TextureFormat {
|
||||
fn is_hdr_format(&self) -> bool {
|
||||
matches!(
|
||||
self,
|
||||
wgpu::TextureFormat::Rgba16Float
|
||||
| wgpu::TextureFormat::Rgba32Float
|
||||
| wgpu::TextureFormat::Rgb10a2Unorm
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
pub struct State {
|
||||
window: Arc<Window>,
|
||||
gst: Video,
|
||||
surface: wgpu::Surface<'static>,
|
||||
video_texture: wgpu::Texture,
|
||||
device: wgpu::Device,
|
||||
queue: wgpu::Queue,
|
||||
config: wgpu::SurfaceConfiguration,
|
||||
pipeline: wgpu::RenderPipeline,
|
||||
bind_group: wgpu::BindGroup,
|
||||
is_surface_initialized: bool,
|
||||
}
|
||||
|
||||
impl State {
|
||||
async fn new(window: Arc<Window>) -> Result<State> {
|
||||
let instance = wgpu::Instance::default();
|
||||
let surface = instance
|
||||
.create_surface(window.clone())
|
||||
.context("Failed to create wgpu surface")?;
|
||||
|
||||
let adapter = instance
|
||||
.request_adapter(&wgpu::RequestAdapterOptions {
|
||||
power_preference: wgpu::PowerPreference::HighPerformance,
|
||||
compatible_surface: Some(&surface),
|
||||
force_fallback_adapter: false,
|
||||
})
|
||||
.await
|
||||
.context("Failed to request wgpu adapter")?;
|
||||
|
||||
let (device, queue) = adapter
|
||||
.request_device(&wgpu::DeviceDescriptor {
|
||||
label: None,
|
||||
required_features: wgpu::Features::empty(),
|
||||
required_limits: wgpu::Limits::default(),
|
||||
memory_hints: wgpu::MemoryHints::default(),
|
||||
..Default::default()
|
||||
})
|
||||
.await
|
||||
.context("Failed to request wgpu device")?;
|
||||
let surface_caps = surface.get_capabilities(&adapter);
|
||||
tracing::info!("Caps: {:#?}", &surface_caps);
|
||||
let surface_format = surface_caps
|
||||
.formats
|
||||
.iter()
|
||||
.rev() // float one comes first
|
||||
.find(|f| f.is_hdr_format())
|
||||
.expect("HDR format not supported")
|
||||
.clone();
|
||||
tracing::info!("Using surface format: {:?}", surface_format);
|
||||
let size = window.inner_size();
|
||||
let config = wgpu::SurfaceConfiguration {
|
||||
usage: wgpu::TextureUsages::RENDER_ATTACHMENT,
|
||||
format: surface_format,
|
||||
width: size.width,
|
||||
height: size.height,
|
||||
present_mode: surface_caps.present_modes[0],
|
||||
alpha_mode: surface_caps.alpha_modes[0],
|
||||
view_formats: vec![],
|
||||
desired_maximum_frame_latency: 2, // calculate upto 5 frames ahead
|
||||
};
|
||||
surface.configure(&device, &config);
|
||||
|
||||
let shader = device.create_shader_module(wgpu::include_wgsl!("shader.wgsl"));
|
||||
|
||||
let texture_bind_group_layout =
|
||||
device.create_bind_group_layout(&wgpu::BindGroupLayoutDescriptor {
|
||||
label: Some("texture_bind_group_layout"),
|
||||
entries: &[
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 0,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Texture {
|
||||
multisampled: false,
|
||||
view_dimension: wgpu::TextureViewDimension::D2,
|
||||
sample_type: wgpu::TextureSampleType::Float { filterable: true },
|
||||
},
|
||||
count: None,
|
||||
},
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 1,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Sampler(wgpu::SamplerBindingType::Filtering),
|
||||
count: None,
|
||||
},
|
||||
],
|
||||
});
|
||||
|
||||
let render_pipeline_layout =
|
||||
device.create_pipeline_layout(&wgpu::PipelineLayoutDescriptor {
|
||||
label: Some("Jello Render Pipeline Layout"),
|
||||
bind_group_layouts: &[&texture_bind_group_layout],
|
||||
push_constant_ranges: &[],
|
||||
});
|
||||
|
||||
let render_pipeline = device.create_render_pipeline(&wgpu::RenderPipelineDescriptor {
|
||||
label: Some("Jello Render Pipeline"),
|
||||
layout: Some(&render_pipeline_layout),
|
||||
vertex: wgpu::VertexState {
|
||||
module: &shader,
|
||||
entry_point: Some("vs_main"),
|
||||
buffers: &[],
|
||||
compilation_options: wgpu::PipelineCompilationOptions::default(),
|
||||
},
|
||||
fragment: Some(wgpu::FragmentState {
|
||||
module: &shader,
|
||||
entry_point: Some("fs_main"),
|
||||
compilation_options: wgpu::PipelineCompilationOptions::default(),
|
||||
targets: &[Some(wgpu::ColorTargetState {
|
||||
format: surface_format,
|
||||
blend: Some(wgpu::BlendState::REPLACE),
|
||||
write_mask: wgpu::ColorWrites::ALL,
|
||||
})],
|
||||
}),
|
||||
primitive: wgpu::PrimitiveState::default(),
|
||||
depth_stencil: None,
|
||||
multisample: wgpu::MultisampleState {
|
||||
count: 1,
|
||||
mask: !0,
|
||||
alpha_to_coverage_enabled: false,
|
||||
},
|
||||
multiview: None,
|
||||
cache: None,
|
||||
});
|
||||
|
||||
let texture_size = wgpu::Extent3d {
|
||||
width: size.width,
|
||||
height: size.height,
|
||||
depth_or_array_layers: 1,
|
||||
};
|
||||
let video_texture = device.create_texture(&wgpu::TextureDescriptor {
|
||||
size: texture_size,
|
||||
mip_level_count: 1,
|
||||
sample_count: 1,
|
||||
dimension: wgpu::TextureDimension::D2,
|
||||
format: surface_format,
|
||||
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
|
||||
label: Some("Jello Video Texture"),
|
||||
view_formats: &[],
|
||||
});
|
||||
|
||||
// TODO: Use a better sampler
|
||||
let sampler = device.create_sampler(&wgpu::SamplerDescriptor {
|
||||
label: Some("texture_sampler"),
|
||||
address_mode_u: wgpu::AddressMode::ClampToEdge,
|
||||
address_mode_v: wgpu::AddressMode::ClampToEdge,
|
||||
address_mode_w: wgpu::AddressMode::ClampToEdge,
|
||||
mag_filter: wgpu::FilterMode::Linear,
|
||||
min_filter: wgpu::FilterMode::Linear,
|
||||
mipmap_filter: wgpu::FilterMode::Nearest,
|
||||
..Default::default()
|
||||
});
|
||||
|
||||
let bind_group = device.create_bind_group(&wgpu::BindGroupDescriptor {
|
||||
layout: &texture_bind_group_layout,
|
||||
entries: &[
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 0,
|
||||
resource: wgpu::BindingResource::TextureView(
|
||||
&video_texture.create_view(&wgpu::TextureViewDescriptor::default()),
|
||||
),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 1,
|
||||
resource: wgpu::BindingResource::Sampler(&sampler),
|
||||
},
|
||||
],
|
||||
label: Some("Jello Texture Bind Group"),
|
||||
});
|
||||
let gst = Video::new().context("Failed to create Video")?;
|
||||
std::thread::sleep(std::time::Duration::from_secs(10));
|
||||
// surface.configure(&device, &config);
|
||||
|
||||
Ok(Self {
|
||||
window,
|
||||
gst,
|
||||
surface,
|
||||
video_texture,
|
||||
device,
|
||||
queue,
|
||||
config,
|
||||
is_surface_initialized: true,
|
||||
bind_group,
|
||||
pipeline: render_pipeline,
|
||||
})
|
||||
}
|
||||
|
||||
// async fn next_frame(&mut self)
|
||||
|
||||
fn resize(&mut self, width: u32, height: u32) {
|
||||
if width > 0 && height > 0 {
|
||||
self.config.width = width;
|
||||
self.config.height = height;
|
||||
self.surface.configure(&self.device, &self.config);
|
||||
self.is_surface_initialized = true;
|
||||
}
|
||||
}
|
||||
|
||||
fn render(&mut self) -> Result<(), wgpu::SurfaceError> {
|
||||
if !self.is_surface_initialized {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
self.gst.poll();
|
||||
|
||||
self.copy_next_frame_to_texture()
|
||||
.inspect_err(|e| {
|
||||
tracing::error!("Failed to copy video frame to texture: {e:?}");
|
||||
})
|
||||
.map_err(|_| wgpu::SurfaceError::Lost)?;
|
||||
|
||||
let output = match self.surface.get_current_texture() {
|
||||
Ok(output) => output,
|
||||
Err(wgpu::SurfaceError::Lost) => {
|
||||
self.surface.configure(&self.device, &self.config);
|
||||
return Ok(());
|
||||
}
|
||||
Err(e) => return Err(e),
|
||||
};
|
||||
let view = output
|
||||
.texture
|
||||
.create_view(&wgpu::TextureViewDescriptor::default());
|
||||
let mut encoder = self
|
||||
.device
|
||||
.create_command_encoder(&wgpu::CommandEncoderDescriptor {
|
||||
label: Some("Jello Render Encoder"),
|
||||
});
|
||||
|
||||
let mut render_pass = encoder.begin_render_pass(&wgpu::RenderPassDescriptor {
|
||||
label: Some("Jello Render Pass"),
|
||||
color_attachments: &[Some(wgpu::RenderPassColorAttachment {
|
||||
view: &view,
|
||||
resolve_target: None,
|
||||
ops: wgpu::Operations {
|
||||
load: wgpu::LoadOp::Clear(wgpu::Color {
|
||||
r: 0.1,
|
||||
g: 0.2,
|
||||
b: 0.3,
|
||||
a: 1.0,
|
||||
}),
|
||||
store: wgpu::StoreOp::Store,
|
||||
},
|
||||
depth_slice: None,
|
||||
})],
|
||||
depth_stencil_attachment: None,
|
||||
occlusion_query_set: None,
|
||||
timestamp_writes: None,
|
||||
});
|
||||
|
||||
render_pass.set_pipeline(&self.pipeline);
|
||||
render_pass.set_bind_group(0, &self.bind_group, &[]);
|
||||
render_pass.draw(0..3, 0..1);
|
||||
drop(render_pass);
|
||||
|
||||
self.queue.submit(std::iter::once(encoder.finish()));
|
||||
output.present();
|
||||
self.window.request_redraw();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn copy_next_frame_to_texture(&mut self) -> Result<()> {
|
||||
let frame = self
|
||||
.gst
|
||||
.appsink
|
||||
.try_pull_sample(gst::ClockTime::NONE)
|
||||
.context("Failed to pull sample from appsink")?;
|
||||
|
||||
let caps = frame.caps().context("Failed to get caps from sample")?;
|
||||
let size = caps
|
||||
.structure(0)
|
||||
.context("Failed to get structure from caps")?;
|
||||
let width = size
|
||||
.get::<i32>("width")
|
||||
.context("Failed to get width from caps")? as u32;
|
||||
let height = size
|
||||
.get::<i32>("height")
|
||||
.context("Failed to get height from caps")? as u32;
|
||||
|
||||
let texture_size = self.video_texture.size();
|
||||
if texture_size.width != width || texture_size.height != height {
|
||||
tracing::info!(
|
||||
"Resizing video texture from {}x{} to {}x{}",
|
||||
texture_size.width,
|
||||
texture_size.height,
|
||||
width,
|
||||
height
|
||||
);
|
||||
self.video_texture = self.device.create_texture(&wgpu::TextureDescriptor {
|
||||
size: wgpu::Extent3d {
|
||||
width: width as u32,
|
||||
height: height as u32,
|
||||
depth_or_array_layers: 1,
|
||||
},
|
||||
mip_level_count: 1,
|
||||
sample_count: 1,
|
||||
dimension: wgpu::TextureDimension::D2,
|
||||
format: self.config.format,
|
||||
usage: wgpu::TextureUsages::TEXTURE_BINDING | wgpu::TextureUsages::COPY_DST,
|
||||
label: Some("Jello Video Texture"),
|
||||
view_formats: &[],
|
||||
});
|
||||
let texture_bind_group_layout =
|
||||
self.device
|
||||
.create_bind_group_layout(&wgpu::BindGroupLayoutDescriptor {
|
||||
label: Some("texture_bind_group_layout"),
|
||||
entries: &[
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 0,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Texture {
|
||||
multisampled: false,
|
||||
view_dimension: wgpu::TextureViewDimension::D2,
|
||||
sample_type: wgpu::TextureSampleType::Float {
|
||||
filterable: true,
|
||||
},
|
||||
},
|
||||
count: None,
|
||||
},
|
||||
wgpu::BindGroupLayoutEntry {
|
||||
binding: 1,
|
||||
visibility: wgpu::ShaderStages::FRAGMENT,
|
||||
ty: wgpu::BindingType::Sampler(wgpu::SamplerBindingType::Filtering),
|
||||
count: None,
|
||||
},
|
||||
],
|
||||
});
|
||||
let sampler = self.device.create_sampler(&wgpu::SamplerDescriptor {
|
||||
label: Some("texture_sampler"),
|
||||
address_mode_u: wgpu::AddressMode::ClampToEdge,
|
||||
address_mode_v: wgpu::AddressMode::ClampToEdge,
|
||||
address_mode_w: wgpu::AddressMode::ClampToEdge,
|
||||
mag_filter: wgpu::FilterMode::Linear,
|
||||
min_filter: wgpu::FilterMode::Linear,
|
||||
mipmap_filter: wgpu::FilterMode::Nearest,
|
||||
..Default::default()
|
||||
});
|
||||
self.bind_group = self.device.create_bind_group(&wgpu::BindGroupDescriptor {
|
||||
layout: &texture_bind_group_layout,
|
||||
entries: &[
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 0,
|
||||
resource: wgpu::BindingResource::TextureView(
|
||||
&self
|
||||
.video_texture
|
||||
.create_view(&wgpu::TextureViewDescriptor::default()),
|
||||
),
|
||||
},
|
||||
wgpu::BindGroupEntry {
|
||||
binding: 1,
|
||||
resource: wgpu::BindingResource::Sampler(&sampler),
|
||||
},
|
||||
],
|
||||
label: Some("Jello Texture Bind Group"),
|
||||
});
|
||||
}
|
||||
let texture = &self.video_texture;
|
||||
|
||||
let buffer = frame.buffer().context("Failed to get buffer from sample")?;
|
||||
let map = buffer
|
||||
.map_readable()
|
||||
.context("Failed to map buffer readable")?;
|
||||
self.queue.write_texture(
|
||||
wgpu::TexelCopyTextureInfo {
|
||||
texture: &texture,
|
||||
mip_level: 0,
|
||||
origin: wgpu::Origin3d::ZERO,
|
||||
aspect: wgpu::TextureAspect::All,
|
||||
},
|
||||
&map,
|
||||
wgpu::TexelCopyBufferLayout {
|
||||
offset: 0,
|
||||
bytes_per_row: Some(4 * width as u32),
|
||||
rows_per_image: Some(height as u32),
|
||||
},
|
||||
texture.size(),
|
||||
);
|
||||
// drop(map);
|
||||
// drop(frame);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
impl ApplicationHandler<State> for App {
|
||||
fn resumed(&mut self, event_loop: &ActiveEventLoop) {
|
||||
#[allow(unused_mut)]
|
||||
let mut window_attributes = Window::default_attributes();
|
||||
|
||||
let window = Arc::new(event_loop.create_window(window_attributes).unwrap());
|
||||
|
||||
// let monitor = event_loop
|
||||
// .primary_monitor()
|
||||
// .or_else(|| window.current_monitor());
|
||||
// window.set_fullscreen(None);
|
||||
// window.set_fullscreen(Some(winit::window::Fullscreen::Borderless(monitor)));
|
||||
self.state = Some(pollster::block_on(State::new(window)).expect("Failed to block"));
|
||||
}
|
||||
|
||||
fn user_event(&mut self, _event_loop: &ActiveEventLoop, event: State) {
|
||||
self.state = Some(event);
|
||||
}
|
||||
|
||||
fn about_to_wait(&mut self, _event_loop: &ActiveEventLoop) {
|
||||
let state = match &mut self.state {
|
||||
Some(state) => state,
|
||||
None => return,
|
||||
};
|
||||
|
||||
state.window.request_redraw();
|
||||
}
|
||||
|
||||
fn window_event(
|
||||
&mut self,
|
||||
event_loop: &ActiveEventLoop,
|
||||
_window_id: winit::window::WindowId,
|
||||
event: WindowEvent,
|
||||
) {
|
||||
let state = match &mut self.state {
|
||||
Some(canvas) => canvas,
|
||||
None => return,
|
||||
};
|
||||
|
||||
match event {
|
||||
WindowEvent::CloseRequested => event_loop.exit(),
|
||||
WindowEvent::Resized(size) => {
|
||||
tracing::info!("Window resized to {size:?}");
|
||||
state.resize(size.width, size.height)
|
||||
}
|
||||
WindowEvent::RedrawRequested => {
|
||||
// if state.gst.poll() {
|
||||
// event_loop.exit();
|
||||
// return;
|
||||
// }
|
||||
|
||||
match state.render() {
|
||||
Ok(_) => {}
|
||||
// Reconfigure the surface if lost
|
||||
Err(wgpu::SurfaceError::Lost | wgpu::SurfaceError::Outdated) => {
|
||||
let size = state.window.inner_size();
|
||||
tracing::info!("Reconfiguring surface to {size:?}");
|
||||
state.resize(size.width, size.height);
|
||||
}
|
||||
// The system is out of memory, we should probably quit
|
||||
Err(wgpu::SurfaceError::OutOfMemory) => event_loop.exit(),
|
||||
// All other errors (Outdated, Timeout) should be resolved by the next frame
|
||||
Err(e) => {
|
||||
tracing::error!("Failed to render frame: {e:?}");
|
||||
}
|
||||
}
|
||||
}
|
||||
// WindowEvent::AboutToWait => {
|
||||
// state.window.request_redraw();
|
||||
// }
|
||||
WindowEvent::KeyboardInput {
|
||||
event:
|
||||
KeyEvent {
|
||||
physical_key: PhysicalKey::Code(code),
|
||||
state,
|
||||
..
|
||||
},
|
||||
..
|
||||
} => match (code, state.is_pressed()) {
|
||||
(KeyCode::Escape, true) => event_loop.exit(),
|
||||
(KeyCode::KeyQ, true) => event_loop.exit(),
|
||||
_ => {}
|
||||
},
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn main() -> anyhow::Result<()> {
|
||||
tracing_subscriber::fmt::init();
|
||||
|
||||
let event_loop = EventLoop::with_user_event().build()?;
|
||||
let mut app = App::new();
|
||||
event_loop.run_app(&mut app)?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub struct Video {
|
||||
pipeline: gst::Pipeline,
|
||||
bus: gst::Bus,
|
||||
appsink: gst_app::AppSink,
|
||||
}
|
||||
|
||||
impl Video {
|
||||
pub fn new() -> Result<Self> {
|
||||
gst::init()?;
|
||||
use gst::prelude::*;
|
||||
let pipeline = gst::parse::launch(
|
||||
r##"playbin3 uri=https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c video-sink="videoconvert ! video/x-raw,format=RGB10A2_LE ! appsink sync=true drop=true name=appsink""##
|
||||
).context("Failed to parse gst pipeline")?;
|
||||
let pipeline = pipeline
|
||||
.downcast::<gst::Pipeline>()
|
||||
.map_err(|_| anyhow::anyhow!("Failed to downcast gst element to Pipeline"))?;
|
||||
|
||||
let video_sink = pipeline.property::<gst::Bin>("video-sink");
|
||||
let appsink = video_sink
|
||||
.by_name("appsink")
|
||||
.context("Failed to get appsink from video-sink")?
|
||||
.downcast::<gst_app::AppSink>()
|
||||
.map_err(|_| {
|
||||
anyhow::anyhow!("Failed to downcast video-sink appsink to gst_app::AppSink")
|
||||
})?;
|
||||
// appsink.set_property("max-buffers", 2u32);
|
||||
// appsink.set_property("emit-signals", true);
|
||||
// appsink.set_callbacks(
|
||||
// gst_app::AppSinkCallbacks::builder()
|
||||
// .new_sample(|_appsink| Ok(gst::FlowSuccess::Ok))
|
||||
// .build(),
|
||||
// );
|
||||
|
||||
let bus = pipeline.bus().context("Failed to get gst pipeline bus")?;
|
||||
pipeline.set_state(gst::State::Playing)?;
|
||||
pipeline
|
||||
.state(gst::ClockTime::from_seconds(5))
|
||||
.0
|
||||
.context("Failed to wait for pipeline")?;
|
||||
Ok(Self {
|
||||
pipeline,
|
||||
bus,
|
||||
appsink,
|
||||
})
|
||||
}
|
||||
|
||||
pub fn poll(&mut self) -> bool {
|
||||
use gst::prelude::*;
|
||||
for msg in self.bus.iter_timed(gst::ClockTime::ZERO) {
|
||||
use gst::MessageView;
|
||||
|
||||
match msg.view() {
|
||||
MessageView::Eos(..) => {
|
||||
tracing::info!("End of stream");
|
||||
self.pipeline.set_state(gst::State::Null).ok();
|
||||
return true;
|
||||
}
|
||||
MessageView::Error(err) => {
|
||||
tracing::error!(
|
||||
"Error from {:?}: {} ({:?})",
|
||||
err.src().map(|s| s.path_string()),
|
||||
err.error(),
|
||||
err.debug()
|
||||
);
|
||||
self.pipeline.set_state(gst::State::Null).ok();
|
||||
return true;
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
false
|
||||
}
|
||||
}
|
||||
31
examples/hdr-gstreamer-wgpu/src/shader.wgsl
Normal file
31
examples/hdr-gstreamer-wgpu/src/shader.wgsl
Normal file
@@ -0,0 +1,31 @@
|
||||
// Vertex shader
|
||||
|
||||
struct VertexOutput {
|
||||
@builtin(position) clip_position: vec4<f32>,
|
||||
@location(0) tex_coords: vec2<f32>,
|
||||
};
|
||||
|
||||
@vertex
|
||||
fn vs_main(
|
||||
@builtin(vertex_index) in_vertex_index: u32,
|
||||
) -> VertexOutput {
|
||||
var out: VertexOutput;
|
||||
let uv = vec2<f32>(f32((in_vertex_index << 1u) & 2u), f32(in_vertex_index & 2u));
|
||||
out.clip_position = vec4<f32>(uv * 2.0 - 1.0, 0.0, 1.0);
|
||||
out.clip_position.y = -out.clip_position.y;
|
||||
out.tex_coords = uv;
|
||||
return out;
|
||||
}
|
||||
|
||||
// Fragment shader
|
||||
|
||||
@group(0) @binding(0)
|
||||
var t_diffuse: texture_2d<f32>;
|
||||
@group(0) @binding(1)
|
||||
var s_diffuse: sampler;
|
||||
|
||||
@fragment
|
||||
fn fs_main(in: VertexOutput) -> @location(0) vec4<f32> {
|
||||
return textureSample(t_diffuse, s_diffuse, in.tex_coords);
|
||||
}
|
||||
|
||||
35
flake.lock
generated
35
flake.lock
generated
@@ -3,11 +3,11 @@
|
||||
"advisory-db": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1763456551,
|
||||
"narHash": "sha256-z5NogiOp+1r7Fd39jVFN0kT3aXUef8sYkuBsrAUNB5g=",
|
||||
"lastModified": 1768679419,
|
||||
"narHash": "sha256-l9rM4lXBeS2mIAJsJjVfl0UABx3S3zg5tul7bv+bn50=",
|
||||
"owner": "rustsec",
|
||||
"repo": "advisory-db",
|
||||
"rev": "6799e5dea99315eb8de85c6084fd99892b4a25d0",
|
||||
"rev": "c700e1cd023ca87343cbd9217d50d47023e9adc7",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -18,11 +18,11 @@
|
||||
},
|
||||
"crane": {
|
||||
"locked": {
|
||||
"lastModified": 1762538466,
|
||||
"narHash": "sha256-8zrIPl6J+wLm9MH5ksHcW7BUHo7jSNOu0/hA0ohOOaM=",
|
||||
"lastModified": 1768873933,
|
||||
"narHash": "sha256-CfyzdaeLNGkyAHp3kT5vjvXhA1pVVK7nyDziYxCPsNk=",
|
||||
"owner": "ipetkov",
|
||||
"repo": "crane",
|
||||
"rev": "0cea393fffb39575c46b7a0318386467272182fe",
|
||||
"rev": "0bda7e7d005ccb5522a76d11ccfbf562b71953ca",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -34,10 +34,10 @@
|
||||
"crates-io-index": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1763363725,
|
||||
"narHash": "sha256-cxr5xIKZFP45yV1ZHFTB1sHo5YGiR3FA8D9vAfDizMo=",
|
||||
"lastModified": 1769614137,
|
||||
"narHash": "sha256-3Td8fiv6iFVxeS0hYq3xdd10ZvUkC9INMAiQx/mECas=",
|
||||
"ref": "refs/heads/master",
|
||||
"rev": "0382002e816a4cbd17d8d5b172f08b848aa22ff6",
|
||||
"rev": "c7e7d6394bc95555d6acd5c6783855f47d64c90d",
|
||||
"shallow": true,
|
||||
"type": "git",
|
||||
"url": "https://github.com/rust-lang/crates.io-index"
|
||||
@@ -50,7 +50,9 @@
|
||||
},
|
||||
"crates-nix": {
|
||||
"inputs": {
|
||||
"crates-io-index": "crates-io-index"
|
||||
"crates-io-index": [
|
||||
"crates-io-index"
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1763364255,
|
||||
@@ -106,11 +108,11 @@
|
||||
},
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1763283776,
|
||||
"narHash": "sha256-Y7TDFPK4GlqrKrivOcsHG8xSGqQx3A6c+i7novT85Uk=",
|
||||
"lastModified": 1768564909,
|
||||
"narHash": "sha256-Kell/SpJYVkHWMvnhqJz/8DqQg2b6PguxVWOuadbHCc=",
|
||||
"owner": "nixos",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "50a96edd8d0db6cc8db57dab6bb6d6ee1f3dc49a",
|
||||
"rev": "e4bae1bd10c9c57b2cf517953ab70060a828ee6f",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -124,6 +126,7 @@
|
||||
"inputs": {
|
||||
"advisory-db": "advisory-db",
|
||||
"crane": "crane",
|
||||
"crates-io-index": "crates-io-index",
|
||||
"crates-nix": "crates-nix",
|
||||
"flake-utils": "flake-utils",
|
||||
"nix-github-actions": "nix-github-actions",
|
||||
@@ -138,11 +141,11 @@
|
||||
]
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1763433504,
|
||||
"narHash": "sha256-cVid5UNpk88sPYHkLAA5aZEHOFQXSB/2L1vl18Aq7IM=",
|
||||
"lastModified": 1768877311,
|
||||
"narHash": "sha256-abSDl0cNr0B+YCsIDpO1SjXD9JMxE4s8EFnhLEFVovI=",
|
||||
"owner": "oxalica",
|
||||
"repo": "rust-overlay",
|
||||
"rev": "42ce16c6d8318a654d53f047c9400b7d902d6e61",
|
||||
"rev": "59e4ab96304585fde3890025fd59bd2717985cc1",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
|
||||
86
flake.nix
86
flake.nix
@@ -9,7 +9,14 @@
|
||||
url = "github:nix-community/nix-github-actions";
|
||||
inputs.nixpkgs.follows = "nixpkgs";
|
||||
};
|
||||
crates-nix.url = "github:uttarayan21/crates.nix";
|
||||
crates-io-index = {
|
||||
url = "git+https://github.com/rust-lang/crates.io-index?shallow=1";
|
||||
flake = false;
|
||||
};
|
||||
crates-nix = {
|
||||
url = "github:uttarayan21/crates.nix";
|
||||
inputs.crates-io-index.follows = "crates-io-index";
|
||||
};
|
||||
rust-overlay = {
|
||||
url = "github:oxalica/rust-overlay";
|
||||
inputs.nixpkgs.follows = "nixpkgs";
|
||||
@@ -35,6 +42,7 @@
|
||||
system: let
|
||||
pkgs = import nixpkgs {
|
||||
inherit system;
|
||||
config.allowUnfree = true;
|
||||
overlays = [
|
||||
rust-overlay.overlays.default
|
||||
];
|
||||
@@ -56,7 +64,7 @@
|
||||
|
||||
src = let
|
||||
filterBySuffix = path: exts: lib.any (ext: lib.hasSuffix ext path) exts;
|
||||
sourceFilters = path: type: (craneLib.filterCargoSources path type) || filterBySuffix path [".c" ".h" ".hpp" ".cpp" ".cc"];
|
||||
sourceFilters = path: type: (craneLib.filterCargoSources path type) || filterBySuffix path [".c" ".h" ".hpp" ".cpp" ".cc" "wgsl"];
|
||||
in
|
||||
lib.cleanSourceWith {
|
||||
filter = sourceFilters;
|
||||
@@ -70,36 +78,52 @@
|
||||
nativeBuildInputs = with pkgs; [
|
||||
pkg-config
|
||||
];
|
||||
# LD_LIBRARY_PATH = pkgs.lib.makeLibraryPath [pkgs.wayland];
|
||||
LD_LIBRARY_PATH = pkgs.lib.makeLibraryPath buildInputs;
|
||||
# SYSTEM_DEPS_LINK = "static";
|
||||
# PKG_CONFIG_ALL_STATIC = "1";
|
||||
|
||||
buildInputs = with pkgs;
|
||||
[
|
||||
gst_all_1.gst-editing-services
|
||||
gst_all_1.gst-libav
|
||||
gst_all_1.gst-plugins-bad
|
||||
gst_all_1.gst-plugins-base
|
||||
gst_all_1.gst-plugins-good
|
||||
gst_all_1.gst-plugins-rs
|
||||
gst_all_1.gst-plugins-bad
|
||||
gst_all_1.gst-plugins-ugly
|
||||
gst_all_1.gst-rtsp-server
|
||||
gst_all_1.gst-plugins-rs
|
||||
gst_all_1.gstreamer
|
||||
glib
|
||||
glib-networking
|
||||
|
||||
wrapGAppsHook4
|
||||
|
||||
# bzip2_1_1
|
||||
# libsysprof-capture
|
||||
# pcre2
|
||||
# libunwind
|
||||
# elfutils
|
||||
# zstd
|
||||
|
||||
openssl
|
||||
vulkan-loader
|
||||
glib
|
||||
]
|
||||
++ (lib.optionals pkgs.stdenv.isLinux [
|
||||
gst_all_1.gstreamermm
|
||||
gst_all_1.gst-vaapi
|
||||
cudatoolkit
|
||||
|
||||
# util-linux
|
||||
# libselinux
|
||||
# libsepol
|
||||
|
||||
alsa-lib-with-plugins
|
||||
libxkbcommon
|
||||
udev
|
||||
wayland
|
||||
wayland-protocols
|
||||
xorg.libX11
|
||||
xorg.libXi
|
||||
xorg.libXrandr
|
||||
# xorg.libX11
|
||||
# xorg.libXi
|
||||
# xorg.libXrandr
|
||||
])
|
||||
++ (lib.optionals pkgs.stdenv.isDarwin [
|
||||
libiconv
|
||||
@@ -159,35 +183,55 @@
|
||||
default = pkg;
|
||||
};
|
||||
|
||||
devShells = {
|
||||
default =
|
||||
devShells = rec {
|
||||
rust-shell =
|
||||
pkgs.mkShell.override {
|
||||
stdenv =
|
||||
if pkgs.stdenv.isLinux
|
||||
then (pkgs.stdenvAdapters.useMoldLinker pkgs.clangStdenv)
|
||||
else pkgs.clangStdenv;
|
||||
} (commonArgs
|
||||
stdenv = pkgs.clangStdenv;
|
||||
# if pkgs.stdenv.isLinux
|
||||
# then (pkgs.stdenvAdapters.useMoldLinker pkgs.clangStdenv)
|
||||
# else pkgs.clangStdenv;
|
||||
}
|
||||
(commonArgs
|
||||
// {
|
||||
# GST_PLUGIN_PATH = "/run/current-system/sw/lib/gstreamer-1.0/";
|
||||
GIO_EXTRA_MODULES = "${pkgs.glib-networking}/lib/gio/modules";
|
||||
packages = with pkgs;
|
||||
[
|
||||
toolchainWithRustAnalyzer
|
||||
cargo-nextest
|
||||
bacon
|
||||
cargo-audit
|
||||
cargo-deny
|
||||
cargo-expand
|
||||
bacon
|
||||
cargo-make
|
||||
cargo-hack
|
||||
cargo-make
|
||||
cargo-nextest
|
||||
cargo-outdated
|
||||
lld
|
||||
lldb
|
||||
(crates.buildCrate "cargo-with" {doCheck = false;})
|
||||
(crates.buildCrate "dioxus-cli" {
|
||||
nativeBuildInputs = with pkgs; [pkg-config];
|
||||
buildInputs = [openssl];
|
||||
doCheck = false;
|
||||
})
|
||||
(crates.buildCrate "cargo-hot" {
|
||||
nativeBuildInputs = with pkgs; [pkg-config];
|
||||
buildInputs = [openssl];
|
||||
})
|
||||
]
|
||||
++ (lib.optionals pkgs.stdenv.isDarwin [
|
||||
apple-sdk_26
|
||||
])
|
||||
++ (lib.optionals pkgs.stdenv.isLinux [
|
||||
mold
|
||||
ffmpeg
|
||||
heaptrack
|
||||
samply
|
||||
cargo-flamegraph
|
||||
perf
|
||||
# mold
|
||||
]);
|
||||
});
|
||||
default = rust-shell;
|
||||
};
|
||||
}
|
||||
)
|
||||
|
||||
62
gst/.github/workflows/build.yaml
vendored
Normal file
62
gst/.github/workflows/build.yaml
vendored
Normal file
@@ -0,0 +1,62 @@
|
||||
name: build
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ master ]
|
||||
pull_request:
|
||||
branches: [ master ]
|
||||
|
||||
env:
|
||||
CARGO_TERM_COLOR: always
|
||||
|
||||
jobs:
|
||||
checks-matrix:
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
matrix: ${{ steps.set-matrix.outputs.matrix }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: DeterminateSystems/nix-installer-action@main
|
||||
- uses: DeterminateSystems/magic-nix-cache-action@main
|
||||
- id: set-matrix
|
||||
name: Generate Nix Matrix
|
||||
run: |
|
||||
set -Eeu
|
||||
matrix="$(nix eval --json '.#githubActions.matrix')"
|
||||
echo "matrix=$matrix" >> "$GITHUB_OUTPUT"
|
||||
|
||||
checks-build:
|
||||
needs: checks-matrix
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix: ${{fromJSON(needs.checks-matrix.outputs.matrix)}}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: DeterminateSystems/nix-installer-action@main
|
||||
- uses: DeterminateSystems/magic-nix-cache-action@main
|
||||
- run: nix build -L '.#${{ matrix.attr }}'
|
||||
|
||||
codecov:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
id-token: "write"
|
||||
contents: "read"
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: DeterminateSystems/nix-installer-action@main
|
||||
- uses: DeterminateSystems/magic-nix-cache-action@main
|
||||
|
||||
- name: Run codecov
|
||||
run: nix build .#checks.x86_64-linux.hello-llvm-cov
|
||||
|
||||
- name: Upload coverage reports to Codecov
|
||||
uses: codecov/codecov-action@v4.0.1
|
||||
with:
|
||||
flags: unittests
|
||||
name: codecov-hello
|
||||
fail_ci_if_error: true
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
files: ./result
|
||||
verbose: true
|
||||
|
||||
38
gst/.github/workflows/docs.yaml
vendored
Normal file
38
gst/.github/workflows/docs.yaml
vendored
Normal file
@@ -0,0 +1,38 @@
|
||||
name: docs
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ master ]
|
||||
|
||||
env:
|
||||
CARGO_TERM_COLOR: always
|
||||
|
||||
jobs:
|
||||
docs:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
id-token: "write"
|
||||
contents: "read"
|
||||
pages: "write"
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: DeterminateSystems/nix-installer-action@main
|
||||
- uses: DeterminateSystems/magic-nix-cache-action@main
|
||||
- uses: DeterminateSystems/flake-checker-action@main
|
||||
|
||||
- name: Generate docs
|
||||
run: nix build .#checks.x86_64-linux.hello-docs
|
||||
|
||||
- name: Setup Pages
|
||||
uses: actions/configure-pages@v5
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-pages-artifact@v3
|
||||
with:
|
||||
path: result/share/doc
|
||||
|
||||
- name: Deploy to gh-pages
|
||||
id: deployment
|
||||
uses: actions/deploy-pages@v4
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
/result
|
||||
/target
|
||||
.direnv
|
||||
.media
|
||||
1040
gst/Cargo.lock
generated
Normal file
1040
gst/Cargo.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
24
gst/Cargo.toml
Normal file
24
gst/Cargo.toml
Normal file
@@ -0,0 +1,24 @@
|
||||
[package]
|
||||
name = "gst"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
|
||||
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
|
||||
|
||||
[dependencies]
|
||||
error-stack = "0.6"
|
||||
futures = "0.3.31"
|
||||
futures-lite = "2.6.1"
|
||||
glib = "0.21.5"
|
||||
glib-sys = "0.21.5"
|
||||
gstreamer = { version = "0.24.4", features = ["v1_26"] }
|
||||
gstreamer-app = { version = "0.24.4", features = ["v1_26"] }
|
||||
gstreamer-video = { version = "0.24.4", features = ["v1_26"] }
|
||||
gstreamer-base = { version = "0.24.4", features = ["v1_26"] }
|
||||
thiserror = "2.0"
|
||||
tracing = { version = "0.1", features = ["log"] }
|
||||
bitflags = "2.10.0"
|
||||
|
||||
[dev-dependencies]
|
||||
smol = "2.0.2"
|
||||
tracing-subscriber = "0.3.22"
|
||||
38
gst/src/bin.rs
Normal file
38
gst/src/bin.rs
Normal file
@@ -0,0 +1,38 @@
|
||||
use crate::priv_prelude::*;
|
||||
|
||||
wrap_gst!(Bin);
|
||||
parent_child!(Element, Bin);
|
||||
|
||||
impl Bin {
|
||||
pub fn new(name: impl AsRef<str>) -> Self {
|
||||
let bin = gstreamer::Bin::with_name(name.as_ref());
|
||||
Bin { inner: bin }
|
||||
}
|
||||
|
||||
pub fn add(&mut self, element: &impl ChildOf<Element>) -> Result<&mut Self> {
|
||||
self.inner
|
||||
.add(&element.upcast_ref().inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to add element to bin")?;
|
||||
Ok(self)
|
||||
}
|
||||
|
||||
pub fn add_many<'a, E: ChildOf<Element> + 'a>(
|
||||
&mut self,
|
||||
elements: impl IntoIterator<Item = &'a E>,
|
||||
) -> Result<&mut Self> {
|
||||
self.inner
|
||||
.add_many(elements.into_iter().map(|e| &e.upcast_ref().inner))
|
||||
.change_context(Error)
|
||||
.attach("Failed to add elements to bin")?;
|
||||
Ok(self)
|
||||
}
|
||||
|
||||
pub fn add_pad(&mut self, pad: &Pad) -> Result<&mut Self> {
|
||||
self.inner
|
||||
.add_pad(&pad.inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to add pad to bin")?;
|
||||
Ok(self)
|
||||
}
|
||||
}
|
||||
27
gst/src/bus.rs
Normal file
27
gst/src/bus.rs
Normal file
@@ -0,0 +1,27 @@
|
||||
use crate::priv_prelude::*;
|
||||
|
||||
wrap_gst!(Bus);
|
||||
|
||||
impl Bus {
|
||||
pub fn iter_timed(
|
||||
&self,
|
||||
timeout: impl Into<Option<core::time::Duration>>,
|
||||
) -> gstreamer::bus::Iter<'_> {
|
||||
let clocktime = match timeout.into() {
|
||||
Some(dur) => gstreamer::ClockTime::try_from(dur).ok(),
|
||||
None => gstreamer::ClockTime::NONE,
|
||||
};
|
||||
self.inner.iter_timed(clocktime)
|
||||
}
|
||||
|
||||
pub fn stream(&self) -> gstreamer::bus::BusStream {
|
||||
self.inner.stream()
|
||||
}
|
||||
|
||||
pub fn filtered_stream<'a>(
|
||||
&self,
|
||||
msg_types: &'a [gstreamer::MessageType],
|
||||
) -> impl futures::stream::FusedStream<Item = gstreamer::Message> + Unpin + Send + 'a {
|
||||
self.inner.stream_filtered(msg_types)
|
||||
}
|
||||
}
|
||||
78
gst/src/caps.rs
Normal file
78
gst/src/caps.rs
Normal file
@@ -0,0 +1,78 @@
|
||||
use gstreamer::Fraction;
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
#[repr(transparent)]
|
||||
pub struct Caps {
|
||||
pub(crate) inner: gstreamer::caps::Caps,
|
||||
}
|
||||
|
||||
impl Caps {
|
||||
pub fn builder(cs: CapsType) -> CapsBuilder {
|
||||
CapsBuilder::new(cs)
|
||||
}
|
||||
}
|
||||
|
||||
pub struct CapsBuilder {
|
||||
inner: gstreamer::caps::Builder<gstreamer::caps::NoFeature>,
|
||||
}
|
||||
|
||||
impl CapsBuilder {
|
||||
pub fn field<V: Into<glib::Value> + Send>(mut self, name: impl AsRef<str>, value: V) -> Self {
|
||||
self.inner = self.inner.field(name.as_ref(), value);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn build(self) -> Caps {
|
||||
Caps {
|
||||
inner: self.inner.build(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub enum CapsType {
|
||||
Video,
|
||||
Audio,
|
||||
Text,
|
||||
}
|
||||
|
||||
impl CapsType {
|
||||
pub fn as_str(&self) -> &str {
|
||||
match self {
|
||||
CapsType::Video => "video/x-raw",
|
||||
CapsType::Audio => "audio/x-raw",
|
||||
CapsType::Text => "text/x-raw",
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl CapsBuilder {
|
||||
pub fn new(cs: CapsType) -> Self {
|
||||
CapsBuilder {
|
||||
inner: gstreamer::Caps::builder(cs.as_str()),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Caps {
|
||||
pub fn format(&self) -> Option<gstreamer_video::VideoFormat> {
|
||||
self.inner
|
||||
.structure(0)
|
||||
.and_then(|s| s.get::<&str>("format").ok())
|
||||
.map(|s| gstreamer_video::VideoFormat::from_string(s))
|
||||
}
|
||||
pub fn width(&self) -> Option<i32> {
|
||||
self.inner
|
||||
.structure(0)
|
||||
.and_then(|s| s.get::<i32>("width").ok())
|
||||
}
|
||||
pub fn height(&self) -> Option<i32> {
|
||||
self.inner
|
||||
.structure(0)
|
||||
.and_then(|s| s.get::<i32>("height").ok())
|
||||
}
|
||||
pub fn framerate(&self) -> Option<gstreamer::Fraction> {
|
||||
self.inner
|
||||
.structure(0)
|
||||
.and_then(|s| s.get::<Fraction>("framerate").ok())
|
||||
}
|
||||
}
|
||||
133
gst/src/element.rs
Normal file
133
gst/src/element.rs
Normal file
@@ -0,0 +1,133 @@
|
||||
use crate::priv_prelude::*;
|
||||
use crate::wrap_gst;
|
||||
|
||||
wrap_gst!(Element, gstreamer::Element);
|
||||
|
||||
// pub trait IsElement {
|
||||
// fn upcast_ref(&self) -> ∈
|
||||
// fn into_element(self) -> Element;
|
||||
// fn pad(&self, name: &str) -> Option<Pad> {
|
||||
// use gstreamer::prelude::*;
|
||||
// self.upcast_ref().inner.static_pad(name).map(Pad::from)
|
||||
// }
|
||||
// }
|
||||
|
||||
// impl IsElement for Element {
|
||||
// fn upcast_ref(&self) -> &Element {
|
||||
// self
|
||||
// }
|
||||
//
|
||||
// fn into_element(self) -> Element {
|
||||
// self
|
||||
// }
|
||||
// }
|
||||
|
||||
impl Element {
|
||||
pub fn pad(&self, name: impl AsRef<str>) -> Option<Pad> {
|
||||
use gstreamer::prelude::*;
|
||||
self.inner.static_pad(name.as_ref()).map(Pad::from)
|
||||
}
|
||||
|
||||
pub fn bus(&self) -> Result<Bus> {
|
||||
use gstreamer::prelude::*;
|
||||
self.inner
|
||||
.bus()
|
||||
.map(Bus::from)
|
||||
.ok_or(Error)
|
||||
.attach_with(|| format!("Failed to get bus from Element: {}", self.inner.name()))
|
||||
}
|
||||
}
|
||||
|
||||
pub trait Sink: ChildOf<Element> {
|
||||
fn sink(&self, name: impl AsRef<str>) -> Pad {
|
||||
self.upcast_ref()
|
||||
.pad(name)
|
||||
.expect("Sink element has no sink pad")
|
||||
}
|
||||
}
|
||||
pub trait Source: ChildOf<Element> {
|
||||
fn source(&self, name: impl AsRef<str>) -> Pad {
|
||||
self.upcast_ref()
|
||||
.pad(name)
|
||||
.expect("Source element has no src pad")
|
||||
}
|
||||
|
||||
fn link<S: Sink>(&self, sink: &S) -> Result<Bin>
|
||||
where
|
||||
Self: Sized,
|
||||
{
|
||||
use gstreamer::prelude::*;
|
||||
if let Ok(bin) = self.upcast_ref().inner.clone().downcast::<gstreamer::Bin>() {
|
||||
bin.add(&sink.upcast_ref().inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to add sink to bin")?;
|
||||
self.upcast_ref()
|
||||
.inner
|
||||
.link(&sink.upcast_ref().inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to link elements")?;
|
||||
Ok(Bin::from(bin))
|
||||
} else {
|
||||
let bin = gstreamer::Bin::builder()
|
||||
.name(format!(
|
||||
"{}-link-{}",
|
||||
self.upcast_ref().inner.name(),
|
||||
sink.upcast_ref().inner.name()
|
||||
))
|
||||
.build();
|
||||
bin.add(&self.upcast_ref().inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to add source to bin")?;
|
||||
bin.add(&sink.upcast_ref().inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to add sink to bin")?;
|
||||
self.upcast_ref()
|
||||
.inner
|
||||
.link(&sink.upcast_ref().inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to link elements")?;
|
||||
if let Some(sink_pad) = self.upcast_ref().pad("sink") {
|
||||
let ghost_pad = Pad::ghost(&sink_pad)?;
|
||||
bin.add_pad(&ghost_pad.inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to add src pad to bin")?;
|
||||
ghost_pad.activate(true)?;
|
||||
}
|
||||
Ok(From::from(bin))
|
||||
}
|
||||
}
|
||||
|
||||
// fn link_pad<S: Sink>(&self, sink: &S, src_pad_name: &str, sink_pad_name: &str) -> Result<()> {
|
||||
// use gstreamer::prelude::*;
|
||||
// let src_pad = self
|
||||
// .upcast_ref()
|
||||
// .pad(src_pad_name)
|
||||
// .ok_or(Error)
|
||||
// .attach("Source pad not found")?;
|
||||
// let sink_pad = sink
|
||||
// .upcast_ref()
|
||||
// .pad(sink_pad_name)
|
||||
// .ok_or(Error)
|
||||
// .attach("Sink pad not found")?;
|
||||
// src_pad
|
||||
// .inner
|
||||
// .link(&sink_pad.inner)
|
||||
// .change_context(Error)
|
||||
// .attach("Failed to link source pad to sink pad")?;
|
||||
// Ok(())
|
||||
// }
|
||||
}
|
||||
|
||||
pub trait ElementExt: ChildOf<Element> + Sync {
|
||||
#[track_caller]
|
||||
fn bus(&self) -> Result<Bus> {
|
||||
self.upcast_ref().bus()
|
||||
}
|
||||
|
||||
#[track_caller]
|
||||
fn pad(&self, name: impl AsRef<str>) -> Option<Pad> {
|
||||
self.upcast_ref().pad(name)
|
||||
}
|
||||
}
|
||||
|
||||
impl<T: ChildOf<Element> + Sync> ElementExt for T {}
|
||||
7
gst/src/errors.rs
Normal file
7
gst/src/errors.rs
Normal file
@@ -0,0 +1,7 @@
|
||||
pub use error_stack::{Report, ResultExt};
|
||||
|
||||
#[derive(Debug, thiserror::Error)]
|
||||
#[error("An error occurred")]
|
||||
pub struct Error;
|
||||
|
||||
pub type Result<T, E = error_stack::Report<Error>> = core::result::Result<T, E>;
|
||||
64
gst/src/lib.rs
Normal file
64
gst/src/lib.rs
Normal file
@@ -0,0 +1,64 @@
|
||||
pub mod bin;
|
||||
pub mod bus;
|
||||
pub mod caps;
|
||||
pub mod element;
|
||||
pub mod errors;
|
||||
pub mod pad;
|
||||
pub mod pipeline;
|
||||
pub mod plugins;
|
||||
#[macro_use]
|
||||
pub mod wrapper;
|
||||
pub mod sample;
|
||||
|
||||
pub use bin::*;
|
||||
pub use bus::*;
|
||||
pub use caps::*;
|
||||
pub use element::*;
|
||||
pub use gstreamer;
|
||||
#[doc(inline)]
|
||||
pub use gstreamer::{Message, MessageType, MessageView, State};
|
||||
pub use gstreamer_video::VideoFormat;
|
||||
pub use pad::*;
|
||||
pub use pipeline::*;
|
||||
pub use plugins::*;
|
||||
pub use sample::*;
|
||||
|
||||
pub(crate) mod priv_prelude {
|
||||
pub use crate::errors::*;
|
||||
pub use crate::wrapper::*;
|
||||
pub use crate::*;
|
||||
pub use gstreamer::prelude::ElementExt as _;
|
||||
pub use gstreamer::prelude::*;
|
||||
#[track_caller]
|
||||
pub fn duration_to_clocktime(
|
||||
timeout: impl Into<Option<core::time::Duration>>,
|
||||
) -> Result<Option<gstreamer::ClockTime>> {
|
||||
match timeout.into() {
|
||||
Some(dur) => {
|
||||
let clocktime = gstreamer::ClockTime::try_from(dur)
|
||||
.change_context(Error)
|
||||
.attach("Failed to convert duration to ClockTime")?;
|
||||
Ok(Some(clocktime))
|
||||
}
|
||||
None => Ok(gstreamer::ClockTime::NONE),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
use std::sync::Arc;
|
||||
static GST: std::sync::LazyLock<std::sync::Arc<Gst>> = std::sync::LazyLock::new(|| {
|
||||
gstreamer::init().expect("Failed to initialize GStreamer");
|
||||
std::sync::Arc::new(Gst {
|
||||
__private: core::marker::PhantomData,
|
||||
})
|
||||
});
|
||||
|
||||
pub struct Gst {
|
||||
__private: core::marker::PhantomData<()>,
|
||||
}
|
||||
|
||||
impl Gst {
|
||||
pub fn new() -> Arc<Self> {
|
||||
Arc::clone(&GST)
|
||||
}
|
||||
}
|
||||
45
gst/src/pad.rs
Normal file
45
gst/src/pad.rs
Normal file
@@ -0,0 +1,45 @@
|
||||
use crate::priv_prelude::*;
|
||||
|
||||
wrap_gst!(Pad, gstreamer::Pad);
|
||||
|
||||
impl Pad {
|
||||
#[track_caller]
|
||||
pub fn ghost(target: &Pad) -> Result<Pad> {
|
||||
let ghost_pad = gstreamer::GhostPad::with_target(&target.inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to create ghost pad")?;
|
||||
Ok(Pad {
|
||||
inner: ghost_pad.upcast(),
|
||||
})
|
||||
}
|
||||
|
||||
#[track_caller]
|
||||
pub fn link(&self, peer: &Pad) -> Result<()> {
|
||||
use gstreamer::prelude::*;
|
||||
self.inner
|
||||
.link(&peer.inner)
|
||||
.change_context(Error)
|
||||
.attach("Failed to link pads")?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[track_caller]
|
||||
pub fn current_caps(&self) -> Result<Caps> {
|
||||
let caps = self
|
||||
.inner
|
||||
.current_caps()
|
||||
.ok_or(Error)
|
||||
.attach("Failed to get pad caps")?;
|
||||
Ok(Caps { inner: caps })
|
||||
}
|
||||
|
||||
#[track_caller]
|
||||
pub fn activate(&self, activate: bool) -> Result<()> {
|
||||
use gstreamer::prelude::*;
|
||||
self.inner
|
||||
.set_active(activate)
|
||||
.change_context(Error)
|
||||
.attach("Failed to set_active pad")?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
211
gst/src/pipeline.rs
Normal file
211
gst/src/pipeline.rs
Normal file
@@ -0,0 +1,211 @@
|
||||
use crate::priv_prelude::*;
|
||||
|
||||
wrap_gst!(Pipeline);
|
||||
parent_child!(Element, Pipeline);
|
||||
parent_child!(Bin, Pipeline);
|
||||
|
||||
impl Drop for Pipeline {
|
||||
fn drop(&mut self) {
|
||||
let _ = self.inner.set_state(gstreamer::State::Null);
|
||||
}
|
||||
}
|
||||
|
||||
impl Pipeline {
|
||||
#[track_caller]
|
||||
pub fn bus(&self) -> Result<Bus> {
|
||||
let bus = self
|
||||
.inner
|
||||
.bus()
|
||||
.ok_or(Error)
|
||||
.attach("Failed to get bus from pipeline")?;
|
||||
Ok(Bus::from_gst(bus))
|
||||
}
|
||||
|
||||
/// Get the state
|
||||
pub fn state(
|
||||
&self,
|
||||
timeout: impl Into<Option<core::time::Duration>>,
|
||||
) -> Result<gstreamer::State> {
|
||||
let (result, current, _pending) = self.inner.state(duration_to_clocktime(timeout)?);
|
||||
result.change_context(Error).attach("Failed to get state")?;
|
||||
Ok(current)
|
||||
}
|
||||
|
||||
pub fn play(&self) -> Result<()> {
|
||||
self.inner
|
||||
.set_state(gstreamer::State::Playing)
|
||||
.change_context(Error)
|
||||
.attach("Failed to set pipeline to Playing state")?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn pause(&self) -> Result<()> {
|
||||
self.inner
|
||||
.set_state(gstreamer::State::Paused)
|
||||
.change_context(Error)
|
||||
.attach("Failed to set pipeline to Paused state")?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn ready(&self) -> Result<()> {
|
||||
self.inner
|
||||
.set_state(gstreamer::State::Ready)
|
||||
.change_context(Error)
|
||||
.attach("Failed to set pipeline to Ready state")?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn stop(&self) -> Result<()> {
|
||||
self.inner
|
||||
.set_state(gstreamer::State::Null)
|
||||
.change_context(Error)
|
||||
.attach("Failed to set pipeline to Null state")?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn set_state(&self, state: gstreamer::State) -> Result<gstreamer::StateChangeSuccess> {
|
||||
let result = self
|
||||
.inner
|
||||
.set_state(state)
|
||||
.change_context(Error)
|
||||
.attach("Failed to set pipeline state")?;
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
pub async fn wait_for(&self, state: gstreamer::State) -> Result<()> {
|
||||
let current_state = self.state(core::time::Duration::ZERO)?;
|
||||
if current_state == state {
|
||||
Ok(())
|
||||
} else {
|
||||
// use futures::stream::StreamExt;
|
||||
use futures_lite::stream::StreamExt as _;
|
||||
self.bus()?
|
||||
.filtered_stream(&[MessageType::StateChanged])
|
||||
.find(|message: &gstreamer::Message| {
|
||||
let view = message.view();
|
||||
if let gstreamer::MessageView::StateChanged(changed) = view {
|
||||
changed.current() == state
|
||||
&& changed.src().is_some_and(|s| s == &self.inner)
|
||||
} else {
|
||||
false
|
||||
}
|
||||
})
|
||||
.await;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn wait_for_states(&self, states: impl AsRef<[gstreamer::State]>) -> Result<()> {
|
||||
let current_state = self.state(core::time::Duration::ZERO)?;
|
||||
let states = states.as_ref();
|
||||
if states.contains(¤t_state) {
|
||||
Ok(())
|
||||
} else {
|
||||
use futures_lite::stream::StreamExt as _;
|
||||
self.bus()?
|
||||
.filtered_stream(&[MessageType::StateChanged])
|
||||
.find(|message: &gstreamer::Message| {
|
||||
let view = message.view();
|
||||
if let gstreamer::MessageView::StateChanged(changed) = view {
|
||||
states.contains(&changed.current())
|
||||
&& changed.src().is_some_and(|s| s == &self.inner)
|
||||
} else {
|
||||
false
|
||||
}
|
||||
})
|
||||
.await;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn wait_for_message<'a, F2>(
|
||||
&self,
|
||||
filter: Option<&'a [gstreamer::MessageType]>,
|
||||
filter_fn: F2,
|
||||
) -> Result<gstreamer::Message>
|
||||
where
|
||||
F2: Fn(&gstreamer::Message) -> bool + Send + 'a,
|
||||
{
|
||||
use futures_lite::stream::StreamExt as _;
|
||||
match filter {
|
||||
Some(filter) => {
|
||||
let message = self.bus()?.filtered_stream(filter).find(filter_fn).await;
|
||||
match message {
|
||||
Some(msg) => Ok(msg),
|
||||
None => {
|
||||
Err(Error).attach("Failed to find message matching the provided filter")
|
||||
}
|
||||
}
|
||||
}
|
||||
None => {
|
||||
let message = self.bus()?.stream().find(filter_fn).await;
|
||||
match message {
|
||||
Some(msg) => Ok(msg),
|
||||
None => {
|
||||
Err(Error).attach("Failed to find message matching the provided filter")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub trait PipelineExt: ChildOf<Pipeline> + Sync {
|
||||
// #[track_caller]
|
||||
// fn bus(&self) -> Result<Bus> {
|
||||
// self.upcast_ref().bus()
|
||||
// }
|
||||
#[track_caller]
|
||||
fn play(&self) -> Result<()> {
|
||||
self.upcast_ref().play()
|
||||
}
|
||||
#[track_caller]
|
||||
fn pause(&self) -> Result<()> {
|
||||
self.upcast_ref().pause()
|
||||
}
|
||||
#[track_caller]
|
||||
fn ready(&self) -> Result<()> {
|
||||
self.upcast_ref().ready()
|
||||
}
|
||||
|
||||
#[track_caller]
|
||||
fn stop(&self) -> Result<()> {
|
||||
self.upcast_ref().stop()
|
||||
}
|
||||
|
||||
#[track_caller]
|
||||
fn set_state(&self, state: gstreamer::State) -> Result<gstreamer::StateChangeSuccess> {
|
||||
self.upcast_ref().set_state(state)
|
||||
}
|
||||
#[track_caller]
|
||||
fn state(&self, timeout: impl Into<Option<core::time::Duration>>) -> Result<gstreamer::State> {
|
||||
self.upcast_ref().state(timeout)
|
||||
}
|
||||
|
||||
fn wait_for(
|
||||
&self,
|
||||
state: gstreamer::State,
|
||||
) -> impl std::future::Future<Output = Result<()>> + Send {
|
||||
self.upcast_ref().wait_for(state)
|
||||
}
|
||||
|
||||
fn wait_for_states(
|
||||
&self,
|
||||
states: impl AsRef<[gstreamer::State]> + Send,
|
||||
) -> impl std::future::Future<Output = Result<()>> + Send {
|
||||
self.upcast_ref().wait_for_states(states)
|
||||
}
|
||||
|
||||
fn wait_for_message<'a, F2>(
|
||||
&self,
|
||||
filter: Option<&'a [gstreamer::MessageType]>,
|
||||
filter_fn: F2,
|
||||
) -> impl std::future::Future<Output = Result<gstreamer::Message>> + Send
|
||||
where
|
||||
F2: Fn(&gstreamer::Message) -> bool + Send + 'a,
|
||||
{
|
||||
self.upcast_ref().wait_for_message(filter, filter_fn)
|
||||
}
|
||||
}
|
||||
|
||||
impl<T: ChildOf<Pipeline> + Sync> PipelineExt for T {}
|
||||
4
gst/src/plugins.rs
Normal file
4
gst/src/plugins.rs
Normal file
@@ -0,0 +1,4 @@
|
||||
pub mod app;
|
||||
pub mod autodetect;
|
||||
pub mod playback;
|
||||
pub mod videoconvertscale;
|
||||
2
gst/src/plugins/app.rs
Normal file
2
gst/src/plugins/app.rs
Normal file
@@ -0,0 +1,2 @@
|
||||
pub mod appsink;
|
||||
pub use appsink::*;
|
||||
278
gst/src/plugins/app/appsink.rs
Normal file
278
gst/src/plugins/app/appsink.rs
Normal file
@@ -0,0 +1,278 @@
|
||||
use crate::priv_prelude::*;
|
||||
|
||||
#[doc(inline)]
|
||||
pub use gstreamer_app::AppSinkCallbacks;
|
||||
|
||||
wrap_gst!(AppSink, gstreamer::Element);
|
||||
parent_child!(Element, AppSink);
|
||||
|
||||
pub struct AppSinkBuilder {
|
||||
inner: AppSink,
|
||||
callbacks: Option<gstreamer_app::app_sink::AppSinkCallbacksBuilder>,
|
||||
}
|
||||
|
||||
impl AppSinkBuilder {
|
||||
pub fn on_new_sample<F>(mut self, mut f: F) -> Self
|
||||
where
|
||||
F: FnMut(&AppSink) -> Result<(), gstreamer::FlowError> + Send + 'static,
|
||||
{
|
||||
let mut callbacks_builder = self
|
||||
.callbacks
|
||||
.take()
|
||||
.unwrap_or_else(gstreamer_app::app_sink::AppSinkCallbacks::builder);
|
||||
callbacks_builder = callbacks_builder.new_sample(move |appsink| {
|
||||
use glib::object::Cast;
|
||||
let element = appsink.upcast_ref::<gstreamer::Element>();
|
||||
let appsink = AppSink::from_gst_ref(element);
|
||||
std::panic::catch_unwind(std::panic::AssertUnwindSafe(|| f(appsink)))
|
||||
.unwrap_or(Err(gstreamer::FlowError::Error))
|
||||
.map(|_| gstreamer::FlowSuccess::Ok)
|
||||
});
|
||||
self.callbacks = Some(callbacks_builder);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn on_new_preroll<F>(mut self, mut f: F) -> Self
|
||||
where
|
||||
F: FnMut(&AppSink) -> Result<(), gstreamer::FlowError> + Send + 'static,
|
||||
{
|
||||
let mut callbacks_builder = self
|
||||
.callbacks
|
||||
.take()
|
||||
.unwrap_or_else(gstreamer_app::app_sink::AppSinkCallbacks::builder);
|
||||
callbacks_builder = callbacks_builder.new_preroll(move |appsink| {
|
||||
use glib::object::Cast;
|
||||
let element = appsink.upcast_ref::<gstreamer::Element>();
|
||||
let appsink = AppSink::from_gst_ref(element);
|
||||
std::panic::catch_unwind(std::panic::AssertUnwindSafe(|| f(appsink)))
|
||||
.unwrap_or(Err(gstreamer::FlowError::Error))
|
||||
.map(|_| gstreamer::FlowSuccess::Ok)
|
||||
});
|
||||
self.callbacks = Some(callbacks_builder);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn build(self) -> AppSink {
|
||||
let AppSinkBuilder { inner, callbacks } = self;
|
||||
if let Some(callbacks) = callbacks {
|
||||
inner.appsink().set_callbacks(callbacks.build());
|
||||
}
|
||||
inner
|
||||
}
|
||||
}
|
||||
|
||||
impl Sink for AppSink {}
|
||||
|
||||
impl AppSink {
|
||||
pub fn builder(name: impl AsRef<str>) -> AppSinkBuilder {
|
||||
let inner = AppSink::new(name).expect("Failed to create AppSink");
|
||||
AppSinkBuilder {
|
||||
inner,
|
||||
callbacks: None,
|
||||
}
|
||||
}
|
||||
fn appsink(&self) -> &gstreamer_app::AppSink {
|
||||
self.inner
|
||||
.downcast_ref::<gstreamer_app::AppSink>()
|
||||
.expect("Failed to downcast to AppSink")
|
||||
}
|
||||
|
||||
pub fn new(name: impl AsRef<str>) -> Result<Self> {
|
||||
let inner = gstreamer::ElementFactory::make("appsink")
|
||||
.name(name.as_ref())
|
||||
.build()
|
||||
.change_context(Error)
|
||||
.attach("Failed to create appsink element")?;
|
||||
Ok(AppSink { inner })
|
||||
}
|
||||
|
||||
pub fn emit_signals(&mut self, emit: bool) -> &mut Self {
|
||||
self.inner.set_property("emit-signals", emit);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn async_(&mut self, async_: bool) -> &mut Self {
|
||||
self.inner.set_property("async", async_);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn sync(&mut self, sync: bool) -> &mut Self {
|
||||
self.inner.set_property("sync", sync);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn drop(&mut self, drop: bool) -> &mut Self {
|
||||
self.inner.set_property("drop", drop);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn caps(&mut self, caps: Caps) -> &mut Self {
|
||||
self.inner.set_property("caps", caps.inner);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn callbacks(&mut self, callbacks: gstreamer_app::AppSinkCallbacks) -> &mut Self {
|
||||
self.appsink().set_callbacks(callbacks);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn on_new_sample<F>(&mut self, mut f: F) -> &mut Self
|
||||
where
|
||||
F: FnMut(&AppSink) -> Result<(), gstreamer::FlowError> + Send + 'static,
|
||||
{
|
||||
self.emit_signals(true).callbacks(
|
||||
AppSinkCallbacks::builder()
|
||||
.new_sample(move |appsink| {
|
||||
use glib::object::Cast;
|
||||
let element = appsink.upcast_ref::<gstreamer::Element>();
|
||||
let appsink = AppSink::from_gst_ref(element);
|
||||
std::panic::catch_unwind(std::panic::AssertUnwindSafe(|| f(appsink)))
|
||||
.unwrap_or(Err(gstreamer::FlowError::Error))
|
||||
.map(|_| gstreamer::FlowSuccess::Ok)
|
||||
})
|
||||
.build(),
|
||||
)
|
||||
}
|
||||
|
||||
pub fn pull_sample(&self) -> Result<Sample> {
|
||||
self.appsink()
|
||||
.pull_sample()
|
||||
.change_context(Error)
|
||||
.attach("Failed to pull sample from AppSink")
|
||||
.map(Sample::from)
|
||||
}
|
||||
pub fn try_pull_sample(
|
||||
&self,
|
||||
timeout: impl Into<Option<core::time::Duration>>,
|
||||
) -> Result<Option<Sample>> {
|
||||
Ok(self
|
||||
.appsink()
|
||||
.try_pull_sample(duration_to_clocktime(timeout)?)
|
||||
.map(From::from))
|
||||
}
|
||||
|
||||
pub fn pull_preroll(&self) -> Result<Sample> {
|
||||
self.appsink()
|
||||
.pull_preroll()
|
||||
.change_context(Error)
|
||||
.attach("Failed to pull preroll sample from AppSink")
|
||||
.map(Sample::from)
|
||||
}
|
||||
|
||||
pub fn try_pull_preroll(
|
||||
&self,
|
||||
timeout: impl Into<Option<core::time::Duration>>,
|
||||
) -> Result<Option<Sample>> {
|
||||
Ok(self
|
||||
.appsink()
|
||||
.try_pull_preroll(duration_to_clocktime(timeout)?)
|
||||
.map(From::from))
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_appsink() {
|
||||
use gstreamer::prelude::*;
|
||||
use tracing_subscriber::prelude::*;
|
||||
tracing_subscriber::registry()
|
||||
.with(
|
||||
tracing_subscriber::fmt::layer()
|
||||
.with_thread_ids(true)
|
||||
.with_file(true),
|
||||
)
|
||||
.init();
|
||||
tracing::info!("Linking videoconvert to appsink");
|
||||
Gst::new();
|
||||
let playbin3 = playback::Playbin3::new("pppppppppppppppppppppppppppppp").unwrap().with_uri("https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c");
|
||||
|
||||
let video_convert = plugins::videoconvertscale::VideoConvert::new("vcvcvcvcvcvcvcvcvcvcvcvcvc")
|
||||
.expect("Create videoconvert");
|
||||
let mut appsink = app::AppSink::new("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa").expect("Create appsink");
|
||||
appsink.caps(
|
||||
Caps::builder(CapsType::Video)
|
||||
.field("format", "RGB")
|
||||
.build(),
|
||||
);
|
||||
|
||||
let video_sink = video_convert
|
||||
.link(&appsink)
|
||||
.expect("Link videoconvert to appsink");
|
||||
|
||||
let playbin3 = playbin3.with_video_sink(&video_sink);
|
||||
playbin3.play().expect("Play video");
|
||||
let bus = playbin3.bus().unwrap();
|
||||
for msg in bus.iter_timed(None) {
|
||||
match msg.view() {
|
||||
gstreamer::MessageView::Eos(..) => {
|
||||
tracing::info!("End of stream reached");
|
||||
break;
|
||||
}
|
||||
gstreamer::MessageView::Error(err) => {
|
||||
tracing::error!(
|
||||
"Error from {:?}: {} ({:?})",
|
||||
err.src().map(|s| s.path_string()),
|
||||
err.error(),
|
||||
err.debug()
|
||||
);
|
||||
break;
|
||||
}
|
||||
gstreamer::MessageView::StateChanged(state) => {
|
||||
eprintln!(
|
||||
"State changed from {:?} to \x1b[33m{:?}\x1b[0m for {:?}",
|
||||
state.old(),
|
||||
state.current(),
|
||||
state.src().map(|s| s.path_string())
|
||||
);
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
// tracing::info!("{:#?}", &msg.view());
|
||||
}
|
||||
// std::thread::sleep(std::time::Duration::from_secs(5));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_appsink_metadata() {
|
||||
use tracing_subscriber::prelude::*;
|
||||
tracing_subscriber::registry()
|
||||
.with(
|
||||
tracing_subscriber::fmt::layer()
|
||||
.with_thread_ids(true)
|
||||
.with_file(true),
|
||||
)
|
||||
.init();
|
||||
|
||||
crate::Gst::new();
|
||||
|
||||
let url = "https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c";
|
||||
|
||||
let videoconvert = crate::plugins::videoconvertscale::VideoConvert::new("iced-video-convert")
|
||||
// .unwrap();
|
||||
// .with_output_format(gst::plugins::videoconvertscale::VideoFormat::Rgba)
|
||||
.unwrap();
|
||||
let appsink = crate::plugins::app::AppSink::new("iced-video-sink")
|
||||
.unwrap()
|
||||
.with_async(true)
|
||||
.with_sync(true);
|
||||
|
||||
let video_sink = videoconvert.link(&appsink).unwrap();
|
||||
let playbin = crate::plugins::playback::Playbin3::new("iced-video")
|
||||
.unwrap()
|
||||
.with_uri(url)
|
||||
.with_video_sink(&video_sink);
|
||||
|
||||
playbin.pause().unwrap();
|
||||
|
||||
smol::block_on(async {
|
||||
playbin.wait_for(gstreamer::State::Paused).await.unwrap();
|
||||
});
|
||||
// std::thread::sleep(core::time::Duration::from_secs(1));
|
||||
let pad = appsink.pad("sink").unwrap();
|
||||
let caps = pad.current_caps().unwrap();
|
||||
let format = caps.format();
|
||||
let height = caps.height();
|
||||
let width = caps.width();
|
||||
let framerate = caps.framerate();
|
||||
dbg!(&format, height, width, framerate);
|
||||
dbg!(&caps);
|
||||
}
|
||||
2
gst/src/plugins/autodetect.rs
Normal file
2
gst/src/plugins/autodetect.rs
Normal file
@@ -0,0 +1,2 @@
|
||||
pub mod autovideosink;
|
||||
pub use autovideosink::*;
|
||||
18
gst/src/plugins/autodetect/autovideosink.rs
Normal file
18
gst/src/plugins/autodetect/autovideosink.rs
Normal file
@@ -0,0 +1,18 @@
|
||||
use crate::priv_prelude::*;
|
||||
|
||||
wrap_gst!(AutoVideoSink, gstreamer::Element);
|
||||
parent_child!(Element, AutoVideoSink);
|
||||
parent_child!(Bin, AutoVideoSink, downcast);
|
||||
|
||||
impl Sink for AutoVideoSink {}
|
||||
|
||||
impl AutoVideoSink {
|
||||
pub fn new(name: impl AsRef<str>) -> Result<Self> {
|
||||
let element = gstreamer::ElementFactory::make("autovideosink")
|
||||
.name(name.as_ref())
|
||||
.build()
|
||||
.change_context(Error)
|
||||
.attach("Failed to create autovideosink element")?;
|
||||
Ok(AutoVideoSink { inner: element })
|
||||
}
|
||||
}
|
||||
71
gst/src/plugins/playback.rs
Normal file
71
gst/src/plugins/playback.rs
Normal file
@@ -0,0 +1,71 @@
|
||||
pub mod playbin3;
|
||||
pub use playbin3::*;
|
||||
pub mod playbin;
|
||||
pub use playbin::*;
|
||||
|
||||
bitflags::bitflags! {
|
||||
/// Extra flags to configure the behaviour of the sinks.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
pub struct PlayFlags: u32 {
|
||||
/// Render the video stream
|
||||
const VIDEO = (1 << 0);
|
||||
/// Render the audio stream
|
||||
const AUDIO = (1 << 1);
|
||||
/// Render subtitles
|
||||
const TEXT = (1 << 2);
|
||||
/// Render visualisation when no video is present
|
||||
const VIS = (1 << 3);
|
||||
/// Use software volume
|
||||
const SOFT_VOLUME = (1 << 4);
|
||||
/// Only use native audio formats
|
||||
const NATIVE_AUDIO = (1 << 5);
|
||||
/// Only use native video formats
|
||||
const NATIVE_VIDEO = (1 << 6);
|
||||
/// Attempt progressive download buffering
|
||||
const DOWNLOAD = (1 << 7);
|
||||
/// Buffer demuxed/parsed data
|
||||
const BUFFERING = (1 << 8);
|
||||
/// Deinterlace video if necessary
|
||||
const DEINTERLACE = (1 << 9);
|
||||
/// Use software color balance
|
||||
const SOFT_COLORBALANCE = (1 << 10);
|
||||
/// Force audio/video filter(s) to be applied
|
||||
const FORCE_FILTERS = (1 << 11);
|
||||
/// Force only software-based decoders (no effect for playbin3)
|
||||
const FORCE_SW_DECODERS = (1 << 12);
|
||||
}
|
||||
}
|
||||
|
||||
const _: () = {
|
||||
use glib::types::StaticType;
|
||||
impl glib::types::StaticType for PlayFlags {
|
||||
#[inline]
|
||||
#[doc(alias = "gst_play_flags_get_type")]
|
||||
fn static_type() -> glib::Type {
|
||||
glib::Type::from_name("GstPlayFlags").expect("GstPlayFlags type not found")
|
||||
}
|
||||
}
|
||||
|
||||
impl glib::value::ToValue for PlayFlags {
|
||||
#[inline]
|
||||
fn to_value(&self) -> glib::Value {
|
||||
let value = self.bits().to_value();
|
||||
value
|
||||
.transform_with_type(Self::static_type())
|
||||
.expect("Failed to transform PlayFlags(u32) to GstPlayFlags")
|
||||
}
|
||||
|
||||
#[inline]
|
||||
fn value_type(&self) -> glib::Type {
|
||||
Self::static_type()
|
||||
}
|
||||
}
|
||||
|
||||
impl From<PlayFlags> for glib::Value {
|
||||
#[inline]
|
||||
fn from(v: PlayFlags) -> Self {
|
||||
// skip_assert_initialized!();
|
||||
glib::value::ToValue::to_value(&v)
|
||||
}
|
||||
}
|
||||
};
|
||||
82
gst/src/plugins/playback/playbin.rs
Normal file
82
gst/src/plugins/playback/playbin.rs
Normal file
@@ -0,0 +1,82 @@
|
||||
use crate::priv_prelude::*;
|
||||
|
||||
wrap_gst!(Playbin, gstreamer::Element);
|
||||
parent_child!(Element, Playbin);
|
||||
parent_child!(Pipeline, Playbin, downcast);
|
||||
parent_child!(Bin, Playbin, downcast);
|
||||
|
||||
impl Drop for Playbin {
|
||||
fn drop(&mut self) {
|
||||
self.set_state(gstreamer::State::Null).ok();
|
||||
}
|
||||
}
|
||||
|
||||
impl Playbin {
|
||||
pub fn new(name: impl AsRef<str>) -> Result<Self> {
|
||||
gstreamer::ElementFactory::make("playbin3")
|
||||
.name(name.as_ref())
|
||||
.build()
|
||||
.map(|element| Playbin { inner: element })
|
||||
.change_context(Error)
|
||||
}
|
||||
|
||||
pub fn with_uri(self, uri: impl AsRef<str>) -> Self {
|
||||
self.inner.set_property("uri", uri.as_ref());
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_buffer_duration(self, duration: impl Into<Option<core::time::Duration>>) -> Self {
|
||||
let duration = match duration.into() {
|
||||
Some(dur) => dur.as_secs() as i64,
|
||||
None => -1,
|
||||
};
|
||||
self.inner.set_property("buffer-duration", duration);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_buffer_size(self, size: impl Into<Option<u32>>) -> Self {
|
||||
let size = match size.into() {
|
||||
Some(size) => size as i32,
|
||||
None => -1,
|
||||
};
|
||||
self.inner.set_property("buffer-size", size);
|
||||
self
|
||||
}
|
||||
|
||||
/// Sets the maximum size of the ring buffer in bytes.
|
||||
pub fn with_ring_buffer_max_size(self, size: u64) -> Self {
|
||||
self.inner.set_property("ring-buffer-max-size", size);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_video_sink(self, video_sink: &impl ChildOf<Element>) -> Self {
|
||||
self.inner
|
||||
.set_property("video-sink", &video_sink.upcast_ref().inner);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_text_sink(self, text_sink: &impl ChildOf<Element>) -> Self {
|
||||
self.inner
|
||||
.set_property("text-sink", &text_sink.upcast_ref().inner);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_audio_sink(self, audio_sink: &impl ChildOf<Element>) -> Self {
|
||||
self.inner
|
||||
.set_property("audio-sink", &audio_sink.upcast_ref().inner);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn set_volume(&self, volume: f64) {
|
||||
self.inner.set_property("volume", volume.clamp(1.0, 100.0))
|
||||
}
|
||||
|
||||
pub fn get_volume(&self) -> f64 {
|
||||
self.inner.property::<f64>("volume")
|
||||
}
|
||||
|
||||
pub fn with_flags(self, flags: playback::PlayFlags) -> Self {
|
||||
self.inner.set_property("flags", flags);
|
||||
self
|
||||
}
|
||||
}
|
||||
95
gst/src/plugins/playback/playbin3.rs
Normal file
95
gst/src/plugins/playback/playbin3.rs
Normal file
@@ -0,0 +1,95 @@
|
||||
use crate::priv_prelude::*;
|
||||
use playback::PlayFlags;
|
||||
|
||||
wrap_gst!(Playbin3, gstreamer::Element);
|
||||
parent_child!(Element, Playbin3);
|
||||
parent_child!(Pipeline, Playbin3, downcast);
|
||||
parent_child!(Bin, Playbin3, downcast);
|
||||
|
||||
impl Drop for Playbin3 {
|
||||
fn drop(&mut self) {
|
||||
self.set_state(gstreamer::State::Null).ok();
|
||||
}
|
||||
}
|
||||
|
||||
impl Playbin3 {
|
||||
pub fn new(name: impl AsRef<str>) -> Result<Self> {
|
||||
gstreamer::ElementFactory::make("playbin3")
|
||||
.name(name.as_ref())
|
||||
.build()
|
||||
.map(|element| Playbin3 { inner: element })
|
||||
.change_context(Error)
|
||||
}
|
||||
|
||||
pub fn with_uri(self, uri: impl AsRef<str>) -> Self {
|
||||
self.inner.set_property("uri", uri.as_ref());
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_buffer_duration(self, duration: impl Into<Option<core::time::Duration>>) -> Self {
|
||||
let duration = match duration.into() {
|
||||
Some(dur) => dur.as_secs() as i64,
|
||||
None => -1,
|
||||
};
|
||||
self.inner.set_property("buffer-duration", duration);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_buffer_size(self, size: impl Into<Option<u32>>) -> Self {
|
||||
let size = match size.into() {
|
||||
Some(size) => size as i32,
|
||||
None => -1,
|
||||
};
|
||||
self.inner.set_property("buffer-size", size);
|
||||
self
|
||||
}
|
||||
|
||||
/// Sets the maximum size of the ring buffer in bytes.
|
||||
pub fn with_ring_buffer_max_size(self, size: u64) -> Self {
|
||||
self.inner.set_property("ring-buffer-max-size", size);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_video_sink(self, video_sink: &impl ChildOf<Element>) -> Self {
|
||||
self.inner
|
||||
.set_property("video-sink", &video_sink.upcast_ref().inner);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_text_sink(self, text_sink: &impl ChildOf<Element>) -> Self {
|
||||
self.inner
|
||||
.set_property("text-sink", &text_sink.upcast_ref().inner);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_audio_sink(self, audio_sink: &impl ChildOf<Element>) -> Self {
|
||||
self.inner
|
||||
.set_property("audio-sink", &audio_sink.upcast_ref().inner);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn set_volume(&self, volume: f64) {
|
||||
self.inner.set_property("volume", volume.clamp(1.0, 100.0))
|
||||
}
|
||||
|
||||
pub fn get_volume(&self) -> f64 {
|
||||
self.inner.property::<f64>("volume")
|
||||
}
|
||||
|
||||
pub fn with_flags(self, flags: playback::PlayFlags) -> Self {
|
||||
self.inner.set_property("flags", flags);
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
impl Playbin3 {
|
||||
pub fn default_flags() -> PlayFlags {
|
||||
PlayFlags::SOFT_COLORBALANCE
|
||||
| PlayFlags::DEINTERLACE
|
||||
| PlayFlags::BUFFERING
|
||||
| PlayFlags::SOFT_VOLUME
|
||||
| PlayFlags::TEXT
|
||||
| PlayFlags::AUDIO
|
||||
| PlayFlags::VIDEO
|
||||
}
|
||||
}
|
||||
2
gst/src/plugins/videoconvertscale.rs
Normal file
2
gst/src/plugins/videoconvertscale.rs
Normal file
@@ -0,0 +1,2 @@
|
||||
pub mod videoconvert;
|
||||
pub use videoconvert::*;
|
||||
36
gst/src/plugins/videoconvertscale/videoconvert.rs
Normal file
36
gst/src/plugins/videoconvertscale/videoconvert.rs
Normal file
@@ -0,0 +1,36 @@
|
||||
use crate::priv_prelude::*;
|
||||
#[doc(inline)]
|
||||
pub use gstreamer_video::VideoFormat;
|
||||
|
||||
wrap_gst!(VideoConvert, gstreamer::Element);
|
||||
parent_child!(Element, VideoConvert);
|
||||
|
||||
impl Sink for VideoConvert {}
|
||||
impl Source for VideoConvert {}
|
||||
|
||||
impl VideoConvert {
|
||||
pub fn new(name: impl AsRef<str>) -> Result<Self> {
|
||||
let element = gstreamer::ElementFactory::make("videoconvert")
|
||||
.name(name.as_ref())
|
||||
.build()
|
||||
.change_context(Error)
|
||||
.attach("Failed to create videoconvert element")?;
|
||||
Ok(VideoConvert { inner: element })
|
||||
}
|
||||
|
||||
// pub fn with_caps(mut self, caps: &gstreamer::Caps) -> Self {
|
||||
// use gstreamer::prelude::*;
|
||||
// self.inner.set_property("caps", caps);
|
||||
// self
|
||||
// }
|
||||
pub fn with_output_format(self, format: VideoFormat) -> Result<Self> {
|
||||
use gstreamer::prelude::*;
|
||||
let caps = Caps::builder(CapsType::Video)
|
||||
.field("format", format.to_str())
|
||||
.build();
|
||||
self.inner.set_property("caps", &caps.inner);
|
||||
// .change_context(Error)
|
||||
// .attach("Failed to set output format on videoconvert")?;
|
||||
Ok(self)
|
||||
}
|
||||
}
|
||||
37
gst/src/sample.rs
Normal file
37
gst/src/sample.rs
Normal file
@@ -0,0 +1,37 @@
|
||||
impl From<gstreamer::Sample> for Sample {
|
||||
fn from(inner: gstreamer::Sample) -> Self {
|
||||
Sample { inner }
|
||||
}
|
||||
}
|
||||
|
||||
#[repr(transparent)]
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Sample {
|
||||
pub inner: gstreamer::Sample,
|
||||
}
|
||||
|
||||
use gstreamer::BufferRef;
|
||||
impl Sample {
|
||||
#[doc(alias = "empty")]
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
inner: gstreamer::Sample::builder().build(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn buffer(&self) -> Option<&BufferRef> {
|
||||
self.inner.buffer()
|
||||
}
|
||||
|
||||
pub fn caps(&self) -> Option<&gstreamer::CapsRef> {
|
||||
self.inner.caps()
|
||||
}
|
||||
|
||||
pub fn info(&self) -> Option<&gstreamer::StructureRef> {
|
||||
self.inner.info()
|
||||
}
|
||||
|
||||
// pub fn set_buffer(&mut self) {
|
||||
// self.inner.set_buffer(None);
|
||||
// }
|
||||
}
|
||||
2
gst/src/wgpu.rs
Normal file
2
gst/src/wgpu.rs
Normal file
@@ -0,0 +1,2 @@
|
||||
|
||||
|
||||
145
gst/src/wrapper.rs
Normal file
145
gst/src/wrapper.rs
Normal file
@@ -0,0 +1,145 @@
|
||||
pub trait GstWrapper {
|
||||
type GstType: glib::prelude::ObjectType;
|
||||
fn from_gst(gst: Self::GstType) -> Self;
|
||||
// fn into_gst(self) -> Self::GstType;
|
||||
fn as_gst_ref(&self) -> &Self::GstType;
|
||||
fn from_gst_ref(gst: &Self::GstType) -> &Self;
|
||||
}
|
||||
|
||||
#[macro_export]
|
||||
macro_rules! wrap_gst {
|
||||
($name:ident) => {
|
||||
$crate::wrap_gst!($name, gstreamer::$name);
|
||||
};
|
||||
($name:ident, $inner:ty) => {
|
||||
$crate::wrap_gst!(core $name, $inner);
|
||||
$crate::wrap_gst!($name, $inner, into_inner);
|
||||
};
|
||||
($name:ident, $inner:ty, skip_inner) => {
|
||||
$crate::wrap_gst!(core $name, $inner);
|
||||
};
|
||||
|
||||
(core $name:ident, $inner:ty) => {
|
||||
#[derive(Debug, Clone)]
|
||||
#[repr(transparent)]
|
||||
pub struct $name {
|
||||
pub(crate) inner: $inner,
|
||||
}
|
||||
|
||||
// impl From<$name> for $inner {
|
||||
// fn from(wrapper: $name) -> Self {
|
||||
// wrapper.into_inner()
|
||||
// }
|
||||
// }
|
||||
|
||||
impl $name {
|
||||
pub fn into_inner(self) -> $inner {
|
||||
self.inner.clone()
|
||||
}
|
||||
}
|
||||
|
||||
impl $crate::wrapper::GstWrapper for $name {
|
||||
type GstType = $inner;
|
||||
|
||||
fn from_gst(gst: Self::GstType) -> Self {
|
||||
Self { inner: gst }
|
||||
}
|
||||
|
||||
// fn into_gst(self) -> Self::GstType {
|
||||
// self.inner.clone()
|
||||
// }
|
||||
|
||||
fn as_gst_ref(&self) -> &Self::GstType {
|
||||
&self.inner
|
||||
}
|
||||
|
||||
fn from_gst_ref(gst: &Self::GstType) -> &Self {
|
||||
unsafe { &*(gst as *const Self::GstType as *const Self) }
|
||||
}
|
||||
}
|
||||
|
||||
impl ChildOf<$name> for $name {
|
||||
fn upcast_ref(&self) -> &$name {
|
||||
self
|
||||
}
|
||||
}
|
||||
};
|
||||
($name:ident, $inner:ty, into_inner) => {
|
||||
impl From<$inner> for $name {
|
||||
fn from(inner: $inner) -> Self {
|
||||
Self { inner }
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
/// A trait for types that can be upcasted to type T.
|
||||
pub trait ChildOf<T> {
|
||||
fn upcast_ref(&self) -> &T;
|
||||
}
|
||||
|
||||
#[macro_export]
|
||||
macro_rules! parent_child {
|
||||
($parent:ty, $child:ty) => {
|
||||
impl ChildOf<$parent> for $child
|
||||
where
|
||||
$child: GstWrapper,
|
||||
$parent: GstWrapper,
|
||||
{
|
||||
fn upcast_ref(&self) -> &$parent {
|
||||
let upcasted = self.inner.upcast_ref::<<$parent as GstWrapper>::GstType>();
|
||||
unsafe { &*(upcasted as *const <$parent as GstWrapper>::GstType as *const $parent) }
|
||||
}
|
||||
}
|
||||
};
|
||||
($parent:ty, $child:ty, downcast) => {
|
||||
impl ChildOf<$parent> for $child
|
||||
where
|
||||
$child: GstWrapper,
|
||||
$parent: GstWrapper,
|
||||
{
|
||||
fn upcast_ref(&self) -> &$parent {
|
||||
let downcasted = self
|
||||
.inner
|
||||
.downcast_ref::<<$parent as GstWrapper>::GstType>()
|
||||
.expect(
|
||||
format!(
|
||||
"BUG: Failed to downcast GStreamer type from child {} to parent {}",
|
||||
stringify!($child),
|
||||
stringify!($parent)
|
||||
)
|
||||
.as_str(),
|
||||
);
|
||||
unsafe {
|
||||
&*(downcasted as *const <$parent as GstWrapper>::GstType as *const $parent)
|
||||
}
|
||||
}
|
||||
}
|
||||
}; // ($parent:ty, $child:ty, deref) => {
|
||||
// $crate::parent_child!($parent, $child);
|
||||
// $crate::parent_child!($parent, $child, __deref);
|
||||
// };
|
||||
//
|
||||
// ($parent:ty, $child:ty, downcast, deref) => {
|
||||
// $crate::parent_child!($parent, $child, downcast);
|
||||
// $crate::parent_child!($parent, $child, __deref);
|
||||
// };
|
||||
// ($parent:ty, $child:ty, deref, downcast) => {
|
||||
// $crate::parent_child!($parent, $child, downcast);
|
||||
// $crate::parent_child!($parent, $child, __deref);
|
||||
// };
|
||||
//
|
||||
// ($parent:ty, $child:ty, __deref) => {
|
||||
// impl core::ops::Deref for $child
|
||||
// where
|
||||
// $child: GstWrapper,
|
||||
// $parent: GstWrapper,
|
||||
// {
|
||||
// type Target = $parent;
|
||||
//
|
||||
// fn deref(&self) -> &Self::Target {
|
||||
// self.upcast_ref()
|
||||
// }
|
||||
// }
|
||||
// };
|
||||
}
|
||||
8
jello-types/Cargo.toml
Normal file
8
jello-types/Cargo.toml
Normal file
@@ -0,0 +1,8 @@
|
||||
[package]
|
||||
name = "jello-types"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
|
||||
[dependencies]
|
||||
serde = { version = "1.0.228", features = ["derive"] }
|
||||
uuid = { version = "1.18.1", features = ["serde"] }
|
||||
6
jello-types/src/lib.rs
Normal file
6
jello-types/src/lib.rs
Normal file
@@ -0,0 +1,6 @@
|
||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize)]
|
||||
pub struct User {
|
||||
id: uuid::Uuid,
|
||||
name: Option<String>,
|
||||
primary_image_tag: Option<String>,
|
||||
}
|
||||
12
justfile
12
justfile
@@ -1,6 +1,18 @@
|
||||
jello:
|
||||
cargo r -r -- -vv
|
||||
# iced-video:
|
||||
# cd crates/iced-video && cargo run --release --example minimal
|
||||
typegen:
|
||||
@echo "Generating jellyfin type definitions..."
|
||||
cd typegen && cargo run
|
||||
cp typegen/jellyfin.rs api/src/jellyfin.rs
|
||||
rm typegen/jellyfin.rs
|
||||
|
||||
hdrtest:
|
||||
GST_DEBUG=3 gst-launch-1.0 playbin3 uri=https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c video-sink="videoconvert ! video/x-raw,format=(string)RGB10A2_LE ! fakesink"
|
||||
|
||||
codec:
|
||||
GST_DEBUG=3 gst-discoverer-1.0 https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c
|
||||
ffprobe:
|
||||
ffprobe -v error -show_format -show_streams "https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c" | grep pix_fmt
|
||||
|
||||
|
||||
66
src/cli.rs
66
src/cli.rs
@@ -1,36 +1,38 @@
|
||||
#[derive(Debug, clap::Parser)]
|
||||
pub struct Cli {
|
||||
#[clap(subcommand)]
|
||||
pub cmd: SubCommand,
|
||||
// #[clap(subcommand)]
|
||||
// pub cmd: SubCommand,
|
||||
#[command(flatten)]
|
||||
pub verbosity: clap_verbosity_flag::Verbosity,
|
||||
}
|
||||
|
||||
#[derive(Debug, clap::Subcommand)]
|
||||
pub enum SubCommand {
|
||||
#[clap(name = "add")]
|
||||
Add(Add),
|
||||
#[clap(name = "list")]
|
||||
List(List),
|
||||
#[clap(name = "completions")]
|
||||
Completions { shell: clap_complete::Shell },
|
||||
}
|
||||
|
||||
#[derive(Debug, clap::Args)]
|
||||
pub struct Add {
|
||||
#[clap(short, long)]
|
||||
pub name: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, clap::Args)]
|
||||
pub struct List {}
|
||||
|
||||
impl Cli {
|
||||
pub fn completions(shell: clap_complete::Shell) {
|
||||
let mut command = <Cli as clap::CommandFactory>::command();
|
||||
clap_complete::generate(
|
||||
shell,
|
||||
&mut command,
|
||||
env!("CARGO_BIN_NAME"),
|
||||
&mut std::io::stdout(),
|
||||
);
|
||||
}
|
||||
}
|
||||
// #[derive(Debug, clap::Subcommand)]
|
||||
// pub enum SubCommand {
|
||||
// #[clap(name = "add")]
|
||||
// Add(Add),
|
||||
// #[clap(name = "list")]
|
||||
// List(List),
|
||||
// #[clap(name = "completions")]
|
||||
// Completions { shell: clap_complete::Shell },
|
||||
// }
|
||||
//
|
||||
// #[derive(Debug, clap::Args)]
|
||||
// pub struct Add {
|
||||
// #[clap(short, long)]
|
||||
// pub name: String,
|
||||
// }
|
||||
//
|
||||
// #[derive(Debug, clap::Args)]
|
||||
// pub struct List {}
|
||||
//
|
||||
// impl Cli {
|
||||
// pub fn completions(shell: clap_complete::Shell) {
|
||||
// let mut command = <Cli as clap::CommandFactory>::command();
|
||||
// clap_complete::generate(
|
||||
// shell,
|
||||
// &mut command,
|
||||
// env!("CARGO_BIN_NAME"),
|
||||
// &mut std::io::stdout(),
|
||||
// );
|
||||
// }
|
||||
// }
|
||||
|
||||
54
src/main.rs
54
src/main.rs
@@ -1,52 +1,16 @@
|
||||
mod cli;
|
||||
mod errors;
|
||||
use api::JellyfinConfig;
|
||||
use errors::*;
|
||||
|
||||
fn jellyfin_config_try() -> Result<JellyfinConfig> {
|
||||
let file = std::fs::read("config.toml").change_context(Error)?;
|
||||
let config: JellyfinConfig = toml::from_slice(&file)
|
||||
.change_context(Error)
|
||||
.attach("Failed to parse Jellyfin Config")?;
|
||||
Ok(config)
|
||||
}
|
||||
|
||||
fn jellyfin_config() -> JellyfinConfig {
|
||||
jellyfin_config_try().unwrap_or_else(|err| {
|
||||
eprintln!("Error loading Jellyfin configuration: {:?}", err);
|
||||
std::process::exit(1);
|
||||
})
|
||||
}
|
||||
|
||||
fn main() -> Result<()> {
|
||||
tracing_subscriber::fmt::init();
|
||||
ui_iced::ui(jellyfin_config).change_context(Error)?;
|
||||
color_backtrace::install();
|
||||
let args = <cli::Cli as clap::Parser>::parse();
|
||||
tracing_subscriber::fmt()
|
||||
.with_max_level(args.verbosity)
|
||||
.with_file(true)
|
||||
.with_line_number(true)
|
||||
.init();
|
||||
ui_iced::ui().change_context(Error)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// #[tokio::main]
|
||||
// pub async fn main() -> Result<()> {
|
||||
// dotenvy::dotenv()
|
||||
// .change_context(Error)
|
||||
// .inspect_err(|err| {
|
||||
// eprintln!("Failed to load .env file: {}", err);
|
||||
// })
|
||||
// .ok();
|
||||
// let config = JellyfinConfig::new(
|
||||
// std::env::var("JELLYFIN_USERNAME").change_context(Error)?,
|
||||
// std::env::var("JELLYFIN_PASSWORD").change_context(Error)?,
|
||||
// std::env::var("JELLYFIN_SERVER_URL").change_context(Error)?,
|
||||
// "jello".to_string(),
|
||||
// );
|
||||
// let mut jellyfin = api::JellyfinClient::new(config);
|
||||
// jellyfin
|
||||
// .authenticate_with_cached_token(".session")
|
||||
// .await
|
||||
// .change_context(Error)?;
|
||||
//
|
||||
// #[cfg(feature = "iced")]
|
||||
// ui_iced::ui(jellyfin);
|
||||
// #[cfg(feature = "gpui")]
|
||||
// ui_gpui::ui(jellyfin);
|
||||
//
|
||||
// Ok(())
|
||||
// }
|
||||
|
||||
12
store/Cargo.toml
Normal file
12
store/Cargo.toml
Normal file
@@ -0,0 +1,12 @@
|
||||
[package]
|
||||
name = "store"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
|
||||
[dependencies]
|
||||
futures = "0.3.31"
|
||||
parking_lot = "0.12.5"
|
||||
secrecy = "0.10.3"
|
||||
serde = "1.0.228"
|
||||
tokio = { version = "1.48.0", features = ["rt"] }
|
||||
uuid = { version = "1.18.1", features = ["v4"] }
|
||||
10
store/src/lib.rs
Normal file
10
store/src/lib.rs
Normal file
@@ -0,0 +1,10 @@
|
||||
use std::collections::BTreeMap;
|
||||
|
||||
use uuid::Uuid;
|
||||
|
||||
pub struct ApiKey {
|
||||
inner: secrecy::SecretBox<String>,
|
||||
}
|
||||
pub struct SecretStore {
|
||||
api_keys: BTreeMap<Uuid, ApiKey>,
|
||||
}
|
||||
1
store/src/sqlite.rs
Normal file
1
store/src/sqlite.rs
Normal file
@@ -0,0 +1 @@
|
||||
|
||||
1
store/src/toml.rs
Normal file
1
store/src/toml.rs
Normal file
@@ -0,0 +1 @@
|
||||
|
||||
@@ -1,262 +1,262 @@
|
||||
use ::tap::*;
|
||||
|
||||
use std::{collections::BTreeMap, sync::Arc};
|
||||
|
||||
use gpui::{
|
||||
App, Application, Bounds, ClickEvent, Context, ImageId, ImageSource, RenderImage, Resource,
|
||||
SharedString, Window, WindowBounds, WindowOptions, actions, div, prelude::*, px, rgb, size,
|
||||
};
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct AppState {
|
||||
pub title: SharedString,
|
||||
pub items: BTreeMap<SharedString, Item>,
|
||||
pub item_ids: BTreeMap<usize, SharedString>,
|
||||
pub current_item: Option<SharedString>,
|
||||
pub errors: Vec<String>,
|
||||
pub jellyfin_client: api::JellyfinClient,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Item {
|
||||
pub id: SharedString,
|
||||
pub name: SharedString,
|
||||
pub item_type: SharedString,
|
||||
pub media_type: SharedString,
|
||||
}
|
||||
|
||||
impl Render for AppState {
|
||||
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
|
||||
div()
|
||||
.flex()
|
||||
.flex_col()
|
||||
.size_full()
|
||||
.justify_center()
|
||||
.text_color(rgb(0xffffff))
|
||||
.child(Self::header())
|
||||
.child(Self::body(self, window, cx))
|
||||
.child(Self::footer())
|
||||
}
|
||||
}
|
||||
|
||||
actions!(jello_actions, [OpenItem, OnLoadItem, MouseDownEvent]);
|
||||
|
||||
impl AppState {
|
||||
fn new(title: impl AsRef<str>, jellyfin_client: api::JellyfinClient) -> Self {
|
||||
AppState {
|
||||
title: SharedString::new(title.as_ref()),
|
||||
items: BTreeMap::new(),
|
||||
item_ids: BTreeMap::new(),
|
||||
current_item: None,
|
||||
errors: Vec::new(),
|
||||
jellyfin_client,
|
||||
}
|
||||
}
|
||||
|
||||
// fn on_mouse_down(
|
||||
// &mut self,
|
||||
// event: &MouseDownEvent,
|
||||
// window: &mut Window,
|
||||
// cx: &mut Context<Self>,
|
||||
// ) {
|
||||
// // Handle mouse down event
|
||||
// }
|
||||
|
||||
fn load_item(id: usize) -> impl Fn(&mut Self, &ClickEvent, &mut Window, &mut Context<Self>) {
|
||||
move |state: &mut Self, event: &ClickEvent, window: &mut Window, cx: &mut Context<Self>| {
|
||||
let item_id = id;
|
||||
cx.spawn(async move |entity, app| {
|
||||
tracing::info!("Loading item with id: {}", item_id);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
fn hover_item(id: usize) -> impl Fn(&mut Self, &bool, &mut Window, &mut Context<Self>) {
|
||||
move |state: &mut Self, item: &bool, window: &mut Window, cx: &mut Context<Self>| {
|
||||
dbg!("Hovering over item: {:?}", id);
|
||||
}
|
||||
}
|
||||
|
||||
fn header() -> impl IntoElement {
|
||||
div()
|
||||
.flex()
|
||||
.flex_row()
|
||||
.w_full()
|
||||
.justify_end()
|
||||
.h_20()
|
||||
.border_10()
|
||||
.bg(rgb(0x333333))
|
||||
.child(Self::button("Refresh"))
|
||||
}
|
||||
|
||||
fn footer() -> impl IntoElement {
|
||||
div().flex().flex_row().w_full().h_20().bg(rgb(0x333333))
|
||||
}
|
||||
|
||||
fn body(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement {
|
||||
div()
|
||||
.flex()
|
||||
.flex_row()
|
||||
.size_full()
|
||||
.child(Self::content(self, window, cx))
|
||||
.child(Self::sidebar(self, window, cx))
|
||||
}
|
||||
|
||||
fn button(label: &str) -> impl IntoElement {
|
||||
div()
|
||||
.flex()
|
||||
.justify_center()
|
||||
.items_center()
|
||||
.bg(rgb(0xff00ff))
|
||||
.text_color(rgb(0xffffff))
|
||||
.border_5()
|
||||
.rounded_lg()
|
||||
.child(label.to_string())
|
||||
}
|
||||
|
||||
fn content(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement {
|
||||
div()
|
||||
.debug_below()
|
||||
.w_3_4()
|
||||
// .flex()
|
||||
// .flex_wrap()
|
||||
.bg(rgb(0x111111))
|
||||
.justify_start()
|
||||
.items_start()
|
||||
.overflow_hidden()
|
||||
.child(
|
||||
div()
|
||||
.size_full()
|
||||
.flex()
|
||||
.flex_wrap()
|
||||
.justify_start()
|
||||
.items_start()
|
||||
.content_start()
|
||||
.gap_y_10()
|
||||
.gap_x_10()
|
||||
.border_t_10()
|
||||
.p_5()
|
||||
.child(Self::card(cx, 1))
|
||||
.child(Self::card(cx, 2))
|
||||
.child(Self::card(cx, 3))
|
||||
.child(Self::card(cx, 4))
|
||||
.child(Self::card(cx, 5))
|
||||
.child(Self::card(cx, 6))
|
||||
.child(Self::card(cx, 7))
|
||||
.child(Self::card(cx, 8))
|
||||
.child(Self::card(cx, 9)),
|
||||
)
|
||||
}
|
||||
|
||||
fn sidebar(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement {
|
||||
div()
|
||||
.flex()
|
||||
.flex_col()
|
||||
.w_1_4()
|
||||
.min_w_1_6()
|
||||
.bg(rgb(0x222222))
|
||||
.child(div().size_full().bg(gpui::yellow()))
|
||||
}
|
||||
|
||||
fn card(cx: &mut Context<AppState>, number: usize) -> impl IntoElement {
|
||||
div()
|
||||
.id(number)
|
||||
.on_click(cx.listener(Self::load_item(number)))
|
||||
.on_hover(cx.listener(Self::hover_item(number)))
|
||||
.flex()
|
||||
.flex_col()
|
||||
.w_48()
|
||||
.h_64()
|
||||
.p_10()
|
||||
.bg(rgb(0xff00ff))
|
||||
.rounded_lg()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn ui(jellyfin_client: api::JellyfinClient) {
|
||||
Application::new().run(|cx: &mut App| {
|
||||
let bounds = Bounds::centered(None, size(px(500.0), px(500.0)), cx);
|
||||
cx.open_window(
|
||||
WindowOptions {
|
||||
window_bounds: Some(WindowBounds::Windowed(bounds)),
|
||||
..Default::default()
|
||||
},
|
||||
|_, cx| cx.new(|_| AppState::new("Jello Media Browser", jellyfin_client)),
|
||||
)
|
||||
.expect("Failed to open window");
|
||||
})
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Card {
|
||||
pub id: usize,
|
||||
pub title: SharedString,
|
||||
pub description: SharedString,
|
||||
pub image: SharedString,
|
||||
pub image_blurhash: BlurHash,
|
||||
pub media_type: SharedString,
|
||||
pub loading: bool,
|
||||
}
|
||||
|
||||
impl Render for Card {
|
||||
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
|
||||
div()
|
||||
.id(self.id)
|
||||
.flex()
|
||||
.flex_col()
|
||||
.w_48()
|
||||
.h_64()
|
||||
.p_10()
|
||||
.bg(rgb(0xff00ff))
|
||||
.rounded_lg()
|
||||
.pipe(|card| {
|
||||
if self.loading {
|
||||
card.child(self.image_blurhash.clone())
|
||||
} else {
|
||||
card.child(gpui::img(self.image.clone()))
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct BlurHash {
|
||||
pub id: ImageId,
|
||||
pub data: Arc<RenderImage>,
|
||||
}
|
||||
|
||||
impl BlurHash {
|
||||
pub fn new(
|
||||
data: impl AsRef<str>,
|
||||
width: u32,
|
||||
height: u32,
|
||||
punch: f32,
|
||||
) -> Result<Self, error_stack::Report<crate::Error>> {
|
||||
use error_stack::ResultExt;
|
||||
let decoded =
|
||||
blurhash::decode(data.as_ref(), width, height, punch).change_context(crate::Error)?;
|
||||
let buffer = image::RgbaImage::from_raw(width, height, decoded)
|
||||
.ok_or(crate::Error)
|
||||
.attach("Failed to convert")?;
|
||||
let frame = image::Frame::new(buffer);
|
||||
let render_image = RenderImage::new([frame]);
|
||||
Ok(Self {
|
||||
id: render_image.id,
|
||||
data: Arc::from(render_image),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl Render for BlurHash {
|
||||
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
|
||||
gpui::img(ImageSource::Render(self.data.clone()))
|
||||
}
|
||||
}
|
||||
|
||||
impl IntoElement for BlurHash {
|
||||
type Element = gpui::Img;
|
||||
|
||||
fn into_element(self) -> Self::Element {
|
||||
gpui::img(ImageSource::Render(self.data.clone()))
|
||||
}
|
||||
}
|
||||
// use ::tap::*;
|
||||
//
|
||||
// use std::{collections::BTreeMap, sync::Arc};
|
||||
//
|
||||
// use gpui::{
|
||||
// App, Application, Bounds, ClickEvent, Context, ImageId, ImageSource, RenderImage, Resource,
|
||||
// SharedString, Window, WindowBounds, WindowOptions, actions, div, prelude::*, px, rgb, size,
|
||||
// };
|
||||
//
|
||||
// #[derive(Clone, Debug)]
|
||||
// pub struct AppState {
|
||||
// pub title: SharedString,
|
||||
// pub items: BTreeMap<SharedString, Item>,
|
||||
// pub item_ids: BTreeMap<usize, SharedString>,
|
||||
// pub current_item: Option<SharedString>,
|
||||
// pub errors: Vec<String>,
|
||||
// pub jellyfin_client: api::JellyfinClient,
|
||||
// }
|
||||
//
|
||||
// #[derive(Clone, Debug)]
|
||||
// pub struct Item {
|
||||
// pub id: SharedString,
|
||||
// pub name: SharedString,
|
||||
// pub item_type: SharedString,
|
||||
// pub media_type: SharedString,
|
||||
// }
|
||||
//
|
||||
// impl Render for AppState {
|
||||
// fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
|
||||
// div()
|
||||
// .flex()
|
||||
// .flex_col()
|
||||
// .size_full()
|
||||
// .justify_center()
|
||||
// .text_color(rgb(0xffffff))
|
||||
// .child(Self::header())
|
||||
// .child(Self::body(self, window, cx))
|
||||
// .child(Self::footer())
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// actions!(jello_actions, [OpenItem, OnLoadItem, MouseDownEvent]);
|
||||
//
|
||||
// impl AppState {
|
||||
// fn new(title: impl AsRef<str>, jellyfin_client: api::JellyfinClient) -> Self {
|
||||
// AppState {
|
||||
// title: SharedString::new(title.as_ref()),
|
||||
// items: BTreeMap::new(),
|
||||
// item_ids: BTreeMap::new(),
|
||||
// current_item: None,
|
||||
// errors: Vec::new(),
|
||||
// jellyfin_client,
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// // fn on_mouse_down(
|
||||
// // &mut self,
|
||||
// // event: &MouseDownEvent,
|
||||
// // window: &mut Window,
|
||||
// // cx: &mut Context<Self>,
|
||||
// // ) {
|
||||
// // // Handle mouse down event
|
||||
// // }
|
||||
//
|
||||
// fn load_item(id: usize) -> impl Fn(&mut Self, &ClickEvent, &mut Window, &mut Context<Self>) {
|
||||
// move |state: &mut Self, event: &ClickEvent, window: &mut Window, cx: &mut Context<Self>| {
|
||||
// let item_id = id;
|
||||
// cx.spawn(async move |entity, app| {
|
||||
// tracing::info!("Loading item with id: {}", item_id);
|
||||
// });
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// fn hover_item(id: usize) -> impl Fn(&mut Self, &bool, &mut Window, &mut Context<Self>) {
|
||||
// move |state: &mut Self, item: &bool, window: &mut Window, cx: &mut Context<Self>| {
|
||||
// dbg!("Hovering over item: {:?}", id);
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// fn header() -> impl IntoElement {
|
||||
// div()
|
||||
// .flex()
|
||||
// .flex_row()
|
||||
// .w_full()
|
||||
// .justify_end()
|
||||
// .h_20()
|
||||
// .border_10()
|
||||
// .bg(rgb(0x333333))
|
||||
// .child(Self::button("Refresh"))
|
||||
// }
|
||||
//
|
||||
// fn footer() -> impl IntoElement {
|
||||
// div().flex().flex_row().w_full().h_20().bg(rgb(0x333333))
|
||||
// }
|
||||
//
|
||||
// fn body(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement {
|
||||
// div()
|
||||
// .flex()
|
||||
// .flex_row()
|
||||
// .size_full()
|
||||
// .child(Self::content(self, window, cx))
|
||||
// .child(Self::sidebar(self, window, cx))
|
||||
// }
|
||||
//
|
||||
// fn button(label: &str) -> impl IntoElement {
|
||||
// div()
|
||||
// .flex()
|
||||
// .justify_center()
|
||||
// .items_center()
|
||||
// .bg(rgb(0xff00ff))
|
||||
// .text_color(rgb(0xffffff))
|
||||
// .border_5()
|
||||
// .rounded_lg()
|
||||
// .child(label.to_string())
|
||||
// }
|
||||
//
|
||||
// fn content(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement {
|
||||
// div()
|
||||
// .debug_below()
|
||||
// .w_3_4()
|
||||
// // .flex()
|
||||
// // .flex_wrap()
|
||||
// .bg(rgb(0x111111))
|
||||
// .justify_start()
|
||||
// .items_start()
|
||||
// .overflow_hidden()
|
||||
// .child(
|
||||
// div()
|
||||
// .size_full()
|
||||
// .flex()
|
||||
// .flex_wrap()
|
||||
// .justify_start()
|
||||
// .items_start()
|
||||
// .content_start()
|
||||
// .gap_y_10()
|
||||
// .gap_x_10()
|
||||
// .border_t_10()
|
||||
// .p_5()
|
||||
// .child(Self::card(cx, 1))
|
||||
// .child(Self::card(cx, 2))
|
||||
// .child(Self::card(cx, 3))
|
||||
// .child(Self::card(cx, 4))
|
||||
// .child(Self::card(cx, 5))
|
||||
// .child(Self::card(cx, 6))
|
||||
// .child(Self::card(cx, 7))
|
||||
// .child(Self::card(cx, 8))
|
||||
// .child(Self::card(cx, 9)),
|
||||
// )
|
||||
// }
|
||||
//
|
||||
// fn sidebar(&mut self, window: &mut Window, cx: &mut Context<AppState>) -> impl IntoElement {
|
||||
// div()
|
||||
// .flex()
|
||||
// .flex_col()
|
||||
// .w_1_4()
|
||||
// .min_w_1_6()
|
||||
// .bg(rgb(0x222222))
|
||||
// .child(div().size_full().bg(gpui::yellow()))
|
||||
// }
|
||||
//
|
||||
// fn card(cx: &mut Context<AppState>, number: usize) -> impl IntoElement {
|
||||
// div()
|
||||
// .id(number)
|
||||
// .on_click(cx.listener(Self::load_item(number)))
|
||||
// .on_hover(cx.listener(Self::hover_item(number)))
|
||||
// .flex()
|
||||
// .flex_col()
|
||||
// .w_48()
|
||||
// .h_64()
|
||||
// .p_10()
|
||||
// .bg(rgb(0xff00ff))
|
||||
// .rounded_lg()
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// pub fn ui(jellyfin_client: api::JellyfinClient) {
|
||||
// Application::new().run(|cx: &mut App| {
|
||||
// let bounds = Bounds::centered(None, size(px(500.0), px(500.0)), cx);
|
||||
// cx.open_window(
|
||||
// WindowOptions {
|
||||
// window_bounds: Some(WindowBounds::Windowed(bounds)),
|
||||
// ..Default::default()
|
||||
// },
|
||||
// |_, cx| cx.new(|_| AppState::new("Jello Media Browser", jellyfin_client)),
|
||||
// )
|
||||
// .expect("Failed to open window");
|
||||
// })
|
||||
// }
|
||||
//
|
||||
// #[derive(Clone, Debug)]
|
||||
// pub struct Card {
|
||||
// pub id: usize,
|
||||
// pub title: SharedString,
|
||||
// pub description: SharedString,
|
||||
// pub image: SharedString,
|
||||
// pub image_blurhash: BlurHash,
|
||||
// pub media_type: SharedString,
|
||||
// pub loading: bool,
|
||||
// }
|
||||
//
|
||||
// impl Render for Card {
|
||||
// fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
|
||||
// div()
|
||||
// .id(self.id)
|
||||
// .flex()
|
||||
// .flex_col()
|
||||
// .w_48()
|
||||
// .h_64()
|
||||
// .p_10()
|
||||
// .bg(rgb(0xff00ff))
|
||||
// .rounded_lg()
|
||||
// .pipe(|card| {
|
||||
// if self.loading {
|
||||
// card.child(self.image_blurhash.clone())
|
||||
// } else {
|
||||
// card.child(gpui::img(self.image.clone()))
|
||||
// }
|
||||
// })
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// #[derive(Clone, Debug)]
|
||||
// pub struct BlurHash {
|
||||
// pub id: ImageId,
|
||||
// pub data: Arc<RenderImage>,
|
||||
// }
|
||||
//
|
||||
// impl BlurHash {
|
||||
// pub fn new(
|
||||
// data: impl AsRef<str>,
|
||||
// width: u32,
|
||||
// height: u32,
|
||||
// punch: f32,
|
||||
// ) -> Result<Self, error_stack::Report<crate::Error>> {
|
||||
// use error_stack::ResultExt;
|
||||
// let decoded =
|
||||
// blurhash::decode(data.as_ref(), width, height, punch).change_context(crate::Error)?;
|
||||
// let buffer = image::RgbaImage::from_raw(width, height, decoded)
|
||||
// .ok_or(crate::Error)
|
||||
// .attach("Failed to convert")?;
|
||||
// let frame = image::Frame::new(buffer);
|
||||
// let render_image = RenderImage::new([frame]);
|
||||
// Ok(Self {
|
||||
// id: render_image.id,
|
||||
// data: Arc::from(render_image),
|
||||
// })
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// impl Render for BlurHash {
|
||||
// fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
|
||||
// gpui::img(ImageSource::Render(self.data.clone()))
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// impl IntoElement for BlurHash {
|
||||
// type Element = gpui::Img;
|
||||
//
|
||||
// fn into_element(self) -> Self::Element {
|
||||
// gpui::img(ImageSource::Render(self.data.clone()))
|
||||
// }
|
||||
// }
|
||||
|
||||
@@ -2,16 +2,31 @@
|
||||
name = "ui-iced"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
license = "MIT"
|
||||
|
||||
[dependencies]
|
||||
api = { version = "0.1.0", path = "../api" }
|
||||
blurhash = "0.2.3"
|
||||
bytes = "1.11.0"
|
||||
gpui_util = "0.2.2"
|
||||
iced = { workspace = true }
|
||||
iced_video_player = { workspace = true }
|
||||
reqwest = "0.12.24"
|
||||
iced = { workspace = true, features = [
|
||||
"advanced",
|
||||
"canvas",
|
||||
"image",
|
||||
"sipper",
|
||||
"tokio",
|
||||
"debug",
|
||||
"hot",
|
||||
], default-features = true }
|
||||
|
||||
|
||||
iced-video = { workspace = true }
|
||||
iced_aw = "0.13.0"
|
||||
iced_wgpu = "0.14.0"
|
||||
iced_winit = "0.14.0"
|
||||
reqwest = "0.13"
|
||||
tap = "1.0.1"
|
||||
toml = "0.9.8"
|
||||
tracing = "0.1.41"
|
||||
url = "2.5.7"
|
||||
uuid = "1.18.1"
|
||||
|
||||
@@ -1,15 +1,20 @@
|
||||
mod settings;
|
||||
mod video;
|
||||
|
||||
mod shared_string;
|
||||
use iced_video_player::{Video, VideoPlayer};
|
||||
use iced_video::{Ready, Video, VideoHandle};
|
||||
use shared_string::SharedString;
|
||||
use tap::Pipe as _;
|
||||
|
||||
use std::sync::Arc;
|
||||
|
||||
mod blur_hash;
|
||||
use blur_hash::BlurHash;
|
||||
|
||||
mod preview;
|
||||
use preview::Preview;
|
||||
// use preview::Preview;
|
||||
|
||||
use iced::{Alignment, Element, Length, Shadow, Task, widget::*};
|
||||
use iced::{Alignment, Element, Length, Task, widget::*};
|
||||
use std::collections::{BTreeMap, BTreeSet};
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
@@ -21,6 +26,8 @@ pub struct ItemCache {
|
||||
pub tree: BTreeMap<Option<uuid::Uuid>, BTreeSet<uuid::Uuid>>,
|
||||
}
|
||||
|
||||
const BACKGROUND_COLOR: iced::Color = iced::Color::from_rgba8(30, 30, 30, 0.7);
|
||||
|
||||
impl ItemCache {
|
||||
pub fn insert(&mut self, parent: impl Into<Option<uuid::Uuid>>, item: Item) {
|
||||
let parent = parent.into();
|
||||
@@ -102,37 +109,55 @@ pub enum Screen {
|
||||
User,
|
||||
Video,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Config {
|
||||
pub server_url: Option<String>,
|
||||
pub device_id: Option<String>,
|
||||
pub device_name: Option<String>,
|
||||
pub client_name: Option<String>,
|
||||
pub version: Option<String>,
|
||||
}
|
||||
|
||||
impl Default for Config {
|
||||
fn default() -> Self {
|
||||
Config {
|
||||
server_url: Some("http://localhost:8096".to_string()),
|
||||
device_id: Some("jello-iced".to_string()),
|
||||
device_name: Some("Jello Iced".to_string()),
|
||||
client_name: Some("Jello".to_string()),
|
||||
version: Some("0.1.0".to_string()),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
struct State {
|
||||
loading: Option<Loading>,
|
||||
current: Option<uuid::Uuid>,
|
||||
cache: ItemCache,
|
||||
jellyfin_client: api::JellyfinClient,
|
||||
jellyfin_client: Option<api::JellyfinClient>,
|
||||
messages: Vec<String>,
|
||||
history: Vec<Option<uuid::Uuid>>,
|
||||
query: Option<String>,
|
||||
screen: Screen,
|
||||
// Login form state
|
||||
username_input: String,
|
||||
password_input: String,
|
||||
settings: settings::SettingsState,
|
||||
is_authenticated: bool,
|
||||
// Video
|
||||
video: Option<Arc<Video>>,
|
||||
video: Option<Arc<VideoHandle<Message, Ready>>>,
|
||||
}
|
||||
|
||||
impl State {
|
||||
pub fn new(jellyfin_client: api::JellyfinClient) -> Self {
|
||||
pub fn new() -> Self {
|
||||
State {
|
||||
loading: None,
|
||||
current: None,
|
||||
cache: ItemCache::default(),
|
||||
jellyfin_client,
|
||||
jellyfin_client: None,
|
||||
messages: Vec::new(),
|
||||
history: Vec::new(),
|
||||
query: None,
|
||||
screen: Screen::Home,
|
||||
username_input: String::new(),
|
||||
password_input: String::new(),
|
||||
settings: settings::SettingsState::default(),
|
||||
is_authenticated: false,
|
||||
video: None,
|
||||
}
|
||||
@@ -141,131 +166,48 @@ impl State {
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum Message {
|
||||
OpenSettings,
|
||||
CloseSettings,
|
||||
Settings(settings::SettingsMessage),
|
||||
Refresh,
|
||||
Search,
|
||||
SearchQueryChanged(String),
|
||||
OpenItem(Option<uuid::Uuid>),
|
||||
LoadedItem(Option<uuid::Uuid>, Vec<Item>),
|
||||
Error(String),
|
||||
SetToken(String),
|
||||
Back,
|
||||
Home,
|
||||
// Login-related messages
|
||||
UsernameChanged(String),
|
||||
PasswordChanged(String),
|
||||
Login,
|
||||
LoginSuccess(String),
|
||||
Logout,
|
||||
Video(VideoMessage),
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum VideoMessage {
|
||||
EndOfStream,
|
||||
Open(url::Url),
|
||||
Pause,
|
||||
Play,
|
||||
Seek(f64),
|
||||
Stop,
|
||||
Test,
|
||||
Video(video::VideoMessage),
|
||||
}
|
||||
|
||||
fn update(state: &mut State, message: Message) -> Task<Message> {
|
||||
match message {
|
||||
Message::OpenSettings => {
|
||||
state.screen = Screen::Settings;
|
||||
Task::none()
|
||||
}
|
||||
Message::CloseSettings => {
|
||||
state.screen = Screen::Home;
|
||||
Task::none()
|
||||
}
|
||||
Message::UsernameChanged(username) => {
|
||||
state.username_input = username;
|
||||
Task::none()
|
||||
}
|
||||
Message::PasswordChanged(password) => {
|
||||
state.password_input = password;
|
||||
Task::none()
|
||||
}
|
||||
Message::Login => {
|
||||
let username = state.username_input.clone();
|
||||
let password = state.password_input.clone();
|
||||
|
||||
// Update the client config with the new credentials
|
||||
let mut config = (*state.jellyfin_client.config).clone();
|
||||
config.username = username;
|
||||
config.password = password;
|
||||
|
||||
Task::perform(
|
||||
async move {
|
||||
let mut client = api::JellyfinClient::new(config);
|
||||
client.authenticate().await
|
||||
},
|
||||
|result| match result {
|
||||
Ok(auth_result) => {
|
||||
if let Some(token) = auth_result.access_token {
|
||||
Message::LoginSuccess(token)
|
||||
} else {
|
||||
Message::Error("Authentication failed: No token received".to_string())
|
||||
}
|
||||
}
|
||||
Err(e) => Message::Error(format!("Login failed: {}", e)),
|
||||
},
|
||||
)
|
||||
}
|
||||
Message::LoginSuccess(token) => {
|
||||
state.jellyfin_client.set_token(token.clone());
|
||||
state.is_authenticated = true;
|
||||
state.password_input.clear();
|
||||
state.messages.push("Login successful!".to_string());
|
||||
state.screen = Screen::Home;
|
||||
|
||||
// Save token and refresh items
|
||||
let client = state.jellyfin_client.clone();
|
||||
Task::perform(
|
||||
async move {
|
||||
let _ = client.save_token(".session").await;
|
||||
},
|
||||
|_| Message::Refresh,
|
||||
)
|
||||
}
|
||||
Message::Logout => {
|
||||
state.is_authenticated = false;
|
||||
state.jellyfin_client.set_token("");
|
||||
state.cache = ItemCache::default();
|
||||
state.current = None;
|
||||
state.username_input.clear();
|
||||
state.password_input.clear();
|
||||
state.messages.push("Logged out successfully".to_string());
|
||||
Task::none()
|
||||
}
|
||||
Message::Settings(msg) => settings::update(state, msg),
|
||||
Message::OpenItem(id) => {
|
||||
let client = state.jellyfin_client.clone();
|
||||
use api::jellyfin::BaseItemKind::*;
|
||||
if let Some(cached) = id.as_ref().and_then(|id| state.cache.get(id))
|
||||
&& matches!(cached._type, Video | Movie | Episode)
|
||||
{
|
||||
let url = client
|
||||
.stream_url(id.expect("ID exists"))
|
||||
.expect("Failed to get stream URL");
|
||||
Task::done(Message::Video(VideoMessage::Open(url)))
|
||||
if let Some(client) = state.jellyfin_client.clone() {
|
||||
use api::jellyfin::BaseItemKind::*;
|
||||
if let Some(cached) = id.as_ref().and_then(|id| state.cache.get(id))
|
||||
&& matches!(cached._type, Video | Movie | Episode)
|
||||
{
|
||||
let url = client
|
||||
.stream_url(id.expect("ID exists"))
|
||||
.expect("Failed to get stream URL");
|
||||
Task::done(Message::Video(video::VideoMessage::Open(url)))
|
||||
} else {
|
||||
Task::perform(
|
||||
async move {
|
||||
let items: Result<Vec<Item>, api::JellyfinApiError> = client
|
||||
.items(id)
|
||||
.await
|
||||
.map(|items| items.into_iter().map(Item::from).collect());
|
||||
(id, items)
|
||||
},
|
||||
|(msg, items)| match items {
|
||||
Err(e) => Message::Error(format!("Failed to load item: {}", e)),
|
||||
Ok(items) => Message::LoadedItem(msg, items),
|
||||
},
|
||||
)
|
||||
}
|
||||
} else {
|
||||
Task::perform(
|
||||
async move {
|
||||
let items: Result<Vec<Item>, api::JellyfinApiError> = client
|
||||
.items(id)
|
||||
.await
|
||||
.map(|items| items.into_iter().map(Item::from).collect());
|
||||
(id, items)
|
||||
},
|
||||
|(msg, items)| match items {
|
||||
Err(e) => Message::Error(format!("Failed to load item: {}", e)),
|
||||
Ok(items) => Message::LoadedItem(msg, items),
|
||||
},
|
||||
)
|
||||
Task::none()
|
||||
}
|
||||
}
|
||||
Message::LoadedItem(id, items) => {
|
||||
@@ -275,34 +217,30 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
|
||||
Task::none()
|
||||
}
|
||||
Message::Refresh => {
|
||||
// Handle refresh logic
|
||||
let client = state.jellyfin_client.clone();
|
||||
let current = state.current;
|
||||
Task::perform(
|
||||
async move {
|
||||
let items: Result<Vec<Item>, api::JellyfinApiError> = client
|
||||
.items(current)
|
||||
.await
|
||||
.map(|items| items.into_iter().map(Item::from).collect());
|
||||
(current, items)
|
||||
},
|
||||
|(msg, items)| match items {
|
||||
Err(e) => Message::Error(format!("Failed to refresh items: {}", e)),
|
||||
Ok(items) => Message::LoadedItem(msg, items),
|
||||
},
|
||||
)
|
||||
if let Some(client) = state.jellyfin_client.clone() {
|
||||
let current = state.current;
|
||||
Task::perform(
|
||||
async move {
|
||||
let items: Result<Vec<Item>, api::JellyfinApiError> = client
|
||||
.items(current)
|
||||
.await
|
||||
.map(|items| items.into_iter().map(Item::from).collect());
|
||||
(current, items)
|
||||
},
|
||||
|(msg, items)| match items {
|
||||
Err(e) => Message::Error(format!("Failed to refresh items: {}", e)),
|
||||
Ok(items) => Message::LoadedItem(msg, items),
|
||||
},
|
||||
)
|
||||
} else {
|
||||
Task::none()
|
||||
}
|
||||
}
|
||||
Message::Error(err) => {
|
||||
tracing::error!("Error: {}", err);
|
||||
state.messages.push(err);
|
||||
Task::none()
|
||||
}
|
||||
Message::SetToken(token) => {
|
||||
tracing::info!("Authenticated with token: {}", token);
|
||||
state.jellyfin_client.set_token(token);
|
||||
state.is_authenticated = true;
|
||||
Task::none()
|
||||
}
|
||||
Message::Back => {
|
||||
state.current = state.history.pop().unwrap_or(None);
|
||||
Task::none()
|
||||
@@ -313,79 +251,53 @@ fn update(state: &mut State, message: Message) -> Task<Message> {
|
||||
}
|
||||
Message::SearchQueryChanged(query) => {
|
||||
state.query = Some(query);
|
||||
// Handle search query change
|
||||
Task::none()
|
||||
}
|
||||
Message::Search => {
|
||||
// Handle search action
|
||||
let client = state.jellyfin_client.clone();
|
||||
let query = state.query.clone().unwrap_or_default();
|
||||
Task::perform(async move { client.search(query).await }, |r| match r {
|
||||
Err(e) => Message::Error(format!("Search failed: {}", e)),
|
||||
Ok(items) => {
|
||||
let items = items.into_iter().map(Item::from).collect();
|
||||
Message::LoadedItem(None, items)
|
||||
}
|
||||
})
|
||||
// let client = state.jellyfin_client.clone();
|
||||
if let Some(client) = state.jellyfin_client.clone() {
|
||||
let query = state.query.clone().unwrap_or_default();
|
||||
Task::perform(async move { client.search(query).await }, |r| match r {
|
||||
Err(e) => Message::Error(format!("Search failed: {}", e)),
|
||||
Ok(items) => {
|
||||
let items = items.into_iter().map(Item::from).collect();
|
||||
Message::LoadedItem(None, items)
|
||||
}
|
||||
})
|
||||
} else {
|
||||
Task::none()
|
||||
}
|
||||
}
|
||||
Message::Video(msg) => match msg {
|
||||
VideoMessage::EndOfStream => {
|
||||
state.video = None;
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Open(url) => {
|
||||
state.video = Video::new(&url)
|
||||
.inspect_err(|err| {
|
||||
tracing::error!("Failed to play video at {}: {:?}", url, err);
|
||||
})
|
||||
.ok()
|
||||
.map(Arc::new);
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Pause => {
|
||||
if let Some(video) = state.video.as_mut().and_then(Arc::get_mut) {
|
||||
video.set_paused(true);
|
||||
}
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Play => {
|
||||
if let Some(video) = state.video.as_mut().and_then(Arc::get_mut) {
|
||||
video.set_paused(false);
|
||||
}
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Seek(position) => {
|
||||
// if let Some(ref video) = state.video {
|
||||
// // video.seek(position, true);
|
||||
// }
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Stop => {
|
||||
state.video = None;
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Test => {
|
||||
let url = url::Url::parse(
|
||||
// "file:///home/servius/Projects/jello/crates/iced_video_player/.media/test.mp4",
|
||||
"https://gstreamer.freedesktop.org/data/media/sintel_trailer-480p.webm",
|
||||
)
|
||||
.unwrap();
|
||||
state.video = Video::new(&url)
|
||||
.inspect_err(|err| {
|
||||
dbg!(err);
|
||||
})
|
||||
.ok()
|
||||
.map(Arc::new);
|
||||
Task::none()
|
||||
}
|
||||
},
|
||||
Message::Video(msg) => video::update(state, msg),
|
||||
_ => todo!(),
|
||||
}
|
||||
}
|
||||
|
||||
fn view(state: &State) -> Element<'_, Message> {
|
||||
let content = home(state);
|
||||
match state.screen {
|
||||
Screen::Settings => settings(state),
|
||||
Screen::Home | _ => home(state),
|
||||
Screen::Settings => {
|
||||
let settings = settings::settings(state);
|
||||
let settings = container(settings)
|
||||
.width(Length::FillPortion(4))
|
||||
.height(Length::FillPortion(4))
|
||||
.style(container::rounded_box)
|
||||
.pipe(mouse_area)
|
||||
.on_press(Message::Refresh)
|
||||
.pipe(|c| iced::widget::column![space::vertical(), c, space::vertical()])
|
||||
.pipe(container)
|
||||
.width(Length::Fill)
|
||||
.width(Length::Fill)
|
||||
.align_y(Alignment::Center)
|
||||
.align_x(Alignment::Center)
|
||||
.style(|_| container::background(BACKGROUND_COLOR))
|
||||
.padding(50)
|
||||
.pipe(mouse_area)
|
||||
.on_press(Message::Settings(settings::SettingsMessage::Close));
|
||||
stack![content, settings].into()
|
||||
}
|
||||
Screen::Home | _ => content,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -396,52 +308,38 @@ fn home(state: &State) -> Element<'_, Message> {
|
||||
.into()
|
||||
}
|
||||
|
||||
fn player(video: &Video) -> Element<'_, Message> {
|
||||
container(
|
||||
VideoPlayer::new(video)
|
||||
.width(Length::Fill)
|
||||
.height(Length::Fill)
|
||||
.content_fit(iced::ContentFit::Contain)
|
||||
.on_end_of_stream(Message::Video(VideoMessage::EndOfStream)),
|
||||
)
|
||||
.style(|_| container::background(iced::Color::BLACK))
|
||||
.width(Length::Fill)
|
||||
.height(Length::Fill)
|
||||
.align_x(Alignment::Center)
|
||||
.align_y(Alignment::Center)
|
||||
.into()
|
||||
}
|
||||
|
||||
fn body(state: &State) -> Element<'_, Message> {
|
||||
if let Some(ref video) = state.video {
|
||||
player(video)
|
||||
video::player(video)
|
||||
} else {
|
||||
scrollable(
|
||||
container(
|
||||
Grid::with_children(state.cache.items_of(state.current).into_iter().map(card))
|
||||
.fluid(400)
|
||||
.spacing(50),
|
||||
)
|
||||
Grid::with_children(state.cache.items_of(state.current).into_iter().map(card))
|
||||
.fluid(400)
|
||||
.spacing(50)
|
||||
.pipe(container)
|
||||
.padding(50)
|
||||
.align_x(Alignment::Center)
|
||||
// .align_y(Alignment::Center)
|
||||
.height(Length::Fill)
|
||||
.width(Length::Fill),
|
||||
)
|
||||
.height(Length::Fill)
|
||||
.into()
|
||||
.width(Length::Fill)
|
||||
.pipe(scrollable)
|
||||
.height(Length::Fill)
|
||||
.into()
|
||||
}
|
||||
}
|
||||
|
||||
fn header(state: &State) -> Element<'_, Message> {
|
||||
row([
|
||||
container(
|
||||
Button::new(
|
||||
Text::new(state.jellyfin_client.config.server_url.as_str())
|
||||
.align_x(Alignment::Start),
|
||||
)
|
||||
.on_press(Message::Home),
|
||||
text(
|
||||
state
|
||||
.jellyfin_client
|
||||
.as_ref()
|
||||
.map(|c| c.config.server_url.as_str())
|
||||
.unwrap_or("No Server"),
|
||||
)
|
||||
.align_x(Alignment::Start)
|
||||
.pipe(button)
|
||||
.on_press(Message::Home)
|
||||
.pipe(container)
|
||||
.padding(10)
|
||||
.width(Length::Fill)
|
||||
.height(Length::Fill)
|
||||
@@ -450,16 +348,17 @@ fn header(state: &State) -> Element<'_, Message> {
|
||||
.style(container::rounded_box)
|
||||
.into(),
|
||||
search(state),
|
||||
container(
|
||||
row([
|
||||
button("Refresh").on_press(Message::Refresh).into(),
|
||||
button("Settings").on_press(Message::OpenSettings).into(),
|
||||
button("TestVideo")
|
||||
.on_press(Message::Video(VideoMessage::Test))
|
||||
.into(),
|
||||
])
|
||||
.spacing(10),
|
||||
)
|
||||
row([
|
||||
button("Refresh").on_press(Message::Refresh).into(),
|
||||
button("Settings")
|
||||
.on_press(Message::Settings(settings::SettingsMessage::Open))
|
||||
.into(),
|
||||
button("TestVideo")
|
||||
.on_press(Message::Video(video::VideoMessage::Test))
|
||||
.into(),
|
||||
])
|
||||
.spacing(10)
|
||||
.pipe(container)
|
||||
.padding(10)
|
||||
.width(Length::Fill)
|
||||
.height(Length::Fill)
|
||||
@@ -475,19 +374,18 @@ fn header(state: &State) -> Element<'_, Message> {
|
||||
}
|
||||
|
||||
fn search(state: &State) -> Element<'_, Message> {
|
||||
container(
|
||||
TextInput::new("Search...", state.query.as_deref().unwrap_or_default())
|
||||
.padding(10)
|
||||
.size(16)
|
||||
.width(Length::Fill)
|
||||
.on_input(Message::SearchQueryChanged)
|
||||
.on_submit(Message::Search),
|
||||
)
|
||||
.padding(10)
|
||||
.width(Length::Fill)
|
||||
.height(Length::Shrink)
|
||||
.style(container::rounded_box)
|
||||
.into()
|
||||
TextInput::new("Search...", state.query.as_deref().unwrap_or_default())
|
||||
.padding(10)
|
||||
.size(16)
|
||||
.width(Length::Fill)
|
||||
.on_input(Message::SearchQueryChanged)
|
||||
.on_submit(Message::Search)
|
||||
.pipe(container)
|
||||
.padding(10)
|
||||
.width(Length::Fill)
|
||||
.height(Length::Shrink)
|
||||
.style(container::rounded_box)
|
||||
.into()
|
||||
}
|
||||
|
||||
fn footer(state: &State) -> Element<'_, Message> {
|
||||
@@ -508,123 +406,6 @@ fn footer(state: &State) -> Element<'_, Message> {
|
||||
.into()
|
||||
}
|
||||
|
||||
fn settings(state: &State) -> Element<'_, Message> {
|
||||
let content = if state.is_authenticated {
|
||||
// Authenticated view - show user info and logout
|
||||
column([
|
||||
Text::new("Settings").size(32).into(),
|
||||
container(
|
||||
column([
|
||||
Text::new("Account").size(24).into(),
|
||||
Text::new("Server URL").size(14).into(),
|
||||
Text::new(state.jellyfin_client.config.server_url.as_str())
|
||||
.size(12)
|
||||
.into(),
|
||||
container(Text::new("Status: Logged In").size(14))
|
||||
.padding(10)
|
||||
.width(Length::Fill)
|
||||
.into(),
|
||||
container(
|
||||
row([
|
||||
Button::new(Text::new("Logout"))
|
||||
.padding(10)
|
||||
.on_press(Message::Logout)
|
||||
.into(),
|
||||
Button::new(Text::new("Close"))
|
||||
.padding(10)
|
||||
.on_press(Message::CloseSettings)
|
||||
.into(),
|
||||
])
|
||||
.spacing(10),
|
||||
)
|
||||
.padding(10)
|
||||
.width(Length::Fill)
|
||||
.into(),
|
||||
])
|
||||
.spacing(10)
|
||||
.max_width(400)
|
||||
.align_x(Alignment::Center),
|
||||
)
|
||||
.padding(20)
|
||||
.width(Length::Fill)
|
||||
.align_x(Alignment::Center)
|
||||
.style(container::rounded_box)
|
||||
.into(),
|
||||
])
|
||||
.spacing(20)
|
||||
.padding(50)
|
||||
.align_x(Alignment::Center)
|
||||
} else {
|
||||
// Not authenticated view - show login form
|
||||
column([
|
||||
Text::new("Settings").size(32).into(),
|
||||
container(
|
||||
column([
|
||||
Text::new("Login to Jellyfin").size(24).into(),
|
||||
Text::new("Server URL").size(14).into(),
|
||||
Text::new(state.jellyfin_client.config.server_url.as_str())
|
||||
.size(12)
|
||||
.into(),
|
||||
container(
|
||||
TextInput::new("Username", &state.username_input)
|
||||
.padding(10)
|
||||
.size(16)
|
||||
.on_input(Message::UsernameChanged),
|
||||
)
|
||||
.padding(10)
|
||||
.width(Length::Fill)
|
||||
.into(),
|
||||
container(
|
||||
TextInput::new("Password", &state.password_input)
|
||||
.padding(10)
|
||||
.size(16)
|
||||
.secure(true)
|
||||
.on_input(Message::PasswordChanged)
|
||||
.on_submit(Message::Login),
|
||||
)
|
||||
.padding(10)
|
||||
.width(Length::Fill)
|
||||
.into(),
|
||||
container(
|
||||
row([
|
||||
Button::new(Text::new("Login"))
|
||||
.padding(10)
|
||||
.on_press(Message::Login)
|
||||
.into(),
|
||||
Button::new(Text::new("Cancel"))
|
||||
.padding(10)
|
||||
.on_press(Message::CloseSettings)
|
||||
.into(),
|
||||
])
|
||||
.spacing(10),
|
||||
)
|
||||
.padding(10)
|
||||
.width(Length::Fill)
|
||||
.into(),
|
||||
])
|
||||
.spacing(10)
|
||||
.max_width(400)
|
||||
.align_x(Alignment::Center),
|
||||
)
|
||||
.padding(20)
|
||||
.width(Length::Fill)
|
||||
.align_x(Alignment::Center)
|
||||
.style(container::rounded_box)
|
||||
.into(),
|
||||
])
|
||||
.spacing(20)
|
||||
.padding(50)
|
||||
.align_x(Alignment::Center)
|
||||
};
|
||||
|
||||
container(content)
|
||||
.width(Length::Fill)
|
||||
.height(Length::Fill)
|
||||
.align_x(Alignment::Center)
|
||||
.align_y(Alignment::Center)
|
||||
.into()
|
||||
}
|
||||
|
||||
fn card(item: &Item) -> Element<'_, Message> {
|
||||
let name = item
|
||||
.name
|
||||
@@ -660,25 +441,54 @@ fn card(item: &Item) -> Element<'_, Message> {
|
||||
.into()
|
||||
}
|
||||
|
||||
// fn video(url: &str
|
||||
fn init() -> (State, Task<Message>) {
|
||||
// Create a default config for initial state
|
||||
|
||||
fn init(config: impl Fn() -> api::JellyfinConfig + 'static) -> impl Fn() -> (State, Task<Message>) {
|
||||
move || {
|
||||
let mut jellyfin = api::JellyfinClient::new(config());
|
||||
(
|
||||
State::new(jellyfin.clone()),
|
||||
Task::perform(
|
||||
async move { jellyfin.authenticate_with_cached_token(".session").await },
|
||||
|token| match token {
|
||||
Ok(token) => Message::SetToken(token),
|
||||
Err(e) => Message::Error(format!("Authentication failed: {}", e)),
|
||||
},
|
||||
)
|
||||
.chain(Task::done(Message::Refresh)),
|
||||
// let default_config = api::JellyfinConfig {
|
||||
// server_url: "http://localhost:8096".parse().expect("Valid URL"),
|
||||
// device_id: "jello-iced".to_string(),
|
||||
// device_name: "Jello Iced".to_string(),
|
||||
// client_name: "Jello".to_string(),
|
||||
// version: "0.1.0".to_string(),
|
||||
// };
|
||||
// let default_client = api::JellyfinClient::new_with_config(default_config);
|
||||
|
||||
(
|
||||
State::new(),
|
||||
Task::perform(
|
||||
async move {
|
||||
let config_str = std::fs::read_to_string("config.toml")
|
||||
.map_err(|e| api::JellyfinApiError::IoError(e))?;
|
||||
let config: api::JellyfinConfig = toml::from_str(&config_str).map_err(|e| {
|
||||
api::JellyfinApiError::IoError(std::io::Error::new(
|
||||
std::io::ErrorKind::InvalidData,
|
||||
e,
|
||||
))
|
||||
})?;
|
||||
|
||||
// Try to load cached token and authenticate
|
||||
match std::fs::read_to_string(".session") {
|
||||
Ok(token) => {
|
||||
let client = api::JellyfinClient::pre_authenticated(token.trim(), config)?;
|
||||
Ok((client, true))
|
||||
}
|
||||
Err(_) => {
|
||||
// No cached token, create unauthenticated client
|
||||
let client = api::JellyfinClient::new_with_config(config);
|
||||
Ok((client, false))
|
||||
}
|
||||
}
|
||||
},
|
||||
|result: Result<_, api::JellyfinApiError>| match result {
|
||||
// Ok((client, is_authenticated)) => Message::LoadedClient(client, is_authenticated),
|
||||
Err(e) => Message::Error(format!("Initialization failed: {}", e)),
|
||||
_ => Message::Error("Login Unimplemented".to_string()),
|
||||
},
|
||||
)
|
||||
}
|
||||
.chain(Task::done(Message::Refresh)),
|
||||
)
|
||||
}
|
||||
|
||||
pub fn ui(config: impl Fn() -> api::JellyfinConfig + 'static) -> iced::Result {
|
||||
iced::application(init(config), update, view).run()
|
||||
pub fn ui() -> iced::Result {
|
||||
iced::application(init, update, view).run()
|
||||
}
|
||||
|
||||
296
ui-iced/src/settings.rs
Normal file
296
ui-iced/src/settings.rs
Normal file
@@ -0,0 +1,296 @@
|
||||
use crate::*;
|
||||
use iced::Element;
|
||||
|
||||
pub fn settings(state: &State) -> Element<'_, Message> {
|
||||
screens::settings(state)
|
||||
}
|
||||
|
||||
pub fn update(state: &mut State, message: SettingsMessage) -> Task<Message> {
|
||||
match message {
|
||||
SettingsMessage::Open => {
|
||||
tracing::trace!("Opening settings");
|
||||
state.screen = Screen::Settings;
|
||||
}
|
||||
SettingsMessage::Close => {
|
||||
tracing::trace!("Closing settings");
|
||||
state.screen = Screen::Home;
|
||||
}
|
||||
SettingsMessage::Select(screen) => {
|
||||
tracing::trace!("Switching settings screen to {:?}", screen);
|
||||
state.settings.screen = screen;
|
||||
}
|
||||
SettingsMessage::User(user) => state.settings.login_form.update(user),
|
||||
|
||||
SettingsMessage::Server(server) => state.settings.server_form.update(server),
|
||||
}
|
||||
Task::none()
|
||||
}
|
||||
|
||||
pub fn empty() -> Element<'static, Message> {
|
||||
column([]).into()
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Default)]
|
||||
pub struct SettingsState {
|
||||
login_form: LoginForm,
|
||||
server_form: ServerForm,
|
||||
screen: SettingsScreen,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum SettingsMessage {
|
||||
Open,
|
||||
Close,
|
||||
Select(SettingsScreen),
|
||||
User(UserMessage),
|
||||
Server(ServerMessage),
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum UserMessage {
|
||||
Add,
|
||||
UsernameChanged(String),
|
||||
PasswordChanged(String),
|
||||
// Edit(uuid::Uuid),
|
||||
// Delete(uuid::Uuid),
|
||||
Clear,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum ServerMessage {
|
||||
Add,
|
||||
NameChanged(String),
|
||||
UrlChanged(String),
|
||||
// Edit(uuid::Uuid),
|
||||
// Delete(uuid::Uuid),
|
||||
Clear,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Default, PartialEq, Eq)]
|
||||
pub enum SettingsScreen {
|
||||
#[default]
|
||||
Main,
|
||||
Users,
|
||||
Servers,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ServerItem {
|
||||
pub id: uuid::Uuid,
|
||||
pub name: SharedString,
|
||||
pub url: SharedString,
|
||||
pub users: Vec<uuid::Uuid>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct UserItem {
|
||||
pub id: uuid::Uuid,
|
||||
pub name: SharedString,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Default)]
|
||||
pub struct LoginForm {
|
||||
username: String,
|
||||
password: String,
|
||||
}
|
||||
|
||||
impl LoginForm {
|
||||
pub fn update(&mut self, message: UserMessage) {
|
||||
match message {
|
||||
UserMessage::UsernameChanged(data) => {
|
||||
self.username = data;
|
||||
}
|
||||
UserMessage::PasswordChanged(data) => {
|
||||
self.password = data;
|
||||
}
|
||||
UserMessage::Add => {
|
||||
// Handle adding user
|
||||
}
|
||||
UserMessage::Clear => {
|
||||
self.username.clear();
|
||||
self.password.clear();
|
||||
}
|
||||
}
|
||||
}
|
||||
pub fn view(&self) -> Element<'_, Message> {
|
||||
iced::widget::column![
|
||||
text("Login Form"),
|
||||
text_input("Enter Username", &self.username).on_input(|data| {
|
||||
Message::Settings(SettingsMessage::User(UserMessage::UsernameChanged(data)))
|
||||
}),
|
||||
text_input("Enter Password", &self.password)
|
||||
.secure(true)
|
||||
.on_input(|data| {
|
||||
Message::Settings(SettingsMessage::User(UserMessage::PasswordChanged(data)))
|
||||
}),
|
||||
row![
|
||||
button(text("Add User")).on_press_maybe(self.validate()),
|
||||
button(text("Cancel"))
|
||||
.on_press(Message::Settings(SettingsMessage::User(UserMessage::Clear))),
|
||||
]
|
||||
.spacing(10),
|
||||
]
|
||||
.spacing(10)
|
||||
.padding([10, 0])
|
||||
.into()
|
||||
}
|
||||
|
||||
pub fn validate(&self) -> Option<Message> {
|
||||
(!self.username.is_empty() && !self.password.is_empty())
|
||||
.then(|| Message::Settings(SettingsMessage::User(UserMessage::Add)))
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Default)]
|
||||
pub struct ServerForm {
|
||||
name: String,
|
||||
url: String,
|
||||
}
|
||||
|
||||
impl ServerForm {
|
||||
pub fn update(&mut self, message: ServerMessage) {
|
||||
match message {
|
||||
ServerMessage::NameChanged(data) => {
|
||||
self.name = data;
|
||||
}
|
||||
ServerMessage::UrlChanged(data) => {
|
||||
self.url = data;
|
||||
}
|
||||
ServerMessage::Add => {
|
||||
// Handle adding server
|
||||
}
|
||||
ServerMessage::Clear => {
|
||||
self.name.clear();
|
||||
self.url.clear();
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
pub fn view(&self) -> Element<'_, Message> {
|
||||
iced::widget::column![
|
||||
text("Add New Server"),
|
||||
text_input("Enter server name", &self.name).on_input(|data| {
|
||||
Message::Settings(SettingsMessage::Server(ServerMessage::NameChanged(data)))
|
||||
}),
|
||||
text_input("Enter server URL", &self.url).on_input(|data| {
|
||||
Message::Settings(SettingsMessage::Server(ServerMessage::UrlChanged(data)))
|
||||
}),
|
||||
row![
|
||||
button(text("Add Server")).on_press_maybe(self.validate()),
|
||||
button(text("Cancel")).on_press(Message::Settings(SettingsMessage::Server(
|
||||
ServerMessage::Clear
|
||||
))),
|
||||
]
|
||||
.spacing(10),
|
||||
]
|
||||
.spacing(10)
|
||||
.padding([10, 0])
|
||||
.into()
|
||||
}
|
||||
|
||||
pub fn validate(&self) -> Option<Message> {
|
||||
(!self.name.is_empty() && !self.url.is_empty())
|
||||
.then(|| Message::Settings(SettingsMessage::Server(ServerMessage::Add)))
|
||||
}
|
||||
}
|
||||
|
||||
mod screens {
|
||||
use iced_aw::Tabs;
|
||||
|
||||
use super::*;
|
||||
pub fn settings(state: &State) -> Element<'_, Message> {
|
||||
Tabs::new(|f| Message::Settings(SettingsMessage::Select(f)))
|
||||
.push(
|
||||
SettingsScreen::Main,
|
||||
iced_aw::TabLabel::Text("General".into()),
|
||||
main(state),
|
||||
)
|
||||
.push(
|
||||
SettingsScreen::Servers,
|
||||
iced_aw::TabLabel::Text("Servers".into()),
|
||||
server(state),
|
||||
)
|
||||
.push(
|
||||
SettingsScreen::Users,
|
||||
iced_aw::TabLabel::Text("Users".into()),
|
||||
user(state),
|
||||
)
|
||||
.set_active_tab(&state.settings.screen)
|
||||
.into()
|
||||
}
|
||||
|
||||
pub fn settings_screen(state: &State) -> Element<'_, Message> {
|
||||
container(match state.settings.screen {
|
||||
SettingsScreen::Main => main(state),
|
||||
SettingsScreen::Servers => server(state),
|
||||
SettingsScreen::Users => user(state),
|
||||
})
|
||||
.width(Length::FillPortion(10))
|
||||
.height(Length::Fill)
|
||||
.style(|theme| container::background(theme.extended_palette().background.base.color))
|
||||
.pipe(container)
|
||||
.padding(10)
|
||||
.style(|theme| container::background(theme.extended_palette().secondary.base.color))
|
||||
.width(Length::FillPortion(10))
|
||||
.into()
|
||||
}
|
||||
|
||||
pub fn settings_list(state: &State) -> Element<'_, Message> {
|
||||
column(
|
||||
[
|
||||
button(center_text("General")).on_press(Message::Settings(
|
||||
SettingsMessage::Select(SettingsScreen::Main),
|
||||
)),
|
||||
button(center_text("Servers")).on_press(Message::Settings(
|
||||
SettingsMessage::Select(SettingsScreen::Servers),
|
||||
)),
|
||||
button(center_text("Users")).on_press(Message::Settings(SettingsMessage::Select(
|
||||
SettingsScreen::Users,
|
||||
))),
|
||||
]
|
||||
.map(|p| p.clip(true).width(Length::Fill).into()),
|
||||
)
|
||||
.width(Length::FillPortion(2))
|
||||
.spacing(10)
|
||||
.padding(10)
|
||||
.pipe(scrollable)
|
||||
.into()
|
||||
}
|
||||
|
||||
pub fn main(state: &State) -> Element<'_, Message> {
|
||||
Column::new()
|
||||
.push(text("Main Settings"))
|
||||
.push(toggler(true).label("HDR"))
|
||||
.spacing(20)
|
||||
.padding(20)
|
||||
.pipe(container)
|
||||
.into()
|
||||
}
|
||||
|
||||
pub fn server(state: &State) -> Element<'_, Message> {
|
||||
Column::new()
|
||||
.push(text("Server Settings"))
|
||||
.push(state.settings.server_form.view())
|
||||
.spacing(20)
|
||||
.padding(20)
|
||||
.pipe(container)
|
||||
.into()
|
||||
}
|
||||
|
||||
pub fn user(state: &State) -> Element<'_, Message> {
|
||||
Column::new()
|
||||
.push(text("User Settings"))
|
||||
.push(state.settings.login_form.view())
|
||||
.spacing(20)
|
||||
.padding(20)
|
||||
.pipe(container)
|
||||
.into()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn center_text(content: &str) -> Element<'_, Message> {
|
||||
text(content)
|
||||
.align_x(Alignment::Center)
|
||||
.width(Length::Fill)
|
||||
.into()
|
||||
}
|
||||
@@ -49,6 +49,21 @@ impl std::ops::Deref for SharedString {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, PartialEq, Eq, Hash)]
|
||||
pub struct SecretSharedString(ArcCow<'static, str>);
|
||||
|
||||
impl core::fmt::Debug for SecretSharedString {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
f.write_str("(..secret..)")
|
||||
}
|
||||
}
|
||||
|
||||
impl From<String> for SecretSharedString {
|
||||
fn from(s: String) -> Self {
|
||||
Self(ArcCow::Owned(Arc::from(s)))
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, PartialEq, Eq, Hash)]
|
||||
pub enum ArcCow<'a, T: ?Sized> {
|
||||
Borrowed(&'a T),
|
||||
@@ -66,3 +81,9 @@ where
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, T> From<&'a T> for ArcCow<'a, T> {
|
||||
fn from(value: &'a T) -> Self {
|
||||
ArcCow::Borrowed(value)
|
||||
}
|
||||
}
|
||||
|
||||
78
ui-iced/src/video.rs
Normal file
78
ui-iced/src/video.rs
Normal file
@@ -0,0 +1,78 @@
|
||||
use super::*;
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum VideoMessage {
|
||||
EndOfStream,
|
||||
Open(url::Url),
|
||||
Loaded(VideoHandle<Message, Ready>),
|
||||
Pause,
|
||||
Play,
|
||||
Seek(f64),
|
||||
Stop,
|
||||
Test,
|
||||
}
|
||||
|
||||
pub fn update(state: &mut State, message: VideoMessage) -> Task<Message> {
|
||||
match message {
|
||||
VideoMessage::EndOfStream => {
|
||||
state.video = None;
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Open(url) => {
|
||||
Task::perform(VideoHandle::load(url.clone()), move |result| match result {
|
||||
Ok(video) => Message::Video(VideoMessage::Loaded(video)),
|
||||
Err(err) => Message::Error(format!("Error opening video at {}: {:?}", url, err)),
|
||||
})
|
||||
}
|
||||
VideoMessage::Loaded(video) => {
|
||||
state.video = Some(Arc::new(
|
||||
video.on_end_of_stream(Message::Video(VideoMessage::EndOfStream)),
|
||||
));
|
||||
Task::done(VideoMessage::Play).map(Message::Video)
|
||||
}
|
||||
VideoMessage::Pause => {
|
||||
if let Some(ref video) = state.video {
|
||||
video.pause();
|
||||
}
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Play => {
|
||||
if let Some(ref video) = state.video {
|
||||
video.play();
|
||||
}
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Seek(position) => {
|
||||
// if let Some(ref video) = state.video {
|
||||
// // video.seek(position, true);
|
||||
// }
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Stop => {
|
||||
state.video.as_ref().map(|video| {
|
||||
video.stop();
|
||||
});
|
||||
state.video = None;
|
||||
Task::none()
|
||||
}
|
||||
VideoMessage::Test => {
|
||||
let url = url::Url::parse("https://jellyfin.tsuba.darksailor.dev/Items/6010382cf25273e624d305907010d773/Download?api_key=036c140222464878862231ef66a2bc9c")
|
||||
.expect("Impossible: Failed to parse hardcoded URL");
|
||||
Task::done(VideoMessage::Open(url)).map(Message::Video)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn player(video: &VideoHandle<Message, Ready>) -> Element<'_, Message> {
|
||||
container(
|
||||
Video::new(video)
|
||||
.width(Length::Fill)
|
||||
.height(Length::Fill)
|
||||
.content_fit(iced::ContentFit::Contain),
|
||||
)
|
||||
.style(|_| container::background(iced::Color::BLACK))
|
||||
.width(Length::Fill)
|
||||
.height(Length::Fill)
|
||||
.align_x(Alignment::Center)
|
||||
.align_y(Alignment::Center)
|
||||
.into()
|
||||
}
|
||||
Reference in New Issue
Block a user