pub struct Settings {
pub default_download_path: Option<String>,
pub default_context_size: Option<u64>,
pub proxy_port: Option<u16>,
pub llama_base_port: Option<u16>,
pub max_download_queue_size: Option<u32>,
pub show_memory_fit_indicators: Option<bool>,
pub max_tool_iterations: Option<u32>,
pub max_stagnation_steps: Option<u32>,
pub default_model_id: Option<i64>,
pub inference_defaults: Option<InferenceConfig>,
}Expand description
Application settings structure.
All fields are optional to support partial updates and graceful defaults.
Fields§
§default_download_path: Option<String>Default directory for downloading models.
default_context_size: Option<u64>Default context size for models (e.g., 4096, 8192).
proxy_port: Option<u16>Port for the OpenAI-compatible proxy server.
llama_base_port: Option<u16>Base port for llama-server instance allocation (first port in range).
Note: The OpenAI-compatible proxy listens on proxy_port.
max_download_queue_size: Option<u32>Maximum number of downloads that can be queued (1-50).
show_memory_fit_indicators: Option<bool>Whether to show memory fit indicators in HuggingFace browser.
max_tool_iterations: Option<u32>Maximum iterations for tool calling agentic loop.
max_stagnation_steps: Option<u32>Maximum stagnation steps before stopping agent loop.
default_model_id: Option<i64>Default model ID for commands that support a default model.
inference_defaults: Option<InferenceConfig>Global inference parameter defaults.
Applied when neither request nor per-model defaults are specified. If not set, hardcoded defaults are used as final fallback.
Implementations§
Source§impl Settings
impl Settings
Sourcepub const fn with_defaults() -> Self
pub const fn with_defaults() -> Self
Create settings with sensible defaults.
Sourcepub const fn effective_proxy_port(&self) -> u16
pub const fn effective_proxy_port(&self) -> u16
Get the effective proxy port (with default fallback).
Sourcepub const fn effective_llama_base_port(&self) -> u16
pub const fn effective_llama_base_port(&self) -> u16
Get the effective llama-server base port (with default fallback).
Sourcepub fn merge(&mut self, other: &SettingsUpdate)
pub fn merge(&mut self, other: &SettingsUpdate)
Merge another settings into this one, only updating fields that are Some.