pub struct NewModel {Show 17 fields
pub name: String,
pub file_path: PathBuf,
pub param_count_b: f64,
pub architecture: Option<String>,
pub quantization: Option<String>,
pub context_length: Option<u64>,
pub metadata: HashMap<String, String>,
pub added_at: DateTime<Utc>,
pub hf_repo_id: Option<String>,
pub hf_commit_sha: Option<String>,
pub hf_filename: Option<String>,
pub download_date: Option<DateTime<Utc>>,
pub last_update_check: Option<DateTime<Utc>>,
pub tags: Vec<String>,
pub file_paths: Option<Vec<PathBuf>>,
pub capabilities: ModelCapabilities,
pub inference_defaults: Option<InferenceConfig>,
}Expand description
A model to be inserted into the system (no ID yet).
This represents a model that hasn’t been persisted to the database.
After insertion, the repository returns a Model with the assigned ID.
Fields§
§name: StringHuman-readable name for the model.
file_path: PathBufAbsolute path to the GGUF file on the filesystem.
param_count_b: f64Number of parameters in the model (in billions).
architecture: Option<String>Model architecture (e.g., “llama”, “mistral”, “falcon”).
quantization: Option<String>Quantization type (e.g., “Q4_0”, “Q8_0”, “F16”, “F32”).
context_length: Option<u64>Maximum context length the model supports.
metadata: HashMap<String, String>Additional metadata key-value pairs from the GGUF file.
added_at: DateTime<Utc>UTC timestamp of when the model was added to the database.
hf_repo_id: Option<String>HuggingFace repository ID (e.g., “TheBloke/Llama-2-7B-GGUF”).
hf_commit_sha: Option<String>Git commit SHA from HuggingFace Hub.
hf_filename: Option<String>Original filename on HuggingFace Hub.
download_date: Option<DateTime<Utc>>Timestamp of when this model was downloaded from HuggingFace.
last_update_check: Option<DateTime<Utc>>Last time we checked for updates on HuggingFace.
User-defined tags for organizing models.
file_paths: Option<Vec<PathBuf>>Ordered list of all file paths for sharded models (None for single-file models).
capabilities: ModelCapabilitiesModel capabilities inferred from chat template analysis.
inference_defaults: Option<InferenceConfig>Per-model inference parameter defaults.
These are preferred over global settings when making inference requests. If not set, falls back to global settings or hardcoded defaults.