LlmCompletionPort

Trait LlmCompletionPort 

Source
pub trait LlmCompletionPort: Send + Sync {
    // Required method
    fn chat_stream<'life0, 'life1, 'life2, 'async_trait>(
        &'life0 self,
        messages: &'life1 [AgentMessage],
        tools: &'life2 [ToolDefinition],
    ) -> Pin<Box<dyn Future<Output = Result<Pin<Box<dyn Stream<Item = Result<LlmStreamEvent>> + Send>>>> + Send + 'async_trait>>
       where Self: 'async_trait,
             'life0: 'async_trait,
             'life1: 'async_trait,
             'life2: 'async_trait;
}
Expand description

Port that the agent loop uses to drive a streaming LLM.

Implementations translate domain messages + tool definitions into vendor-specific HTTP requests and stream back LlmStreamEvent values.

§Contract

  • The returned stream must end with exactly one LlmStreamEvent::Done item, even when the finish reason is abnormal (e.g. "length").
  • Text and tool-call delta events may interleave freely before Done.
  • An Err item in the stream signals an unrecoverable infrastructure error; the agent loop will surface it as super::agent::AgentError::Internal.

Required Methods§

Source

fn chat_stream<'life0, 'life1, 'life2, 'async_trait>( &'life0 self, messages: &'life1 [AgentMessage], tools: &'life2 [ToolDefinition], ) -> Pin<Box<dyn Future<Output = Result<Pin<Box<dyn Stream<Item = Result<LlmStreamEvent>> + Send>>>> + Send + 'async_trait>>
where Self: 'async_trait, 'life0: 'async_trait, 'life1: 'async_trait, 'life2: 'async_trait,

Begin a chat-completion request and return a live event stream.

§Parameters
  • messages — conversation history in domain form.
  • tools — tool schemas to advertise to the model.
§Returns

A pinned, heap-allocated, Send-able stream of LlmStreamEvent. The caller drives the stream by polling it; each item is either a successfully parsed event or an infrastructure error.

Implementors§