Provider
Enum representing supported LLM providers.RenderableModel
Protocol for session data models generated by codegen.render() method transforms typed data into a flat dict[str, str] for variable substitution.
PromptTemplate
A prompt template fetched from the Moxn API.| Property | Type | Description |
|---|---|---|
id | UUID | Stable anchor ID that never changes |
name | str | Human-readable prompt name |
messages | list[Message] | The message sequence |
input_schema | Schema | None | Auto-generated from variables |
completion_config | CompletionConfig | None | Model and parameter settings |
function_tools | list[SdkTool] | Tools configured for function calling |
structured_output_schema | SdkTool | None | Schema for structured output |
Task
A task containing prompts and schemas.Message
A message within a prompt.ParsedResponse
Normalized LLM response from any provider.ParsedResponseCandidate
A single response candidate.Content Block Types
StopReason
TokenUsage
LLMEvent
Event logged to telemetry for LLM interactions.ResponseType
Classification of the response for UI rendering:Span Types
Span
An active span for tracing.SpanContext
Context that can be passed to child spans.MoxnTraceCarrier
Carrier for propagating trace context across services.Version Types
VersionRef
Reference to a specific version (branch or commit).branch_name or commit_id must be provided.
BranchHeadResponse
Response fromget_branch_head().
Branch
A branch reference.Commit
A commit snapshot.Schema Types
MoxnSchemaMetadata
Metadata embedded in JSON schemas.x-moxn-metadata field of exported JSON schemas and in generated Pydantic models.