Skip to main content
Platform extensions enable agents to access external services and UI components through dependency injection. They provide automatic configuration, portable agent code, user-configured services with authentication, and remain entirely optional.

Extension Types

Service Extensions

Service extensions provide configured access to external services:
  • LLM Service: Language model access (OpenAI, Ollama, IBM Granite, and more)
  • Embedding Service: Text embedding generation for RAG and semantic search
  • Platform API: File storage, vector databases, and other platform services
  • MCP: Model Context Protocol integration

UI Extensions

UI extensions add rich interactive components to the user interface:
  • Trajectory: Visualize agent reasoning with step-by-step execution traces
  • Citation: Display source references with clickable inline links
  • Form: Collect structured user input through interactive forms
  • Settings: Let users configure agent behavior and preferences

Authentication Extensions

Authentication extensions handle secure access to protected resources:
  • OAuth: Standardized OAuth-based authentication and token management
  • Secrets: Secure storage and retrieval of API keys and credentials

Using Extensions with Dependency Injection

Extensions are injected into agent functions using Annotated type hints:
from typing import Annotated
from agentstack_sdk.a2a.extensions import LLMServiceExtensionServer, LLMServiceExtensionSpec

async def my_agent(
    input: Message,
    context: RunContext,
    llm: Annotated[LLMServiceExtensionServer, LLMServiceExtensionSpec.single_demand()]
):
    # Platform provides configured LLM access
    pass

Checking Extension Availability

Extensions are optional by design. Always verify an extension is provided before using it:
if llm:
    # Use LLM functionality
    pass
else:
    # Provide fallback or inform user
    yield "LLM not configured"

Suggesting Preferred Models

Service extensions accept model suggestions in priority order. The platform selects the first available model:
LLMServiceExtensionSpec.single_demand(
    suggested=("openai/gpt-4o", "ollama/llama3.1", "ibm/granite-3-8b")
)

Extension Reference

OAuth Extension

Enables OAuth-based authentication for secure access to protected resources on behalf of users. Particularly useful for MCP clients that need user identity for third-party tools.

Secrets Extension

Provides secure storage and retrieval of sensitive data like API keys and tokens. Secrets can be requested before conversation start or dynamically during runtime.

Citation Extension

Renders inline citation icons with source information, optionally marking text ranges. Makes agent responses more transparent and verifiable.

Trajectory Extension

Tracks and visualizes agent reasoning steps and tool execution. Provides transparency into the agent’s decision-making process.

Form Extension

Collects structured data through interactive forms with fields like text inputs, dropdowns, date pickers, file uploads, and checkboxes. Forms can be presented initially or requested dynamically during conversations.

Settings Extension

Gives users control over agent behavior through configurable parameters like thinking mode, response style, and other agent-specific options.

Key Benefits

Extensions make agents adaptable across deployment environments while providing rich, interactive user experiences. The dependency injection pattern ensures agents remain portable and testable, while optional extension checking enables graceful degradation when services aren’t available.