Skip to main content
Most agents require access to language models and other external services to function. Agent Stack makes it easy to configure these connections once and share them across all your agents.

Quickstart Interactive Setup

The fastest way to get started is with the interactive setup wizard:
agentstack model setup
The setup wizard will then guide you through:
  • API Key entry (with validation)
  • Model selection (with recommendations)
  • Connection testing (to verify everything works)
  • Provider-specific options (like context window for Ollama)

Supported LLM Providers

Agent Stack supports a wide range of language model providers:

Cloud Providers

  • Anthropic Claude
  • Cerebras - has a free tier
  • Chutes - has a free tier
  • Cohere - has a free tier
  • DeepSeek
  • Google Gemini - has a free tier
  • GitHub Models - has a free tier
  • Groq - has a free tier
  • IBM watsonx
  • Mistral - has a free tier
  • Moonshot AI
  • NVIDIA NIM
  • OpenAI
  • OpenRouter - has some free models
  • Perplexity
  • together.ai - has a free tier

Local Providers

  • Ollama
  • Jan

Custom Providers via LLM Gateway

If you have a custom OpenAI-compatible API endpoint, you can configure it during the interactive setup via agentstack model setup by selecting “Other (RITS, vLLM, …)” and providing your API URL. Agent Stack includes a built-in LLM gateway that provides a unified OpenAI-compatible API endpoint. This is useful when you want to:
  • Point existing agents to Agent Stack instead of directly to LLM providers
  • Centrally manage API keys and provider configurations
  • Switch providers without reconfiguring individual agents
After configuring Agent Stack with a provider, the gateway is available at: http://localhost:8333/api/v1/openai/chat/completions
This is a POST-only API endpoint for programmatic use. Use curl or OpenAI-compatible clients to interact with it.
The gateway automatically handles:
  • Authentication with your configured provider
  • Provider-specific request/response formatting
  • Both streaming and non-streaming responses
  • Request validation and error responses

Agent-Specific Variables

Some agents require additional API keys for external services. Example variables that agents might declare:
# Search APIs
agentstack env add TAVILY_API_KEY=your-tavily-key
agentstack env add SERP_API_KEY=your-serpapi-key

# Third-party services
agentstack env add GITHUB_TOKEN=your-github-token
agentstack env add DATABASE_URL=postgresql://user:pass@host:5432/db

Manual Environment Management

Add Variables

Set individual variables:
agentstack env add LLM_MODEL=gpt-4o
agentstack env add LLM_API_KEY=sk-...
agentstack env add LLM_API_BASE=https://api.openai.com/v1

View Current Configuration

List all configured environment variables:
agentstack env list

Remove Variables

Remove specific variables:
agentstack env remove LLM_API_KEY
agentstack env remove LLM_MODEL LLM_API_BASE