Chat UI documentation
MCP Tools
MCP Tools
Chat UI supports tool calling via the Model Context Protocol (MCP). MCP servers expose tools that models can invoke during conversations.
Server Types
Chat UI supports two types of MCP servers:
Base Servers (Admin-configured)
Base servers are configured by the administrator via environment variables. They appear for all users and can be enabled/disabled per-user but not removed.
MCP_SERVERS=[
{"name": "Web Search (Exa)", "url": "https://mcp.exa.ai/mcp"},
{"name": "Hugging Face", "url": "https://hf.co/mcp"}
]Each server entry requires:
name- Display name shown in the UIurl- MCP server endpoint URLheaders(optional) - Custom headers for authentication
User Servers (Added from UI)
Users can add their own MCP servers directly from the UI:
- Open the chat input and click the + button (or go to Settings)
- Select MCP Servers
- Click Add Server
- Enter the server name and URL
- Run Health Check to verify connectivity
User-added servers are stored in the browser and can be removed at any time. They work alongside base servers.
User Token Forwarding
When users are logged in via Hugging Face, you can forward their access token to MCP servers:
MCP_FORWARD_HF_USER_TOKEN=trueThis allows MCP servers to access user-specific resources on their behalf.
Using Tools
- Enable the servers you want to use from the MCP Servers panel
- Start chatting - models will automatically use tools when appropriate
Model Requirements
Not all models support tool calling. To enable tools for a specific model, add it to your MODELS override:
MODELS=`[
{
"id": "meta-llama/Llama-3.3-70B-Instruct",
"supportsTools": true
}
]`Tool Execution Flow
When a model decides to use a tool:
- The model generates a tool call with parameters
- Chat UI executes the call against the MCP server
- Results are displayed in the chat as a collapsible “tool” block
- Results are fed back to the model for follow-up responses
Integration with LLM Router
When using the LLM Router, you can configure automatic routing to a tools-capable model:
LLM_ROUTER_ENABLE_TOOLS=true
LLM_ROUTER_TOOLS_MODEL=meta-llama/Llama-3.3-70B-InstructWhen a user has MCP servers enabled and selects the Omni model, the router will automatically use the specified tools model.
Update on GitHub