Built-in Extensions
Overview of the built-in extensions that provide core functionality without additional dependencies
To minimize bloat and ensure a lean functional core, llms .py:
- Maintains a single file main.py core which enables:
- Its CLI Features
- Support for Open AI Compatible providers:
- Moonshot AI, Z.ai, Alibaba/Qwen, Groq, xAI, Mistral, Codestral, NVIDIA, GitHub, DeepSeek, Chutes, HuggingFace, OpenRouter, Fireworks AI, Ollama, LM Studio
- Requires only a single aiohttp dependency
- Maintains an up-to-date model provider configuration (from models.dev)
- Supports loading extensions for any additional functionality, integrations, and UI features
Built-in Extensions
All built-in extensions are located in the llms/extensions folder. They're resolved for generally useful functionality that does not require any additional dependencies.
app extension
Provides core application logic, data persistence (SQLite), and migration from client-side storage
- Database:
~/.llms/user/default/app/app.sqlite- request: Logs all requests for auditing and analytics
- thread: Persists chat threads and messages
- analytics: Provides a UI for analytics - using
requesttable data inapp.sqlite
gallery extension
A UI and backend for intercepting, storing, and viewing generated assets and uploaded files
- Database:
~/.llms/user/default/gallery/gallery.sqlite- media: Stores metadata for generated assets and uploaded files
providers extension
Provides support for popular LLM providers requiring custom handling beyond OpenAI-compatible requests:
- anthropic: Integration with Anthropic's Claude models
- cerebras: Integration with Cerebras AI models
- chutes: Integration with Chutes.ai for image generation
- google: Integration with Google's Gemini models
- nvidia: Integration with NVIDIA's GenAI APIs
- openai: Integration with OpenAI's chat and image generation models
- openrouter: Integration with OpenRouter for image generation
- z.ai: Integration with Z.ai for image generation
Disable Extensions
Only built-in extensions or extensions in your local ~/.llms/extensions folder are loaded by default.
To disable built-in extensions or temporarily disable local extensions, add their folder names to your ~/.llms/llms.json:
{
"disable_extensions": [
"xmas",
"duckduckgo"
]
}Alternatively you can set the LLMS_DISABLE environment variable with a comma-separated list of extension names to disable.
export LLMS_DISABLE="xmas,duckduckgo"Other Built-in Extensions
- core_tools - Essential tools for file operations, memory persistence, math calculation & safe code execution
- katex - Enables rendering of LaTeX math expressions in chat responses using KaTeX
- system_prompts - Configures a library of over 200+ curated system prompts accessible from the UI
- tools - Provides Tools UI and Exposes registered tool definitions via an API endpoint
Extensions Overview
Flexible extensions system for adding features, custom pages, toolbar icons, provider implementations, and UI customizations
UI Extensions
This guide provides a walkthrough of the LLM UI Extensions API which allows you to customize the UI, add new pages, modify the layout, and intercept chat functionality.