Models & Providers
Configure AI models in your preferred tool to use Aexol.ai's built-in Spectral model or other supported providers.
Aexol.ai Built-in Model
The built-in/spectral model is Aexol's native AI, optimized for understanding and generating Aexol specifications.
Endpoint: https://api.aexol.ai/v1
Model name: built-in/spectral
Auth: Team API key (sk-aexol-team-...)
Get your team API key from the Studio → Team settings.
OpenCode
Add a provider entry in your opencode.json or opencode.jsonc:
{
"provider": {
"aexol.ai": {
"name": "Aexol AI",
"options": {
"baseURL": "https://api.aexol.ai/v1",
"apiKey": "sk-aexol-team-..."
},
"models": {
"built-in/spectral": {
"name": "built-in/spectral"
}
}
}
}
}
Config locations:
- Global:
~/.config/opencode/opencode.json - Project-level:
opencode.jsonin your project root
OpenCode resolves the model as {baseURL}/models/{modelName}/v1/chat/completions, so built-in/spectral maps to https://api.aexol.ai/v1/models/built-in/spectral/v1/chat/completions.
Also connect MCP (optional)
To use Aexol tools (validate, generate, etc.) alongside the model, also add the MCP config:
{
"mcp": {
"aexol": {
"type": "remote",
"url": "https://api.aexol.ai/mcp",
"headers": {
"Authorization": "Bearer sk-aexol-team-..."
},
"oauth": false
}
}
}
Claude Code
# Add Aexol as an OpenAI-compatible provider
claude config set --global model aexol/built-in/spectral
Or configure directly in Claude's config with the Aexol.ai base URL and your team API key.
See Working with AI & IDEs for Claude Code MCP setup.
Cursor IDE
In Cursor settings, add a custom model provider:
- Provider type: OpenAI-compatible
- Base URL:
https://api.aexol.ai/v1 - API Key:
sk-aexol-team-... - Model:
built-in/spectral
Windsurf / Codeium
In Windsurf settings, configure a custom AI provider:
- Base URL:
https://api.aexol.ai/v1 - API Key:
sk-aexol-team-... - Model:
built-in/spectral
Continue (VS Code Extension)
In ~/.continue/config.json:
{
"models": [
{
"title": "Aexol Spectral",
"provider": "openai",
"model": "built-in/spectral",
"apiBase": "https://api.aexol.ai/v1",
"apiKey": "sk-aexol-team-..."
}
]
}
Other OpenAI-compatible clients
Any client that supports custom OpenAI-compatible endpoints works:
| Setting | Value |
|---|---|
| Base URL | https://api.aexol.ai/v1 |
| API Key | sk-aexol-team-... |
| Model | built-in/spectral |
Getting a Team API Key
- Open Aexol Studio
- Go to Team settings → API Keys
- Click Create new key
- Copy the key — it starts with
sk-aexol-team-
Next Steps
- Set up MCP tools → Working with AI & IDEs
- Configure remote MCP endpoint → Remote MCP
- Run inference → CLI Commands
