š Aexol CLI Commands Guide
Complete reference for all Aexol CLI commands with interactive features, AI model selection, and code generation.
Table of Contents
- Installation & Updates
- Interactive Commands
- Core Commands
- Model Selection
- Tool Integration
- Command Reference
Installation & Updates
Install Aexol
# Install via curl (Aexol team)
export GITLAB_TOKEN=<your_private_token>
curl --fail --show-error --location \
--header "Private-Token: $GITLAB_TOKEN" \
https://gitlab.aexol.com/api/v4/projects/725/packages/generic/aexol/latest/install.sh | bash
Update Aexol
# Update to latest version
aexol update
# Install specific version
aexol update --version 0.4.4
Interactive Commands
Chat Command
AI-powered conversational interface for designing and refining Aexol specifications.
Basic Usage
# Start interactive chat with model selection
aexol chat
# On first run, you'll see:
š¤ Select AI Model
š Codex CLI
ā¶ Codex CLI - GPT-5 (recommended)
Codex CLI - GPT-5-Codex
Codex CLI - GPT-4o
š¤ GitHub Copilot CLI
ā¶ GitHub Copilot CLI - Claude Sonnet 4
Anthropic Claude Models
ā¶ claude-sonnet-4-5-20250929 - Latest Sonnet
claude-haiku-4-5-20251001 - Fast and efficient
OpenAI GPT-5 Series (Latest)
ā¶ gpt-5 - Best for coding & agentic tasks
gpt-5-mini - Faster, cost-efficient
Chat Features
š§ 15 Available Tools (for API models only - Claude, GPT):
File Operations (8 tools):
list_files(path?)- List files and directoriesread_file(file_path, start_line?, end_line?)- Read file contentswrite_file(file_path, content)- Create or overwrite filesdelete_file(file_path)- Delete filessearch_files(pattern, file_pattern?)- Search text in filesreplace_in_file(file_path, old_text, new_text)- Replace textcreate_directory(path)- Create directoriesget_file_info(path)- Get file/directory info
Aexol Commands (7 tools):
aexol_parse(file_path, json?, validate?)- Parse Aexol filesaexol_validate(file_path, json?, verbose?)- Validate filesaexol_docs(file_path, output?, format?)- Generate docsaexol_analyze(file_path, verbose?)- Analyze specificationsaexol_version()- Get CLI versionaexol_help(command?)- Get command help
Chat Commands
During a chat session, use these commands:
/exitor:q- End the session/model- Change AI model (saved to.aexol/config.json)/tools- Show available AI tools/help- Show help message
Chat Command Options
# Use specific model
aexol chat --model claude-sonnet-4-5-20250929
# Use Codex CLI for interactive terminal
aexol chat --model codex-cli
# Use GitHub Copilot CLI
aexol chat --model copilot-cli
# Start with a prompt/goal
aexol chat --prompt "Help me design an authentication workflow"
Example Conversation
$ aexol chat
aexol> Can you help me design a todo application workflow?
AI: I can help you design a comprehensive todo workflow. Let me first check
if there are any existing examples...
aexol> Create a workflow file for me
AI: I'll create a workflow specification for your todo application with the
necessary states and transitions...
[File created: workflows/todo.aexol]
aexol> /exit
Session ended.
Implement Command
Generate project structure and implement methods with AI agent guidance.
Basic Usage
# Interactive agent selection
aexol implement app.aexol
# On first run, you'll see:
š¤ Select AI Agent for Implementation
š¤ AI Agents
ā¶ GitHub Copilot CLI - Interactive coding with GitHub Copilot
Codex CLI - Coding agent from Aexol
Claude Code CLI - Anthropic Claude coding assistant
Implement Options
# Use specific agent
aexol implement app.aexol --agent copilot
# Specify target language
aexol implement api.aexol --language python
# Provide implementation guidance
aexol implement todo.aexol --guidance "Use async/await, add validation"
# Dry run (generate structure only)
aexol implement app.aexol --dry-run
# Specify output directory
aexol implement app.aexol -o ./src
How It Works
- Generates skeleton code with TODOs from your Aexol specification
- Implements each method interactively using your chosen AI agent
Supported Agents
- GitHub Copilot CLI:
npm install -g @githubnext/github-copilot-cli - Codex CLI:
npm install -g codex-cli - Claude CLI: Note - Anthropic does not provide a CLI tool. Use API models instead.
Example
$ aexol implement todo.aexol --agent copilot --language typescript
ā Parsed todo.aexol
ā Generated project structure
š Generated files:
ā src/TodoList.ts (3 TODOs)
ā src/TodoItem.ts (2 TODOs)
ā src/api/routes.ts (4 TODOs)
š¤ Launching GitHub Copilot CLI for implementation...
[Interactive session begins...]
Inference Command
Generate artifacts (GraphQL, Prisma, Routes, Webhooks, Cron, E2E Tests) from Aexol specifications using AI inference.
The inference command analyzes your Aexol specification and generates production-ready artifacts for various parts of your stack.
Available Subcommands
| Subcommand | Description | Default Output |
|---|---|---|
graphql | Generate GraphQL SDL schema | schema.graphql |
prisma | Generate Prisma database schema | schema.prisma |
routes | Generate route tree configuration | routes.json |
webhooks | Generate webhook specifications | webhooks.json |
cron | Generate cron job specifications | cron.json |
e2e | Generate E2E test specifications | ./tests/ |
GraphQL Schema Generation
Generate a complete GraphQL schema from your types and visitor capabilities.
# Basic usage
aexol inference graphql app.aexol
# Specify output file
aexol inference graphql app.aexol -o schema.graphql
# With Apollo Federation directives
aexol inference graphql app.aexol --federated
# Use specific AI agent
aexol inference graphql app.aexol -a claude-sonnet-4-5-20250929
What it generates:
- GraphQL types from Aexol types
- Query operations from "can view/browse/list/search" capabilities
- Mutation operations from "can create/update/delete" capabilities
- Subscription operations from "can subscribe/watch" capabilities
- Input types for mutations (CreateInput, UpdateInput)
- Descriptions from your specification
Prisma Schema Generation
Generate a Prisma schema with auto-detected relations.
# Basic usage
aexol inference prisma app.aexol
# Specify output and database provider
aexol inference prisma app.aexol -o prisma/schema.prisma --provider postgresql
# Available providers: postgresql, mysql, sqlite, mongodb
aexol inference prisma app.aexol --provider mysql
What it generates:
- Prisma models from Aexol types
- Auto-detected relations from field naming patterns (e.g.,
customerIdā relation to Customer) - Appropriate attributes (@id, @unique, @default, @relation)
- Indexes for frequently queried fields
Route Tree Generation
Generate route configurations for various frontend frameworks.
# Generate JSON route tree (framework-agnostic)
aexol inference routes app.aexol
# React Router v6 format
aexol inference routes app.aexol --framework react-router -o src/routes.tsx
# Next.js App Router structure
aexol inference routes app.aexol --framework nextjs
# Vue Router v4 format
aexol inference routes app.aexol --framework vue-router -o src/router/routes.ts
Route inference from visitors:
visitor Dashboard {
"view overview" # ā /dashboard (index)
"manage orders" { # ā /dashboard/orders
"view order" # ā /dashboard/orders/:id
"edit order" # ā /dashboard/orders/:id/edit
"create order" # ā /dashboard/orders/new
}
}
}
}
}
⨠Edit in StudioWebhook Specifications
Generate webhook endpoint specifications from agent definitions.
# Basic usage
aexol inference webhooks app.aexol
# Specify framework context
aexol inference webhooks app.aexol --framework express
aexol inference webhooks app.aexol --framework nextjs
aexol inference webhooks app.aexol --framework hono
Output format (JSON):
{
"webhooks": [
{
"path": "/webhooks/stripe/payment-succeeded",
"method": "POST",
"source": "stripe",
"event": "payment_intent.succeeded",
"description": "Handles successful payment events",
"handler": "PaymentProcessor agent"
}
]
}
Cron Job Specifications
Generate scheduled job definitions from agent schedules.
# node-cron format (default)
aexol inference cron app.aexol
# GitHub Actions workflow
aexol inference cron app.aexol --format github-actions -o .github/workflows/scheduled.yml
# Kubernetes CronJob manifests
aexol inference cron app.aexol --format kubernetes -o k8s/cronjobs.yaml
E2E Test Specifications
Generate comprehensive test specifications from visitor journeys.
# Markdown format (default)
aexol inference e2e app.aexol -o tests/
# Gherkin/Cucumber format
aexol inference e2e app.aexol --format gherkin -o tests/
# Simple checklist format
aexol inference e2e app.aexol --format checklist
What it generates:
- Test flows for each visitor
- Happy path and error scenarios
- Preconditions based on state requirements
- Step-by-step test instructions
Inference Command Options
All subcommands support these common options:
-o, --output <path> # Output file or directory
-a, --agent <agent> # AI agent to use (interactive if not specified)
-h, --help # Show help for the subcommand
Example: Full Stack Generation
# Generate all artifacts for an e-commerce app
aexol inference graphql ecommerce.aexol -o generated/schema.graphql
aexol inference prisma ecommerce.aexol -o generated/prisma/schema.prisma
aexol inference routes ecommerce.aexol -o generated/routes.tsx --framework react-router
aexol inference webhooks ecommerce.aexol -o generated/webhooks.json
aexol inference e2e ecommerce.aexol -o generated/tests/
Core Commands
Parse Command
Parse a Aexol file and display the Abstract Syntax Tree (AST).
Basic Usage
# Parse and show AST
aexol parse app.aexol
# Parse directory of specs
aexol parse ./specs
# Verbose output
aexol parse app.aexol --verbose
Example Output
$ aexol parse examples/todo.aexol
ā Parsed successfully
š Summary:
- 3 visitors
- 2 roles
- 1 workflow
- 4 types
- 0 agents
AST Structure:
āā Visitor: TodoUser
āā Role: Admin
āā Workflow: TodoLifecycle
āā Type: Todo
Validate Command
Validate a Aexol file for syntax and semantic errors.
Basic Usage
# Validate specification
aexol validate app.aexol
# Verbose validation
aexol validate app.aexol --verbose
Example Output
$ aexol validate app.aexol
ā Validation passed
ā 0 errors
ā 2 warnings:
- Workflow 'TodoLifecycle' missing initial state (line 45)
- Agent 'TaskManager' role 'Undefined' not found (line 67)
Analyze Command
Analyze a Aexol specification and generate a detailed complexity report.
Basic Usage
# Analyze specification
aexol analyze app.aexol
# Analyze with verbose output
aexol analyze app.aexol --verbose
# Analyze directory
aexol analyze ./specs
Example Output
$ aexol analyze ecommerce.aexol
š Aexol Analysis Report
Complexity Metrics:
- Total Definitions: 42
- Visitors: 5 (depth: 3-5 levels)
- Workflows: 3 (avg 8 states)
- Agents: 7 (avg 4 capabilities)
- Types: 12
- Roles: 4
Recommendations:
ā Well-structured specification
ā Consider splitting large workflows
ā ProductCheckout workflow has 15 states
Docs Command
Generate comprehensive Markdown documentation from Aexol specifications.
Basic Usage
# Generate documentation
aexol docs app.aexol
# Save to file
aexol docs app.aexol -o API.md
# Generate for directory
aexol docs ./specs -o documentation.md
Example Output
$ aexol docs ecommerce.aexol -o API.md
ā Parsed ecommerce.aexol
ā Generating documentation...
š Generated documentation:
ā API.md (1,234 lines)
Includes:
- Overview
- Type Definitions (12 types)
- Workflows (3 workflows, 24 states)
- Agents (7 agents)
- Visitors (5 visitors)
- Roles & Permissions
Model Selection
Available Models
CLI Tools (Interactive Terminal Sessions)
- codex-cli:gpt-5 - Codex CLI with GPT-5 (recommended)
- codex-cli:gpt-5-codex - Optimized for agentic coding
- codex-cli:gpt-4o - Fast & intelligent
- copilot-cli - GitHub Copilot CLI (Claude Sonnet 4)
Anthropic Claude (API)
- claude-sonnet-4-5-20250929 - Latest Sonnet (recommended)
- claude-haiku-4-5-20251001 - Fast and efficient
OpenAI GPT-5 Series (API - Latest)
- gpt-5 - Best for coding & agentic tasks
- gpt-5-mini - Faster, cost-efficient
- gpt-5-nano - Fastest, most cost-efficient
- gpt-5-pro - Smarter, more precise
- gpt-5-codex - Optimized for Codex
OpenAI GPT-4.1 Series (API)
- gpt-4.1 - Smartest non-reasoning
- gpt-4.1-mini - Smaller, faster
- gpt-4.1-nano - Fastest GPT-4.1
OpenAI o-series (Reasoning Models)
- o3 - Latest reasoning model
- o3-mini - Small reasoning model
- o3-pro - More compute, better responses
- o4-mini - Fast reasoning model
Model Configuration
Models are saved to .aexol/config.json and persist across sessions.
# Models are selected interactively
aexol chat
# ā Select model from menu
# ā Saved to .aexol/config.json
# Or specify directly
aexol chat --model claude-sonnet-4-5-20250929
Environment Variables
# Anthropic API Key (for Claude models)
export ANTHROPIC_API_KEY=your_key
# OpenAI API Key (for GPT models)
export OPENAI_API_KEY=your_key
# Disable colors in output
export NO_COLOR=1
Tool Integration
Remote MCP Tools
For remote/team operations (cloud documents, remote tasks, refinements), use the backend MCP endpoint at https://api.aexol.ai/mcp with a team API key (sk-aexol-team-...).
Use JSON-RPC methods initialize, ping, tools/list, and tools/call.
AI Tools (API Models Only)
Tools are available for API models (Claude, GPT) but not for CLI models (Codex CLI, Copilot CLI).
File Operations (8 tools)
list_files(path?)
read_file(file_path, start_line?, end_line?)
write_file(file_path, content)
delete_file(file_path)
search_files(pattern, file_pattern?)
replace_in_file(file_path, old_text, new_text)
create_directory(path)
get_file_info(path)
Aexol Commands (7 tools)
aexol_parse(file_path, json?, validate?)
aexol_validate(file_path, json?, verbose?)
aexol_docs(file_path, output?, format?)
aexol_analyze(file_path, verbose?)
aexol_version()
aexol_help(command?)
Using Tools in Chat
AI can use tools automatically by describing the action:
You: Can you read the package.json file?
AI: Let me read_file(package.json)...
š§ Executing tools...
ā read_file(package.json)
ā
File contents loaded
Here's what I found in package.json...
Command Reference
Quick Reference
# Core commands
aexol parse <file> # Parse Aexol file
aexol validate <file> # Validate specification
aexol analyze <file> # Analyze complexity
aexol docs <file> # Generate documentation
# Interactive commands
aexol chat # Interactive AI chat
aexol implement <file> # AI-assisted implementation
# Inference commands (generate artifacts)
aexol inference graphql <file> # Generate GraphQL schema
aexol inference prisma <file> # Generate Prisma schema
aexol inference routes <file> # Generate route tree
aexol inference webhooks <file> # Generate webhook specs
aexol inference cron <file> # Generate cron job specs
aexol inference e2e <file> # Generate E2E test specs
# Utility commands
aexol auth # Login/logout/whoami/register
aexol from <doc> # Generate spec from .md/.txt document
aexol lsp --info # Show LSP capabilities/info
aexol lsp --stdio # Start LSP server over stdio
aexol update # Update Aexol CLI
aexol help # Show help
Common Workflows
1. Design a Specification
# Start with interactive chat
aexol chat
# Design your specification interactively
aexol> I need a user authentication workflow
# AI helps you design the spec
# Save it to a .aexol file
2. Validate & Analyze
# Validate the specification
aexol validate auth.aexol
# Analyze complexity
aexol analyze auth.aexol
3. Generate Artifacts with Inference
# Generate GraphQL schema
aexol inference graphql auth.aexol -o schema.graphql
# Generate Prisma schema
aexol inference prisma auth.aexol -o prisma/schema.prisma
# Generate routes
aexol inference routes auth.aexol --framework react-router
4. Implement with AI
# Implement with AI agent
aexol implement auth.aexol --agent copilot --language typescript
5. Generate Documentation
# Create API documentation
aexol docs auth.aexol -o API.md
Examples
Complete Project Setup
# 1. Design specification
aexol chat --prompt "Design a todo application"
# 2. Validate design
aexol validate todo.aexol
# 3. Generate artifacts
aexol inference graphql todo.aexol -o schema.graphql
aexol inference prisma todo.aexol -o prisma/schema.prisma
aexol inference routes todo.aexol --framework react-router -o src/routes.tsx
# 4. Implement with AI
aexol implement todo.aexol --agent copilot
# 5. Generate documentation
aexol docs todo.aexol -o TODO_API.md
# 6. Generate E2E tests
aexol inference e2e todo.aexol -o tests/
Getting Help
# Show general help
aexol help
# Show command-specific help
aexol parse --help
aexol chat --help
aexol implement --help
aexol inference --help
# Show inference subcommand help
aexol inference graphql --help
aexol inference prisma --help
For more information, visit: https://github.com/aexol/aexol
