Choose Your AI Provider - OpenAI, Anthropic, or Local Models
The dal ai commands support multiple AI
providers. Choose what works best for you:
Best for: Production use, highest quality responses
Get API Key
sk-proj- or
sk-)Set Environment Variable
export OPENAI_API_KEY="sk-proj-..."
# Add to your shell config for persistence
echo 'export OPENAI_API_KEY="sk-proj-..."' >> ~/.zshrc
source ~/.zshrcUse DAL
dal ai code "Create a DeFi lending protocol"
dal ai explain mycontract.dal
dal ai audit token.dal# Choose model (default: gpt-4)
export OPENAI_MODEL="gpt-4" # Most capable
export OPENAI_MODEL="gpt-3.5-turbo" # Faster, cheaperBest for: Long context, detailed analysis, code review
Get API Key
sk-ant-)Set Environment Variable
export ANTHROPIC_API_KEY="sk-ant-..."
# Add to your shell config for persistence
echo 'export ANTHROPIC_API_KEY="sk-ant-..."' >> ~/.zshrc
source ~/.zshrcUse DAL
dal ai code "Create a token contract"
dal ai review complex_system.dal
dal ai audit defi_protocol.dal# Choose model (default: claude-3-5-sonnet-20241022)
export ANTHROPIC_MODEL="claude-3-5-sonnet-20241022" # Most capable
export ANTHROPIC_MODEL="claude-3-opus-20240229" # Highest intelligence
export ANTHROPIC_MODEL="claude-3-haiku-20240307" # Fastest, cheapestBest for: Privacy, offline use, free unlimited usage
Install Ollama
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.com/install.sh | sh
# Windows
# Download from https://ollama.com/downloadStart Ollama Server
ollama serveDownload a Model
# Code generation models
ollama pull codellama # 7B, good for code
ollama pull deepseek-coder # Excellent for coding
ollama pull phind-codellama # Optimized for code
# General purpose models
ollama pull llama2 # Good all-around
ollama pull mistral # Fast and capable
ollama pull llama3 # Latest, most capable
# Small/fast models
ollama pull tinyllama # Very fastConfigure DAL
export DAL_AI_ENDPOINT="http://localhost:11434/api/generate"
export DAL_AI_MODEL="codellama" # or deepseek-coder, llama2, etc.
# Add to shell config
echo 'export DAL_AI_ENDPOINT="http://localhost:11434/api/generate"' >> ~/.zshrc
echo 'export DAL_AI_MODEL="codellama"' >> ~/.zshrc
source ~/.zshrcUse DAL (Offline!)
dal ai code "Create a REST API"
dal ai explain myfile.dal
dal ai test contract.dal| Model | Size | Speed | Quality | Best For |
|---|---|---|---|---|
codellama |
7B | Fast | Good | Code generation |
deepseek-coder |
6.7B | Fast | Excellent | Code, best quality |
llama3 |
8B | Medium | Excellent | General purpose |
mistral |
7B | Fast | Good | General purpose |
phind-codellama |
34B | Slow | Excellent | Complex code (needs GPU) |
You can configure all three and DAL will choose automatically:
# Set all three
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export DAL_AI_ENDPOINT="http://localhost:11434/api/generate"
# Priority order: OpenAI > Anthropic > Local > Fallback
dal ai code "test" # Uses OpenAI (first priority)# Use Anthropic instead of OpenAI
OPENAI_API_KEY="" dal ai code "test"
# Use local model
OPENAI_API_KEY="" ANTHROPIC_API_KEY="" dal ai code "test"
# Force fallback mode (no AI)
OPENAI_API_KEY="" ANTHROPIC_API_KEY="" DAL_AI_ENDPOINT="" dal ai code "test"# .env file
OPENAI_API_KEY=sk-proj-...# .env file
ANTHROPIC_API_KEY=sk-ant-...# .env file
DAL_AI_ENDPOINT=http://localhost:11434/api/generate
DAL_AI_MODEL=codellama# Use OpenAI for code generation (fast)
export OPENAI_API_KEY="sk-..."
# But use local for security audits (private)
dal ai code "Create API" # Uses OpenAI
OPENAI_API_KEY="" dal ai audit contract.dal # Uses local (private)export OPENAI_API_KEY="your-key"
dal ai code "print hello world"export ANTHROPIC_API_KEY="your-key"
dal ai code "print hello world"ollama serve
export DAL_AI_ENDPOINT="http://localhost:11434/api/generate"
export DAL_AI_MODEL="codellama"
dal ai code "print hello world"echo $OPENAI_API_KEYexport the variablesk-proj- or
sk-ollama servecurl http://localhost:11434/api/generate| Provider | Model | Monthly Cost |
|---|---|---|
| OpenAI | GPT-3.5 | ~$5 |
| OpenAI | GPT-4 | ~$50 |
| Anthropic | Claude Haiku | ~$2 |
| Anthropic | Claude Sonnet | ~$15 |
| Anthropic | Claude Opus | ~$75 |
| Local | Any | $0 |
| Provider | Model | Monthly Cost |
|---|---|---|
| OpenAI | GPT-3.5 | ~$50 |
| OpenAI | GPT-4 | ~$500 |
| Anthropic | Claude Haiku | ~$20 |
| Anthropic | Claude Sonnet | ~$150 |
| Local | Any | $0 |
# OpenAI
export OPENAI_API_KEY="sk-proj-..."
export OPENAI_MODEL="gpt-4" # optional
# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
export ANTHROPIC_MODEL="claude-3-5-sonnet-20241022" # optional
# Local (Ollama)
export DAL_AI_ENDPOINT="http://localhost:11434/api/generate"
export DAL_AI_MODEL="codellama" # optional
# Test
dal ai code "hello world"Never commit API keys to git
echo ".env" >> .gitignore
echo "*.key" >> .gitignoreUse environment variables, not hard-coded keys
# Good
export OPENAI_API_KEY="sk-..."
# Bad - NEVER do this
# let api_key = "sk-..."Rotate keys regularly
Use separate keys for dev/prod
# Development
export OPENAI_API_KEY="sk-dev-..."
# Production
export OPENAI_API_KEY="sk-prod-..."Monitor usage
Need Help?
docs/development/AI_API_INTEGRATION.mddocs/CLI_QUICK_REFERENCE.mddocs/development/CLI_PHASE3_COMPLETE.mdCan't get API keys?
Want to contribute?