Configure Doclo providers to work with your preferred AI services.
Provider Configuration
VLM Provider (Vision + Language)
Use createVLMProvider for models that can process images directly:
import { createVLMProvider } from '@doclo/providers-llm';
const provider = createVLMProvider({
provider: 'google', // Provider name
model: 'google/gemini-2.5-flash', // Model ID
apiKey: process.env.OPENROUTER_API_KEY!,
via: 'openrouter' // Optional: use OpenRouter gateway
});
Configuration Options
| Option | Type | Description |
|---|
provider | string | Provider name: 'openai', 'anthropic', 'google', 'xai' |
model | string | Model ID (OpenRouter or native format) |
apiKey | string | API key for authentication |
via | 'openrouter' | Optional: route through OpenRouter |
OpenRouter vs Native
Via OpenRouter
Native Access
Use a single API key for all providers:const provider = createVLMProvider({
provider: 'google',
model: 'google/gemini-2.5-flash', // OpenRouter model ID
apiKey: process.env.OPENROUTER_API_KEY!,
via: 'openrouter'
});
Benefits:
- Single API key for all providers
- Automatic fallback support
- Usage tracking across providers
Use provider-specific API keys:// OpenAI
const openai = createVLMProvider({
provider: 'openai',
model: 'gpt-4o',
apiKey: process.env.OPENAI_API_KEY!
});
// Anthropic
const anthropic = createVLMProvider({
provider: 'anthropic',
model: 'claude-sonnet-4-20250514',
apiKey: process.env.ANTHROPIC_API_KEY!
});
// Google
const google = createVLMProvider({
provider: 'google',
model: 'gemini-2.0-flash',
apiKey: process.env.GOOGLE_API_KEY!
});
Benefits:
- Lower latency
- Direct pricing (no markup)
Provider with Fallback
Use buildLLMProvider for production workloads with automatic fallback and retry:
import { buildLLMProvider } from '@doclo/providers-llm';
const provider = buildLLMProvider({
providers: [
{
provider: 'openai',
model: 'openai/gpt-4.1',
apiKey: process.env.OPENROUTER_API_KEY!,
via: 'openrouter'
},
{
provider: 'anthropic',
model: 'anthropic/claude-haiku-4.5',
apiKey: process.env.OPENROUTER_API_KEY!,
via: 'openrouter'
}
],
maxRetries: 3,
retryDelay: 1000,
useExponentialBackoff: true
});
Fallback Options
| Option | Type | Default | Description |
|---|
providers | array | Required | List of providers in priority order |
maxRetries | number | 3 | Max retries per provider |
retryDelay | number | 1000 | Base delay between retries (ms) |
useExponentialBackoff | boolean | true | Increase delay exponentially |
OCR Provider
Configure OCR providers for document parsing:
import { suryaProvider, markerProvider } from '@doclo/providers-datalab';
// Surya OCR
const surya = suryaProvider({
endpoint: 'https://www.datalab.to/api/v1/ocr',
apiKey: process.env.DATALAB_API_KEY!
});
// Marker OCR
const marker = markerProvider({
endpoint: 'https://www.datalab.to/api/v1/marker',
apiKey: process.env.DATALAB_API_KEY!
});
API Key Best Practices
Security
Never commit API keys to version control. Always use environment variables.
// Good: use environment variables
const apiKey = process.env.OPENROUTER_API_KEY!;
// Bad: hardcoded keys
const apiKey = 'sk-or-v1-abc123...';
Key Rotation
Store keys in environment variables for easy rotation:
# .env.local
OPENROUTER_API_KEY=sk-or-v1-current-key
OPENROUTER_API_KEY_BACKUP=sk-or-v1-backup-key
Multiple Environments
Use different keys for development and production:
# .env.development
OPENROUTER_API_KEY=sk-or-v1-dev-key
# .env.production
OPENROUTER_API_KEY=sk-or-v1-prod-key
Model Selection
By Speed
| Model | Provider | Speed | Cost |
|---|
| Gemini Flash 2.5 | Google | Fast | Low |
| Claude Haiku 4.5 | Anthropic | Fast | Low |
| Grok 4 Fast | xAI | Fast | Medium |
By Accuracy
| Model | Provider | Accuracy | Cost |
|---|
| GPT-4o | OpenAI | High | High |
| Claude Sonnet 4 | Anthropic | High | High |
| Gemini Pro | Google | High | Medium |
Recommended Starting Point
// Good balance of speed, accuracy, and cost
const provider = createVLMProvider({
provider: 'google',
model: 'google/gemini-2.5-flash',
apiKey: process.env.OPENROUTER_API_KEY!,
via: 'openrouter'
});
Next Steps