Algolia powers search on 18,000+ documentation sites: React, Vue, Tailwind, Stripe, Twilio. Free for open source.
For years, the question has been: "When will Algolia add AI search?"
Their answer with DocSearch v4: "Bring your own LLM."
This is smart. It's also a pattern that could transform how we think about AI features on the web.
The Documentation AI Problem
AI-powered documentation sounds great:
- "Explain how to set up authentication"
- "Show me an example with TypeScript"
- "What's the difference between these two methods?"
Natural language queries, contextual answers, examples pulled from docs.
The problem: who pays?
Algolia DocSearch serves millions of queries daily across 18,000 sites. If each query triggered an LLM call:
- Cost per query: $0.01-0.05
- Queries per day: ~10 million (estimate)
- Daily cost: $100,000-500,000
- Annual cost: $36-180 million
That's not viable for a free service.
Algolia's BYOLLM Approach
DocSearch v4's solution: the site owner brings their LLM.
// DocSearch v4 configuration
docsearchV4.init({
// Traditional search config
appId: 'YOUR_APP_ID',
apiKey: 'YOUR_SEARCH_KEY',
indexName: 'docs',
// NEW: AI-powered features
ai: {
provider: 'openai',
apiKey: process.env.OPENAI_KEY,
model: 'gpt-4-turbo'
}
});
What this means:
- Algolia provides the search infrastructure (free for open source)
- Site owner provides the LLM (they pay their own API costs)
- Users get AI-powered search
- Algolia's costs stay manageable
Why BYOLLM Makes Sense for Docs
Open Source Projects
React docs, Vue docs, Tailwind docs—they're maintained by communities with limited budgets.
Traditional AI: Project pays for API calls → Not affordable BYOLLM: Project brings their own key → Project controls costs
Projects can choose:
- Cheap models for high volume
- Good models for critical paths
- No AI for budget constraints
Corporate Docs
Stripe, Twilio, Vercel—they have budgets but also security requirements.
Traditional AI: Data flows through third party → Compliance issues BYOLLM: Their API key, their data relationship → Clean compliance
Cost Distribution
Traditional model:
┌──────────────────────────────────────┐
│ Algolia pays for ALL AI queries │
│ 18,000 sites × millions of queries │
│ = Unsustainable │
└──────────────────────────────────────┘
BYOLLM model:
┌──────────────────────────────────────┐
│ Each site pays for THEIR AI queries │
│ Costs distributed to those with need │
│ = Sustainable │
└──────────────────────────────────────┘
The Pattern: From Site-Pays to User-Pays
Algolia's BYOLLM shifts costs from Algolia to site owners. But there's a further shift possible: to users.
Current:
Algolia (free) → Site Owner (pays LLM) → User (free)
Possible:
Algolia (free) → Site Owner (no LLM cost) → User (brings their own)
If users bring their own AI:
- Site owner pays nothing for AI
- Users use AI they already pay for
- Works across ALL doc sites
- No per-site API key management
How User-Powered Doc AI Would Work
// Future DocSearch with browser AI support
docsearchV4.init({
appId: 'YOUR_APP_ID',
apiKey: 'YOUR_SEARCH_KEY',
indexName: 'docs',
ai: {
// Try user's browser AI first
browserFirst: true,
// Fallback to site's LLM if user doesn't have AI
fallback: {
provider: 'openai',
apiKey: process.env.OPENAI_KEY
}
}
});
User experience:
- User searches React docs
- DocSearch checks for
navigator.llm - If available: User's AI processes the query
- If not: Falls back to site's API (or traditional search)
Cost distribution:
- Users with ChatGPT Plus: Use their subscription ($0 to site)
- Users with Ollama: Use local AI ($0 to site)
- Users without AI: Site's fallback (site pays)
Most power users (who ask complex questions) already have AI subscriptions. They'd use their own.
Implementation: Today vs. Tomorrow
Today: Extension-Based
WebLLM extension provides navigator.llm. Doc sites can detect it:
async function searchWithAI(query, context) {
if ('llm' in navigator) {
// User has AI - use it
return await navigator.llm.prompt(
`Answer this question about ${siteName} docs: ${query}\n\nContext: ${context}`
);
}
// Fallback to traditional search
return traditionalSearch(query);
}
Tomorrow: Native Browser Support
When browsers ship native AI APIs:
- More users have AI available
- No extension required
- Seamless experience
Doc sites that build for navigator.llm today will work better tomorrow.
Cost Comparison
For a popular documentation site (1M monthly visitors):
| Model | AI Queries/Month | Cost to Site |
|---|---|---|
| Traditional (site pays all) | 100K | $2,000-10,000 |
| BYOLLM (site provides key) | 100K | $2,000-10,000 |
| User-powered (50% have AI) | 50K | $1,000-5,000 |
| User-powered (90% have AI) | 10K | $200-1,000 |
As AI subscription rates increase, site costs decrease.
What Doc Platforms Should Build
Algolia
Extend BYOLLM to support browser AI:
- Detect
navigator.llm - Route to user's AI when available
- Fall back to site's API
- Maintain consistent UX
ReadTheDocs
Add AI features with user-first approach:
- Check for browser AI
- Let users configure their provider
- Offer basic AI for users without
GitBook
Same pattern:
- Browser AI detection
- Graceful fallback
- Cost efficiency for publishers
Docusaurus, VitePress, etc.
Plugin ecosystem for user-powered AI:
docusaurus-plugin-browser-ai- Standard patterns for AI integration
- Works across all Docusaurus sites
The Broader Lesson
Algolia's BYOLLM is a step in the right direction. But it still requires site owners to manage API keys and costs.
User-powered AI removes that burden:
- Users bring AI they already pay for
- Sites get AI features for free
- Costs scale with capability, not traffic
This pattern applies beyond documentation:
- Recipe sites
- Educational platforms
- Content publishers
- E-commerce
Anyone who wants AI features but can't afford per-query costs.
Conclusion
Algolia's BYOLLM is smart infrastructure design. It distributes costs to those who can afford them.
The next step: distributing costs to users who already pay for AI.
With browser AI standards, this becomes possible:
- Users configure their AI once
- Every site benefits
- Nobody manages API keys per-site
- Costs go to zero for publishers
Documentation is a perfect test case. The patterns that work here will work everywhere.
Documentation sites that adopt browser AI today will help establish patterns for the broader web.