Quick Start
Make your first AI request with WebLLM in 5 minutes
Your First AI Request
This guide will walk you through making your first AI request with WebLLM.
Prerequisites
- Node.js 18+ installed
- WebLLM extension installed (or daemon running)
- At least one provider configured (API key or local model)
Step 1: Install WebLLM
Step 2: Import and Use
Create a new file called app.js:
Run it:
That's it! You've made your first AI request with WebLLM.
Understanding the Response
The generateText function returns a response object with:
Step 3: Task-Based Routing
WebLLM can automatically select the best model for your task:
Step 4: Streaming Responses
For real-time text generation:
Next Steps
Now that you've made your first request, explore more features:
- Browser Usage - Learn about the navigator.llm API
- Provider Setup - Configure multiple providers
- Intelligent Routing - Understand how WebLLM selects models
- Examples - See more code examples