Browser Prototype for On-Device AI
Browser.AI brings powerful AI models directly to your browser, keeping your data private and enabling faster, more responsive AI interactions without cloud dependency.
Browser.AI brings powerful AI models directly to your browser, keeping your data private and enabling faster, more responsive AI interactions without cloud dependency.
Browser.AI is an open-source project aimed at highlighting the power of on-device AI in the browser. The prototype provides a simple API on the window object that allows developers to interact with AI models directly in the browser. This project exists to demonstrate the potential of on-device AI, which I outlined in a proposal to the W3C.
The API is exposed through the window.ai
object.
The permissions object provides methods to interact with the permissions system.
The models
method returns a list of available models.
await window.ai.permissions.models();
[
{
"model": "llama3.2",
"available": true
},
{
"model": "llama3.1",
"available": true
},
{
"model": "gemma2",
"available": true
}
]
The request
method requests access to a specific model.
await window.ai.permissions.request({ model: 'llama3.1' });
The model object provides methods to interact with a specific model.
The connect
method connects to a specific model.
const session = await window.ai.model.connect({ model: 'llama3.1' });
The session object provides methods to interact with the model session.
The chat
method starts a chat session.
await session.chat({ messages: [ { role: 'user', content: 'hello!' } ]});
{
"id": "fn6cq1zciy",
"choices": [
{
"message": {
"role": "assistant",
"content": "Hello! How are you today? Is there something I can help you with or would you like to chat?"
},
"finish_reason": "stop"
}
],
"created": "2024-10-11T01:14:37.204701Z",
"model": "llama3.1",
"usage": {
"total_duration": 6854656244,
"load_duration": 41462875,
"prompt_eval_count": 12,
"prompt_eval_duration": 955155000,
"eval_count": 23,
"eval_duration": 5856371000
}
}
The embed
method starts an embed session.
await session.embed({ input: "Hello world" });
{
"model": "llama3.1",
"embeddings": [
[
-0.0070958203,
-0.019203912,
...
]
]
}
The info
method provides details about the model.
const info = await window.ai.model.info({ model: 'llama3.1' });
{
"model": "llama3.1",
"license": "[Model License]",
"details": {
"parent_model": "",
"format": "gguf",
"family": "llama",
"families": [
"llama"
],
"parameter_size": "8.0B",
"quantization_level": "Q4_0"
}
}
Check out the project on GitHub: Browser.AI GitHub Repository
Learn more about the proposal we submitted to the W3C: Browser.AI W3C Proposal