Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
62 changes: 61 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,66 @@ https://github.com/user-attachments/assets/90150f0e-e8c8-4b53-b6a6-c739f143f4a0

---

## ❓ FAQ

### What is Browser Operator?

Browser Operator is an open-source, privacy-focused AI browser that transforms how you work on the web. It's an intelligent platform for research, analysis, and automation that runs locally in your browser, with multi-agent capabilities.

### How is Browser Operator different from other AI browsers?

**Privacy-first**: All processing happens locally on your machine. Use local models with Ollama for complete offline operation — 100% data stays on your device.

Comment on lines +68 to +69
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Avoid absolute privacy claims when cloud providers are supported.

These lines state that all processing is local and 100% of data stays on-device, but the same FAQ also documents cloud providers (OpenAI/OpenRouter/Groq). This can mislead users about data flow and privacy guarantees.

Suggested wording adjustment
-**Privacy-first**: All processing happens locally on your machine. Use local models with Ollama for complete offline operation — 100% data stays on your device.
+**Privacy-first**: Browser Operator runs locally and supports privacy-sensitive workflows. For fully local processing and on-device data handling, use LiteLLM + Ollama in offline mode.

-Yes. Configure LiteLLM with Ollama to run completely offline with local models. All data remains on your machine — perfect for privacy-sensitive work.
+Yes. Configure LiteLLM with Ollama to run completely offline with local models. In this setup, data stays on your machine.

Also applies to: 103-104

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@README.md` around lines 68 - 69, The README currently makes an absolute
privacy claim ("Privacy-first: All processing happens locally... 100% data stays
on your device") which conflicts with documented support for cloud providers;
update the wording around the "Privacy-first" headline and the sentence about
local models to add clear qualifiers (e.g., "By default" or "When using local
models such as Ollama") and explicitly note that when cloud providers
(OpenAI/OpenRouter/Groq) are configured, data may be processed off-device and
subject to those providers' policies; mirror this change for the other instance
referenced (lines ~103-104) and consider adding a short pointer to the FAQ for
full details.

**Multi-Agent Automation**: Specialized AI agents work together to handle complex web tasks autonomously, unlike single-agent tools.

**Extensible**: Compatible with 100+ AI models through OpenRouter, OpenAI, Claude, Gemini, Llama, and more via LiteLLM. No platform lock-in.

### How do I get started?

1. Download from [Releases](https://github.com/BrowserOperator/browser-operator-core/releases) for macOS or Windows
2. System Requirements: macOS 10.15+ or Windows 10 (64-bit)+, 8GB RAM (16GB recommended), 2GB free disk space
3. Configure AI provider in Settings → Select provider → Enter credentials → Choose model → Save

### What AI providers are supported?

| Provider | Best For | Setup |
|----------|----------|-------|
| **OpenRouter** | Beginners, 400+ models (Claude, GPT, Gemini, Llama) | Sign in through browser |
| **OpenAI** | GPT models | API key required |
| **Groq** | Ultra-fast inference | API key required |
| **LiteLLM** | Local models, privacy, advanced users | Proxy + Ollama setup |

### How does Multi-Agent Automation work?

Browser Operator uses specialized AI agents that collaborate to handle complex web tasks. Each agent focuses on specific capabilities (research, analysis, automation), working together through an orchestrated workflow.

### What can I use Browser Operator for?

**Research & Analysis**: Literature reviews, data collection, competitive intelligence, market research

**Shopping & Price Tracking**: Product comparisons, review analysis, price monitoring

**Business Automation**: Talent sourcing, lead generation, compliance audits

### Can I use Browser Operator offline?

Yes. Configure LiteLLM with Ollama to run completely offline with local models. All data remains on your machine — perfect for privacy-sensitive work.

### How do I contribute?

- Submit pull requests, report bugs, improve docs, or share ideas
- See [Contributing Guide](CONTRIBUTING.md)
- Check [build instructions](https://docs.browseroperator.io) in documentation

### Where can I get help?

- 📖 [Documentation](https://docs.browseroperator.io)
- 💬 [Discord](https://discord.gg/fp7ryHYBSY)
- 🐛 [GitHub Issues](https://github.com/BrowserOperator/browser-operator-core/issues)
- 🌐 [Official Website](https://browseroperator.io)

---

## 👥 Community & Contributing

**Get Help**
Expand All @@ -82,4 +142,4 @@ Browser Operator is released under the [BSD-3-Clause License](LICENSE).

[browseroperator.io](https://browseroperator.io)

</div>
</div>