docs: Add FAQ section for common questions about Browser Operator#98
docs: Add FAQ section for common questions about Browser Operator#98meichuanyi wants to merge 1 commit into
Conversation
|
Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits. |
📝 WalkthroughWalkthroughREADME.md is enhanced with a comprehensive FAQ section that addresses common questions about Browser Operator, including its purpose, differentiation from similar tools, getting started guidance, supported AI providers, multi-agent workflows, use cases, offline capabilities, and contribution information. A closing HTML div tag is repositioned in the centered layout block. ChangesDocumentation
Estimated Code Review Effort🎯 1 (Trivial) | ⏱️ ~3 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
Inline comments:
In `@README.md`:
- Around line 68-69: The README currently makes an absolute privacy claim
("Privacy-first: All processing happens locally... 100% data stays on your
device") which conflicts with documented support for cloud providers; update the
wording around the "Privacy-first" headline and the sentence about local models
to add clear qualifiers (e.g., "By default" or "When using local models such as
Ollama") and explicitly note that when cloud providers (OpenAI/OpenRouter/Groq)
are configured, data may be processed off-device and subject to those providers'
policies; mirror this change for the other instance referenced (lines ~103-104)
and consider adding a short pointer to the FAQ for full details.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
| **Privacy-first**: All processing happens locally on your machine. Use local models with Ollama for complete offline operation — 100% data stays on your device. | ||
|
|
There was a problem hiding this comment.
Avoid absolute privacy claims when cloud providers are supported.
These lines state that all processing is local and 100% of data stays on-device, but the same FAQ also documents cloud providers (OpenAI/OpenRouter/Groq). This can mislead users about data flow and privacy guarantees.
Suggested wording adjustment
-**Privacy-first**: All processing happens locally on your machine. Use local models with Ollama for complete offline operation — 100% data stays on your device.
+**Privacy-first**: Browser Operator runs locally and supports privacy-sensitive workflows. For fully local processing and on-device data handling, use LiteLLM + Ollama in offline mode.
-Yes. Configure LiteLLM with Ollama to run completely offline with local models. All data remains on your machine — perfect for privacy-sensitive work.
+Yes. Configure LiteLLM with Ollama to run completely offline with local models. In this setup, data stays on your machine.Also applies to: 103-104
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@README.md` around lines 68 - 69, The README currently makes an absolute
privacy claim ("Privacy-first: All processing happens locally... 100% data stays
on your device") which conflicts with documented support for cloud providers;
update the wording around the "Privacy-first" headline and the sentence about
local models to add clear qualifiers (e.g., "By default" or "When using local
models such as Ollama") and explicitly note that when cloud providers
(OpenAI/OpenRouter/Groq) are configured, data may be processed off-device and
subject to those providers' policies; mirror this change for the other instance
referenced (lines ~103-104) and consider adding a short pointer to the FAQ for
full details.
This PR adds a comprehensive FAQ section to the README, covering:
What is Browser Operator: Overview of privacy-focused AI browser with multi-agent capabilities
Platform Comparison: Privacy-first approach, multi-agent automation, extensibility with 100+ AI models
Getting Started: Download instructions, system requirements, AI provider configuration
AI Providers: OpenRouter, OpenAI, Groq, LiteLLM (local models) setup options
Multi-Agent Automation: How specialized agents collaborate on complex web tasks
Use Cases: Research & analysis, shopping & price tracking, business automation
Offline Operation: Complete offline mode with LiteLLM + Ollama
Contributing: Links to contributing guide and build instructions
Help Resources: Documentation, Discord, GitHub Issues, official website
This FAQ helps new users quickly understand the platform's capabilities and privacy features.
Summary by CodeRabbit