Skip to content

Ollama integration#41

Merged
wubin1989 merged 7 commits intomainfrom
upstream-open-pr-346
Apr 28, 2026
Merged

Ollama integration#41
wubin1989 merged 7 commits intomainfrom
upstream-open-pr-346

Conversation

@wubin1989
Copy link
Copy Markdown
Owner

Migrated open PR from upstream for local review.

Upstream: opencode-ai#346
Upstream head: alexthotse:ollama-integration @ 61d098b

This commit introduces a dedicated Ollama provider to the application. Previously, local models were handled by a generic `local` provider. This change adds a new `ollama` provider, making the integration more explicit and easier to maintain.

The following changes are included:

- A new `ollama.go` file in `internal/llm/provider` to define the `OllamaClient`.
- A new `ollama.go` file in `internal/llm/models` to define Ollama models.
- Updates to `provider.go` and `models.go` to register the new provider and its models.
- A new test file `internal/llm/provider/ollama_test.go` to verify the integration.
This commit introduces several new features and improvements to the codebase.

- Reordered the provider priority to prioritize Ollama, OpenRouter, and Gemini.
- Added a new `config` command to the CLI that allows you to set your API keys and other settings.
- Added support for Hugging Face, Replicate, and Cohere as new LLM providers.
- Added a testing framework for the providers, including a mock provider.
- Improved the error handling in the application by creating a new `errors.go` file with custom error types.
@wubin1989 wubin1989 merged commit a4b69ed into main Apr 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant