Eclipse extension that provides inline completion backed by LLMs
-
Updated
May 11, 2026 - Java
Eclipse extension that provides inline completion backed by LLMs
OpenAI-compatible proxy that makes local LLM tool calling actually work — schema validation, retry with feedback, model escalation, and context condensing
Docker service stack for a comprehensive local ai experience, running on a normal home hardware setup with single GPU. DevOps-Dashboards included. WIP.
💻 Enhance your coding experience with Mistral Vibe, an open-source CLI assistant that lets you interact with your projects using natural language.
OllamaLLM Desktop App. Golang API. Utilizes 10 LLMs. ChatGPT-like GUI . In it's current form, it operates as a desktop app / local host API, but, has the foundation to be scaled up to become a complete web app. See the documentation for more.
The open-source agentic coding CLI. Focused on Ollama Cloud Models.
EU-sovereign multi-model LLM serving pipeline — Apertus 70B + Devstral 24B + EuroLLM 22B with EU AI Act Art. 52/53 transparency
Autonomous dev agent that runs on your hardware. No cloud. No API keys. Ships code while you sleep.
Add a description, image, and links to the devstral topic page so that developers can more easily learn about it.
To associate your repository with the devstral topic, visit your repo's landing page and select "manage topics."