Expanding Ollama Buddy: Mistral Codestral Integration
Ollama Buddy now supports Mistral's Codestral - a powerful code-generation model from Mistral AI that seamlessly integrates into the ollama-buddy ecosystem.
https://github.com/captainflasmr/ollama-buddy
https://melpa.org/#/ollama-buddy
So now we have:
- Local Ollama models - full control, complete privacy
- OpenAI - extensive model options and API maturity
- Claude - reasoning and complex analysis
- Gemini - multimodal capabilities
- Grok - advanced reasoning models
- Codestral - specialized code generation NEW
To get up and running…
First, sign up at Mistral AI and generate an API key from your dashboard.
Add this to your Emacs configuration:
(use-package ollama-buddy :bind ("C-c o" . ollama-buddy-menu) ("C-c O" . ollama-buddy-transient-menu-wrapper) :custom (ollama-buddy-codestral-api-key (auth-source-pick-first-password :host "ollama-buddy-codestral" :user "apikey")) :config (require 'ollama-buddy-codestral nil t))
Once configured, Codestral models will appear in your model list with an s: prefix (e.g., s:codestral-latest). You can:
- Select it from the model menu (
C-c m) - Use it with any command that supports model selection
- Switch between local and cloud models on-the-fly