A Zoomed in vc dir for the Current Directory in dired

May 5, 2026 07:50

I almost always reach for project-vc-dir when I want a VC status overview, and most of the time this is exactly what I want, the whole project laid out in one buffer, every modified, added and unregistered file in the repo sitting right there, ready to be diffed or committed. But every so often, particularly when I am deep inside a big repository and I only really care about a single subdirectory's worth of changes, that project-wide view is, frankly, a bit too much. Too many rows, too much scrolling, too much noise.

A Tiny Header line Tweak: Image Dimensions in image mode

April 30, 2026 11:00

I have been doing a lot of fiddling with images lately, mostly through dired and image-dired, and one little thing has been bugging me for a while. When I open an image in Emacs, image-mode happily shows me the picture, but it never tells me the one bit of information I actually want to know, how big is the thing?, width, height, file size, that sort of thing. You can of course bounce out to a shell and run identify or file, but that feels silly when Emacs already has the image loaded.

Highlighting git changes in a buffer with diff-hl

April 21, 2026 08:00

Lately I’ve found myself wanting a better, more fine-grained view of what’s going on in a file under git. For some reason, my default workflow has been to keep jumping in and out of project-vc-dir to check changes. It gets the job done, but honestly it’s a bit of a hassle

Emacs-DIYer: A Built-in dired-collapse Replacement

April 15, 2026 19:20

I have been slowly chipping away at my Emacs-DIYer project, which is basically my ongoing experiment in rebuilding popular Emacs packages using only what ships with Emacs itself, no external dependencies, no MELPA, just the built-in pieces bolted together in a literate README.org that tangles to init.el. The latest addition is a DIY version of dired-collapse from the dired-hacks family, which is one of those packages I did not realise I leaned on until I started browsing a deeply-nested Java project and felt the absence immediately.

Wiring Flymake Diagnostics into a Follow Mode

April 9, 2026 06:13

Flymake has been quietly sitting in my config for years doing exactly what it says on the tin, squiggly lines under things that are wrong, and I mostly left it alone. But recently I noticed I was doing the same little dance over and over: spot a warning, squint at the modeline counter, run `M-x flymake-show-buffer-diagnostics`, scroll through the list to find the thing I was actually looking at, then flip back. Two windows, zero connection between them.

Ollama Buddy - Seven Lines to Any LLM Provider

March 19, 2026 14:50

Ever found yourself wanting to add a new AI provider to ollama-buddy? (probably not I would guess 🙂), only to realise you'd need to write an entire Elisp module? Or perhaps you're running a local inference server that speaks the OpenAI API, but can't be bothered with the ceremony of creating a dedicated provider file?

Ollama Buddy - In-Buffer LLM Streaming

March 12, 2026 11:09

There is now an in-buffer replace feature in ollama-buddy, so now an ollama response can work directly on your text, streaming the replacement in real-time, and giving you a simple accept/reject choice!, I have also added an smerge diff inline if desired to show the differences and give the user the ability to accept or reject

Ollama Buddy - Web Search Integration

March 4, 2026 09:34

One of the fundamental limitations of local LLMs is their knowledge cutoff - they don't know about anything that happened after their training data ended. The new web search integration in ollama-buddy solves this by fetching current information from the web and injecting it into your conversation context. Ollama has a specific API web search section, so it has now been activated!

Ollama Buddy v2.5 - RAG (Retrieval-Augmented Generation) Support

February 24, 2026 12:10

One of the things that has always slightly bothered me about chatting with a local LLM is that it only knows what it was trained on (although I suppose most LLMs are like that) . Ask it about your own codebase, your org notes, your project docs - and it's just guessing. Well, not anymore! Ollama Buddy now ships with proper Retrieval Augmented Generation support built-in