I’ve recently been writing fiction and using an AI as a critic/editor to help me tighten things up (as I’m not a particularly skilled prose writer myself). Currently the two ways I’ve been trying are just writing text in a basic editor and then either saving files to add to a hosted LLM or copy pasting into a local one. Or using pycharm and AI integration plugins for it.
Neither is particularly satisfactory and I’m wondering if anyone knows of a good setup for this (preferably open source but not neccesary), integration with at least one of ollama or open-router would be needed.
Edit: Thanks for the recommendations everyone, lots of things for me to check out when I get the time!
Mikupad is incredible:
https://github.com/lmg-anon/mikupad
I think my favorite feature is the ‘logprobs’ mouseover, aka showing the propability of each token that’s generated. It’s like a built-in thesaurus, a great way to dial in sampling, and you can regenerate from that point.
Once you learn how instruct formatting works (and how it auto inserts tags), it’s easy to maintain some basic formatting yourself and question it about the story.
It’s also fast. It can handle 128K context without being too laggy.
I’d recommend the llama.cpp server or TabbyAPI as backends (depending on the model and your setup), though you can use whatever you wish.
I’d recommend exui as well, but seeing how exllamav2 is being depreciated, probably not the best idea to use anymore… But another strong recommendation is kobold.cpp (which can use external APIs if you want).