Hi Hackers,<p>Excited to share a macOS app I've been working on: <a href="https://recurse.chat/" rel="nofollow">https://recurse.chat/</a> for chatting with local AI. While it's amazing that you can run AI models locally quite easily these days (through llama.cpp / llamafile / ollama / llm CLI etc.), I missed feature complete chat interfaces. Tools like LMStudio are super powerful, but there's a learning curve to it. I'd like to hit a middleground of simplicity and customizability for advanced users.<p>Here's what separates RecurseChat out from similar apps:<p>- UX designed for you to use local AI as a daily driver. Zero config setup, supports multi-modal chat, chat with multiple models in the same session, link your own gguf file.<p>- Import ChatGPT history. This is probably my favorite feature. Import your hundreds of messages, search them and even continuing previous chats using local AI offline.<p>- Full text search. Search for hundreds of messages and see results instantly.<p>- Private and capable of working completely offline.<p>Thanks to the amazing work of @ggerganov on llama.cpp which made this possible. If there is anything that you wish to exist in an ideal local AI app, I'd love to hear about it.