Kimi 2.5 and Self-Hosting Open WebUI
Been poking around with the Kimi 2.5 LLM and also started self-hosting Open WebUI on my server (a self-hosted ChatGPT-style web frontend for LLM APIs).
Kimi probably isn't the best model on the market, but Kimi 2.5 is the first time I've used a truly open source model that feels to be vaguely in the same category of performance as ChatGPT, etc. And I don't really feel much of a penalty using it vs ChatGPT.
Of course, running it directly is way beyond what any device I have can do reasonably well.
But there are already API providers around offering it with very favorable privacy and data retention policies, so I'm probably going to switch to using it over ChatGPT.
I wouldn't recommend using the chat/API offered by the model's creator—I don't really trust that company.
If I self-host the front end, all of the actually sensitive data like chat logs etc are stored on my server.
Open WebUI is pretty cool. It works almost as well as ChatGPT does. I've run into some issues with the model occasionally freezing during processing, but I've occasionally seen that type of thing with other LLM providers.
It has a search integration that works with the model so it can web search etc. It's pretty customizable.
I quickly created a custom tool that the model can use which queries the OpenAlex API to find open access academic articles. The code for that can be found here https://git.selfhosted.onl/theo/openwebui-tools-skills/src/branch/main