This website contains age-restricted materials including nudity and explicit depictions of sexual activity.
By entering, you affirm that you are at least 18 years of age or the age of majority in the jurisdiction you are accessing the website from and you consent to viewing sexually explicit content.
Tonight, I installed Open Web UI to see what sort of performance I could get out of it.
My entire homelab is a single n100 mini, so it was a bit of squeeze to add even Gemma3n:e2b onto it.
It did something. Free chatgpt is better performance, as long as I remember to use place holder variables. At least for my use case: vibe coding compose.yamls and as a rubber duck/level 0 tech support for trouble shooting. But it did something, I’m probably going to re-test when I upgrade to 32gb of ram, then nuke the LXC and wait till I have a beefier host though.
case in point: you jacked off all night over your local model and still got a disappointing result
Not really. The fact that an n100 can produce anything at all is awesome.
statements dreamed up by the utterly deranged