Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.
Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?
I type very poorly on my phone. with that much vram ypu csn get somethşng lşke a 70b model defineyly ask around in the koboldai community that shşt’s crszy