- cross-posted to:
- aicompanions@lemmy.world
- cross-posted to:
- aicompanions@lemmy.world
But in all fairness, it’s really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!
But in all fairness, it’s really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!
I’ve been using it with a 6800 for a few months now, all it needs is a few env vars.