- cross-posted to:
- aicompanions@lemmy.world
- cross-posted to:
- aicompanions@lemmy.world
But in all fairness, it’s really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!
But in all fairness, it’s really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!
It’s improving very fast. Give it a little time.