turkishdelight@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 1 year agoOllama now supports AMD graphics cardsollama.comexternal-linkmessage-square4fedilinkarrow-up178arrow-down10file-textcross-posted to: amd@lemmy.worldaicompanions@lemmy.worldhackernews@lemmy.smeargle.fans
arrow-up178arrow-down1external-linkOllama now supports AMD graphics cardsollama.comturkishdelight@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 1 year agomessage-square4fedilinkfile-textcross-posted to: amd@lemmy.worldaicompanions@lemmy.worldhackernews@lemmy.smeargle.fans
But in all fairness, it’s really llama.cpp that supports AMD. Now looking forward to the Vulkan support!
minus-squaresardaukar@lemmy.worldlinkfedilinkEnglisharrow-up7·1 year agoI’ve been using it with a 6800 for a few months now, all it needs is a few env vars.
I’ve been using it with a 6800 for a few months now, all it needs is a few env vars.