just_another_person@lemmy.world to Linux@lemmy.worldEnglish · 2 months agoAMD Announces "Instella" Fully Open-Source 3B Language Modelswww.phoronix.comexternal-linkmessage-square10fedilinkarrow-up10arrow-down10cross-posted to: linux@lemmy.ml
arrow-up10arrow-down1external-linkAMD Announces "Instella" Fully Open-Source 3B Language Modelswww.phoronix.comjust_another_person@lemmy.world to Linux@lemmy.worldEnglish · 2 months agomessage-square10fedilinkcross-posted to: linux@lemmy.ml
minus-squarebrokenlcd@feddit.itlinkfedilinkEnglisharrow-up0·2 months agoThe problem is… How do we run it if rocm is still a mess for most of their gpus? Cpu time?
minus-squareswelter_spark@reddthat.comlinkfedilinkEnglisharrow-up1·12 days agoThere are ROCm versions of llama.cpp, ollama, and kobold.cpp that work well, although they’ll have to add support for this model before they could run it.
The problem is… How do we run it if rocm is still a mess for most of their gpus? Cpu time?
There are ROCm versions of llama.cpp, ollama, and kobold.cpp that work well, although they’ll have to add support for this model before they could run it.