Stamets@lemmy.world to People Twitter@sh.itjust.works · 2 years agoThe dreamlemmy.worldimagemessage-square249linkfedilinkarrow-up11.97Karrow-down144
arrow-up11.93Karrow-down1imageThe dreamlemmy.worldStamets@lemmy.world to People Twitter@sh.itjust.works · 2 years agomessage-square249linkfedilink
minus-squareaubertlone@lemmy.worldlinkfedilinkarrow-up3·2 years agoHey me too. And I do have a couple different LLMs installed on my rig. But having that resource running locally is years and years away from being remotely performant. On the bright side there are many open source llms, and it seems like there’s more everyday.
Hey me too.
And I do have a couple different LLMs installed on my rig. But having that resource running locally is years and years away from being remotely performant.
On the bright side there are many open source llms, and it seems like there’s more everyday.