https://www.deepseek.com/ is the “intended” experience and has the highest quality, but it isn’t local. If you want to run it locally, install ollama (https://ollama.com/) and run ollama pull deepseek-r1:<size>. I doubt you will be able to run 617b, but 14b preforms brilliantly.
When you message an LLM that input is parsed into tokens, and a response is generated that’s built out of tokens. You get a certain number of tokens with online models per day.
tokens are what the model breaks input and output into, and how it maps what words should come next in any given sentence… so basically when you use one of these models online they limit the number of tokens you can input/output in a given time because it is costly to run and people will abuse it otherwise. If you’re running it on your own machine then you don’t have to worry about it because you can use your machine to the max if you want.
https://www.deepseek.com/ is the “intended” experience and has the highest quality, but it isn’t local. If you want to run it locally, install ollama (https://ollama.com/) and run
ollama pull deepseek-r1:<size>
. I doubt you will be able to run 617b, but 14b preforms brilliantly.What would I gain from it being local ?
Not having data collected from you and unlimited tokens.
What are tokens ? Sorry I dont know much about this stuff beyond the surface level.
When you message an LLM that input is parsed into tokens, and a response is generated that’s built out of tokens. You get a certain number of tokens with online models per day.
ok so in short the amount of questions you can ask is limited ?
tokens are what the model breaks input and output into, and how it maps what words should come next in any given sentence… so basically when you use one of these models online they limit the number of tokens you can input/output in a given time because it is costly to run and people will abuse it otherwise. If you’re running it on your own machine then you don’t have to worry about it because you can use your machine to the max if you want.
tokens are words, more or less