

No worries! You’re probably right that it’s better not to assume, and it’s good of you to provide some different options.
No worries! You’re probably right that it’s better not to assume, and it’s good of you to provide some different options.
If by more learning you mean learning
ollama run deepseek-r1:7b
Then yeah, it’s a pretty steep curve!
If you’re a developer then you can also search “$MyFavDevEnv use local ai ollama” to find guides on setting up. I’m using Continue extension for VS Codium (or Code) but there’s easy to use modules for Vim and Emacs and probably everything else as well.
The main problem is leveling your expectations. The full Deepseek is a 671b (that’s billions of parameters) and the model weights (the thing you download when you pull an AI) are 404GB in size. You need so much RAM available to run one of those.
They make distilled models though, which are much smaller but still useful. The 14b is 9GB and runs fine with only 16GB of ram. They obviously aren’t as impressive as the cloud hosted big versions though.
Yeah, I’ve been really impressed with Gloomwood but I am already tired of the fishery (level 1) and wish the EA would come with a jump-ahead feature. It’s looking to be a very good game on release, I’m quite excited for it.