Best doesn't mean mean impossible, arrr me boy we setting up more SWAP on my partitions
Even the 20B model seems difficult with 8GB.😭
> gpt-oss-20b
> The smaller model
> Best with ≥16GB VRAM or unified memory
> Perfect for higher-end consumer GPUs or Apple Silicon Macs
https://cookbook.openai.com/articles/gpt-oss/run-locally-ollama
Best doesn't mean mean impossible, arrr me boy we setting up more SWAP on my partitions
No replies yet.