is there a 2-bit quantization yet? my software SUCKS and i only have 8 gigabytes of ram lol

Reply to this note

Please Login to reply.

Discussion

hardware* i am an idiot

Even the 20B model seems difficult with 8GB.😭

> gpt-oss-20b

> The smaller model

> Best with ≥16GB VRAM or unified memory

> Perfect for higher-end consumer GPUs or Apple Silicon Macs

https://cookbook.openai.com/articles/gpt-oss/run-locally-ollama

Best doesn't mean mean impossible, arrr me boy we setting up more SWAP on my partitions