According to this: https://apxml.com/posts/gpu-system-requirements-kimi-llm

You need 32 x H100 80GB's to run Kimi K2

These cost $30-45K each according to a quick search. 32 of them makes it... about $1 million?

Reply to this note

Please Login to reply.

Discussion

wat is kimi k2

A new open source llm from moonshotai. It requires about 1TB of vram.

is it really open source at that point 😅

Well it’s open weights but totally agree. Still better than “OpenAI”.

How’s Diablo 2 look on’em?

What are you poor?

Is there an ollama file yet? Waiting to see CPU perf on 1TB RAM

Grabbed that day-of, but there wasn't an ollama model file for it yet

There are some necessary code changes that are in-flight: https://github.com/ollama/ollama/issues/11382

unsloth has GGUFs and llama.cpp fork that could run it in smaller GPUs

https://huggingface.co/unsloth/Kimi-K2-Instruct-GGUF

https://github.com/unslothai/llama.cpp

Or you can run it on a bunch of 128gb framework desktops for 3k a pop to get a total price of ~50k. It'd be much slower but still pretty usable