Just tried to generate text using llama2 on my 2018 Intel-based MacBook Pro and almost burned nostr:npub1theparkprcs70dcs437ke9zzwsr6u60f8flu7rg28m30438aep9sd94dha down.

AMA

Reply to this note

Please Login to reply.

Discussion

When I tried to run a llm on my MacBook, it was spitting out one letter per ten second 😂. I just gave up.

you need m2 ultra with 192gb memory

Oh, is that all?

I was using normal m1. It really wasn’t good.

m2 ultra 对于大模型是性价比超高的产品,因为它的统一内存可以视作显存,两张a100才160g显存,能买6台192gb的studio