Can I ask what hardware you're running it on?
E.g. would a Mac studio do the job?
New to self hosting but considering to give it a go
Can I ask what hardware you're running it on?
E.g. would a Mac studio do the job?
New to self hosting but considering to give it a go
I don't run any models. I use GLM through z.ai's coding plan, which is incredibly cheap.
I need to look at doing the same but have been using the mental model that were still in the early 00s PC era of AI hardware. I think we'll see massive improvements that will make today's products look outdated. I'm still focused on SOTA models for my money now with hopes that as things optimize I'll spend a year or two of model costs for my own self hosted hardware in a few years when things start to stabilize and new hardware lowers current costs.
Cool. I think I misread your prior note.
I am pleased to hear you rate GLM 4.7 well. The hardware I have coming should run it.