My main workstation has a 3090rtx and 64 gb of ram.

Just loading a few models into ram and doing inference and my hardware is already maxed out and lagging

Siigghhh AI is expensive

Reply to this note

Please Login to reply.

Discussion

Damn.

Yep 😩

Which language(?) are you using?

Mix of python and rust

I guess I’m asking which LLM are you using? Llama etc.

(I’m new to this and learning).

How easy is it to create a non-cloud based AI for personal or company use?

nostr:npub1cmmswlckn82se7f2jeftl6ll4szlc6zzh8hrjyyfm9vm3t2afr7svqlr6f

Are you training from scratch?

Nope this is just inference but I will be doing training too

Last year I tried training Karpathy's minGPT on textbook with an old desktop 1070 - broke the temperature sensor(I think) and anything GPU intensive just dropped to a black screen with the fans blaring at 1000% 😭

Needless to say, when go back into that realm ill make sure to do it on competent hardware

Which model?

Playing with a bunch, right now the wizard coding one

3090 runs hot too πŸ₯΅πŸ”₯ got to account for room cooling as well.

I took mine apart and put thermal paste and heatsinks allover it lol

That’s crazy! I wonder if Mojo by Modular will be more efficient when it’s available

Bro if you needed a Jupiter rack you should have just asked

Can I have one

No lol, you have to be an L7 and above to put a rack in your personal vehicle, you have to be an L7 and above to loose a finger in the process tooπŸ˜†