My main workstation has a 3090rtx and 64 gb of ram.
Just loading a few models into ram and doing inference and my hardware is already maxed out and lagging
Siigghhh AI is expensive
My main workstation has a 3090rtx and 64 gb of ram.
Just loading a few models into ram and doing inference and my hardware is already maxed out and lagging
Siigghhh AI is expensive
Damn.
Which language(?) are you using?
Mix of python and rust
I guess Iβm asking which LLM are you using? Llama etc.
(Iβm new to this and learning).
How easy is it to create a non-cloud based AI for personal or company use?
nostr:npub1cmmswlckn82se7f2jeftl6ll4szlc6zzh8hrjyyfm9vm3t2afr7svqlr6f
Are you training from scratch?
Nope this is just inference but I will be doing training too
Last year I tried training Karpathy's minGPT on textbook with an old desktop 1070 - broke the temperature sensor(I think) and anything GPU intensive just dropped to a black screen with the fans blaring at 1000% π
Needless to say, when go back into that realm ill make sure to do it on competent hardware
3090 runs hot too π₯΅π₯ got to account for room cooling as well.
I took mine apart and put thermal paste and heatsinks allover it lol
Thatβs crazy! I wonder if Mojo by Modular will be more efficient when itβs available