I do understand that ((I admit with limitations as Im not a computer expert)) ... I have seen the soecs for running a chat gpt 4 equivalent and its kinda 5000 usd of hardware ((if you disagree, please send me a correction of what hardware you think its needdd I'd be happy to -ground- my expectations))
That does not mean im not concerned... Let me try to rephrase what I mean
I thought it was going to be impossible to run one of these from a self hosted server both due to propietary software restrictions and hardware limmitations. However, I'm already running LLama in a shitty Linux computer I have at homw... In a couple of years and with a bit of investment I'll have a gpt4 equivalent running from my home. This means I can homeschooled my children potentially with the support of self hosted software. When all of this got madness started I never imagined I could be in the self hosting large language models endeavour in so little time.