Global Feed Post Login
Replying to Avatar levlion

I do understand that ((I admit with limitations as Im not a computer expert)) ... I have seen the soecs for running a chat gpt 4 equivalent and its kinda 5000 usd of hardware ((if you disagree, please send me a correction of what hardware you think its needdd I'd be happy to -ground- my expectations))

That does not mean im not concerned... Let me try to rephrase what I mean

I thought it was going to be impossible to run one of these from a self hosted server both due to propietary software restrictions and hardware limmitations. However, I'm already running LLama in a shitty Linux computer I have at homw... In a couple of years and with a bit of investment I'll have a gpt4 equivalent running from my home. This means I can homeschooled my children potentially with the support of self hosted software. When all of this got madness started I never imagined I could be in the self hosting large language models endeavour in so little time.

8c
coinbox 2y ago

Wow! I have only just learned about Llama. I stand corrected! Thats very impressive. Is it actually as comparable to ChatGPT3 on your linux? Do you have any links on how to do this, I'd like to set this up myself. Thanks for the info.

Reply to this note

Please Login to reply.

Discussion

No replies yet.