Sure ..

Reply to this note

Please Login to reply.

Discussion

They had to create a new chip just to do this well.

I'm no one to question the overlords, I'm just protecting myself from their wrath and letting anybody else know that's interested. Dont mean to hurt the feelings of their followers.

which is why it's encouraging that open source is winning. The newer the model the less capable at some things it is because they are being neutered by the thought police

like the early stable-diffusion stuff is easier to work with than this months models

Fine tuned versions of things like Llama70B are going to outperform corporate models in the end.

The good news is we are probably two years away from these being self hostable on local machines. So I think this issue will become a non-issue and harder open AI and google try the worse they make it.

Which is also why googles models suck so bad even though they invented the whole thing

I love self hosted models, I'm not too sure of it becoming a non-issue though, it's not just about the models themselves being open but all the plumbing in between as well.