With a 3090 graphics card (or two) you can run an AI that is nearly GPT4. And there are smaller cpu-only models that run great too (gpt4all). There are models that run on a phone (Doctor Dignity) and even smaller ones in the works (Tiny Llama 2). The pace is so fast the hardest thing is to build a product that isn't obsolete in a month.

Reply to this note

Please Login to reply.

Discussion

No replies yet.