Cool idea to write yourself for AI processing. There are larger models like 45Gb which are more smarter. I mean that the small model will answer as well but it will never be precise.

Small models are better for natural language processing.

Reply to this note

Please Login to reply.

Discussion

Yes, I've experimented with up to 20GB models so far.

My largest machine has 32GB RAM.

I'm trying not to buy the Nvidia DGX Spark because it's a first generation machine and it's locked into CUDA 😂

Yes, I've experimented with up to 20GB models so far.

My largest machine has 32GB RAM.

I'm trying not to buy the Nvidia DGX Spark because it's a first generation machine and it's locked into CUDA 😂