Are these things that need a high end rig to run or could they run on your average consumer desktop?

Reply to this note

Please Login to reply.

Discussion

Would be good to have some serious compute, but it can be run on a typical “nice” consumer machine without being too much of a hog. Though you’ll want a C++ port of whatever is being run because python is just a monster of computational inefficiency.

With that said, I’m hoping to see a working release of Mojo (a new programming language that bridges between python and base hardware) that at a short glance looks like it will utterly change the game in this regard. Preliminary is looking to be in the 100x range performance improvement without having to change the program/model at all.

So, in a nutshell:

• Right now, yes-ish. If you know what you are looking for and are ok with some minor trade offs.

• In the near future, more yes. I expect more people porting the open source tools to more base languages like C++ and the like.

• In 2 years, absolutely, without question, 100%. We will be able to run multiple of these tools in real time on a huge variety of hardware. While the models will get both better and smaller at the same time. Both the use of current hardware will be vastly more efficient, AND all new processors will be optimized for AI. (literally every chip manufacturer right now is optimizing their next generation for AI, literally ALL of them)

I think there will be a Moore’s Law on top of Moore’s Law when it comes to these AI tools and their growing capacity, their improving accuracy, and the lowering of costs needed to run them.