When you're running AI tools on your computer...

Reply to this note

Please Login to reply.

Discussion

186 threads. Impressive.

That's my editing software, python was gobbling up the 36 GB of RAM, but was "idling" during this screenshot, hits about 40 GB when generating. I read that AL generally is super multithreaded, but it hardly ever uses more than 10-15, which I don't quite understand. I don't know enough specificity on when and why more threads is beneficial to know what's going on exactly.

Python’s memory usage is insane. In the words of Guido van Rossum, “It’s pointers all the way down.”

Something about the inefficiency messes with me. I’ve moved most of my code to Lua or Rust.

Btw, what model are you running?

In this picture I'm only running Stable Diffusion with the automatic1111 interface

If I remember correctly torch was originally written in Lua, but people seem to like to code in python instead ☹️

You are remembering correctly. A lot of AI used to be in Lua, including a lot of DeepMind.

You a Mac guy or is this a hackintosh?

Mac M1 Max.

However, I'm migrating all of the AI stuff over to a custom Linux machine whenever I can order the CPU, but I still gotta get my damn money back from Amazon... ⏳

Ah okay.

Thanks for starting the AI Podcast btw.

Started playing around with GPT4All after listening to it. ☺️