Global Feed Post Login
Replying to Avatar Stuart Bowman

That would be so cool if there was an AI inside your laptop that you could grant keyboard/mouse control to and that you could communicate with verbally, and it would understand what you were asking it to do in context of what was currently on the screen. Basically like you're always screen sharing with your AI helper. Since it would have access to everything this would probably only be workable if you were running locally or self hosting. In any case I wouldn't be surprised if UX converges on something like this in the near(ish) future

Avatar
Matt Lorentz 1y ago

We have to be just a few months away from this right? LLMs are already training on screencasts etc. on YouTube so it can probably understand what it is looking at on a modern OS. It just needs “mouse and keyboard” to be another language it can output? I’m not sure what the technical term is.

Reply to this note

Please Login to reply.

Discussion

No replies yet.