When OSes come with built-in local LLMs and expose them via internal APIs, things are going to get interesting.

Devs won’t need to ship entire AI models or rely on cloud APIs. Apps just need to be “AI-aware”. Invoking LLMs could just be a system call.

Imagine a game where NPC dialogue is generated on the fly, tailored to the story and your personal quirks.

Or the help menu for your app is a chatbot that helps you figure out how to solve the problem you have for the app.

Apps in the future just need to be “AI aware”.

wdyt?

Reply to this note

Please Login to reply.

Discussion

No replies yet.