I think in the next couple years OS-shipped #LocalAI will replace the use of heavy cloud based #AI. Microsoft, Google, and soon Apple will be shipping devices with local LLMs and it'll be cheaper for applications to target those APIs rather than pay OpenAI or the such. This will also mean that we'll get into a sort of "browser wars" of model functionality gated by hardware vendors.

Reply to this note

Please Login to reply.

Discussion

No replies yet.