Gemini Nano running locally in your browser

https://glaforge.dev/posts/2024/08/07/gemini-nano-running-locally-in-your-browser/?linkId=10623010

Generative AI use cases are usually about running large language models somewhere in the cloud. However, with the advent of smaller models and open models, you can run them locally on your machine, with projects like llama.cpp or Ollama.

originally posted at https://stacker.news/items/646550

Reply to this note

Please Login to reply.

Discussion

No replies yet.