If you wanna try llama2 70B:

https://labs.perplexity.ai/

Reply to this note

Please Login to reply.

Discussion

How much fucking VRAM do you need to run this model?

Dunno, but here a pure C Llama2 model that runs crazy fast on cpu

https://github.com/karpathy/llama2.c

I just got mightily played by that llama!

Got all stoked that it could generate custom crossword puzzles and then it comes up with this:

https://void.cat/d/9CTzB2K962pYouTagTBCwd.webp