I still have to research more but there is a package for running models optimized for CPU instead called Llama cpp library (and then go from there....). I'm gonna throw that on Mac in this case. Just personal needs kinda thing see the heck is up with all this

https://github.com/ggerganov/llama.cpp

Reply to this note

Please Login to reply.

Discussion

Nice. I’ve seen this before. Just didn’t realize a GPT-4 level model was available.

I prefer to think and not let AI do everything but I'm curious and you know digital world 😅

I’m more interested in using things like LangChain to generate context specific LLM bots to do Q&A. At work I’ve got a large corpus of cybersecurity data I’m trying to contextualize around an LLM. Would be dope for bitcoin content too.

Interesting. That is one frontier I need to explore still..... the LLMs and AI in general. Would be cool if you document anything you may run into 🤙