nostr:npub1y4s7hjcxn9e2dxakrl98taqevp8vk9fn4axpkejqcuvackcg7f6sgnvgt8 Yeah this is Llama 2 7B (the weakest version of that model) running locally using my new https://github.com/simonw/llm-mlc plugin

It's very quick to take ethical objections to things that it shouldn't, but you can tamp that down a fair amount by giving it a less restrictive system prompt

Reply to this note

Please Login to reply.

Discussion

No replies yet.