Seriously if you are this concerned about how you are using an LLM then just run one locally. Llama3 is available is a bunch of different sizes for whatever machine you have. Even the small ones are decent.

Reply to this note

Please Login to reply.

Discussion

No replies yet.