It is for both and yes what you are talking about is possible.

Reply to this note

Please Login to reply.

Discussion

Thanks for the speedy reply! That sounds brilliant. Are there any specific, self-hosted LLMs you'd recommend for such a use case?

I like Mixtral 8b7x something with 32k context limit.