prompt injection is still a big deal sadly… there’s actually dedicated models for summarization out there.

you could fine tune existing summarization models to take additional context input like reply chains via tokens

Reply to this note

Please Login to reply.

Discussion

Curator llms for doomer and whitepill feeds