The next big feature in https://contex.st will be vector storage for all content, making search and memory context for LLMs very natural.

Alongside this, a prompt feature that allows one to specify data to be included in an LLM chat session

This will lead to asking Agents to query your “Contex.st” to generate reports, statistics and write research documents

We need creative researchers and memelords to use it and hopefully give us suggestions and feedback

Reply to this note

Please Login to reply.

Discussion

No replies yet.