Not sure. Another issue is streaming text via events. Not too great rn.

Reply to this note

Please Login to reply.

Discussion

This being said, not sure if communication via nostr with llms is fast enough. Not sure if we have something to deal with this.

How does the AI manage the session context. Does it reset in every message as a new session or does it know the short past?

Depends how much history you hand over. In principle, most models can handle the conversation history as context (depending on the length they can deal with). You either can forward the history or you do one request at a time. depends on what you want

Well, as a user of AI, I constantly need to start fresh. So, there must be some button somewhere to reset the context.

I often tell the conversation to give me a prompt that will allow me to start fresh based on β€œwhere we are and leave behind the conversation portions that have been abandoned.” Sometimes this works. 🀞

I use simple commands to my bot like

/new (fresh session)

/session (list sessions)

Etc.

I think we need some kind of meta text that clients could render as a menu bar, for talking to bots

The second resolution of `created_at` is a real problem imo. It should have been nano or at least microseconds.

Clients display conversations out of order when they occur in sub-second intervals, which is often with bots that respond fast.

With higher time resolution we could also have a short lived kind for streaming text. And ephemeral status messages for things like "Alice is typing"