Global Feed Post Login
Replying to Avatar Machu Pikacchu

I started working on one but didn’t get far because… life.

The idea: you run this service that connects to a set of relays and streams all the notes from your web into Kafka. Then you have a consumer group for processing (e.g. sending to an LLM for classification, labeling, etc). Then publish to another topic and have 2 consumer groups listening: one that streams to websocket connections and another for storage.

I used Kafka so eventually you could have commercial relays doing this for many users. It was going to use langchain-go and Ollama by default but obviously can use ChatGPT or Claude if needed.

Another reason for using Kafka is that you can have so many concurrent processors. For example I planned to generate embeddings for each note and store it in a vector database so users could do search too.

Avatar
MichaelJ 1y ago

Embeddings and vector search are the way to go, I think.

Reply to this note

Please Login to reply.

Discussion

No replies yet.