For example on BLE communication the long messages are broken into pieces. The first piece is the header which indicates how many pieces to expect (2 characters), a specific identifier for that message (2 characters), the last 4 characters of the verification signature and the call sign for the destination (6 characters).
This gives enough information to:
1) permit knowing when pieces are missing (many are lost during transmission and need to requested again)
2) to know if they are targetted to the device listening (otherwise ignore)
3) in the end to verify if the last 4 characters of the verification result are matching with the full verification result based on the NPUB associated to that callsign.
For this to work, at every minute or so, all the devices advertise their callsigns and NPUB on bluetooth broadcasts to whomever is listening.
On the other radio waves is similar, albeit adapted to the conditions of each frequency or limitations of the radio protocol.
There are flaws on this approach. For example, someone can spoof a callsign and present their own NPUB but this is meant for local usage without internet available. When there is internet, it is possible for someone to register a callsign with an NPUB on the central geogram server, which then periodically distributes an offline list of NPUB mapped to callsigns.
For example, if the radio wave is Wi-Fi like the newer Wi-Fi Halow or ESPNOW that can reach between 1 and 10 km then there is no need for these procedures, just send the normal NOSTR json files as you'd do on internet because they are fast enough for that kind of data bandwidth.
I think I kinda get it
What about decentralizing the central geogram server?
Thread collapsed