Avatar
Pip the WoT guy
f683e87035f7ad4f44e0b98cfbd9537e16455a92cd38cefc4cb31db7557f5ef2
simplifying the social graph so you can focus on building great experiences

You are welcome: https://vertexlab.io/docs/nips/

Vertex is a premium service for WoT, but if you are interested I can give you some free credits. Just post here the npub of the bot.

yes, that is true. Trusted introductions are a way to solve for this.

Invite someone into nostr and you start following him right away, or invite him into a community and that is already a signal if the community is reputable.

Great article nostr:nprofile1qqs9kqvr4dkruv3t7n2pc6e6a7v9v2s5fprmwjv4gde8c4fe5y29v0spp4mhxue69uhkummn9ekx7mqpzpmhxue69uhkummnw3ezumrpdejqh9f4s0.

Web of Trust fixes this.

naddr1qqxnzde5xy6rjdf5xqcrqdfkqgs9kqvr4dkruv3t7n2pc6e6a7v9v2s5fprmwjv4gde8c4fe5y29v0srqsqqqa286y66pw

no, it's just an API detail

instead of returning events, the filter and fulltext search endpoints are specced to return simple lists of events in whatever expected sort order (i'm thinking to put the since/until/sort as parameters to the endpoint, because currently they are by default ascending order, which people may not want, in theory they could even be further sorted by npub, or kind

the reason for changing the API to only return event IDs is that pushes the query state problem for pagination back on the client, the relay doesn't need to additionally keep track of the state of recent history of queries to enable pagination, and i don't like that shit anyway because it inherently is inconsistent, as in, the query could return more events at any moment afterwards, so what do we do if we push pagination on the relay? do we make it update those things? then the client will get out of sync as well

implicitly any query with a filter that has no "until" on it searches until a specific moment in time, the time at which the relay receives the query, and the identical query 10 seconds later could have more events at least in the space since that time, not to mention it may get older events that match pushed to it by spiders or whatever in between

so, i say, fuck this complication, you just make an index on event IDs to event serials and then you search the indexes created by the filter fields, and then find the event ID table and pull every one of those out, and return them sorted in either ascending or descending order that they were stored on the relay (which is mostly actual chronological order)

idk, maybe i should add a timestamp to that index so this invariant can be enforced

anyway, i'm interested to hear other opinions about why the relay should implement filter api differently than i described, but i have thought a lot about it and i'm leaning very much towards returning IDs so the client manages their cache state instead of pushing that on the relay to give people their precious pagination

i already got too much complexity in here

> instead of returning events, the filter and fulltext search endpoints are specced to return simple lists of events in whatever expected sort order.

In our case that would work poorly. Kind:0s are replaceabe, so if we returned the list of kind:0s sorted by the npub's rank, the result can hardly be reused later. One of the kind:0s might become outdated the next time you want to use it.

We care about this because it's quite a lot of work for generating these responses. That's why we return hex pubkeys sorted by the npub's rank. The rank of these pubkeys is stable over time except for edge cases like a key hack.

I think our case is different enough from the normal use of relays. For most other cases, returning event IDs is a solid choice I think.

It is a relay, yes, however the output is a new event kind which depends on the inputs of the request.

Yes, the same can technically be achieved by abusing a REQ filter, buuut it's different enough from the normal use that I am okay in using a new thing.

it can be improved a lot by not downloading a bunch of events in the first place.

By using DVMs this burden can be moved to service providers, which benefit from better infra and economies of scale.

An example is using Vertex vs downloading 1000 kind:3s to compute WoT things.

lol we live in a simulation

Someone is sending thousands of kind 5314 requests to nostr:nprofile1qy2hwumn8ghj7un9d3shjtnyv9kh2uewd9hj7qghwaehxw309aex2mrp0yhxummnw3ezucnpdejz7qpqkpt95rv4q3mcz8e4lamwtxq7men6jprf49l7asfac9lnv2gda0lqpsy38p .

The relay is working like a charm, but nevertheless soon we are going to be more aggressive with rate-limiting.

Just letting you know:

YOU CAN JUST DM US, I KNOW IT'S SOUNDS CRAZY BUT IF YOU LIKE SOMETHING YOU CAN JUST ASK THE PEOPLE BEHIND IT AND TALK ABOUT THE PRICE.

yes this is something we definitely want to explore in the future.

One idea, specific for recommend follows, is to provide some kind3s (or their IDs), so the user can verify client side if and how he's connected to all the recommended npubs. We are talking about 2N +1 kind3s as the worst case where N is the number of recommendations.

we can provide the same service in other ways, but a DVM is very straightforward to integrate and it has the benefits of being a native nostr event

as Fran pointed our already, we already have a suite of services using our algos in multiple ways!

https://vertexlab.io/docs/nips/

Perhaps we can in the future start indexing and analyzing e-cash mints rating.

content is not coming anywhere soon. We'll remain focused on profiles for the forseable future

yeah I know, our Firehose is inspired by yours. Not the same because we want to aggressively filter spammers, and we use internal metrics for that.

gotcha. Can I ask what's the reason? To me all other clients seemed to have converged to showing 'display_name'

> What made you pursue your project?

My passion for math first, then the appreciation for the problem, and finally the fact that it can be so useful for others that they might be willing to pay for it.

> What problem does it solve?

Making WoT analysis simple so apps can focus on building great experiences.

> Where did you hear about this problem? How do you know it exists?

I've experienced this problem myself, and all devs on Nostr have to deal with WoT one way or another. I've talked to many many in NostrRiga.

Replying to Avatar Fabricio

Hey Nostr! I am excited to share the second version of my password manager project! I will share the UI here and you can test it here: https://fabricio333.github.io/PasswordManagerWeb/

It basically generates deterministically different password for different sites and for different user or emails, there is a nonce counter for password changes and BIP39 for the private key backup and recovery.

Getting Started

BIP39 Mnemonic Key Recovery

When Creating a password you are prompted a user and a site url and you can update the nonce for the site if you need to change the password

Optionally you can encrypt the private key and the nonces/sites data to access them faster later in the getting started screen

Here you create and back up a random seedphrase

Here you decrypt the local stored data:

Pretty cool sir! I would use it if it was not javascript, as I think browsers are generally insecure. A nice native mobile app would be super useful for many I think

fantastic! Yes the previous version had issue regarding the WoT.

I think Amboss is a lightning data aggregator. Therefore they might come across some "I run my lightning node on a raspberry pi with a channel of 100$" operators that skew the metrics idk

Is there a service or a test suite for probing a bunch of relays to see which is online, as well as their latency? #asknostr

There are multiple NFC cards, I think it's using Shamir secret sharing to reconstruct the secret.

The keynote video is pretty instructive. A lot of marketing (as it should!) but nevertheless explains all the great tech they are using. Super damn impressive.

https://foundation.xyz/passport-prime/

Replying to Avatar Leito

A new nostr:npub1s0veng2gvfwr62acrxhnqexq76sj6ldg3a5t935jy8e6w3shr5vsnwrmq5 child is born.

Now with Paid-blossom (Ratasker) you can charge for upload and download of files using blossoms and cashu. This will give incentives to people to save information and easily charge for it when people want it. This of torrents and the Internet Archive combined. File distribution with solid economic incentives.

Thank you nostr:npub1ye5ptcxfyyxl5vjvdjar2ua3f0hynkjzpx552mu5snj3qmx5pzjscpknpr for all the help with this.

Demo of upload and download. See how cool it is that the file shows up almost intermediately on the downloader client:

https://cdn.hzrd149.com/64c61f2191ba4f77c7bfe85301141e4946a3ae94cb0a20a826763a33c703fb6d.mp4

nostr:nevent1qvzqqqqqqypzqfngzhsvjggdlgeycm96x4emzjlwf8dyyzdfg4hefp89zpkdgz99qyghwumn8ghj7mn0wd68ytnhd9hx2tcpzfmhxue69uhkummnw3e82efwvdhk6tcqyp3065hj9zellakecetfflkgudm5n6xcc9dnetfeacnq90y3yxa5z5gk2q6

real noice