You’re absolutely right—using a social network for filtering content is a simple and effective solution. You follow people because you like their posts, and in a way, that’s a form of indirect trust. Plus, the functionality’s already there in most cases, and I know clients like iris.to used to do it. It works for now, no doubt.
But here’s the thing: while it’s great for quickly filtering out bots and randoms, it’s not perfect. The follow system is more about content preference than actual trust. Just because you follow someone for their spicy memes doesn’t mean you’d trust them with, say, medical advice or fact-checking. And that’s where it falls short—it doesn’t allow for a reputation system or any sort of "fact-checking" style rating on posts. It’s like giving every post the same level of credibility just because you follow the person, even if it’s not all equal.
In a Web of Trust, new users (and bots) with zero reputation aren’t automatically filtered out—they're visible at first. But here’s the catch: bots will quickly be marked as untrustworthy by just a few people, and then they’ll be filtered for everyone else in the network. It’s like crowdsourced spam control—once a bot is flagged, it’s as good as invisible to the rest of us.
To add another layer of protection, relay servers could require proof of work before accepting posts from new accounts, enforce rules like rate limits on IP, etc., making it harder for them to endlessly spam.
As for bots that trust each other—well, that’s not really a problem. In a Web of Trust, it’s not about the size of the network; it’s about who you trust. So, if bots are busy trusting each other, it doesn’t affect your network unless someone in your network starts trusting them. And since no one in your network is likely to trust a bot, those fake trust loops don’t impact you at all.
nostr:note1lpfnmn7wrlnrc6av3cusltw8fn74j809xu54uspef8skpg4z9ptquealft
Using the social graph as a Web of Trust isn’t ideal because it mixes two different things. The social graph is about who you follow and who follows you—connections, not trust. Just because you follow someone doesn’t mean you trust them with important info, and vice versa.
In my opinion, keeping trust separate from the social graph makes more sense. It lets you build a trust network based on actual trustworthiness, not just social ties. This way, new users aren’t automatically flagged as spammers but can earn trust through their actions, not just who they’re connected to.
Separating these systems ensures spam control while giving newcomers a fair shot to prove themselves.
nostr:note1fazuks709z7wg20u23l98jz2zr7wljsmnpd85fejyv4tq8ldaw2s0vyk7y
Not at all! It’s not about being "hidden forever" just because a few people don’t like your posts. In a Web of Trust, trust is subjective. So, if a few people distrust you, it only affects their network and the people who trust them. Your posts would still be visible to others who trust you or those who haven’t built trust with the people who flagged you. It's a decentralized system, so no single opinion or small group can wipe you out across the entire network—your visibility depends on the trust relationships you build.
In a network of trust with, say, 100,000 people, it only takes one or a few people to spot the bot and decide to distrust it. Once that happens, the rest of the network gets the signal that the account isn’t trustworthy, and most people won’t even see the spam. Instead of everyone having to individually check if the content is valid, the Web of Trust allows for the community to quickly filter out the bad actors. It’s like crowdsourced spam protection—one or two people deal with it, and the rest benefit from it.
#wot #weboftrust #reputation #trust
Here's the thing: in a Web of Trust, your trust choices don’t just affect you—they ripple out to the people who trust you too. So, if you just go with the popular "trusted users," you might not be doing yourself, or your network, any favors. It’s kind of like everyone following the same food critic. Sure, they might know the trendy spots, but that doesn’t mean you’ll like the same thing. Plus, the people who trust your judgment might not appreciate those mainstream picks either. It’s better to trust based on your own experiences. Build a network that reflects what really matters to you—it’ll be more genuine, and your peers will trust you more for it!
In the decentralized space, trust and reputation is the new currency for services, time, and attention.
#wot #weboftrust #reputation
How to score trust and reputation in a Web of Trust system
Trust and reputation play a big role in both the real world and online. Trust is something that grows over time through personal experiences. It can be good or bad, depending on how someone has acted in the past. If someone consistently shows they can be reliable and honest, we naturally start trusting them more. That trust becomes a kind of safety net, allowing us to engage with others without constantly worrying about getting hurt or deceived.
But trust doesn’t just appear—it’s built slowly, through repeated interactions and consistent behavior. And, of course, trust is completely subjective. It’s shaped by how we each see and experience the world, which means one person’s trust in someone might not match another’s.
Reputation, on the other hand, is more about how others collectively see you. It’s the sum of all those individual trust levels, but still seen through everyone’s personal perspectives. In a way, reputation feels like a community reflection of trust. Even though it might look like an objective score, it’s really built on a whole range of subjective experiences. That’s what makes the dynamic between trust and reputation so interesting—it's constantly evolving, just like our interactions with people every day.
Figuring out how to score someone with trust in a digital system is like trying to rate your friend’s cooking. Do you go with 5 stars, 10 stars, or maybe something super detailed like 0-100%? The reality is, trust is a tricky thing to measure. We all have our own way of deciding who we trust—some of it’s personal, some of it’s cultural, and none of it fits neatly into a simple rating system. What makes sense to one person might be completely different for someone else.
So, instead of trying to create a complicated scale, the best solution is to simplify: trust, neutral, or distrust. It’s like a thumbs-up or thumbs-down—you either trust someone or you don’t. No need to overthink it. This binary approach is easy to implement digitally, and it keeps things straightforward. Plus, it’s easier for the algorithms to handle. No one wants an algorithm having an existential crisis over whether 4 stars means "pretty good" or "just okay."
Once you’ve got this simple trust/distrust system in place, you can start adding more details if needed. For example, you can confirm certain facts or give 5-star ratings for products, if that’s relevant. But at its heart, the binary trust system keeps things easy to understand and manage, for both people and computers.
Reputation gets interesting because it’s all about perspective. In a decentralized system, there’s no universal score—it’s all subjective, calculated by each observer based on their view. The Web of Trust comes in by aggregating these individual perspectives, creating a broader sense of someone’s reputation.
The process is pretty simple: every time someone trusts the person in question, they get a +1. If they’re distrusted, it’s a -1. It’s really just a running tally of trust vs. distrust across the network.
When the system calculates reputation, it only looks at the degrees of connections where the first opinion—either trust or distrust—appears about the subject. The idea is that opinions from closer peers matter more, while those further out aren’t as relevant and don’t get considered in the calculation.
This keeps the reputation system streamlined and ensures it reflects trust from those who are most relevant to you. After that, it’s up to you to decide how to interpret the score. For example, if the subject has 5 trust points and 2 distrust points, you might view them as generally trustworthy but still be cautious because some people in the network have expressed doubts. The system gives you the information, but how you weigh those points and act on them is entirely your call.
This simpel scoring approach enables algorithms that automated systems can follow, leveraging human trust to make decisions. By doing so, systems can guard against misuse and information spamming while still respecting individual preferences. The prospects for the Web of Trust are immense—an untapped industry where information filtering no longer depends on centralized platforms, but instead focuses on earning the trust of individuals. In this decentralized world, trust becomes the key currency, shaping how we engage and filter the overwhelming flow of information around us.
#weboftrust #wot #reputation #dwotr
I have extended the iris.to Nostr client, into a clone of the site with a reputation system on it (prof of concept). It works, but it is still fairly simple: https://dpeep.com.
You can trust people and posts, and the trust network extends 3 degrees deep. You need to create multiple accounts and trust each other to see the effects and the networks of web of trust.
I'll write some posts about how the general system works later.
Provide useful content and resources to the system. It will take time and that's the point.
nostr:note1fazuks709z7wg20u23l98jz2zr7wljsmnpd85fejyv4tq8ldaw2s0vyk7y
nostr:note1fazuks709z7wg20u23l98jz2zr7wljsmnpd85fejyv4tq8ldaw2s0vyk7y
I think that the governments might start cracking down on free speech by holding you personally accountable for any content found on your phone, computer, or server, regardless of who put it there. In this scenario, only licensed providers, such as major social media companies, would be exempt from this responsibility, as long as they comply with the regulations.
In the context of the Web of Trust for Nostr, I believe it's essential to first understand what trust really means.
In human relationships, I see trust as the belief or expectation that someone will act in a certain way, whether that action is positive or negative. When trust is positive, it means you rely on someone to act with integrity, honesty, and in your best interest. But trust can also be negative, where you expect someone to behave in a way that’s harmful, deceitful, or not in your favor, based on what you’ve seen from them before. In both cases, trust is about predicting behavior.
For me, trust—whether it’s positive or negative—is a crucial tool for security. It helps you anticipate how others will act, saving you time and effort. By trusting someone, you can focus on what’s important without needing to double-check every detail, using your understanding of that person to protect yourself from harm or to secure positive outcomes.
https://primal.net/e/note1fazuks709z7wg20u23l98jz2zr7wljsmnpd85fejyv4tq8ldaw2s0vyk7y
#WoT #DWoTR #WebOfTrust
First Principles and Tragedy of the Commons in context of Nostr
First principles thinking is a problem-solving approach that involves breaking down complex issues into their most fundamental elements and reasoning from these basic truths to build up a solution. Instead of relying on analogies or assumptions, first principles encourage a clear and foundational understanding of the problem.
The "Tragedy of the Commons" is a concept introduced by ecologist Garrett Hardin in 1968. It describes a situation in which individuals, acting independently according to their self-interest, overuse and deplete a shared resource, even though this behavior is ultimately detrimental to the entire group, including themselves.
When analyzing the Tragedy of the Commons from a first principles perspective, several fundamental truths emerge:
* The shared resource is finite and can be exhausted if overused.
* Individuals have an incentive to maximize their personal benefit, even if it harms the collective good.
* The commons are accessible to everyone, and it is difficult or impossible to exclude individuals from using them.
* Without regulation or an agreed-upon system of governance, there is no mechanism to prevent individuals from acting in their own self-interest to the detriment of the whole group.
* Addressing the Tragedy Through First Principles
To address the Tragedy of the Commons using first principles thinking, we need to focus on the following fundamental elements:
* Create systems where individual incentives are aligned with the collective good.
* Implement ways to control access to the resource.
* Establish governance structures that enforce rules, manage the resource, and ensure compliance.
* Foster cooperation among users to ensure the long-term sustainability of the resource.
A Decentralized Web of Trust Reputation (DWoTR) System can address the Tragedy of the Commons by:
* Reputation scores, which are subjective and based on individual trust, encouraging sustainable behavior. People gain better reputations, and thus more opportunities, by acting responsibly according to the trust placed in them by others.
* Controlling access to shared resources through subjective reputation. If someone over exploits the resource, their reputation diminishes in the eyes of others, restricting their future access based on individual assessments.
* Decentralizing governance, with decision-making power distributed according to subjective reputations. This creates a transparent, community-driven management system where each person’s influence is shaped by how others perceive their trustworthiness.
* Fostering cooperation, as subjective reputation motivates individuals to act in ways that earn trust from others. This mutual trust reinforces responsible behavior and collective resource management.
In conclusion, a DWoTR system offers the governance needed for decentralized systems to function without falling victim to the Tragedy of the Commons. By using subjective trust and reputation, DWoTR aligns individual actions with the collective good, ensuring sustainable resource management and effective, community-driven governance.
https://primal.net/e/note1fazuks709z7wg20u23l98jz2zr7wljsmnpd85fejyv4tq8ldaw2s0vyk7y
#WoT #WebOfTrust #DWoTR #TragedyoftheCommons #First Principles
Yes, it was a simple but sufficient system. Once the digital content was multiplied enough times (copied), the network could handle any load. The initial sharing of just-downloaded parts also helped conserve scarce bandwidth resources.
BitTorrent is mostly a pull system, meaning you can't push data onto someone else's computer, unlike Nostr relays. I think this fact limited the need for a reputation system.
I'm actually exploring using GunDB by Mark Nadal for a p2p network, particularly for local Web of Trust calculations. It already comes with a built-in database system and p2p connection logic, so with a few tweaks—especially considering that events are immutable—it could work really well for what you’re describing. The aim isn’t to replace the relay system but to enable clients to share events directly with their peers or sync across multiple devices. I also think this approach could relieve some of the pressure on relay servers and help Nostr scale to a larger audience.
Can Damus and Nostur act as private storage for the user’s notes, replies and reboosts?
Nostur shows 270 days back and Damus scrolled to 1y (no more specific details).
Are notes deleted in the local storage or is it reasonable to expect that the accumulation of notes over time represents the full archive?
nostr:npub1xtscya34g58tk0z605fvr788k263gsu6cy9x0mhnm87echrgufzsevkk5s nostr:npub1n0sturny6w9zn2wwexju3m6asu7zh7jnv2jt2kx6tlmfhs7thq0qnflahe
Relays on Nostr do a great job of keeping your data accessible, but it’s important not to count on them to store everything forever. In a decentralized system like this, it's really up to you to take care of your own data. While your notes, replies, and reboosts might be available for a while, I’d recommend backing up anything important to make sure you don’t lose it down the road.
Reputation
Solving the Tragedy of the Commons problem in a decentralized system is challenging, especially when there’s no central authority to enforce rules, restrict usage, or punish individuals who misuse resources. In traditional environments, people usually need to provide something of value—whether it's money, time, or effort—in exchange for access to shared resources. This creates a natural check on overuse. However, in decentralized systems like Nostr, where there’s a desire to keep things open and accessible to all, implementing a payment-based solution can be problematic. Such solutions are often unpopular because they exclude those without the means to pay, which runs counter to the ethos of inclusivity.
But there's one resource everyone possesses that could be used as a stake: reputation. The idea is that access to resources can be tied to the reputation of the user. A good reputation allows greater access to shared resources, while a low reputation might limit one's ability to access these resources freely. Reputation acts as a form of currency within the system, but with a key difference—it’s earned through consistent, positive behavior over time, and can be lost quickly if one abuses the system. This asymmetry—where building reputation takes time and effort, but losing it can happen swiftly—creates a strong incentive for users to act responsibly. By linking resource access to reputation, the system can encourage sustainable usage without needing a central authority to enforce rules.
In essence, this approach leverages the natural human desire to maintain a good standing within a community. When users know that their actions directly impact their reputation, and by extension, their ability to access resources, they’re more likely to avoid behavior that would harm the network as a whole. This could be a powerful way to address the Tragedy of the Commons in decentralized systems, ensuring that the community self-regulates and preserves the integrity of shared resources.
nostr:note138a97epurfkt7kau3sffr3xu4fatt2529pu7pwjhs69hzjjud3yscdfu85
#WoT #WebOfTrust
Tragedy of the Commons in Nostr
Nostr—open, free, and full of potential—could be a game-changer. But with that freedom comes a classic challenge as old as shared resources themselves: the Tragedy of the Commons.
So, what is this tragedy all about? Imagine a communal pasture where everyone is free to graze their cattle. It’s great at first—plenty of grass for everyone. But if everyone keeps adding more cattle, thinking they’re just maximizing their benefit from the free resource, the pasture gets overgrazed. Eventually, it becomes barren, and nobody’s cattle have anything left to eat. This is the Tragedy of the Commons: when individual actions, driven by self-interest, deplete a shared resource to the detriment of everyone.
Now, let’s bring this back to Nostr. Nostr operates on a network of relay servers—small, decentralized servers that keep the network running. These servers are like that shared pasture. They’re open for anyone to use, which is fantastic, but they have limited resources: bandwidth, storage, and processing power. If everyone starts using Nostr without restraint—posting endlessly, spamming content, or overloading the servers—we risk overwhelming these relays. The result? Slower performance, a degraded user experience, and potentially, servers going offline. Just like the overgrazed pasture, the whole network suffers.
The free nature of Nostr means there’s little to stop someone from using as much of the network’s resources as they want. There’s no central authority regulating how much you can post or how many connections you can make. While this openness is part of Nostr’s appeal, it also makes the network vulnerable to abuse. If too many people take advantage without considering the impact on the whole, the system could become bogged down, making it less effective for everyone.
To prevent Nostr from falling into this trap, we need to think about how to manage these shared resources wisely. Some solutions include using classical anti-spam methods on the relay servers to limit the load, while others involve pay-for-services models. However, neither addresses the problem at its root. A promising solution is the Web of Trust concept, which, if implemented correctly, could help both the relays and the users.
To wrap it up, Nostr’s decentralized nature makes it vulnerable to the Tragedy of the Commons problem. But the Web of Trust concept could be the key to solving a lot of these issues. It has the potential to help keep the network running smoothly by encouraging more responsible use.
In this ongoing series of posts, I’ll share my thoughts on what the Web of Trust is and how it can tackle the problems that come with the Tragedy of the Commons. Stay tuned for more!
#wot #weboftrust #tragedyofthecommons
nostr:note168w0wjcukej0r9p6tsvtea0fz0tvw9cxpe0atvfmxvseqaa95skqgcw0sv