Avatar
ElectronicMonkey
45c41f21e1cf715fa6d9ca20b8e002a574db7bb49e96ee89834c66dac5446b7a
Author of nostr blogging client: https://flycat.club/

这个我就不知道了,只是近来有种趋势,很多犯罪悬疑小说和影视作品的故事发生地从重庆慢慢转向了东北

虽然东北一直在写下岗潮,厂,这些母题,但是看完《漫长的季节》,还是会对厂这种奇怪的组织感到十分荒诞。

八十年代写诗,玩摇滚,九十年代冲浪写代码,再后面到现在,年轻人在做什么比较酷的事?

空白这段时间都在医院照顾家人和忙着做飞猫的改版,会慢慢回来nostr🤝

https://github.com/nostr-protocol/nips/pull/576 I want the repost feature on long-form articles (NIP-23 event), just like the short-note reposting. if you care about this too, please do review or discussion on this PR

Replying to Avatar nos

At [Nos.social](http://Nos.social) we believe that both freedom of speech and freedom of listening are required to build healthy social spaces. Moderation is a set of features that allow users to choose what they consume in social spaces.

Many users are put off by the term “moderation” because of the way it has been implemented on big social platforms. Typically it is performed by the corporation owning the network in conjunction with the government where the company exists. These two parties have all the power over who gets to speak and who doesn’t. The rules are often unclear, unequally enforced, and there is little or no transparency or recourse for injustice.

On the other hand places with little or no moderation tend to become very unhealthy or only comfortable for a very specific group of people. Creating one space or set of rules where everyone can feel comfortable is not possible; which is why our vision is one where thousands of communities, governed by and for the people, are nested in a larger social media commons.

Moderation in a decentralized network therefore must adhere to the following principles:

1. Users should be in ultimate control of the content they see.

2. Moderation activities should be transparent.

3. Relay owners must have the tools to comply with their consciences and local laws, while respecting the first two principles.

Our vision is not one moderation strategy to rule them all, but a collection of models that each user and community can apply as they see fit.

# Implementation

Moderation on Nostr will likely evolve in many directions, from complex webs of trust to more traditional designated moderators policing the health of their own communities. Our short-term plan at Nos is to build the simplest tools we can that allow moderation to happen according to the principles listed above. The core users stories for this effort are:

1. As a user I want to be able to report content that I find objectionable.

2. As a user I want to designate trusted moderators whose judgement my client respects.

3. As a user I want to be able to label my own notes with content warnings at the time of publishing.

4. As a moderator or relay operator, I want to be able to view reports and respond to them by either approving or not approving them.

5. As a user I want to see which notes moderators are censoring.

# Pitfalls

The two biggest risks to realizing our vision of decentralized content moderation are over-fragmentation of vocabulary and legal liability. Some networks (like Mastodon) have allowed freeform content warnings and content reports that have put a massive burden on moderators on the platform. Coming up with a classification system informed by real-world experience is necessary to make moderation possible at scale and enable features like opting in or out of certain types of content. An example of this functionality can be seen in the following screenshot:

![F30381CE-B786-4AE6-B048-2D8307D5911E.jpeg]()

In addition, a functioning moderation system is required for Nostr apps to abide by the Apple App Store and Google Play Store rules, and for relay owners to comply with laws in most countries around the world. If we want Nostr to work for a large part of humanity it is paramount that we enable people and business to comply with these laws and guidelines where they are so inclined.

# Conclusion

We’re focused on adding the necessary features for basic decentralized moderation to our app Nos, as well as building a micro-app to allow any Nostr user to engage with this moderation system. We’re working with the community to standardize on a reporting format and vocabulary (see [NIP-68/69](https://github.com/nostr-protocol/nips/pull/457). We’ve been in touch with experts from Trust and Safety teams at Twitter and Facebook to confirm that our system is feasible. We’d love to hear your feedback on our efforts as we continue on our journey of building a global social media commons powered by Nostr.

this is a must read

actually they made some point here. nostr is not a first mover in social network so it has no such advantage, I agree, but the real meaning of that point is to say social network is not the killer app on Nostr. Instead, I believe we have more interesting stuff to build on.

spent one month in hospital while working. been a long time to recheck nostr. how is everyone doing the last couple of weeks? is there any news I miss?

I like the angle your article inspect nostr. it is interesting perspective to see nostr under a dev-stack light. to me, nostr is really just a new way to do things for all the unhappy developers all over the world.

特别喜欢这篇,我去年也看了青年变革者。我发现自己总是不自觉地去看这段历史。中国那段时间的所谓的知识份子思考的东西,可以参考张灏写的《危机中的中国知识分子》

https://book.douban.com/subject/26797535/

可以在文章里面直接修改,文章都是可以无限次编辑的