One of the most important evaluation criteria of a publishing system, regardless of its type, is how it implements the basic operations of Create, Modify, Delete. This is a standard that we no longer question at the operating system level. However, when it comes to anything we put on the public network, these fundamentals should be given even greater consideration. Creating content is one thing. But editing or even completely deleting (without it still lingering somewhere) seems to be more of a luxury.

With centralized systems, one never knows if there might still be a copy somewhere. With decentralized systems, it seems even impossible. At least for now. However, this does not render this age-old principle obsolete.

The next big "system" would be a decentralized, uncensorable protocol that cleanly implements C-M-D (Create, Modify, Delete).

Reply to this note

Please Login to reply.

Discussion

First off, a disclaimer, so you Nostr fanboys don't immediately freak out: Nostr is definitely better than central legacy algo-media. That much is certain. And its only really cool innovation is zapping, not the decentralization. The internet is already decentralized by itself.

But, in addition to what I wrote in https://mslmdvlpmnt.com/nostrs-myth-of-decentralization-rediscovering-true-freedom-on-the-web/ , one must make significant concessions. ⬇️

nostr:note1v204yuxnh5822gpwpz2z69e8z89m4l799czjc6nu4u365nlc5zjqpwv2zu

Fun fact... every CMS, Budgeting, and Aggregator system Ive built that operates at scale offers CRUD, but the delete is always a soft delete and the data is retained indefinitely for archival. I assume other enterprise systems over the past few decades share this trait

Just one filename: robots.txt