stake is not a security mechanism, it's a concurrency mutual exclusion lock for 20-100 nodes to get the role of authoritative replica

bitcoin solves this problem but can't do better than 1mb/10 minutes and the simple fact is that this means that only one blockchain can be secure with more than 100 nodes in the consensus

only bitcoin is decentralised, and it can only be a notary for one small monetary ledger, in the whole internet

Reply to this note

Please Login to reply.

Discussion

Blockchain have to solve immutability and double spend, it is just sync of history of txes. It just need a rules written in code, there is no dependency on something called proof of work, proof of work is just an idea. Many nodes can sync and accept longest block without proof of work. Proof of work is just a proof of money and not proof of someone having brain or enough intelligence to protect chain security.

proof of work and the nakamoto heaviest chains wins consensus enable a fully decentralized distributed database

that is, the number of candidates to be "leader" of the network is basically unlimited, and up to 50% of the network participants can be MALICIOUS

the Practical Byzantine Fault Tolerant 2/3 consensus mechanism, and can only tolerate 1/3 being MALICIOUS and it cannot scale beyond about 100 nodes before the cost of communication starts upwards like a classic exponential hockey stick expansion towards infinity

it is highly specious to say that this mechanism is decentralized compared to the nakamoto consensus based on proof of work hash difficulty adjustment

nakamoto consensus is more than orders of magnitude more decentralized

it's like comparing a small town of 3000 people with tokyo and saying "they are both cities"

there is in fact no practical limit to the number of participants in the bitcoin consensus, we are up to about 1/4 of the bits being zeros in the hashes and each new bit is more than the log10 expansion versus the previous added zero

bitcoin consensus could include the entire world population and still have room for expansion in the bazillions more

Sometimes I think about concensus parameters, and wonder that one can change them but not by much.

10 min blocktime could have been 5 min, or 20 min, but not much more different.

1 MB is now up to 4 MB.

The thing that cannot be changed is the difficulty adjusted Proof of Work. To me that's like a new primitive, a new concept, like the circle. One cannot improve on the circle, you can only discover it.

the difficulty adjustment is a statistical interpolation mechanism, there are better ways to change it than the simple ~2week 2106 block cycle - using a PID controller mechanism, but it's easy tho get them wrong - the Verge blockchain, for example, used a continuous adjustment scheme with a time consensus and someone eventually stopped the clock

the one that satoshi chose, a simple proportional adjustment, works, it just undershoots and overshoots, and thus it has a long cycle time to narrow its error rate

i build a simulator for these things back in 2021 and experimented with parameters to find optimal numbers for a P-I based scheme that either moved smoothly or stayed stable against the actual hashpower on the network, but i didn't figure out how to make the derivative parameter work

i was doing some more reading on PID and learned that what you need is a high pass filter, which catches the high frequencies of the derivative parameter and that probably would have narrowed the adjustment down so much it is within about 10% of an accurate measurement of the real hash power at any given moment

Interesting!

Yeah, thermostats can be very stupid and still work, and so does Bitcoin.

overshooting and undershooting a bit is not such a big deal.

What's important is to have a mechanism that eventually will reach the target.

yeah, a slow thermostat works well enough

adding the integral and derivative factors to the computation matter more with a high dynamic range, this is the thing about bitcoin's mining hashrate, the bigger it gets, the slower it changes, so the adjustment error % gets smaller over time