entropy is information that is hard to predict emerging from an exchange

sometimes it really can be caled "negentropy" because very occasionally shuffling a system causes order to increase

this pattern is what taleb calls "antifragile"

only materialists who disregard the presence of information and its influence on physics don't get this

it's central to understanding what the very notion of "economic growth" actually means, not the stupid NGU of fiat but the actual reduction of inefficiencies

negentropy is the general category of all things that have value and what mckenna was calling "novelty"

Reply to this note

Please Login to reply.

Discussion

Pseudoscience IMHO, but yet again. There is no point of us arguing when neither of us are open to a change πŸΆπŸΎπŸ«‘πŸ«‚

if you understood how AI proximity hash recognition systems worked you would get what i'm talking about

shuffling a system does not always lower it's structure, or there would be no order even at the nano scale

structure that builds engines of gathering structure together has a way of gradually gathering systems that conserve structure and eventually find ways to increase it

this in no way invalidates the point that energy is lost in this process