Entropy and measure of System Complexity

https://www.youtube.com/watch?v=DxL2HoqLbyA

I was checking out this nostr app and saw the complexity chat and it reminded me of one of the points from this Veritasium video (linked).

Entropy is related to 3 fields of study

1. Physics

2. Information Theory

3. Statistics

The measure of entropy can be described as a metric of surprise in a dataset or the process of state space moving into disorder, randomness, or uncertainty. I typically use measures of Shannon's Entropy to understand data sets with the biggest surprise but I also always assumed that a higher entropy state would be associated with higher complexity.

Towards the end of the linked video it is demonstrated how physical systems with low entropy move towards systems of high entropy. Low and high entropy being the points of least complexity and the period in between are states of highest entropy and also systems of maximum complexity :) .

Reply to this note

Please Login to reply.

Discussion

No replies yet.