(testing signal)

Tag: entropy

A Definition of Entropy Using LEGO Bricks

The best definition of entropy is given by Claude Shannon in his Theory of Information: ‘the number of equi-probable, equivalent states of a system’.

A bag full of Lego Bricks has a lot of entropy because there are many equivalent states or configurations (disorder).

A house built with those very same bricks has low entropy because there are (relatively) fewer ways to build that house (order).

When building the little house the entropy of the bricks has decreased. But the total entropy of the Universe has actually increased because the kid who built the house disipated some heat into the air.… Read more...

Information, Entropy and Life

Shannon discovered that information can be used as a measure of entropy and probability. This is both a physical and a quantitative definition which involves how much information a system can carry, and it’s not concerned with the meaning of that information.

The more information a system carries the less entropy it contains, which also happens to be the least probable state of the system. Likewise, the most probable state of a system is noise, which carries little information.

Life has a lot of information in it and this makes life a very unlikely event (tends to zero). The Universe on the other hand has tons of  time to ‘play’ (tends to infinite).… Read more...