Entropy is the number of ways you can arrange the littlest parts of a system and still get the same big system (1). Alternatively, entropy technically is a mean to determine how many different ways you can rearrange the atoms of an object, and still have it look pretty much the same.

For example; a bag full of #Lego Bricks has a lot of entropy because there are many equivalent states or configurations (disorder). The greater the #randomness in a system the greater the entropy. But a house built with those very same bricks has low #entropy because there are (relatively) fewer ways to build that house (order).

When building the little house the entropy of the bricks has decreased. But the total entropy of the Universe has actually increased because the kid who built the house disipated some heat into the air.

Same happens with water molecules when they become starry geometry of ice. Geometry is certainty, and has less entropy than the molecules, the molecules are ordered and give up energy to the environment.

The are only 2 ways to fight back entropy: prevent the dispersion of energy, or inject additional energy to undo the dispersion. Eg you could keep always your desk in order by keeping it isolated and not using it, or you be messy and tidy it up at the end of every day (add energy).

(1) The best definition of entropy is given by Claude Shannon in Theory of #Information: ‘the number of equi-probable, equivalent states of a system’.