The best definition of entropy is given by Shannon in his Theory of Information: ‘the number of equi-probable, equivalent states of a system’.

A bag full of Lego Bricks has a lot of entropy because there are many equivalent states or configurations (disorder).

A house built with those very same bricks has low entropy because there are (relatively) fewer ways to build that house (order).

When building the little house the entropy of the bricks has decreased. But the total entropy of the Universe has actually increased because the kid who built the house disipated some heat into the air.