Shannon discovered that information can be used as a measure of entropy and probability. This is both a physical and a quantitative definition which involves how much information a system can carry, and it’s not concerned with the meaning of that information.
The more information a system carries the less entropy it contains, which also happens to be the least probable state of the system. Likewise, the most probable state of a system carries little information.
Life has tons of information in it, which makes it an exponentially extremely unlikely event. The Universe on the other hand has tons of dices and time to play. This makes the number of life forms to be some indeterminate type zero times infinite.