> It’s easy to wax poetic about entropy, but what is it? I claim it’s the amount of information we don’t know about a situation, which in principle we could learn.
The entropy of a random integer being 1 makes intrinsic sense to me, given I didn't spend years in theoretical math classes