Abstract: The entropy rate of a stationary sequence of random symbols was introduced by Shannon in his foundational work on information theory in 1948. In the early 1950s, Kolmogorov and Sinai realized that they could turn this quantity into an isomorphism invariant for measure-preserving transformations on a probability space. Almost immediately, they used it to distinguish many examples called "Bernoulli shifts" up to isomorphism. This resolved a famous open question of the time, and ushered in a new era for ergodic theory.
In the decades since, entropy has become one of the central concerns of ergodic theory, having widespread consequences both for the abstract structure of measure-preserving transformations and for their behaviour in applications. In this talk, I will review some of the highlights of the structural story, and then discuss Bowen's more recent notion of `sofic entropy'. This generalizes Kolmogorov--Sinai entropy to measure-preserving actions of many `large' non-amenable groups including free groups. I will end with a recent result illustrating how the theory of sofic entropy has some striking differences from its older counterpart.
This talk will be aimed at a general mathematical audience. Most of it should be accessible given a basic knowledge of measure theory, probability, and a little abstract algebra.