Abstract:
Given a transition matrix (i.e., a row-stochastic matrix) one can define either a discrete-time Markov chain (with the multi-step transition matrix given by the matrix power) or a continuous-time Markov process (roughly, the update times are distributed as a Poisson point process). A common way to analyze the speed of convergence of Markov chains is to study the contraction of the relative entropy (i.e., the Kullback-Leibler divergence). Such entropy contractions are characterized by the strong data processing inequality for discrete-time Markov chains, or the modified log-Sobolev inequality for continuous-time Markov processes. In several previous works these two notions of entropy contraction were claimed to be equivalent to each other, in the sense that the rates of contraction differ by universal constant factors. We disprove this and related conjectures, and summarize known comparisons among different notions of entropy contraction. In particular, we show that: (a) entropy contraction of a continuous-time Markov process can be arbitrarily faster than its discrete-time counterpart; (b) entropy contraction of an (m+1)-step transition matrix can be arbitrarily faster than the m-step version. Joint work with Pietro Caputo, Yuzhou Gu and Yury Polyanskiy.