In a recent preprint, Pawlowski et al. propose a natural physical principle called information causality which classical and quantum mechanics obeys but is violated by any theory in which exhibit CHSH correlations beyond Tsirelson’s bound of . Roughly speaking, information causality is the principle that if you send classical bits about a random variable to someone, their uncertainty about should not decrease by more than bits. For more on the background, see this excellent post by the his whole-iness, the Quantum Pontiff.
More precisely, it is defined in the following context: separated parties Alice and Bob play a game in which Alice receives a random bit string while Bob receives an index ; Bob’s task is to guess the value of . Additionally, Alice and Bob share a joint system (already used ), described by whatever theory you want, and can use this however they like (measure it in some way, etc.). If Alice sends Bob an bit string (presumably depending on and some measurement on her half of the joint system), then information causality is the statement that no matter how Bob generates his guess from and measurement on his part of system , , where is the mutual information between Bob’s guess and the actual bit .
What does this mean, in more direct terms? Well, limits the probability of Bob correctly guessing Alice’s bit, a fact which I imagine the authors had in mind based on the discussion surrounding equation A5 in the paper. Let me elaborate on the connection here. By using the Fano inequality, we can link the mutual information of two random variables and to the probability of guessing given : , where is the number of values that can take.
Applied to the probability of error when Bob guesses Alice’s th bit, averaged over the bits , we end up with , where is the random variable corresponding to Alice’s th bit ( ), and is Bob’s guess. The average probability of error is then . Since is generated from (the message), and (the extra system that Bob has, be it for the moment classical or quantum), (data processing inequality/Holevo bound). In the course of showing information causality holds for classical and quantum mechanics, they prove that , so . Thus, information causality says what we would have guessed intuitively: in order for Bob to be able to predict any of Alice’s bits with high probability, she’ll have to send him bits.
I would argue this formulation is more operational than a claim about itself, but of course they are very closely related. Phrasing things in terms of the number of bits required to complete the task is operational in the sense usually used in (quantum) information theory, e.g. the number of extractable random bits, number of distillable EPR pairs, smallest code that allows for reliable message recovery, etc. It also opens up the possibility of demonstrating that a particular theory obeys/disobeys information causality without making entropic arguments. On the other hand, it’s not operational in the sense that Alice and Bob can simply make a specified measurement on the joint system and compare with a bound like Bell/CHSH or Tsirelson. They first have to determine which protocol to use, and this may be difficult.