From a July 18, 2018 story by the Centre for Quantum Technologies at the National University of Singapore:
Watch a movie backwards and you'll likely get confused - but a quantum computer wouldn't.
In research published 18 July in Physical Review X, [an] international team show that a quantum computer is less in thrall to the arrow of time than a classical computer. In some cases, it's as if the quantum computer doesn't need to distinguish between cause and effect at all.
The new work is inspired by an influential discovery made almost ten years ago by complexity scientists James Crutchfield and John Mahoney at the University of California, Davis. They showed that many statistical data sequences will have a built-in arrow of time. An observer who sees the data played from beginning to end, like the frames of a movie, can model what comes next using only a modest amount of memory about what occurred before. An observer who tries to model the system in reverse has a much harder task - potentially needing to track orders of magnitude more information.
Read the full story by the Centre for Quantum Technologies at the National University of Singapore here: https://www.eurekalert.org/pub_releases/2018-07/cf...
And read the full article in Physical Review X, "Causal Asymmetry in a Quantum World", here:
https://journals.aps.org/prx/abstract/10.1103/Phys...
"Causal Asymmetry in a Quantum World"
Abstract: Causal asymmetry is one of the great surprises in predictive modeling: The memory required to predict the future differs from the memory required to retrodict the past. There is a privileged temporal direction for modeling a stochastic process where memory costs are minimal. Models operating in the other direction incur an unavoidable memory overhead. Here, we show that this overhead can vanish when quantum models are allowed. Quantum models forced to run in the less-natural temporal direction not only surpass their optimal classical counterparts but also any classical model running in reverse time. This holds even when the memory overhead is unbounded, resulting in quantum models with unbounded memory advantage.
Published: July 23, 2018, 2:27 pm