Scheme: University Research Fellowship
Organisation: University College London
Dates: Jan 2013-Dec 2017
Summary: All our information processing technology rests on an underlying mathematical theory -- information theory -- developed by Shannon in the 1940's. He based the theory on "classical" physics: physics pre-dating quantum mechanics. This is often a good approximation, and its success is evident. However, as Shannon himself was aware, a complete theory of information must take quantum effects into account. Quantum information theory includes the classical theory as a special case, but also predicts fundamentally new ways of processing information: quantum computers, quantum cryptography, and even entirely new types of resource such as entanglement. Landauer famously said: "information is physical", and quantum information epitomises this.
Quantum mechanics allows information to be processed in ways not possible classically, but there is a flip side to this: it is extremely difficult to simulate quantum systems on a (classical) computer, an obstacle at the heart of many of the challenges in modern theoretical physics. Materials are made up of atoms obeying quantum laws, and we still do not fully understand properties for which quantum effects are significant, such as superconductivity.
Turning Landauer's maxim on its head, we can use information theory to study physics. Much of theoretical physics concerns the behaviour of complex quantum many-body systems systems. Quantum information theory studies how to deliberately engineer complex many-body quantum systems to produce useful behaviour (such as quantum computing). In my research, I use quantum information theory to tackle some of the challenging fundamental open questions in quantum many-body and condensed matter physics.
This remarkable marriage of physics, mathematics and computer science is, for me, one the most fascinating ideas in current science. Beyond the potential technological applications, it probes the deepest aspects of both quantum physics and information.