Many aspects of the world of our everyday experience, which we call the classical world, are just so familiar that the way they work seems self-evident. It seems almost impossible to conceive of how things could function any differently. This is a trap that the pioneers of computing science fell into. Modern computer science is founded on bits - entities which take on the values 0 or 1. You can look at them, make copies of them, and act based on their values in order to develop interesting algorithms.
However, the apparently self-evident basis of computer science predicts that certain operations are impossible. For instance, think of the NOT gate. It looks at the value of an incoming bit, and outputs the opposite value. One can ask "is there such a think as the square root of NOT gate", by which we mean a gate which takes an input, and produces an output which, when fed back as the input to another identical gate gives the same output as NOT would have given. Such a thing does not exist, and cannot exist, in a modern computer. However, by exploring different extremes of the physical world, where one has to modify the classical laws of physics to explain the results of experiments, then one can find a device that implements the square root of NOT operation. This extreme is the regime of Quantum Mechanics.
Suddenly, when one has a new computational gate, you need to rewrite the theory of computation, and this gives us Quantum Computers. Rather than bits, these operate on qubits, which take on not only the values 0 and 1, but potentially some superposition of the two. Quantum Mechanics is very strange, and behaves very differently from the world of our intuition. For instance, it is impossible to perfectly copy qubits. The problem is that quantum mechanical effects are very fragile, and it is extremely easy to get things just a little bit wrong, and return to the classical regime. This is a major challenge for experimentalists who are trying to build a quantum computer.
There is now well established theory relating to sufficient requirements for being able to implement a quantum computation. In particular, if one can operate on single quantum bits (qubits), and between arbitrary pairs of qubits, then arbitrary quantum computations can be realised. Unfortunately, most experimental designs don't allow for direct interaction between arbitrary pairs of qubits, but only between those that are close to each other.
From a computer science perspective, there is no problem with replacing one long-range interaction with a series of local SWAP operations to move the qubits together, interact them, and move them back to their original positions ("Nearest Neighbour"). This only adds an overhead to any experiment, but does not fundamentally change the efficiency of a computation. Nevertheless, it represents a significant problem for the experimentalist trying to build such a device - the more operations that are required, the more he/she has to interact with the system, and the more likely it is that an error will be introduced. In the quantum world, errors are far more problematic than in the classical world - one cannot measure a quantum system for fear of destroying its state, and one cannot copy the data. We are thus highly motivated to reduce the influx of errors, drawing inspiration from the existing experimental capabilities.
Many of these different ideas can be repesented by the task of quantum state transfer. The idea is to start with a network of spins (we will depict a 1D chain for simplicity). At one particular site (the end of a chain), there is a qubit prepared in an unknown quantum state, and we need to get it to the other wnd of the chain. There are a variety of strategies that one might try:
- (No Restrictions) Interfacing the computer with a different architecture which is good at long-range transport. For instance, if one can move a quantum state from an atom onto a photon, direct the photon towards the target site, and then move the state from the photon onto a qubit, this would suffice. However, this requires interfacing two very different technologies, and is a significant experimental challenge.
- (Nearest Neighbour) This is a sufficient method for a theorist, who doesn't have to worry that every single SWAP operation between a pair of qubits risks introducing more error.
- (Global Control) In some experiments, the nearest neighbour strategy isn't even possible (or is technologically extremely demanding). For instance, in optical lattices, where atoms are trapped by laser light, each atom is separated by about half the wavelength of the light, and it is impossible to focus another laser sufficiently tightly to manipulate individual interactions. However, it is possible to use lasers to address, for instance, every second site of a lattice. It turns out that this level of control is sufficient to allow quantum computation. State transfer, as our simple example, is realised by alternating two different globally applied pair-wise SWAPs.
- (Designer Hamiltonian) When one fabricates a system of qubits, it is possible to build in some control over how they interact. However, they also have an intrinsic (time invariant) interaction known as the Hamiltonian. Can one design the system such that its Hamiltonian can implement specific tasks? This would have the massive potential advantage that we can put all of our experimental effort into refining (and pre-testing) one design which we don't have to subsequently interact with, thereby reducing errors. Perfect quantum state transfer along a chain is, theoretically rather simple, but also acts as the foundation for a variety of other studies, such as the design of Hamiltonians to implement arbitrary quantum computations. One of the major challenges facing this field of study is how to generalise the results from a chain to an arbitrary network configuration.
- (Single Spin) A slight relaxation of the designer Hamiltonian situation is to allow control over just a single site, or single coupling, and observe the resulting trade-offs in the range of applicable systems (which are massively enhanced), degree of tuning required (such systems are much more tolerant of manufacturing imperfections) with the level of error reintroduced by the control [State transfer in networks, Quantum computation].