Our work is in quantum information, specifically in `Models of quantum computation' and quantum fault-tolerance.
Models of quantum computation: We study the one-way quantum computer, a scheme of quantum computation consisting of local measurements on an entangled universal resource state [Phys. Rev. Lett. 86, 5188 (2001)]. Unitary evolution and measurement are the two fundamental ways of evolving a quantum system in time, and they are very different. Unitary evolution is deterministic and reversible, measurement is probabilistic and irreversible. Universal quantum computation can be built on either unitary evolution or measurement. In the former case, one is led to the (standard) circuit model, in the latter to measurement-based quantum computation. The one-way quantum computer is one such measurement-based scheme. By shifting the focus from unitary evolution to measurement, it encourages us to reconsider what the fundamentals of quantum computation are. We thus ask: ``What are the elementary building blocks of the one-way quantum computer? What is their composition principle?'' The final goal of this line of research is to obtain clues for how to construct novel quantum algorithms.
Another computational model we study are quantum cellular automata (QCA). These devices are constrained by translation-invariance in space (and time), and usually have only short-range interaction. Despite these stringent constraints, computational universal QCA can be constructed. By implementing quantum cellular automata, quantum systems which have the advantage of long decoherence times but the drawback of limited local control (= small number of knobs to turn) can be used for quantum information processing.
Coding theory and fault-tolerant quantum computation: Error-correction is what a large-scale quantum computer, once built, will spend most of its computation time with. It is therefore important to devise error-correction methods which allow for a high error threshold at a moderate operational overhead. One focus of our work on fault-tolerance is in systems with a geometrical constraint, e.g. low-dimensional lattice systems, and in topological methods.
We have presented a fault-tolerant one-way quantum computer [arXiv:quant-ph/0510135], and have described a method for fault-tolerant quantum computation in a two-dimensional lattice of qubits requiring local and translation-invariant nearest-neighbor interaction only [arXiv:quant-ph/0610082], [arXiv:quant-ph/0703143] (joint work with Jim Harrington (formerly Los Alamos National Laboratory, now HRL Laboratories) and Kovid Goyal (Caltech)). The obtained error threshold is 0.75 percent per quantum gate, which is the highest known threshold for a two-dimensional architecture with nearest-neighbor interaction. A large value of the error threshold is important for realization of fault-tolerant quantum computation because it relaxes the accuracy requirements of the experiment. The imposed constraint of nearest-neighbor interaction in a two-dimensional qubit array is suggested by experimental reality: Many physical systems envisioned for the realization of a quantum computer are confined to two dimensions and prefer short-range interaction, for example optical lattices, arrays of superconducting qubits and quantum dots.
Quantum computation and foundations of quantum mechanics: David Deutsch, one of the founders of the field of quantum information, writes "[It is not] obvious a priori that any of the familiar recursive functions is in physical reality computable. The reason why we find it possible to construct, say, electronic calculators, and indeed why we can perform mental arithmetic, cannot be found in mathematics or logic. The reason is that the laws of physics 'happen to' permit the existence of physical models for arithmetic such as addition, subtraction and multiplication. If they did not, these familiar operations would be non-computable functions. We might still know of them and invoke them in mathematical proofs (which would presumably be called 'nonconstructive') but we could not perform them." [D. Deutsch, Proc. Roy. Soc. A 400, 97 (1985).].
Now that quantum computation is based on the laws of quantum mechanics, we ask: "Which is the key feature of quantum mechanics that causes the quantum speed-up?" -- There is no shortage of candidates, for example: superposition and interference, entanglement, largeness of Hilbert space, and contextuality of quantum mechanics. They are, presumably, all part of the picture. However, a precise connection remains to be drawn.