|Lawrence Ip > Research Summary|
Some of the things I've worked on, organized by area.
My work in quantum computation has focused on the search for new quantum algorithms. Given the enormous effort required to implement a quantum computer and the likelihood that the achievable 'operations' per second will be much less than that of a classical computer, the only practical computational problems that will benefit will be ones where a quantum computer has a huge advantage over any classical computer.
Despite recent attempts to create a new paradigm for designing quantum algorithms using quantum random walks and the adiabatic algorithm, almost all quantum algorithms that give an exponential speedup over the best known classical algorithms have used the Fourier transform to search for symmetries.
To devise new quantum algorithms, I have taken two approaches:
Quantum Algorithms for some Hidden Shift Problems, ACM-SIAM Symposium on Discrete Algorithms (SODA) 2003.
Abstract, ps (537kB), ps.gz (256kB) or pdf (199kB). Also quant-ph/0211140.
with Wim van Dam and Sean Hallgren.
Quantum algorithm for the shifted Legendre symbol problem and its generalizations.
Solving Shift Problems and the Hidden Coset Problem Using the Fourier Transform, 2002.
Abstract, ps (151kB), ps.gz (57kB), pdf (170kB) or djvu (81kB). Also quant-ph/0205034.
Earlier version that treats the problem from a matrix factorization viewpoint (as opposed to the deconvolution viewpoint).
The greatest success in quantum algorithms to date has been Shor's algorithm
for the order finding problem (and thus for factoring). The natural generalization
of the order finding problem is the hidden subgroup problem. The "standard"
algorithm uses the Fourier transform to efficiently solve the hidden subgroup
problem for abelian groups but fails when the group is nonabelian.
Finding an efficient algorithm for the nonabelian hidden subgroup problem is the outstanding open problem in quantum algorithms. The motivation comes from the fact that two important problems, graph isomorphism and unique shortest lattice vector, can be reduced to instances of the nonabelian hidden subgroup problem. The security of lattice cryptography relies on the difficulty of the shortest lattice vector problem. and a efficient algorithm for this would imply the ability to break those cryptographic schemes.
One of the reasons for the lack of progress in finding quantum algorithms for the nonabelian hidden subgroup problem has been that we don't have many tools aside from the Fourier transform. In some sense, measuring in the Fourier basis is all we know. In recent work I considered the problem of "How do we come up with candidate measurement bases?" One way is to look for an "optimal" basis. If the measure of optimality is maximizing the probability of correctly identifying the hidden subgroup, the optimal measurement basis is the solution to a semidefinite program.
Shor's Algorithm is Optimal, 2003. (In preparation)
Abstract, ps (350kB), ps.gz (161kB) or pdf (196kB).
We show the standard algorithm for the abelian hidden subgroup problem implements the optimal measurements. For some nonabelian groups we find the optimal measurements by solving a semidefinite program and show that the standard algorithm is no longer optimal.
Recently coding theory has been applied to the transmission of large files over the Internet to avoid having to send acknowledgements back to the sender. TCP relies on acknowledgements and subsequent retransmission to guarantee reliability. The need for feedback is a bottleneck when the latency is high, or in a multicast scenario when all of the leaves of the multicast tree are sending feedback to the root.
Erasure codes or forward error correction eliminate the need for feedback by allowing reconstruction of the original file as long as enough packets have been received. LT codes were the first class of rateless erasure codes to be discovered, the rateless property allowing the codes to adapt to the loss rate of the channel. In work with Chris Harrelson and Wei Wang, I showed analytically that LT codes work even when we use much less randomness than in the original construction.
Limited Randomness LT Codes, 41st Annual Allerton Conference on Communication, Control, and Computing, 2003.
Abstract, ps (336kB), ps.gz (156kB) or pdf (224kB).
with Chris Harrelson and Wei Wang.
The original LT codes construction assumes complete randomness. Implementations use much less randomness. We show analytically that even with this reduced randomness, LT codes are still asymptotically optimal.
The capacity region of the interference channel has been an outstanding open problem in information theory for over 30 years. I obtained some asymptotic results about the maximum sumrate and the extreme points for the Han-Kobayashi achievable region (the best known achievable region) in the high SNR regime. I found some evidence suggesting that the achievable region has a simpler structure for the z-channel, when one of the receivers is not subject to interference from the other sender.
The Gaussian Interference Channel, 2000. (Unpublished)
Asymptotics (high SNR) of the Han-Kobayashi achievable region for the Gaussian interference channel.
How do we assess the quality of software for producing sequence alignments? One way is to have a test set of "gold standard" alignments that have been produced by hand. In work with Steve Chien, I examined the well known BAliBASE standard alignments and showed that the correctness of the BAliBASE alignments is questionable (version 1 and 2 give different alignments for the same set of sequences), and that scoring schemes to determine the optimality of alignments were flawed (sums-of-pairs and minimum entropy often gave better scores to the computer generated alignments than the BAliBASE alignments).
Gold Standard Alignments - 24 Carat? 2002.
Slides in pdf (128kB).
with Steve Chien.
Project report from computational biology class. Looks at the quality of the BAliBASE standard alignments.
Catalan numbers arise in a natural way in random matrix theory when calculating the moments of the distribution of eigenvalues of random matrices, the most well known example being the original proof for Wigner's semi-circle law. I give a bijective proof of a combinatorial identity that appears in a similar calculation of moments.
Catalan Numbers and Random Matrices, 1999. (Unpublished)
ps (129kB) or ps.gz (33kB).
Elegant bijective proof of a combinatorial identity by counting certain Dyck paths in two different ways.
Last modified March 2, 2004