Search ISIT 2010

Can Structure Beat Shannon? — The Secrets of Lattice-codes

Ram Zamir, Tel Aviv University

Friday, June 18, 08:30 - 09:30


Communication systems use structured (linear, lattice, trellis) codes for their low complexity. Luckily, such codes also have the potential to approach the limits promised by information theory, i.e., structure "comes for free". Can structured codes exceed the information theoetic limits? For memoryless point-to-point systems this is clearly impossible; Shannon's classic coding theorems provide simple "single-letter" expressions - in terms of entropy or mutual information - and corresponding random code constructions, which attain the best source and channel coding performance. In multi-terminal settings, however, random codes are not always optimal. Korner and Marton showed in 1979 that linear coding beats the best known random coding scheme for the binary two-help-one source coding problem.

In this talk we consider the benefits of structure from the viewpoint of lattice codes. We show the simplicity of lattice-based solutions in Gaussian side-information settings, as well as in joint source-channel coding. More interestngly, we demonstrate their performance gains over random codes in more involved multi-terminal settings.

The talk is based on past and present work with Meir Feder, Gregory Poltyrev, Shlomo Shamai, Uri Erez, Simon Litsyn, Dave Forney, Yuval Kochman and Tal Philosof.

IEEE Valid HTML! Except for images and photographs, content on
this site is licensed under a Creative Commons
GNU General Public License, version 3.