Distortion-delay tradeoff for a Gaussian source transmitted over a fading channel
Conference Paper
Overview
Research
Identity
Additional Document Info
Other
View All
Overview
abstract
We study the end-to-end distortion-delay tradeoff for a Gaussian source transmitted over a fading channel. The analog source is quantized and stored in a buffer until it is transmitted. There are two extreme cases as far as buffer delay is concerned: no delay and infinite delay. We observe that there is a significant power gain by introducing a buffer delay. Our goal is to investigate the situation between these two extremes. Using the recently proposed effective capacity concept, we derive a closedform expression for this tradeoff. In order to characterize the convergence behavior, we derive an asymptotically tight upper bound for our tradeoff curve, which approaches the infinite delay lower bound polynomially. Numerical results demonstrate that introduction of a small amount of delay can save significant transmission power. 2006 IEEE.
name of conference
2006 IEEE International Symposium on Information Theory