Informationtheoretic lower bounds for compressive sensing with generative models the goal of standard compressive sensing is to estimate an unknown vecto. Consider a population consisting of n individuals, each of whom has one of d types e. In this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. Compressed sensing is an emerging field based on the revelation that a small group of linear projections of a sparse signal contains enough information for reconstruction. These techniques are based on one of the following categories. In this section, we put our work in the context of existing work on poisson compressed sensing with theoretical performance bounds. Sep, 20 in the remaining part of this chapter we derive a few information theoretic bounds pertaining to the problem at hand. The fundamental revelation is that, if an n sample signal x is sparse and has a good k term approximation in some basis, then it can be reconstructed using m ok lognk n. Bounds for optimal compressed sensing matrices and practical reconstruction schemes shriram sarvotham abstract compressed sensing cs is an emerging. Tight measurement bounds for exact recovery of structured.
We also propose and prove several interesting statistical properties of the square root of jensenshannon divergence, a wellknown informationtheoretic metric, and exploit other known ones. Furthermore, x can be reconstructed using linear programming, which has. Pdf information theoretic bounds for compressed sensing. Cs is considered as a new signal acquisition paradigm with which sample taking could be faster than. We emphasize that although the derivation assumes the measurement matrix to be gaussian, it can be extended to any subgaussian case, by paying a small con. The obtained bounds establish the relation between the complexity of the autoregressive process and the attainable estimation accuracy through the use of a novel measure of complexity. Information theoretic lower bounds for compressive sensing with generative models. Here the authors propose a quantity, named sensing capacity, to incorporate the effects of distortion. Information theoretic bounds for compressed sensing in sar. I m, where i m is an identity matrix of size m, is assumed to be a. Information theoretic bounds for compressed sensing.
We consider two types of distortion for reconstruction. An interesting question which arises in this context is the e. Information theoretic bounds for compressed sensing core. Paper open access related content quantum tomography via. Since arguments for establishing information theoretic lower bounds are not algorithm speci. For example, reference 4 studied the minimum number of noisy measurements required to recover a sparse signal by using shannon information theory bounds. Compressive sensing provides a new approach to data acquisition and storage. Indeed, the informationtheoreticconstrained quadratic programming. Index termsbasis pursuit, compressed sensing, compressive sampling, information theoretic bounds, lasso, orthogonal matching pursuit, prior information, sparsity pattern recovery. Tight measurement bounds for exact recovery of structured sparse signals. Informationtheoretic bounds of resampling forensics.
Information theoretic bounds for compressed sensing abstract. The course aimed at introducing the topic of compressed sensing cs. Towards an algorithmic theory of compressed sensing, rutgers univ. Abstract compressed sensing cs deals with the reconstruction of sparse signals from a small number of linear measurements. Index termscompressed sensing, relaxation, fanos method, highdimensional statistical inference, information theoretic bounds, lasso, model selection, signal denoising, sparsity pattern, sparsity recovery, subset selection, support recovery. Written by leading experts in a clear, tutorial style, and using consistent notation and definitions throughout, it shows how information theoretic methods are being used in data acquisition, data. Informationtheoretic limits on sparse signal recovery. It has recently been shown that for compressive sensing, significantly fewer measurements may be required if the sparsity assumption is replaced by the assumption the unknown vector lies near the range of a suitablychosen generative model. Learn about the stateoftheart at the interface between information theory and data science with this first unified treatment of the subject. These problems concern continuous natural phenomena. The goal of compressed sensing is to learn a structured signal x from a limited number of noisy linear measurements y. We develop information theoretic performance bounds on target recognition based on statistical models for sensors and data, and examine conditions under which these bounds are tight. In the cs literature, several information theoretic bounds on.
Index terms an optimal scaling of the number of observations required forrelaxation, compressed sensing, fanos method, highdimensional statistical inference, information theoretic bounds. Compressed sensing cs is a new framework for integrated sensing and compression. Algorithms and bounds for sensing capacity and compressed sensing with applications to learning graphical models s aeron, m zhao, v saligrama 2008 information theory. Furthermore, we show an information theoretic lower bound for tomography of rankr states using adaptive sequences of singlecopy pauli measurements. Informationtheoretic bounds on target recognition performance. Citeseerx document details isaac councill, lee giles, pradeep teregowda. Zhang jingxiong 1, yangke, guojianzhong2 1school of remote sensing and information engineering, wuhan university, wuhan, china.
The focus of our technique is on the replacement of the generalized kullbackleibler divergence, with an information theoretic metric namely the square root of the jensenshannon divergence, which is related to an approximate, symmetrized version of the poisson log likelihood function. Algorithms and bounds for sensing capacity and compressed sensing with applications to learning graphical models s aeron, m zhao, v saligrama 2008 information theory and applications workshop, 303309, 2008. Introduction sparse vectors are widely used tools in. Numerical experiments are performed showing the practical use of the technique in signal and image reconstruction from compressed measurements under. On the other hand fundamental information theoretic bounds that are algorithm independent have been presented in 2 1. This is based on the principle that, through optimization, the sparsity of a signal can be exploited to recover it from far fewer samples than required by. Apr 22, 2008 in this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. Index termsbasis pursuit, compressed sensing, compressive sampling, informationtheoretic bounds, lasso, orthogonal matching pursuit, prior information, sparsity pattern recovery. Sparse signal recovery with multiple prior information. Compressed sensing also known as compressive sensing, compressive sampling, or sparse sampling is a signal processing technique for efficiently acquiring and reconstructing a signal, by finding solutions to underdetermined linear systems.
The problem of sparse estimation via linear measurements commonly referred to as compressive sensing is particularly wellunderstood, with theoretical developments including sharp performance bounds for both practical algorithms 4, 7, 8, 6 and potentially intractable information theoretically optimal algorithms 9, 10, 11, 12. In this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy random projections of data. Informationtheoretic methods in data science edited by. One of the main challenges in cs is to find the support of a sparse signal from a set of noisy observations. Information theoretic lower bounds for compressive sensing with generative models abstract. A strong converse bound for multiple hypothesis testing, with applications to highdimensional estimation. Aug 17, 2000 detection and recognition problems are modeled as composite hypothesis testing problems involving nuisance parameters. Spectral compressed sensing via structured matrix completion 1d line spectral estimation as a special case, and indicates how to address multidimensional models. The standard approach to taking pictures is to rst take a highresolution picture in the \standard basis e. Detection and information theoretic measures for quantifying the. In the cs literature, several information theoretic bounds on the scaling law of the required number of measurements for exact support recovery have been. On the one hand are rigorous bounds based on information theoretic arguments or the analysis of speci.
Reference 5 investigated the contained information in noisy measurements by viewing the measurement system as an information theoretic channel. Compressed sensing cs is a new framework for sampling and. Finally, we characterize the privacy properties of the compression procedure in informationtheoretic terms, establishing upper bounds on the rate of information communicated between the. Bounds for optimal compressed sensing matrices and. One buzzword you can look up and read more about is the \singlepixel camera. Informationtheoretic lower bounds for compressive sensing. Information theoretic bounds to performance of compressed. Index terms an optimal scaling of the number of observations required forrelaxation, compressed sensing, fanos method, highdimensional statistical inference, information theoretic bounds, sparse approximation, sparse random matrices. Information theoretic limits for linear prediction with. Sparsity pattern recovery in compressed sensing by galen reeves a dissertation submitted in partial satisfaction of the requirements for the degree of doctor of philosophy in engineering electrical engineering and computer sciences in the graduate division of the university of california, berkeley committee in charge. Information theoretic results in compressed sensing. Detection and information theoretic measures for quantifying the distinguishability between multimedia operator chains. An informationtheoretic approach to distributed compressed. A novel technique using polar codes signal processing and communications applications conference siu, 2010 ieee 18th,2010 compressed sensing coding and information theory polar codes signal processing.
Spectral compressed sensing via structured matrix completion. Using an information theoretic metric for compressive. The problem has received significant interest in compressed sensing and sensor networkssnets literature. On the other hand are exact but heuristic predictions made using the replica method from statistical physics. Dror baron information theoretic results in compressed sensing compressed sensing. Ieee transactions on information forensics and security 11, 4 2016, 774788. Using an information theoretic metric for compressive recovery under poisson noise sukanya patila, karthik s. Another goal of this paper is to develop information theoretic bounds for the emerging.
The theory of compressed sensing, where one is interested in recovering a highdimensional signal from a small number of measurements, has grown into a rich field of investigation and found many applications 24. Information theoretic bounds for compressed sensing ieee. Informationtheoretic limits on sparsity recovery in the. Similarly, in 10, the authors consider the matrix completion problem and again use information theoretic techniques to obtain bounds. Compressed sensing cs deals with the reconstruction of sparse signals from a small number of linear measurements. Compressed regression neural information processing systems. In the cs literature, several information theoretic bounds on the scaling law of the required number of measurements for exact support recovery have been derived, where.
The improved performance of these methods over their standard counterparts is demonstrated using simulations. We propose a reconstruction algorithm with multiple side in. Introduction to compressed sensing with coding theoretic perspective this book is a course note developed for a graduate level course in spring 2011, at gist, korea. Computer science information theory, computer science machine learning, electrical engineering and systems science signal. For comparison, we will use the results by hegde and others 2 in a linear regression setup. On the other hand, an information theoretic analysis can reveal where there currently exists a gap between the performance of computationally tractable methods, and the fundamental limits. Information theoretic performance bounds for noisy. Gurumoorthyb, ajit rajwadec, adepartment of electrical engineering, iit bombay binternational center for theoretical sciences, tifr ictstifr, bangalore cdepartment of computer science and engineering, iit bombay abstract. The principle observation here is that most natural phenomena of interest is compressible, i.
Information theoretic limits for linear prediction with graph. The fundamental revelation is that, if an nsample signal x is sparse and has a good kterm approximation in some basis, then it can be reconstructed using m ok lognk n linear projections of x onto another basis. Information theoretic bounds for compressed sensing in sar imaging to cite this article. Nowadays, after only 6 years, an abundance of theoretical aspects of compressed sensing are explored in more than articles. In this paper, we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. Compressed mizationsensing bounds prior information weighted n. The smaller matrix sa2rm is a compressedd version of the original data a2rn d we start with an overview of di erent constructions of sketching matrices in. In this paper, we derive some information theory bounds on the performance of noisy compressive sensing to calculate. Information theoretic bounds for compressed sensing article pdf available in ieee transactions on information theory 5610. Lower bounds for compressed sensing with generative models. On the one hand are rigorous bounds based on informationtheoretic arguments or the analysis of speci. Compressed sensing cs is a new framework for sampling and reconstructing. Saligrama, information theoretic bounds to sensing capacity of sensor networks under fixed snr, presented at the information theory workshop, sep.
604 329 692 536 193 1202 390 1601 1180 1220 1187 1584 1646 1520 951 1511 1025 1340 385 1119 849 730 260 675 1278 399 1459 434 271 216 1306 427 555 1055 711 1333 821 332 366 1434 13 915 1142 694