Large information plus noise random matrix models and consistent subspace estimation in large sensor networks (Q2884853)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Large information plus noise random matrix models and consistent subspace estimation in large sensor networks |
scientific article; zbMATH DE number 6036650
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Large information plus noise random matrix models and consistent subspace estimation in large sensor networks |
scientific article; zbMATH DE number 6036650 |
Statements
18 May 2012
0 references
information plus noise model
0 references
localization of the eigenvalues
0 references
subspace estimation
0 references
MUSIC
0 references
random matrix models
0 references
large sensor networks
0 references
quadratic form
0 references
covariance matrix
0 references
consistency
0 references
singular values
0 references
Gaussian information plus noise matrix
0 references
regularization
0 references
Poincaré inequality
0 references
0 references
0 references
0 references
0 references
Large information plus noise random matrix models and consistent subspace estimation in large sensor networks (English)
0 references
In array processing, a common problem is to estimate the angles of arrival of \(K\) deterministic sources impinging on an array of \(M\) antennas, from \(N\) observations of the source signal, corrupted by Gaussian noise. In the so-called subspace methods, the problem reduces to estimate a quadratic form (called ``localization function'') of a certain projection matrix related to the source signal empirical covariance matrix. The estimates of the angles of arrival are then obtained by taking the \(K\) deepest local minima of the estimated localization function.NEWLINENEWLINERecently, a new subspace estimation method has been proposed, in the context where the number of available samples \(N\) is of the same order of magnitude than the number of sensors \(M\). In this context, the traditional subspace methods tend to fail because they are based on the empirical covariance matrix of the observations which is a poor estimate of the source signal covariance matrix.NEWLINENEWLINEThe new subspace method is based on a consistent estimator of the localization function in the regime where \(M\) and \(N\) tend to \(+\infty \) at the same rate. However, the consistency of the angles estimator was not addressed, and the purpose of this paper is to prove this consistency in the previous asymptotic regime. For this, it is proven that the property that the singular values of \(M \times N\) Gaussian information plus noise matrix escape from certain intervals is an event of probability decreasing at rate \(\mathcal O(N^{-p})\) for all \(p\). A regularization trick is also introduced, which allows to confine these singular values into certain intervals and to use standard tools as Poincaré inequality to characterize any moments of the estimator. These results are believed to be of independent interest.
0 references