Wednesday, March 25, 2015

Improving M-SBL for Joint Sparse Recovery using a Subspace Penalty

This is very interesting. In the area of sparsity seeking solvers M-SBL is with AMP one of the interesting solvers to follow. This is in part due to the good results Zhilin Zhang got with block sparsity and non sparse signals in the past.The authors of the following paper change the regularization term of that algorithm and seem to have even better phase transitions for sparse signals. Without further ado:



Improving M-SBL for Joint Sparse Recovery using a Subspace Penalty by Jong Chul Ye, Jong Min Kim, Yoram Bresler
The multiple measurement vector problem (MMV) is a generalization of the compressed sensing problem that addresses the recovery of a set of jointly sparse signal vectors. One of the important contributions of this paper is to reveal that the seemingly least related state-of-art MMV joint sparse recovery algorithms - M-SBL (multiple sparse Bayesian learning) and subspace-based hybrid greedy algorithms - have a very important link. More specifically, we show that replacing the $\log\det(\cdot)$ term in M-SBL by a rank proxy that exploits the spark reduction property discovered in subspace-based joint sparse recovery algorithms, provides significant improvements. In particular, if we use the Schatten-$p$ quasi-norm as the corresponding rank proxy, the global minimiser of the proposed algorithm becomes identical to the true solution as $p \rightarrow 0$. Furthermore, under the same regularity conditions, we show that the convergence to a local minimiser is guaranteed using an alternating minimization algorithm that has closed form expressions for each of the minimization steps, which are convex. Numerical simulations under a variety of scenarios in terms of SNR, and condition number of the signal amplitude matrix demonstrate that the proposed algorithm consistently outperforms M-SBL and other state-of-the art algorithms. 

I wonder how that change of the regularization proxy would change Zhilin's codes


Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly