Thursday, November 29, 2012

Fast Marginalized Block SBL Algorithm - implementation -




Zhilin just mentioned it on his blog: Fast Marginalized Block SBL Algorithm by Benyuan Liu, Zhilin Zhang, Hongqi Fan, Zaiqi Lu, Qiang Fu. The abstract reads:
The performance of sparse signal recovery can be improved if both sparsity and correlation structure of signals can be exploited. One typical correlation structure is intra-block correlation in block sparse signals. To exploit this structure, a framework, called block sparse Bayesian learning (BSBL) framework, has been proposed recently. Algorithms derived from this framework showed promising performance but their speed is not very fast, which limits their applications. This work derives an efficient algorithm from this framework, using a marginalized likelihood maximization method. Thus it can exploit block sparsity and intra-block correlation of signals. Compared to existing BSBL algorithms, it has close recovery performance to them, but has much faster speed. Therefore, it is more suitable for recovering large scale datasets.

The attendant code to recreate the examples featured in this paper is here.


Join our Reddit Experiment, Join the CompressiveSensing subreddit and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

2 comments:

liubenyuan said...

The X-Label should be 'N'. (-.-!)

BTW: The python implementation of BSBL code family, BSBL_BO and BSBL_FM are now available at
https://bitbucket.org/liubenyuan/pybsbl

Best wishes!

Igor said...

Thank you. Zhilin is aware of that typo. Thanks for the python code!

Cheers

Igor

Printfriendly