Thursday, February 12, 2009

CS: Sparse Approximation and Atomic Decomposition, SPARS'09 program, Counter Braids, trending in the stock market, L_0 reconstruction

Bob Sturm, a reader of this blog, just released his recently defended Ph.D. thesis entitled Sparse Approximation and Atomic Decomposition: Considering Atom Interactions in Evaluating and Building Signal Representations. The attendant slides of that defense can be found here. The abstract of the thesis reads:
This dissertation makes contributions to the sparse approximation and efficient representation of complex signals, e.g., acoustic signals, using greedy iterative descent pursuits and overcomplete dictionaries. As others have noted before, peculiar problems arise when a signal model is mismatched to the signal content, and a pursuit makes bad selections from the dictionary. These result in models that contain several atoms having no physical significance to the signal, and instead exist to correct the representation through destructive interference. These ``spurious'' terms greatly diminish the efficiency of the generated signal model, and hinder the useful application of sparse approximation to signal analysis (e.g., source identification), visualization (e.g., source selection), and modification (e.g., source extraction). While past works have addressed these problems by reformulating a pursuit to avoid them, such as adding restrictions to the content of a dictionary, in this dissertation we use these corrective terms to learn about the signal, the pursuit algorithm, the dictionary, and the created model. Our thesis is essentially that a better signal model results when a pursuit builds it considering the interaction between the atoms. We formally study these effects and propose novel measures of them to quantify the interaction between atoms in a model, and to illuminate the role of each atom in representing a signal. We then propose and study three different ways of incorporating these new measures into the atom selection criteria of greedy iterative descent pursuits, and show analytically and empirically that these interference-adaptive pursuits can produce models with increased efficiency and meaningfulness, e.g., the direct correspondence between an atom and specific signal content. Finally, we propose creating a higher-level model of the decomposed signal by agglomerating the atoms of a representation into molecules based on a set of similarity criteria, and compare this method with a previous pursuit that builds molecules simultaneously with the decomposition process. In both cases, we find that the resulting molecules have a more clear relationship with the signal content.
One can note that he is going to have to continue his postdoc in Paris. Life ain't fair :-)

The SPARS'09 conference has just released the list of the talks that will be featured there. It is here. It looks like a very interesting and worthy program.

Laurent Gosse, a trader it seems, writes about the trend of the CAC-40, the french local stock index using what he calls Compressed Sensing. I am not quite sure but it looks like detrending using l1 regularization.

There is also a new version of the presentation entitled Toward 0-norm Reconstruction, and a Nullspace Technique for Compressive Sampling by Christine Law and Gary Glover.

Finally, Yi Lu  and Balaji Prabhakar continue investigating their counter braid work with  Robust Counting via Counter Braids: An Error-Resilient Network Measurement Architecture. The abstract reads:
A novel counter architecture, called Counter Braids, has recently been proposed for accurate per-flow measurement on high-speed links. Inspired by sparse random graph codes, Counter Braids solves two central problems of per-flow measurement: one-to-one flow-to-counter association and large amount of unused counter space. It eliminates the one-to-one association by randomly hashing a flow label to multiple counters and minimizes counter space by incrementally compressing counts as they accumulate. The random hash values are reproduced offline from a list of flow labels, with which flow sizes are decoded using a fast message passing algorithm. The decoding of Counter Braids introduces the problem of collecting flow labels active in a measurement epoch. An exact solution to this problem is expensive. This paper complements the previous proposal with an approximate flow label collection scheme and a novel error-resilient decoder that decodes despite missing flow labels. The approximate flow label collection detects new flows with variable-length signature counting Bloom filters in SRAM, and stores flow labels in high-density DRAM. It provides a good trade-off between space and accuracy: more than 99 percent of the flows are captured with very little SRAM space. The decoding challenge posed by missing flow labels calls for a new algorithm as the original message passing decoder becomes error-prone. In terms of sparse random graph codes, the problem is equivalent to decoding with graph deficiency, a scenario beyond coding theory. The error-resilient decoder employs a new message passing algorithm that recovers most flow sizes exactly despite graph deficiency. Together, our solution achieves a 10-fold reduction in SRAM space compared to hash-table based implementations, as demonstrated with Internet trace evaluations.


Credit & CopyrightEric J. Zbinden, Space Station in the Moon (via astronomy picture of the day)

No comments:

Printfriendly