Ugrads invited too.
———- Forwarded message ———-
From: Jeff Bilmes <bilmes@ee.washington.edu>
Date: Mon, Aug 22, 2011 at 4:10 PM
Subject: [cs-ugrads] [Speech-seminar] Research Seminar: Suvrit Sra, Max Planck Institute, Tuebingen, 8/25 11am-12noon, PAC AE108
To: speech-seminar@crow.ee.washington.edu, uw-ml@cs.washington.edu
Title: Positive definite matrices and the SS-Divergence
Max Planck Institute for Intelligent Systems,
Tuebingen, Germany
Thursday, August 25th, 11:00am-12:00noon
PAC (Paul Allen Center) AE108
We encounter kernels, Laplacians, covariances, and other positive
definite (PD) matrices in a dazzling variety of applications. This
ubiquity of PD matrices can be attributed in part to their rich
geometric structure: they form a differentiable Riemannian
manifold. But exploiting this manifold structure is nontrivial, as
even basic tasks such as intermatrix distance computation are
complicated. To partially address these concerns, we propose a new
distance-like measure: the Symmetric-Burg (SB)-Divergence; this
measure not only mirrors several key properties of the Riemannian
distance but also simplifies computation. I substantiate the above
claims by showing theoretical and practical results extracted from my
recent (and ongoing) work.
To offer motivation beyond the intrinsic geometric beauty of PD
matrices, I highlight an application where the SS-Divergence allows us
to develop an efficient method for covariance-based image retrieval.
A substep of our algorithm depends on certain nonlinear matrix
equations, for solving which I present a method that may be of
independent interest.