**Seminar Abstracts (spring 2009)**

Laurent Younes

*Smoothing in the dual space; Applications to vector fields and frames*

** Abstract: ** We discuss a new variational paradigm to measure the smoothness of data like unit vectors fields, or fields of rotation matrices over spatial domains,
and related denoising method for such datasets. Our point of view is to crepreset such fields as linear forms acting on suitable reproducing
kernel Hilbert spaces and work in the dual. Experimental results are based on synthetic data and diffusion tensor–magnetic resonance imaging datasets.

Melvin Leok

*Lie group and homogeneous
variational integrators and their applications to geometric optimal control
theory*

** Abstract:**
The geometric approach to mechanics
serves as the theoretical underpinning of innovative control methodologies in
geometric control theory. These techniques allow the attitude of satellites to
be controlled using changes in its shape, as opposed to chemical propulsion,
and are the basis for understanding the ability of a falling cat to always land
on its feet, even when released in an inverted orientation.

We will discuss the application of
geometric structure-preserving numerical schemes to the optimal control of
mechanical systems. In particular, we consider Lie group variational
integrators, which are based on a discretization of Hamilton's principle that
preserves the Lie group structure of the configuration space. In contrast to
traditional Lie group integrators, issues of equivariance and order-of-accuracy
are independent of the choice of retraction in the variational formulation. The
importance of simultaneously preserving the symplectic and Lie group properties
is also demonstrated.

Recent extensions to homogeneous
spaces yield intrinsic methods for Hamiltonian flows on the sphere, and have
potential applications to the simulation of geometrically exact rods,
structures and mechanisms. Extensions to Hamiltonian PDEs and uncertainty propagation
on Lie groups using noncommutative harmonic analysis techniques will also be
discussed.

We will place recent work in the
context of progress towards a coherent theory of computational geometric
mechanics and computational geometric control theory, which is concerned with
developing a self-consistent discrete theory of differential geometry,
mechanics, and control.

This research is partially supported
by NSF grants DMS-0714223, DMS-0726263, and DMS-0747659.

Dan Naiman

*Multivariate Records*

** Abstract:** Given a vector-valued time series, a multivariate record is said to
occur at time t if no previous observation dominates it in every
coordinate. This notion of a record generalizes the usual notion in one
dimension, and gives rise to some interesting phenomena, some of which
will be presented. This talk will describe an efficient algorithm for
sampling the multivariate records process, as well as a method, based on
the theory of abstract tubes, for calculating the probability of a new
multivariate record at time t, conditional on the past up to time t.
(Joint work with Fred Torcaso.)

Anant Godbole

*Omnibus Sequences *

** Abstract:** Consider locating words of length $k$ over an
alphabet of size $a$ as subsequences (but not necessarily
substrings) of a random length $n$ string. An $n$ string with
the property that each of the $a^k$ words is present as a
subsequence is called an {\it Omnibus Sequence.} We derive
necessary and sufficient conditions for a string to be omnibus
and give conditions under which a random string is almost
always or almost never omnibus. An analysis of the number of
missing words provides another way of looking at this problem.
Several applications are presented. For example, we
demonstrate how Tolstoy's ``War and Peace" contains this
abstract, or any other ``abstract" of this length as a
subsequence. Parallels are drawn with the fundamental results
of de Bruijn, and Chung, Diaconis, & Graham, on * Universal
Cycles *. This is joint work with Sunil Abraham (Oxford), Greg
Brockman (Harvard) and Stephanie Sapp (JHU-AMS).

Jay Rosen

* The $L^2$ modulus of continuity of local times of
Brownian motion*

** Abstract:** Let $\{L^x_t; (x,t)\in\mathbb R^1\times \mathbb R^1_+\}$ denote the local time of Brownian
motion. We refer to $\int(L_t^{x+h}-L_t^x)^2dx$ as the $L^2$ modulus of continuity.
We explain the basic ideas behind the law of large numbers
$$
\lim_{h\downarrow 0}\int \frac{(L_t^{x+h}-L_t^x)^2}{h} dx = 4t \mathrm{\ a. s.}
$$
and central limit theorem
$$
\lim_{h\downarrow 0}\int \frac{(L_t^{x+h}-L_t^x)^2-4ht}{h^{3/2}} dx \stackrel{\mathcal L}{\Longrightarrow} c\sqrt{\alpha_t}W_1
$$
for this modulus of continuity. Here
$$
\alpha_t = \int (L_t^x)^2 dx
$$
and $\{W_t; t\geq 0\}$ is a Brownian motion independent of
$\alpha_t$.

We also discuss analogous results for the local times of stable and
other Lévy processes.

Based on joint work with Xia Chen, Wenbo Li, and Michael B.
Marcus

Lisa Fleischer

Submodular Approximation: Sampling-based Algorithms and
lower Bounds

** Abstract:**
We introduce several natural optimization problems using
submodular functions and give asymptotically tight or close upper
and lower bounds on approximation guarantees achievable using
polynomial number of queries to a function-value oracle. The
optimization problems we consider are submodular load balancing,
submodular sparsest cut, submodular balanced cut, submodular
knapsack.

We also give a new lower bound for approximately learning
a monotone submodular function; and show that much tighter
lower bounds will require a different approach.

Trac Tran

* Novel Compressed Sensing Applications in Distributed Video Sensing and Error-Resilient
Video Transmission*

** Abstract:** This talk presents several practical frameworks and algorithms for
video sensing and communications via the viewpoint of the recently
emerging compressed sensing (CS) theory.

We will first start with a novel framework called Distributed Compressed
Video Sensing (DisCoS) where video sequences are sampled
intra-frame and reconstructed inter-frame via exploitation of
temporal correlation. The key observation here is that a pixel block
in the current frame can be sparsely represented by a linear
combination of few neighboring blocks in previously reconstructed
frame(s), enabling it to be predicted from its CS measurements by
soling the L1 minimization problem. Simulation results show that
DisCoS significantly outperforms the intra-frame-sensing and
intra-frame-sparse recovery scheme by 8-10 dB. Unlike all other
previous distributed video coding schemes, DisCoS can perform encoding
operations in the analog/optical domain with very low-complexity, making
it a promising candidate for applications where the sampling process
is very expensive, e.g., in Terahertz imaging.

The second part of the talk discusses a framework named
layered compressed sensing (LaCoS) for robust video transmission over
lossy packet-switched channels. In the proposed transmission system,
the encoder consists of a base layer and an enhancement layer. The
base layer is a conventionally encoded bitstream and transmitted
without any error protection. The additional enhancement layer is a
stream of compressed-sensing measurements taken across slices of
video signals for error-resilience. By exploiting the side information
(base layer) at the decoder, a sparse recovery algorithm can not only
recover the lost packets but the enhancement layer is also required
to transmit a minimal amount of compressed measurements (only proportional
to the packet-loss percentage). Simulation results show that both
compression efficiency and error-resilience capacity of the proposed
LaCos framework are competitive with those of other state-of-the-art
robust transmission methods, in which Wyner-Ziv coders are often used
to generate the enhancement layer.

Jeffrey Leek

*A general framework for multiple testing dependence*

** Abstract:** I will present a general framework for performing large-scale significance
testing in the presence of arbitrarily strong dependence.
We have derived a low-dimensional set of random vectors, called a dependence kernel,
that fully captures the dependence structure in an observed high-dimensional dataset.
This result is a surprising reversal of the "curse of dimensionality" in the
high-dimensional hypothesis testing setting. We have shown theoretically that conditioning
on a dependence kernel is sufficient to render statistical tests independent regardless
of the level of dependence in the observed data. This framework for multiple testing dependence
has implications in a variety of common multiple testing problems,
such as in gene expression studies, brain imaging, and spatial epidemiology.
I will illustrate our approach on a large-scale clinical genomics study of trauma.

Matt Feiszli

*Weil-Peterson metric on plane curves *

** Abstract:** The Weil-Peterson Riemannian metric arises in Teichmuller theory,
where it measures deformation of conformal structures on Riemann
surfaces. We will discuss its use as a metric on shapes, where it
measures changes in the conformal structure of the interior and
exterior of a simple closed plane curve. We will examine the
geometric properties that the WP metric is measuring, and present some
estimates relating WP path lengths to the geometry of the region being
deformed. There is a close connection between the WP metric and
deformations of the medial axes of both the interior and the exterior
of the curve.

Youngmi Hur

* Design of nonseparable $n$-D biorthogonal wavelet filter banks*

** Abstract:**
We consider the design of nonseparable $n$-D biorthogonal wavelet filter banks
for signals with different correlation along different directions. Other
properties (such as fast algorithms) of the new wavelet filters, as well as the
details of the design will be discussed. We illustrate the use of our wavelet
filter banks with a $2$-D example. If time permits, we will discuss the
properties (such as smoothness) of the corresponding biorthogonal wavelet
functions.

Jonathan Taylor

*TBA*

** Abstract:**
This talk describes a "prototypical" application of random field theory (RFT) to neuroimaging data.
The data we are interested in represent anatomical differences in the brain between
controls and patients who have suffered non-missile trauma. To model the
difference between patients and control, we use a
multivariate linear model at each location in space (by varying spatial location we arrive at
a random field model). The test we use to compare patients and controls at each point is a Hotelling's
$T^2$ to detect differences between cases and controls. RFT is used in the final stage:
detecting regions of activation in the Hotelling's $T^2$ data, which we do by thresholding
the image of $T^2$ statistics.

Jonathan Taylor

*TBA*

** Abstract:**
In various scientific fields from astrophysics to neuroimaging,
researchers observe entire images or functions rather than
single observations. In my first talk, I describe a particular
application in which these random functions, or fields, are
used to detect differences between populations in a
neuroimaging study.
The integral geometric properties, notably the Euler characteristic of
the level/excursion sets of these random functions, typically
modelled as Gaussian random fields figure in a crucial way in
these
applications of random fields. In this talk, I will describe
some of the integral geometric properties of these random sets,
particularly their average Lipschitz-Killing curvature
measures. I will focus on describing the results for a class
of non-Gaussian random fields (built up of Gaussians) which
highlights the relation between their Lipschitz-Killing
curvature measures and the
classical Kinematic Fundamental Formulae of integral geometry.

Robert McCann

*Extremal Doubly Stochastic Measures and Optimal Transportation*

** Abstract:**
Imagine some commodity being produced at various locations and consumed
at others. Given the cost per unit mass transported, the optimal
transportation problem is to pair consumers with producers so as to
minimize total transportation costs. Despite much study, surprisingly
little is understood about this problem when the producers and consumers
are continuously distributed over smooth manifolds, and optimality is
measured against a cost function encoding some geometry of the product
space.
This talk will be an introduction to the optimal transportation, its
relation to Birkhoff's problem of characterizing of
extremality among doubly stochastic measures, and recent progress linking
the two. It culminates in the presentation of
a criterion for uniqueness of solutions which subsumes all previous
criteria, yet which is among the very first to apply
to smooth costs on compact manifolds, and only then when the topology is
simple.

Robert McCann

*Continuity, curvature, and the general covariance of optimal
transportation *

** Abstract:**
In this talk, I describe my surprising discovery with Young-Heon
Kim (University of Toronto) that the regularity theory of Ma, Trudinger,
Wang and Loeper for optimal maps is based on a
hidden pseudo-Riemannian structure induced by the cost function on the
product space. Non-negativity of certain sectional
curvatures of this metric give
a necessary and sufficient condition for such maps to be continuous.
This gives a simple direct proof of a key result in the theory,
leads to new examples, and opens several new research directions,
including links to maximal Lagrangian submanifolds of the product space
and para-Kähler geometry.

This page uses Peter Jipsen and J. Knisley's implementation of LaTeXMathML.If you use Internet Explorer, you can either switch to Firefox, or install the mathPlayer plug-in to see nice mathematical expressions.