Applied Mathematics Colloquium
Tuesday,
March 10, 2020
2:45 PM - 3:45 PM
Speaker:
Barak Sober
Phillip Griffiths Assistant Research Professor, Math Department, Duke University
Title:
The Manifold Moving Least-Squares (Manifold-MLS) Framework: estimating manifolds and reconstructing their atlas from discrete data sets
Abstract:
Differentiable manifolds are an indispensable ‘language’ in modern physics and mathematics. As such, there is a plethora of analytic tools designed to investigate manifold-based models (e.g., connections, differential forms, curvature tensors, parallel transport, bundles). However, in order to facilitate these tools, one normally assumes access to the manifold’s atlas of charts (i.e., local parametrizations). In recent years, manifold-based modeling has permeated into data analysis as well, usually in order to avoid working in high dimensions. However, in these data-driven models, charts are not accessible and the only information at hand are the samples themselves. As a result, a common practice in Manifold Learning is to project the data into a lower dimensional Euclidean domain, while maintaining some notion of distance (e.g., geodesic or diffusion).
In this talk we introduce an alternative approach named the Manifold Moving Least-Squares (Manifold-MLS) that, given a finite set of samples, reconstructs an atlas of charts and provides an approximation of the manifold itself. Under certain (non-restrictive) sampling assumptions, we prove that the Manifold-MLS produces a smooth Riemannian manifold approximating the sampled one, even in case of noisy samples. We show that the approximation converges to the sampled manifold in case the number of samples tends to infinity, and give the exact convergence rates.
Barak Sober
Phillip Griffiths Assistant Research Professor, Math Department, Duke University
Title:
The Manifold Moving Least-Squares (Manifold-MLS) Framework: estimating manifolds and reconstructing their atlas from discrete data sets
Abstract:
Differentiable manifolds are an indispensable ‘language’ in modern physics and mathematics. As such, there is a plethora of analytic tools designed to investigate manifold-based models (e.g., connections, differential forms, curvature tensors, parallel transport, bundles). However, in order to facilitate these tools, one normally assumes access to the manifold’s atlas of charts (i.e., local parametrizations). In recent years, manifold-based modeling has permeated into data analysis as well, usually in order to avoid working in high dimensions. However, in these data-driven models, charts are not accessible and the only information at hand are the samples themselves. As a result, a common practice in Manifold Learning is to project the data into a lower dimensional Euclidean domain, while maintaining some notion of distance (e.g., geodesic or diffusion).
In this talk we introduce an alternative approach named the Manifold Moving Least-Squares (Manifold-MLS) that, given a finite set of samples, reconstructs an atlas of charts and provides an approximation of the manifold itself. Under certain (non-restrictive) sampling assumptions, we prove that the Manifold-MLS produces a smooth Riemannian manifold approximating the sampled one, even in case of noisy samples. We show that the approximation converges to the sampled manifold in case the number of samples tends to infinity, and give the exact convergence rates.
LOCATION:
← BACK TO EVENTS
- Morningside
- Lecture
- Public
Date Navigation Widget
Getting to Columbia
Other Calendars
- Alumni Events
- Barnard College
- Columbia Business School
- Columbia College
- Committee on Global Thought
- Heyman Center
- Jewish Theological Seminary
- Miller Theatre
- School of Engineering & Applied Science
- School of Social Work
- Teachers College
Guests With Disabilities
- Columbia University makes every effort to accommodate individuals with disabilities. Please notify us if you need any assistance by contacting the event’s point person. Alternatively, the Office of Disability Services can be reached at 212.854.2388 and [email protected]. Thank you.