Wednesday, April 24, 2024 11am to 12:15pm
About this Event
Engineering 2 1156 High Street, Santa Cruz, California 95064
#CSEcolloquiumPresenter: Tamara Kolda
Abstract:
In many real-world situations, we have data from discrete measurements on a set of continuous processes. We focus on the case where the data can be structured as a multi-way array or *tensor*. For instance, we might consider a tensor T(i,j,k) that comprises features (indexed by i) from a set of subjects (indexed by j) measured repeatedly at different points in time (indexed by k). Even if we collect discrete and regularly spaced measurements, some modes of a tensor (such as the time mode) may be best described using continuous functions. We refer to such a tensor as a **quasitensor**. In the example above, we might view this instead as a quasitensor T(i,j,x) where i and j are discrete and x is a continuous variable (time).
The CP tensor decomposition is a fundamental tool in data analysis and approximates a data tensor by a sum of outer produces of vectors. In this work, we develop a decomposition for a quasitensor that is the sum of outer products of vectors (finite dimensional) and smooth *functions* (infinite dimensional), which we refer to as **hybrid infinite and finite dimensional (HIFI)** quasitensor decomposition. This is similar to ideas such of functional principal component analysis (FPCA) and Gaussian process factor analysis (GPFA) for two-way data. Major advantages of the HIFI tensor decomposition are that we can naturally incorporate irregular and unaligned measurements, the infinite dimensional modes are modeled explicitly by smooth functions, and interpolation between sparse observations is naturally defined.
This talk is aimed at anyone interested in data analysis with a basic knowledge of linear algebra. The talk will provide the necessary background on tensors, CP tensor decomposition, and reproducing kernel Hilbert spaces (RKHS) that is needed for describing the methodology of the HIFI quasitensor decomposition. We will relate these ideas to those in matrix (two-way data) analysis as well as other notions for handling smoothness and related approaches for handling missing data for the standard CP decomposition.
This talk is based on collaborations with Brett Larsen (Mosaic), Runshi Tang (Duke University), Alex Williams (Flatiron Institute/Columbia University), Anru Zhang (Duke University).
Bio:
Tamara Kolda is an independent mathematical consultant under the auspices of her company MathSci.ai based in California. From 1999-2021, she was a researcher at Sandia National Laboratories in Livermore, California. She specializes in mathematical algorithms and computation methods for tensor decompositions, tensor eigenvalues, graph algorithms, randomized algorithms, machine learning, network science, numerical optimization, and distributed and parallel computing.
She is a member of the National Academy of Engineering (NAE), Fellow of the Society for Industrial and Applied Mathematics (SIAM), and Fellow of the Association for Computing Machinery (ACM). She currently serves as the founding Chair of the SIAM Activity Group on Equity, Diversity, and Inclusion (SIAG-EDI) and as a member of the National Academies’ Board on Mathematical Sciences and Analytics (BMSA). She was the founding editor-in-chief for the SIAM Journal on Mathematics of Data Science (SIMODS).
She is the author of two forthcoming books: *Tensor Decompositions for Data Science* (Cambridge University Press, with coauthor Grey Ballard) and *Unlocking LaTeX Graphics: A Concise Guide to TikZ/PGF and PGFPLOTS*.
Hosted by: Professor Sesh Commandur
Zoom: https://ucsc.zoom.us/j/94784153196?pwd=cmZFeTJGZHNPaE9XV3NxRHRhOTR1dz09
0 people are interested in this event
User Activity
No recent activity