Browsing by Author "Berry, Tyrus"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Higher Order Kalman Filtering for Nonlinear Systems(2022) Easley, Deanna Colonna; Berry, TyrusWe seek to improve upon and generalize the Ensemble Kalman Filter (EnKF) by defining a Higher Order Kalman Filter. The Kalman filter consists of two steps: forecast and assimilation. In this thesis we develop the forecast step of our desired Higher Order Kalman Filter with the higher order unscented transform (HOUT). The HOUT is a quadrature rule that estimates the expected value of the first four moments of a distribution, i.e. the mean, covariance, skewness and kurtosis. We then discuss how to generalize the assimilation step. The original Kalman Filter can be derived in three ways: the Bayesian approach, the Minimum Mean-Square Estimate (MMSE) approach and the Closure approach. Each derivation provides a different avenue for us to derive the Higher Order Kalman Filter. In order to generalize the Bayesian approach to the first four moments, instead of using a Gaussian likelihood and prior, we use exponentials with a quartic polynomial as the exponent. In order to generalize the MMSE approach we consider deriving optimal quadratic filters. Finally we may generalize the closure approach by deriving the ordinary differential equations for the skewness and kurtosis and instead of assuming that the skewness is zero, we seek new closures for the first four moments rather than just the first two.Item Model Free Techniques for Reduction of High-Dimensional Dynamics(2013) Berry, Tyrus; Berry, Tyrus; Sauer, TimothyThere is a growing need in science and engineering to extract information about complex phenomena from large data sets. A rapidly developing approach to building a model from data is manifold learning, and analysis of such a model may allow isolation of the desired features of the data. By introducing an additional geometric structure, the techniques of differential geometry become available for analyzing the model. In this dissertation we extend previous methods of analyzing the geometry of data. Our key contribution is the theory of local kernels, which generalizes previous nonparametric techniques such as Laplacian eigenmaps and diffusion maps. We show that every geometry can be represented by a local kernel in the limit of large data. Moreover, using the discrete exterior calculus (DEC) we show that a local kernel can be used to introduce a discrete Hodge star operator on a data set. This shows that local kernels introduce a discrete geometry on a data set without the need for an explicit simplicial complex.Item Towards a Common Dimensionality Reduction Approach; Unifying PCA, tSNE, and UMAP through a Cohesive FrameworkDraganov, Andrew; Draganov, Andrew; Berry, TyrusDimensionality reduction is a widely studied field that is used to visualize data, cluster samples, and extract insights from high-dimensional distributions. The classical approaches such as PCA, Isomap, and Laplacian eigenmaps rely on clear optimization strategies while more modern approaches such at tSNE and UMAP define gradient descent search spaces through disparities between the high- and low-dimensional datasets. In this work, we notice that all of these approaches can be interpreted as minimizing the difference between two kernel functions – one for the high dimensional space and one for the low dimensional space. In particular, once we abstract the kernel functions, we can develop a common framework for any dimensionality reduction problem. Namely, one needs to identify their high-dimensional distance kernel, the low-dimensional distance kernel, and the method used for minimization. With this in mind, we identify the relevant general framework and then proceed to discuss the ways in which PCA, tSNE, and UMAP all fit into it. For each, we discuss insights that were obtained during the process. We lastly highlight next steps and directions for future work.