Maximum likelihood estimator: Sample mean and PCA stein’s phenomenon and shrinkage in high dimensional statistics random projection and compressed sensing extended PCAs: robust PCA, sparse PCA, and MDS with uncertainty geometry and low rank structures: manifold learning.
Maximum likelihood estimator: Sample mean and PCA stein’s phenomenon and shrinkage in high dimensional statistics random projection and compressed sensing extended PCAs: robust PCA, sparse PCA, and MDS with uncertainty geometry and low rank structures: manifold learning.
Maximum likelihood estimator: Sample mean and PCA stein’s phenomenon and shrinkage in high dimensional statistics random projection and compressed sensing extended PCAs: robust PCA, sparse PCA, and MDS with uncertainty geometry and low rank structures: manifold learning.
The modern data sets often ''live'' in very high dimensional spaces with heavy noise of a lot of uncertainty. To deal with such problem, a manifold assumption is an attracting starting point. Based on the manifold assumption, we are interested in understanding the geometric and topological properties of the given data set. We will discuss graph connection Laplacian, a recently proposed generalization of the traditional graph Laplacian framework, and how it is related to the principal bundle structure encoded in the manifold. The synchronization property will also be discussed. While noise is inevitable, how the proposed algorithm is robust to noise will be discussed. Applications to the phase retrieval problem and puzzle solving problem will be discussed if time permits.
The modern data sets often ''live'' in very high dimensional spaces with heavy noise of a lot of uncertainty. To deal with such problem, a manifold assumption is an attracting starting point. Based on the manifold assumption, we are interested in understanding the geometric and topological properties of the given data set. We will discuss graph connection Laplacian, a recently proposed generalization of the traditional graph Laplacian framework, and how it is related to the principal bundle structure encoded in the manifold. The synchronization property will also be discussed. While noise is inevitable, how the proposed algorithm is robust to noise will be discussed. Applications to the phase retrieval problem and puzzle solving problem will be discussed if time permits.
The modern data sets often ''live'' in very high dimensional spaces with heavy noise of a lot of uncertainty. To deal with such problem, a manifold assumption is an attracting starting point. Based on the manifold assumption, we are interested in understanding the geometric and topological properties of the given data set. We will discuss graph connection Laplacian, a recently proposed generalization of the traditional graph Laplacian framework, and how it is related to the principal bundle structure encoded in the manifold. The synchronization property will also be discussed. While noise is inevitable, how the proposed algorithm is robust to noise will be discussed. Applications to the phase retrieval problem and puzzle solving problem will be discussed if time permits.