[month] [year]

Tanmay Kumar Sinha – Geometric Methods

November 2022

Tanmay Kumar Sinha received his Master of Science – Dual Degree in  Computer Science and Engineering (CSE).  His research work was supervised by Dr. Pawan Kumar. Here’s a summary of his research work on  Geometric Methods for Tensor Completion:

Optimization encompasses the mathematical problem of making the best use of the resources available to us, to maximize our reward, or equivalently, minimize our cost. It is a vast field, with important applications, and the importance of optimization has only increased with the rise of machine learning and artificial intelligence – most machine learning problems involve optimization in some way or another. Optimization problems come in various forms. We consider constrained, continous optimization problems with differentiable cost functions and a specific geometric structure on the constraint set. This geometric structure is that of a Riemannian manifold, which are smooth spaces that locally resemble Euclidean space. The paradigm of geometric or Riemannian optimization tackles constrained optimization problems on Euclidean spaces, by translating these problems into unconstrained optimization problems over Riemannian manifolds which may be non-Euclidean spaces. Using concepts from Riemannian geometry, we can develop optimization problems on these manifolds. Since the manifolds have a smaller dimension than the original Euclidean space, these Riemannian optimization algorithms tend to be scalable, fast, and we can prove guarantees on the convergence of these algorithms. This thesis is concerned with the application of geometric optimization techniques to tackle machine learning problems. Specifically, we consider the task of low-rank tensor completion. This problem is an extension of the low-rank matrix completion problem, an important problem with applications in recommendation systems (like the famous Netflix Prize problem). In recent years, there has been an increased interest in multi-dimensional data, which are represented as tensors, higher-dimensional analogues of matrices. Capturing this data as a matrix fundamentally alters the structure, and causes loss of information. This motivates us to study tensor problems. Tensor completion is one such problem, where we attempt to recover the original tensor given sparse observations, under the assumption that the tensor has a small rank. It has applications in visual data inpainting and recommender systems, among others. As an extension, we study the general structured tensor completion problem, where, in addition to the low rank structure, the tensor must also satisfy some structural constraints such as nonnegativity. The dual problem for the structured tensor completion problem has a constraint set that is a Riemannian manifold. We develop optimization algorithms for this problem, provide experimental results to show the method is competitive, and provide theoretical guarantees for the recovered tensor.