Robust Matrix Factorization using the Density Power Divergence and its Applications
No Thumbnail Available
Date
2025-06
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Indian Statistical Institute, Kolkata
Abstract
n the modern era of big data, high-dimensional datasets are becoming increasingly com-
mon across a range of disciplines, including machine learning, natural language process-
ing, finance, and genomics. Extracting meaningful information from these datasets often
requires uncovering low-dimensional structures hidden within the data. Singular Value
Decomposition (SVD) and Principal Component Analysis (PCA) are widely used matrix
factorization techniques for this purpose. However, the traditional methods to compute
these are extremely sensitive to outliers, with even a single aberrant observation poten-
tially leading to highly imprecise results. This issue is exacerbated in high-dimensional
datasets, where outliers are difficult to detect. Classical robust inference techniques, such
as M-estimators, struggle due to their diminishing breakdown points as the data dimension
becomes extremely large.
This thesis addresses these challenges by proposing a novel class of robust matrix
factorization techniques based on the minimum density power divergence estimator (MD-
PDE). The MDPDE, a member of the broader class of minimum divergence estimators, is
well-known for its robustness and efficiency across diverse applications. Crucially, it offers
a dimension-free asymptotic breakdown point, making it particularly well-suited for high-
dimensional settings. In this work, we leverage this estimator to develop robust versions
of SVD and PCA, referred to as rSVDdpd and rPCAdpd, respectively.
The thesis is structured as follows: In Chapter 1, we provide the necessary background
on classical matrix factorization techniques, introduce key concepts related to minimum
divergence estimators, particularly the MDPDE, and the notations to be used through-
out the thesis. Chapter 2 presents the novel rSVDdpd algorithm, detailing its theoretical
properties, including different equivariance properties, algorithmic convergence and con-
sistency. Through simulation studies, we demonstrate the algorithm’s superior robustness
compared to existing methods, particularly in high-dimensional settings. We also ap-
ply the rSVDdpd algorithm to the problem of video surveillance background modelling,showcasing its real-world applicability.
Chapter 3 extends this methodology to robust PCA, resulting in the rPCAdpd al-
gorithm. We establish its theoretical properties such as orthogonal equivariance, con-
sistency and asymptotic normality. We also demonstrate that its influence function re-
mains bounded, ensuring its robustness to outliers. Comparative studies with benchmark
datasets reveal that rPCAdpd outperforms existing robust PCA algorithms, particularly
in scenarios with high-dimensional data with a low signal-to-noise ratio. The robust SVD and the PCA algorithms introduced in Chapters 2 and 3 require a
robust estimate of the rank of the low-dimensional component of the data matrix. To this
end, we propose a new penalized criterion, DICMR, in Chapter 4. Theoretical results on
selection consistency and B-robustness are established, and extensive simulation studies
show that DICMR is the best-performing among penalized methods, and also provides
competitive performance relative to cross-validation methods while being computationally
efficient.
A key contribution of this thesis, explored in Chapter 5, is the demonstration that
the MDPDE has a dimension-free lower bound to its asymptotic breakdown point. This
property makes it uniquely robust in high-dimensional settings, a significant improve-
ment over classical M-estimators. We further generalize this result in Chapter 6, showing
that the dimension-free breakdown point holds for a broader class of estimators known as
minimum generalized Alpha-Beta divergence estimators. We derive the necessary and suf-
ficient conditions under which the corresponding divergence measures are well-defined and
nonnegative, contributing to the theoretical understanding of generating novel statistical
divergence measures that may lead to robust estimation in high-dimensional data.
Chapter 7 concludes the thesis, summarizing the key findings and outlining directions
for future research. This includes potential extensions of the proposed algorithms to other
matrix factorization problems and the exploration of more practical applications beyond
those demonstrated in the thesis.
Overall, this thesis aims to contribute to the field of robust statistics by developing
scalable, robust matrix factorization techniques with strong theoretical guarantees and
practical relevance in high-dimensional data analysis.
Description
This thesis is under the supervision of Prof.Ayanendranath Basu and Prof. Abhik Ghosh
Keywords
Density Power Divergence, Robust Estimation, Singular Value Decomposition, Principal Component Analysis
Citation
231p.
