000 05879nam a22003017a 4500
001 th646
003 ISI Library, Kolkata
005 20250911173317.0
008 250911b |||||||| |||| 00| 0 eng d
040 _aISI Library
_bEnglish
082 0 4 _223rd
_aSA.13
_bR888
100 1 _aRoy, Subhrajyoty
_eauthor
245 1 0 _aRobust Matrix Factorization using the Density Power Divergence and its Applications/
_cSubhrajyoty Roy
260 _aKolkata:
_bIndian Statistical Institute,
_c2025
300 _axvi, 214 pages,
502 _aThesis (Ph.D.) - Indian Statistical Institute, 2025
504 _aIncludes bibliography
505 0 _aBackground -- Robust Singular Value Decomposition -- Robust Principal Component Analysis -- Rank Estimation -- Breakdown Analysis of Minimum Super Divergence Estimator -- Breakdown Analysis of Minimum Generalized Alpha-Beta Divergence Estimator -- Conclusion and Future Scopes
508 _aGuided by Prof. Ayanendranath Basu & Prof. Abhik Ghosh
520 _aIn the modern era of big data, high-dimensional datasets are becoming increasingly com- mon across a range of disciplines, including machine learning, natural language process- ing, finance, and genomics. Extracting meaningful information from these datasets often requires uncovering low-dimensional structures hidden within the data. Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) are widely used matrix factorization techniques for this purpose. However, the traditional methods to compute these are extremely sensitive to outliers, with even a single aberrant observation poten- tially leading to highly imprecise results. This issue is exacerbated in high-dimensional datasets, where outliers are difficult to detect. Classical robust inference techniques, such as M-estimators, struggle due to their diminishing breakdown points as the data dimension becomes extremely large. This thesis addresses these challenges by proposing a novel class of robust matrix factorization techniques based on the minimum density power divergence estimator (MD- PDE). The MDPDE, a member of the broader class of minimum divergence estimators, is well-known for its robustness and efficiency across diverse applications. Crucially, it offers a dimension-free asymptotic breakdown point, making it particularly well-suited for high- dimensional settings. In this work, we leverage this estimator to develop robust versions of SVD and PCA, referred to as rSVDdpd and rPCAdpd, respectively. The thesis is structured as follows: In Chapter 1, we provide the necessary background on classical matrix factorization techniques, introduce key concepts related to minimum divergence estimators, particularly the MDPDE, and the notations to be used through- out the thesis. Chapter 2 presents the novel rSVDdpd algorithm, detailing its theoretical properties, including different equivariance properties, algorithmic convergence and con- sistency. Through simulation studies, we demonstrate the algorithm’s superior robustness compared to existing methods, particularly in high-dimensional settings. We also ap- ply the rSVDdpd algorithm to the problem of video surveillance background modelling,showcasing its real-world applicability. Chapter 3 extends this methodology to robust PCA, resulting in the rPCAdpd al- gorithm. We establish its theoretical properties such as orthogonal equivariance, con- sistency and asymptotic normality. We also demonstrate that its influence function re- mains bounded, ensuring its robustness to outliers. Comparative studies with benchmark datasets reveal that rPCAdpd outperforms existing robust PCA algorithms, particularly in scenarios with high-dimensional data with a low signal-to-noise ratio. The robust SVD and the PCA algorithms introduced in Chapters 2 and 3 require a robust estimate of the rank of the low-dimensional component of the data matrix. To this end, we propose a new penalized criterion, DICMR, in Chapter 4. Theoretical results on selection consistency and B-robustness are established, and extensive simulation studies show that DICMR is the best-performing among penalized methods, and also provides competitive performance relative to cross-validation methods while being computationally efficient. A key contribution of this thesis, explored in Chapter 5, is the demonstration that the MDPDE has a dimension-free lower bound to its asymptotic breakdown point. This property makes it uniquely robust in high-dimensional settings, a significant improve- ment over classical M-estimators. We further generalize this result in Chapter 6, showing that the dimension-free breakdown point holds for a broader class of estimators known as minimum generalized Alpha-Beta divergence estimators. We derive the necessary and suf- ficient conditions under which the corresponding divergence measures are well-defined and nonnegative, contributing to the theoretical understanding of generating novel statistical divergence measures that may lead to robust estimation in high-dimensional data. Chapter 7 concludes the thesis, summarizing the key findings and outlining directions for future research. This includes potential extensions of the proposed algorithms to other matrix factorization problems and the exploration of more practical applications beyond those demonstrated in the thesis. Overall, this thesis aims to contribute to the field of robust statistics by developing scalable, robust matrix factorization techniques with strong theoretical guarantees and practical relevance in high-dimensional data analysis.
650 4 _aStatistics
650 4 _aRobust Estimation
650 4 _aDensity Power Divergence
650 4 _aSingular Value Decomposition
650 4 _aPrincipal Component Analysis
856 _uhttps://dspace.isical.ac.in/jspui/handle/10263/7574
_yFull text
942 _2ddc
_cTH
999 _c437307
_d437307