Special Guest Presentation: Reduction Mappings for Guided Neural Collapse
Many high-dimensional optimization problems exhibit rich geometric structures in their set of minimizers, often forming smooth manifolds due to over-parameterization or symmetries. When this structure is known it can be exploited through reduction mappings that reparametrize part of the parameter space to lie on the solution manifold. It can be shown that well-designed reduction mappings improve curvature properties of the objective, leading to better-conditioned problems and faster convergence rates for gradient-based methods. We demonstrate this effect in guiding a neural network towards neural collapse—a known optimal configuration for over-parameterized classifier models. Material in this talk is joint work with Evan Markou and Thalaiyasingam Ajanthan, and appears in publications at NeurIPS 2024 and 2025.