Topologically Densified Distributions

Roland Kwitt, Christoph Hofer*, Florian Graf, Marc Niethammer

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceeding/Legal commentaryConference contributionResearchpeer-review


We study regularization in the context of small sample-size learning with over-parametrized neural networks. Specifically, we shift focus from architectural properties, such as norms on the network weights, to properties of the internal representations before a linear classifier. Specifically, we impose a topological constraint on samples drawn from the probability measure induced in that space. This provably leads to mass concentration effects around the representations of training instances, i.e., a property beneficial for generalization. By leveraging previous work to impose topological constrains in a neural network setting, we provide empirical evidence (across various vision benchmarks) to support our claim for better generalization.
Original languageEnglish
Title of host publicationProceedings of Machine Learning Research
Subtitle of host publicationProceedings of the 37h International Conference on Machine Learning
Number of pages10
Publication statusPublished - 12 Jul 2020


  • Machine Learning
  • Persistent homology

Fields of Science and Technology Classification 2012

  • 102 Computer Sciences

Cite this