[Zurück]


Vorträge und Posterpräsentationen (mit Tagungsband-Eintrag):

S. Bulusu, M. Favoni, A. Ipp, D. Mueller, D. Schuh:
"Equivariance and generalization in neural networks";
Vortrag: A Virtual Tribute to Quark Confinement and the Hadron Spectrum 2021, Stavanger (Norwegen); 02.08.2021 - 06.08.2021; in: "A Virtual Tribute to Quark Confinement and the Hadron Spectrum (vConf21)", EPJ Web of Conferences, 258 (2022), Paper-Nr. 09001, 8 S.



Kurzfassung englisch:
The crucial role played by the underlying symmetries of high energy physics and lattice field theories calls for the implementation of such symmetries in the neural network architectures that are applied to the physical system under consideration. In this talk we focus on the consequences of incorporating translational equivariance among the network properties, particularly in terms of performance and generalization [1]. The benefits of equivariant networks are exemplified by studying a complex scalar field theory, on which various regression and classification tasks are examined. For a meaningful comparison, promising equivariant and non-equivariant architectures are identified by means of a systematic search. The results indicate that in most of the tasks our best equivariant architectures can perform and generalize significantly better than their non-equivariant counterparts, which applies not only to physical parameters beyond those represented in the training set, but also to different lattice sizes.

[1] ``Generalization capabilities of translationally equivariant neural networks", S.~Bulusu, M.~Favoni, A.~Ipp, D.~I.~Müller, D.~Schuh,
https://arxiv.org/abs/2103.14686


"Offizielle" elektronische Version der Publikation (entsprechend ihrem Digital Object Identifier - DOI)
http://dx.doi.org/10.1051/epjconf/202225809001


Erstellt aus der Publikationsdatenbank der Technischen Universität Wien.