What you get for free with Euclidean Neural Networks

  • Online Seminar
  • Date: Sep 23, 2020
  • Time: 16:30
  • Speaker: Dr. Tess Smidt
  • Luis W. Alvarez Postdoctoral Fellow in Computing Science, University of California, Berkeley
  • Location: https://us02web.zoom.us/j/87456247140?pwd=b3lVS0NaWmhMQXRJZWtSM3d2MTJ5Zz09
  • Room: Webinar ID: 874 5624 7140 I Password: NOMAD
  • Host: Christian Carbogno
What you get for free with Euclidean Neural Networks
Equivariance to Euclidean symmetry is a simple assumption with many consequences. In this talk, we show that Euclidean symmetry equivariant Neural Networks naturally inherit these consequences.

The geometry and properties of physical systems in 3D transform predictably (equivariantly) under coordinate transformations via elements of Euclidean symmetry -- 3D rotations, translations, and inversion. One of the motivations for incorporating symmetry into machine learning models on 3D data is to eliminate the need for data augmentation -- the 500-fold increase in brute-force training necessary for a model to learn 3D patterns in arbitrary orientations.

Most symmetry-aware machine learning models in the physical sciences avoid augmentation through invariance, throwing away coordinate systems altogether. But this comes at a price; many of the rich consequences of Euclidean symmetry are lost: geometric tensors, point groups, space groups, atomic orbitals, real and reciprocal space, and 2nd order phase transitions.

In this talk, we discuss how each of these equivariant concepts manifest naturally in Euclidean Neural Networks and give concrete examples of how these properties can be harnessed for applications in the material and chemical sciences. We also demonstrate how we implement these networks using e3nn, an open-source PyTorch library for creating Euclidean Neural Networks.

Go to Editor View