Point Edge Transformer

  • NOMAD Laboratory
  • Date: Jun 5, 2025
  • Time: 11:00 AM (Local Time Germany)
  • Speaker: Dr. Sergey Pozdnyakov
  • EPFL, LIAC Group, Lausanne, Switzerland
  • Location: Building T
  • Room: 0.18/0.19
  • Host: NOMAD Laboratory
Point Edge Transformer

Abstract:

Over the last decade, machine-learning interatomic potentials have become vital tools for simulating molecules and materials, unlocking time scales and system sizes that were once out of reach. Many current state-of-the-art models rely on graph neural networks (GNNs). In this talk, I will focus on the Point Edge Transformer, which features a few unique design choices that set it apart from most of the other GNN-based potentials. Most importantly, it builds its hidden representations on every edge between atoms within a cutoff, whereas most GNNs encode information at the atomic level. This edge-centric design allows us to define arbitrarily deep models without the associated undesirable over-increase of the receptive field. Our potential achieves state-of-the-art performance on several benchmark datasets of molecules and solids, such as the COLL and High-Entropy-Alloys (HEA) datasets. See more details at https://arxiv.org/abs/2305.19302.

Bio:

I obtained my B.Sc. from the Moscow Institute of Physics and Technology (MIPT), my M.Sc. from the Skolkovo Institute of Science and Technology (Skoltech) in Prof. Artem R. Oganov’s group, and my Ph.D. from the Swiss Federal Institute of Technology Lausanne (EPFL) in Prof. Michele Ceriotti’s lab. I am currently a postdoctoral researcher in EPFL’s LIAC group under Prof. Philippe Schwaller. My scientific interests are primarily focused on Machine Learning Interatomic Potentials (MLIPs), see more details at my Google Scholar profile: https://scholar.google.com/citations?user=1-uZ3uYAAAAJ&hl=en.

Go to Editor View