Host: NOMAD Laboratory

Thermodynamic properties by on-the-fly machine-learned interatomic potentials: thermal transport and phase transitions

Towards ex-machina computations of transport and transformations in complex materials

Advancing fundamental science with Machine Learning at DeepMind

Deep learning has had a transformative impact in computer science and is recently being applied to the natural sciences. In this talk, I will give an overview of recently published work on applying Machine Learning techniques to fundamental science problems at DeepMind. I will cover: super-human Quantum Dot tuning, advances in quantum Monte Carlo with neural network ansatz, transfer learning for predicting experimental material properties, and finally, touch upon recent advances in protein structure prediction. These case studies will hopefully allow me to exemplify the three kinds of impact that we can expect in future years: automating the experimental research pipeline, exploiting the representation power of neural network as function forms and finally extracting knowledge from data. [more]

Reaching for the stars with density functional theory

Accurately modeling warm dense matter deep inside astrophysical objects is a grand challenge.The associated thermodynamic states are characterized by solid-state densities, temperatures ofthousands of Kelvin, and GPa pressures. The extreme of the conditions can vary gravely dependingon the mass, radius, and composition of the studied object ranging from several GPa in planetarymantles to millions of GPa at the center of stellar interiors. A method that has proven highlysuccessful in describing this peculiar state of matter is density functional theory moleculardynamics (DFT-MD). [more]

Automatic topography of multidimensional probability densities

A Seminar of the NOMAD Laboratory
Unsupervised methods in data analysis aim at obtaining a synthetic description of high-dimensional data landscapes, revealing their structure and their salient features. We will describe an approach for charting complex and heterogeneous data spaces, providing a topography of the high-dimensional probability density from which the data are harvested. [more]

Introduction to Approximate Bayesian Computation

The goal of statistical inference is to draw conclusions about properties of a population given a finite observed sample. This typically proceeds by first specifying a parametric statistical model (that identifies a likelihood function) for the data generating process which is indexed by parameters that need to be calibrated (estimated). There is always a trade-off between model simplicity / inferencial effort / prediction power. [more]

Smart Sampling for Chemical Property Landscapes with BOSS

A Seminar of the NOMAD Laboratory
Atomistic structure search for organic/inorganic heterostructures is made complex by the many degrees of freedom and the need for accurate but costly density-functional theory (DFT) simulations. To accelerate and simplify structure determination in such heterogeneous functional materials, we developed the Bayesian Optimization Structure Search (BOSS) approach [1]. BOSS builds N-dimensional surrogate models for the energy or property landscapes to infer global optima. The models are iteratively refined by sequentially sampling DFT data points with high information content. The uncertainty-led exploration/exploitation sampling strategy delivers global minima with modest sampling, but also ensures visits to less favorable regions of phase space to gather information on rare events and energy barriers. [more]
Heat and charge transport play a key role in materials science and thus for many technological applications that are key to establish a sustainable energy economy and ecology. Examples include improving the fuel-efficiency of aeronautic turbines [1], for developing efficient thermoelectric devices able to recover useful voltage from otherwise wasted heat [2], and for designing novel battery materials for advancing e-mobility [3]. [more]
Go to Editor View