# Speakers

The speakers are principally **early-career researchers** who are experts in the **practice** as well as in the theory of building ML-IPs, and are especially well equipped to share their experience with those interested in getting started in the field.

The schedule is structured around **three/four ****sessions per day**, distributed over the 5 days of the workshop. Each session includes an introduction to the topic of the speaker's expertise leading into a hands-on tutorial and extended interactive question time. The interactive aspect of the lessons will be heavily emphasized: Speakers will be asked to provide an **interactive tutorial notebook** that participants can run themselves, and will be encouraged to dedicate a large part of the lesson working directly with participants (answering questions, asking about their research problems). This format is intended to help participants progress from first contact with the research topic to the application of these concepts in their own research.

**Nongnuch Artrith**

(Utrecht University, the Netherlands)

*Developing Artificial Neural Network Potentials for Materials*

Nong Artrith is a Tenure-Track Assistant Professor in the Materials Chemistry and Catalysis Group at the Debye Institute for Nanomaterials Science. Prior to joining Utrecht University, Nong was a Research Scientist at Columbia University, USA, and a PI in the Columbia Center for Computational Electrochemistry. Nong obtained her PhD in Theoretical Chemistry from Ruhr University Bochum, Germany, for the development of machine-learning (ML) models for materials chemistry. She was awarded a Schlumberger Foundation fellowship for postdoctoral research at MIT and subsequently joined UC Berkeley as an associate specialist. In 2019, Nong was named a Scialog Fellow for Advanced Energy Storage. She is the main developer of the open-source ML package ænet (http://ann.atomistic.net) for atomistic simulations. Her research interests focus on the development and application of first principles and ML methods for the computational discovery of energy materials and for the interpretation of experimental observations.

**Rose K. Cersonsky**

(EPFL, CH)

*Dimensionality Reduction for Machine Learning Representations*

Rose K. Cersonsky received her Bachelor of Science degree in Materials Science and Engineering from the University of Connecticut in 2014 and her PhD in Macromolecular Science and Engineering from the University of Michigan in 2019 under Professor Sharon C. Glotzer. She is currently working as a postdoctoral researcher in the Laboratory of Computational Science and Modeling (COSMO) at École Polytechnique Fédérale de Lausanne (EPFL). Here, Rose has focused on methods development for dimensionality reduction in chemical studies and providing new algorithms and tools for understanding the correlations between crystal structure and materials properties

**Bingqing Cheng**

(University of Cambridge, UK)

*Predicting materials properties: from thermodynamics to transport*

Bingqing Cheng is a Departmental Early Career Fellow at the Computer Laboratory, University of Cambridge, and a Junior Research Fellow at Trinity College. She received her Ph.D. from the École polytechnique fédérale de Lausanne (EPFL) in 2019. Her work focuses on theoretical predictions of material properties.

**Geneviève Dusson**

(Laboratoire de Mathématiques de Besançon, FR)

*Active learning approaches for machine learning*

Geneviève Dusson is currently working at the University of Bourgogne Franche-Comté in the Mathematics department. Her research focuses on mathematical and numerical problems arising in the simulation of molecules and material systems. She is particularly interested in error certification aspects, relying notably on numerical analysis and a posteriori error estimates, e.g. for quantum chemistry problems. She is also working on developing accurate and fast interatomic potentials with the help of machine-learning techniques.

**Edgar Engel**

(University of Cambridge, UK)

*Simulations for bio-molecular systems made easy*

Edgar Engel is a junior research fellow at Trinity College, Cambridge, working in the Theory of Condensed Matter Group, University of Cambridge. His research focusses on leveraging machine-learning techniques to enable and accelerate the prediction and design of complex materials and their properties from first-principles.

**Mariia Karabin**

(Oak Ridge National Laboratory, USA)

*An entropy-maximization approach to automated training set generation for interatomic potentials*

Postdoctoral research associate at the National Center for Computational Sciences, Oak Ridge National Laboratory, with background in global optimization, sampling methods, and first principles calculations. Currently working on adding new capabilities to the locally self-consistent multiple scattering method

**Emine Küçükbenli**

(Harvard University, USA)

*Atomistic modelling with Neural Network Potentials with PANNA*

Emine Küçükbenli is a Clinical Asst. Professor at Boston University Questrom School of Business and associate at Harvard School of Engineering and Applied Sciences. They are recognized for their research in computational electronic structure, in particular, algorithmic improvements to calculations of magnetic response properties of materials, machine learning for crystal structure prediction and neural networks for atomistic simulations. Dr Küçükbenli is the lead developer of PANNA, a machine learning package for material science, and a long term contributor of Quantum Espresso suite of codes for materials modeling.

**Nataliya Lopanitsyna**

(EPFL, CH)

*Machine learning potential recipes on the example of metal alloys*

Nataliya Lopanitsyna is a doctoral researcher in materials science in Michele Ceriotti’s group at EPFL. She works on atomic-scale modelling of metal alloys combining statistical mechanics methods and machine learning to predict structure-property relatiotions. She recently presented a machine learning model capturing electrons excitations effects which allows to compute properties at high temperature accurately. She is now focussing on modelling of multicomponent alloys and problems conjugated with chemical complexity of materials.

**Félix Musil**

(Freie Universität Berlin, DE)

*Building machine learned force fields with kernel methods: a hands-on tutorial*

Félix Musil studied physics at the EPFL and received his Ph.D. from the EPFL in 2021. During his graduate work, he developed and applied methods to investigate structure-property relationships in materials using atomistic modeling and machine learning techniques.

**Jigyasa Nigam**

(EPFL, CH)

*Incorporating physical constraints and symmetry in atomic-scale machine learning*

Jigyasa Nigam is a PhD student in the COSMO lab at EPFL. She began her ventures in atomistic modeling as an NCCR MARVEL Inspire Potentials Fellow in 2019, prior to which she also gained scientific experience and exposure at Caltech, NASA Jet Propulsion Lab and the Australian National University. Her current research focuses on understanding and developing improvable physics-integrated structural representations for atomistic machine learning within a mathematically sound framework that captures the symmetries of learning targets

**Ivan Novikov**

(Skoltech, RUS)

*The MLIP package: Moment tensor potentials with active learning*

Ivan Novikov got his PhD degree in applied mathematics in March 2016. In the area of multiscale molecular modeling Since May 2016 Ivan works in Alexander Shapeev's group at Skoltech on developing machine-learning interatomic potentials (namely, Moment Tensor Potentials, MTPs) and algorithms for their training. These are implemented in the MLIP-2 package whose number of registered users have grown to 250 in less than a year. Ivan has published several papers on the application of machine-learning potentials to studying chemical reactions in gas phase and is currently working on extending the machine-learning formalism to magnetism.

**Berk Onat**

(University of Warwick, UK)

*Dimensionality of Atomic Environment Representations and Implanted Neural Networks for Machine Learning Interatomic Potentials.*

Berk Onat has been working in materials modelling field and his works extend to ab initio calculations, molecular dynamics simulations and machine learning approaches. During his research fellow position at Harvard University, he has involved in the development of Implanted Neural Networks for machine learning interatomic potentials. As a research associate at University of Warwick, he has also analysed the dimensionality and sensitivity of atomic environment descriptors.

**Christoph Schran**

(University College London, UK)

*Machine learning potentials for complex aqueous systems made simple*

Christoph Schran received his PhD in chemistry from the Ruhr-Universität Bochum, Germany in 2019. During this time he worked at the École normale supérieure, Paris and was a visiting graduate student in the Markland group at Stanford University. For his postdoctoral work, he moved to the University of Cambridge working with Angelos Michaelides as a fellow of the ’Alexander von Humboldt’ foundation. His research interests include the understanding of hydrogen bonded systems and their modelling by machine learning techniques.

**Ganesh Sivaraman**

(Argonne National Laboratory, USA)

*From Atomistic to Coarse Grained : Active Learning Strategies for Gaussian Approximation Potential and Deep Kernel Learning*

Ganesh Sivaraman is an Assistant Scientist at the Data Science and Learning division at the Argonne National Laboratory. He was formerly a postdoctoral appointee at the Argonne Leadership Computing Facility, Argonne National Laboratory (ANL), USA (2017-2020). During the postdoctoral years, he was working with leadership class super computers to deliver scientific solution for leading experimentalists, namely Prof. Jacqueline Cole (University of Cambridge) and Dr. Chris Benmore (ANL) . He was awarded a PhD in engineering physics from the Institute for Computational Physics, University of Stuttgart, Germany in 2017. His research is at the interface of material physics and machine learning.

(SISSA, IT)

*Green-Kubo simulations of transport properties: from ab initio to neural networks*

Davide got his bachelor and master degrees in physics at the University of Modena and Reggio Emilia, and is now PhD student in Theory and Numerical Simulation of Condensed Matter at SISSA (Trieste) in the group of Prof. Stefano Baroni. He has a passion for data-driven approaches for material properties and he currently investigates transport properties of liquids via both *ab-initio* simulations and neural network potentials.

**Anh Tran**

(Sandia National Laboratory, USA)

*Multi-fidelity and parallel machine-learning approaches for combining first-principles calculations and classical molecular dynamics predictions*

Anh Tran is currently a LTE Senior Member of Technical Staff in Optimization and Uncertainty Quantification, Sandia National Laboratories, Albuquerque, NM. He obtained his B.S., M.S., and Ph.D. in mechanical engineering from Georgia Institute of Technology in 2011 and 2018 and M.S. in applied mathematics from Georgia Southern University in 2014. His current research interests are optimization, uncertainty quantification, and machine learning methodology and applications for multiscale computational materials science.

**Julien Tranchida**

(Sandia National Laboratory, USA)

*Multi-fidelity and parallel machine-learning approaches for combining first-principles calculations and classical molecular dynamics predictions*

Julien Tranchida is a research scientist in the Computational Multiscale Department at Sandia National Laboratories. After completing his doctorate work in the French atomic energy commission (CEA/DAM), he joined Sandia as a CEA - NNSA postdoctoral fellow in April 2017, before transitioning to a staff position in October 2019. Julien implemented and maintains the SPIN package of LAMMPS, and works on the development of interatomic potentials following machine-learning approaches

**Yifan Li**

(Princeton University, USA)

*From Deep Potential to DeePMD-kit, to DeepModeling*