Mathieu Luisier
Bio: Since 2022, Mathieu Luisier has been Full Professor of Computational Nanoelectronics at ETH Zurich, Switzerland. He graduated in electrical engineering in 2003 and received his PhD in 2007, both from ETH Zurich. During that time, he started the development of a state-of-the-art quantum transport simulator called OMEN. After a one-year post-doc at ETH, he joined in 2008 the Network for Computational Nanotechnology at Purdue University, USA, as a research assistant professor. In 2011 he returned to ETH Zurich to become Assistant and then Associate Professor. His current research interests focus on the modeling of nanoscale devices, such as advanced transistors based on classical semiconductors and 2-D materials, photo-detectors/emitters, and non-volatile resistive memory cells.
Title: Acceleration of atomistic NEGF: algorithms, parallelization, and machine learning
Abstract: The Non-equilibrium Green’s function (NEGF) formalism is a particularly powerful method to simulate the quantum transport properties of nanoscale devices, e.g., transistors, photo-diodes, or memory cells, in the ballistic limit of transport or in the presence of various scattering sources such as electron-phonon, electron-photon, or even electron-electron interactions. The inclusion of all these mechanisms has been first demonstrated in small systems, composed of a few atoms, before being scaled up to larger structures made of thousands of atoms. Also the accuracy of the models has kept improving, from empirical to fully ab-initio ones. NEGF is nowadays widely used in combination with density functional theory (DFT) to investigate the electronic, thermal, or optical characteristics of different types of nano-devices. These progresses have been enabled by the development of dedicated numerical algorithms and by the parallelization of the workload, first on CPUs, now on GPUs.
In this presentation, I will review key achievements that have allowed to push back the limit of DFT+NEGF solvers beyond toy examples and I will illustrate them with recent applications. Also, I will show how graph neural networks and machine-learning can be leveraged to speed up ab-initio device simulations.