Zernike seminar: Vincent M. le Corre "Machine learning and device modeling as an automated diagnostic tool for high-throughput research
|When:||We 19-04-2023 10:00 - 11:00|
Device modeling is extensively used in solar cells research going from simple models such as the
Shockley diode equation to more complex models like Monte-Carlo or drift-diffusion (DD). These
models are mostly used in three ways: first as a predictive tool to assess a technology’s ultimate
efficiency. Secondly, as an investigation tool to understand the limitation of a given experimental
technique to measure a certain physical quantity (mobility, defect density...) and to define the conditions
under which the technique is accurate. Finally, as a diagnostic tool to understand and quantify the main
losses for a given device by reproducing experimental results.
One of the main criticisms about using DD modeling as a means to quantify material properties is the
many fitting (10 to 40) parameters that need to be estimated. Many would use the famous von Neumann
argument: “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk”
and argue that with so many fitting parameters one could fit almost any model to the experimental data.
In this presentation, we will show that the use of DD modeling combined with machine learning (ML)
tools to fit light-intensity-dependent current-voltage characteristics indeed leads to a unique solution.
However, the uniqueness of this solution is not defined by a single set of parameters but by figures of
merits (FOMs) that are a combination (product or ratio) of the material properties. This fact is reported for
the first time and has been overlooked in the literature and means that the values reported are not
necessarily accurate as in most cases only the FOMs can be estimated accurately and not the individual
Therefore, we will also define under which conditions one can accurately quantify the materials
parameters (mobility, defect density, density of states...).
Finally, we will show how DD modeling and ML can be combined to speed up the analysis of large
datasets coming from degradation measurements or high-throughput experimentation