MOSFET modeling as a digital twin and use of machine learning models to find current leakage
This work demonstrates the potential of digital twins in the industry as well as how they can be simulated and trained with Machine Learning methods. Digital twin modeling does not necessarily need data science or Machine learning tools. However, it is prudent to use such modeling techniques wherever applicable and suitable. In our case, a digital twin of a MOSFET is developed using Python, in addition with Machine Learning models that are used on it to investigate and find the sub-threshold leakage current of the MOSFET, a phenomenon found in all real/physical MOSFETs. The target of this work is to show the potential benefits of digital twins, as well as the possibility of swapping out an analytical model with a Machine Learning model and vice versa.