A Predictive Approach to Reduce Intrinsic Gate Delay in Junctionless Double Gate Strained Transistor using DoE-Based Genetic Algorithm
DOI:
https://doi.org/10.37934/araset.59.1.3851Keywords:
Genetic algorithm, Intrinsic gate delay, JLDGST, Multiple regression analysisAbstract
The aim of reducing the size of a transistor is not only to compress more transistors into a denser area and increase switching speed, but also to reduce the intrinsic gate delay. This paper introduces a predictive approach to reduce intrinsic gate delay in Junctionless Double Gate Strained Transistor (JLDGST). The study involves 2D simulation and a hybrid Multiple Regression Analysis – Genetic Algorithm (MRA-GA) model for device simulation and optimization respectively. Initially, 18 sets of experiment are employed for obtaining multiple magnitude of intrinsic gate delay. Based on the retrieved results, the objective function that relates multiple input parameters (Ge mole fraction, high-k material thickness, source/drain doping concentration and metal work-function) with the output response (intrinsic gate delay) is derived using MRA. The derived objective function is then utilized as an input to the GA for searching the local minima of the fitness function. The final result shows that the proposed hybrid MRA-GA model has significantly reduced the intrinsic gate delay of the device by approximately 70%. The most optimum magnitude of Ge mole fraction, Thigh-k, Nsd and WF for the lowest possible intrinsic gate delay of the JLDGST are predicted to be 0.3 (30%), 3 nm, 2.96x1013 cm-3 and 4.6 eV respectively.