Researchers at the US National Institute of Science and Technology have warned that a flaw exists in transistor noise models which could fundamentally affect the efficiency of future chips.
The elastic tunneling model predicts that as transistors get smaller, electronic noise within the transitors, which can cause erratic on-off states, should increase. However, a team of Nist scientists, who have been exploring nano-scale transistor behavior, have found that transistor noise does not increase as transistors are scaled down.
"This implies that the theory explaining the effect must be wrong," said Jason Campbell, who lead the research, in a statement. "The model was a good working theory when transistors were large, but our observations clearly indicate that it's incorrect at the smaller nanoscale regimes where industry is headed."
The team also discovered that as less energy is pushed through nano-transistors, transistor noise increases. This could spell trouble for low-energy chips, which are being explored for use in devices including laptops and phones.
"This is a real bottleneck in our development of transistors for low-power applications," said Campbell. "We have to understand the problem before we can fix it — and troublingly, we don't know what's actually happening."
Campbell credits fellow Nist researcher KP Cheung with first identifying a possible problem with the elastic tunnelling model. Researchers from the University of Maryland College Park, and Rutgers University, also contributed to the team's work.
The researchers gave a presentation of some of their findings at an IEEE event last week. The team's initial results were published in a paper entitled The Origin of Random Telegraph Noise in Highly Scaled nMOSFETs in February.
This article was originally posted on ZDNet UK.