In a potentially transformative development for artificial intelligence hardware, researchers at the National University of Singapore (NUS) have demonstrated the ability to make a standard silicon transistor behave like both a neuron and a synapse – the fundamental processing units of the human brain. This breakthrough, achieved using existing, widely available semiconductor technology, opens a promising new avenue for creating highly efficient and scalable neuromorphic computing systems, potentially overcoming major hurdles in AI development.
The quest for neuromorphic computing – building computer systems that mimic the structure and function of the biological brain – has been ongoing for decades. The brain’s remarkable efficiency stems from its parallel processing architecture, where billions of neurons communicate via trillions of synapses, processing information and learning with minimal energy consumption. Traditional computer architectures, based on the von Neumann model, separate processing and memory units, leading to bottlenecks and high energy use, particularly for complex AI tasks. Neuromorphic approaches aim to integrate processing and memory more closely, mirroring the brain’s design for greater efficiency.
The NUS team, led by Associate Professor Mario Lanza from the Department of Materials Science and Engineering, achieved their breakthrough by cleverly exploiting a known physical phenomenon within silicon transistors, typically considered a reliability issue or failure mechanism. They discovered that by carefully controlling electrical stress, they could induce specific changes in the transistor’s insulating layer (typically silicon dioxide). This controlled degradation allows the single transistor to exhibit hysteresis – a memory effect where the output depends on its past state – enabling it to function like a memristor, a component whose resistance changes based on the history of current passed through it. This memristive behaviour allows the transistor to mimic the adaptive resistance changes seen in biological synapses, which strengthen or weaken based on neural activity, forming the basis of learning and memory.
Furthermore, the researchers demonstrated that the same transistor, under different operating conditions, could exhibit the threshold-based firing behavior characteristic of a biological neuron. Neurons only transmit a signal once incoming stimuli reach a certain activation threshold. By leveraging the inherent switching properties of the transistor combined with the induced memory effects, the NUS device can replicate this essential neuronal function. The ability to embody both synaptic (memory/learning) and neuronal (processing/firing) functions within a single, standard silicon transistor is the key innovation. Previous neuromorphic efforts often required specialized materials or complex multi-component circuits to achieve similar results.
The implications of this research are profound. Using standard silicon manufacturing processes (CMOS technology) means that these neuromorphic building blocks could potentially be integrated into existing chip fabrication workflows with relative ease. This compatibility overcomes a major barrier faced by designs relying on exotic materials or entirely new manufacturing techniques. It paves the way for developing dense, low-power AI chips that could significantly accelerate machine learning tasks directly on devices (edge AI), reducing reliance on cloud computing and enabling more sophisticated AI applications in areas like robotics, autonomous systems, and personalized medicine.
While further research and development are needed to scale this concept into complex artificial neural networks and demonstrate its performance on real-world AI tasks, the NUS discovery represents a significant step forward. By transforming a ubiquitous, well-understood component into a dual-function neuromorphic element, it offers a potentially pragmatic and scalable path towards building AI hardware that is fundamentally more efficient and brain-like in its operation, potentially revolutionizing the field of artificial intelligence.
Source: SciTechDaily