It is nearly impossible to escape discussions on generative AI and the large language models (LLMs) that support it.
Behind these LLMs is a massive amount of computing power that resides in cavernous data centers that are popping up all over the world.
The computers in these AI data centers are powered by GPU chips, or graphical processing units originally built for high-end game use, but then found new life in helping to build the complex models that train AI.
These chips are silicon-based, as are nearly all hardware semiconductors used in computers today.
However, there is a limit to how far silicon will take computing.
As we collectively move from generative AI into the future of a true general AI that can think and act independently, the underlying hardware to support these will need a fundamental upgrade.
Erica Carlson from Purdue University explains, “The brain-inspired codes of the AI revolution are largely being run on conventional silicon computer architectures, which were not designed for it.”
Carlson, with a team from Purdue, The University of California San Diego, and the ESPCI Paris published a recent report that paves the way for a computer designed to process like a human brain.
The human brain is made up of an estimated 100 billion neurons – each capable of storing bits of memory similar to silicon chips.
However, it is the transfer of memory from neuron to neuron in the brain through a synapse that allows humans to think that common silicon can’t handle.
The team discovered a material that can handle both neuron and synapse behaviors and is capable of creating what they call “neuromorphic hardware.”
This metal, called vanadium dioxide, could form the basis of a supercomputer that thinks like humans in the not-too-distant future.