(Credit: IBM)
Computers with processors that mimic the human brain's cognition, perception, and action abilities are a lot closer than they've ever been after IBM on Wednesday unveiled the first generation of chips that will power them.
The announcement comes nearly three years after IBM and several university partners were awarded a grant by the Defense Advanced Research Projects Agency (DARPA) to re-create the brain's perception, cognitive, sensation, interaction, and action abilities, while also simulating its efficient size and low-power consumption.
The grant was part of Phase 2 of DARPA's Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project, the goal of which, IBM said, is "to create a system that not only analyzes complex information from multiple sensory modalities at once but also dynamically rewires itself as it interacts with its environment--all while rivaling the brain's compact size and low-power usage."
According to IBM Research project leader Dharmendra Modha, the first tangible results of the grant and a great deal of work by those at six IBM labs and five universities is finally ready to be shown to the world.
"What I hold in my hand as I speak," Modha told CNET by phone Wednesday, "is our first cognitive computing core that combines computing in the form of neurons, memory in the form of synapses, and communications in the form of axons...[and] in working silicon, and not PowerPoint."
The development of the new chips comes two years after Modha's team finished work on an algorithm called BlueMatter that spelled out the connections between all the human brain's cortical and sub-cortical locations. That mapping is a critical step, Modha has said, for a true understanding of how the brain communicates and processes information.
While it's too early to say exactly what kind of applications will be powered by the new chips, Modha suggested that they will likely tackle some of the thorniest problems in computing. Among those he foresees are programs that could analyze financial markets with extreme precision and attention; that could monitor global water supplies and track and report information on things such as wave height, ocean tides, water temperature, and even issue tsunami warnings; and those that could allow a supermarket worker to instantly sense when produce has gone bad.
Others see even more potential applications. For example, the chips could offer earthquake detection due to "infinite patience" and being our eyes and ears on a seismic fault line in a way that would bore people, said analyst Rick Doherty, co-founder and director of the Envisioneering Group. Similarly, the chips could be used for a wide range of military, public health or public safety purposes, Doherty suggested.
All of this is due to the fact, Modha said, that the chips can enable biological "senses" such as sight, sound, smell, and touch, and "drive multiple motor modes while consuming less than 20 watts [of power] and occupying less volume than a 2-liter bottle of soda, and weighing less than three pounds."
While Modha wouldn't say when the first commercially available applications based on the new chips will be available, Doherty said he thinks the first such programs could be ready by 2015 or 2016. And Doherty, who was briefed on the cognitive computing news, said he thinks that IBM's latest work is "exciting" and significant because of the fact that the chips are designed to learn and even give feedback on what they've learned.
Beyond von Neumann
To Modha, his new chips are nothing short of an entirely new computing paradigm, perhaps the first in decades, and one which could far surpass the decades-old von Neumann architecture on which today's computers are based.
IBM's new chips contain no biological elements, the company said, but do have digital silicon circuits "inspired by neurobiology to make up what is referred to as a 'neurosynaptic core' with integrated memory (replicated synapses), computation (replicated neurons), and communication (replicated axons)," IBM said in a release.
At the moment, Modha's team has come up with two prototype chip designs. Each contains 256 neurons, while one has 262,144 programmable synapses, and the other has 65,536 learning synapses, IBM said. For now, the company has used the chips to demonstrate basic applications such as pattern recognition, machine vision, classification, navigation, and associative memory.
And the company expects that the chips will move away from von Neumann computing to "a potentially more power-efficient architecture that has no set programming, integrates memory with processor, and mimics the brain's event-driven, distributed and parallel processing."
Eventually, Modha said, IBM hopes to "weave the [building] blocks together into a scalable network and progressively scale it to a mammalian scale system with 10 billion neurons, 100 trillion synapses, [all] while consuming 1 kilowatt of power [and] fitting in a shoebox. Think of a cognitive supercomputer in your pocket."
Not replacing today's computers
Although the chips may some day create a new computing paradigm, Modha said he doesn't see them as replacing existing computers. Rather, he said, it's about "extending and complementing today's computer stack."
Perhaps more to the point, Modha's team has set out to come up with a new computing architecture that isn't weighed down by the imminent end of Moore's Law.
Today's computers, he argued, are hobbled by three fundamental constraints: first, that they must have progressively increasing clock rates; second, that those clock rates require smaller and smaller devices that are quickly approaching hard physical limitations; and finally that today's computers are essentially programmed systems with linear sets of instructions that are occasionally interspersed with if/then/else statements. By contrast, IBM's cognitive computing chips could in theory put off those physical limits for some additional time, enabling a wide range of previously impossible applications, Modha said.
"Everyone else is playing within the [Moore's Law] system," he argued. "We're changing the game."
(Daniel Terdiman is a staff writer at CNET News covering games, Net culture, and everything in between.)
Sunday, August 21, 2011
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment