December 5, 2011

Researchers at MIT create Brain on a Chip


Scientists at MIT and the University of Texas have created a computer chip that mimics how a biological brain learns. Whereas typical computer activities take place in series, meaning one thing happens after another. This presents a major obstacle for researchers trying to replicate brain-like activity.

Computers have been called "thinking machines," their internal operations have very little to do with how the original thinking machine -- the human brain -- actually works. That's changing, however, as some researchers at MIT and the University of Texas Medical School have demonstrated in a new computer chip that mimics how the brain learns as it receives new information.

The chip can simulate the activity that takes place in the brain's synapses. Synapses connect neurons in the brain. Neurons are where the brain stores information.

One of the obstacles in trying to simulate brain activity on silicon is scale. That's because brain activity takes place in parallel -- many things happening simultaneously. Computer activity takes place in series -- one thing after another.

"That means, when you have to go up to scale to the sizes of our brains, or even the brains of very simple animals, it becomes nearly impossible," one of the researchers, University of Texas Medical School Associate Professor Harel, explained.
Other members of the research team were Chi-Sang Poon, a principal research scientist in the Harvard-MIT Division of Health Sciences and Technology; Guy Rachmuth, a former postdoc in Poon's lab; and Mark Bear, the Picower Professor of Neuroscience at MIT.



Miracle of Parallel Processing 

 What kind of scale are researchers confronted with? It's estimated that there are 100 billion neurons in the brain talking to each other through a 100 trillion synapses.

"The number of connections between neurons in our brain grows approximately by the square of the number of neurons in the brain," explained. "So the problem gets very quickly out of hand."

"If all those synapses need to be simulated in series, there is no way we can do anything in a finite time," he said.

"Each synapse is incredibly slow compared to anything digital, but how the brain does what it does is by having an immense number of these 'machines' working in parallel," he added.

The new chip, however, can have millions of simulated synapses on it. The researchers are able to mimic a synapse with 400 transistors. Using very large scale integration (VLSI), billions of transistors can be placed on a chip.


Prosthetic Retinas 

  Another problem tackled by the new chip is simulating how neurons communicate with each other through the synapses. How something is learned is determined by changes in the strength of those connections, which are called "ion channels."

To simulate on a chip what happens in those channels, the researchers had to make the current flow on the silicon mimic the analog behavior in the channels. Rather than spiking the current -- turning it off and on -- a gradient approach is used which emulates how ions flow in the channels between neurons.

"If you really want to mimic brain function realistically, you have to do more than just spiking," Poon told MIT tech writer Anne Traffton. "You have to capture the intracellular processes that are ion channel-based."

The researchers see a number of applications for the new chip. For example, it could be used in neural prosthetic devices, such as artificial retinas, or to model neural functions, which can now take hours or even days to simulate.
 From Thinking to Learning Machines 

 Earlier this year, IBM  researchers announced they had made a breakthrough in their efforts to make chips that work like the human brain. The chips, which are still in the test phase, are expected to be the backbone of a new breed of thinking machine called a "cognitive computer."

Cognitive computers will add to their knowledge base as humans do -- through experience, by finding correlations, creating hypotheses, remembering and emulating the brain's structural and synaptic plasticity.

The brain structure is based on learning, by establishing pathways, reenforcing pathways and eliminating pathways when they're not being used, explained Roger L. Kay, president of Endpoint Technologies Associates.

"Computers have not generally been learning structures, even though artificial intelligence has spent some time trying to make that happen," he told

Artificial intelligence researchers pioneered the idea of a computer architecture wherein nodes shared information with nearby nodes, just as neurons share information with neighboring neurons through synaptic connections, he explained.

"But it really didn't work until recently," he said. "So the idea that you could create a learning strucuture in a machine is something that's just beginning."

DMCA.com The Techbay | All Rights Reserved.

Designed by "mintJelly"