'Artificial synapse' points the way toward portable AI devices
Tech titans like Intel and IBM have already begun developing chips for AI that mimic the way the human brain works -- it is, after all, the most powerful computer there is. The field of "neuromorphic computing" is still in its very early stages, though, and one of its pioneers' greatest challenges is copying neural synapses. Those are the small structures where information passes when it moves from one neuron to the next. That's why a team of MIT engineers have set out to develop an artificial synapse that works like the real deal and have successfully come up with a design that can "precisely control the strength of an electric current flowing across it, similar to the way ions flow between neurons."
Existing designs that typically use non-crystalline solid or amorphous materials have difficulties controlling that flow of ions, which means they're not that successful in moving info from neuron to neuron. To solve that issue, the team created synapses using silicon wafer and silicon germanium, a material commonly used in transistors. Team leader Jeehwan Kim said that together, the "two perfectly mismatched materials can form a funnel-like dislocation, creating a single path through which ions can flow."
They tested their creation by using it on a chip for their neural network, which they fed with tens of thousands of handwriting samples. The result? It was able to recognize handwriting 95 percent of the time -- not bad, seeing as current algorithms powered by typical hardware have a 97 percent accuracy.
Neuromorphic computing will be able to do lot more than recognize handwritten samples, though. In the future, it could lead to portable AI devices much more powerful than your already impressive modern smartphone. "Ultimately we want a chip as big as a fingernail to replace one big supercomputer," Kim said. "This opens a stepping stone to produce real artificial hardware."
[ From left: MIT researchers Scott H. Tan, Jeehwan Kim, and Shinhyun Choi. Image credit: Kuan Qiao]
Source: MIT
via Engadget RSS Feed "http://ift.tt/2DBSPdn"
Comments
Post a Comment