One hardware company investigating the possibilities of neural networking is Micron. The company has just released a prototype of a DDR memory module with a built-in processor, called Automata.
While not a replacement for standard CPUs, a set of Automata modules could be used to watch over a live stream of incoming data, seeking anomalies or patterns of interest. In addition to these spatial characteristics, they can also watch for changes over time, said Paul Dlugosch, director of Automata processor development in the architecture development group of Micron's DRAM division.
"We were in some ways biologically inspired, but we made no attempt to achieve a high fidelity model of a neuron. We were focused on a practical implementation in a semiconductor device, and that dictated many of our design decisions," Dlugosch said.
Nonetheless, because they can be run in parallel, multiple Automata modules, each serving as a node, could be run together in a cluster for doing neural network-like computations. The output of one module can be piped into another module, providing the multiple layers of nodes needed for neural networking. Programming the Automata can be done through a compiler that Micron developed that uses either an extension of the regular expression language or its own Automata Network Markup Language (ANML).
Another company investigating this area is IBM. In 2013, IBM announced it had developed a programming model for some cognitive processors it built as part of the U.S. Defense Advanced Research Projects Agency (DARPA) SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) program.
IBM's programming model for these processors is based on reusable and stackable building blocks, called corelets. Each corelet is in fact a tiny neural network itself and can be combined with other corelets to build functionality. "One can compose complex algorithms and applications by combining boxes hierarchically," Modha said.
"A corelet equals a core. You expose the 256 wires emerging out of the neurons, and expose 256 axioms going into the core but inside of the code is not exposed. From the outside perspective, you only see these wires," Modha said.
In early tests, IBM taught one chip how to play the primitive computer game Pong, to recognize digits, to do some olfactory processing, and to navigate a robot through a simple environment.
While it is doubtful that neural networks would ever replace standard CPUs, they may very well end up tackling certain types of jobs difficult for CPUs alone to handle.
"Instead of bringing sensory data to computation, we are bringing computation to sensors," Modha said. "This is not trying to replace computers, but it is a complementary paradigm to further enhance civilization's capability for automation."
Sign up for CIO Asia eNewsletters.