Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Qualcomm's brain chip could turn your phone into a robot butler

Mark Hachman | Oct. 14, 2013
If a cell phone can essentially see, hear, and detect movement like a person, shouldn't it start to think like a person, too? That's the basis of Qualcomm's Zeroth processor, designed to emulate millions of the billions of neurons within the human brain.

If a cell phone can essentially see, hear, and detect movement like a person, shouldn't it start to think like a person, too? That's the basis of Qualcomm's Zeroth processor, designed to emulate millions of the billions of neurons within the human brain.

A version of the Zeroth has already been built into a robotic platform that learns by being encouraged—quite literally, "good robot"—rather than being traditionally programmed, Qualcomm executives said.

For years, technologists have talked about personal assistants, pieces of code that pull in data and try to coalesce them into information that's relevant and useful. Qualcomm's Zeroth could form the hardware foundation upon which future personal assistants are built.

"Wouldn't it be swell to have a device that you could train?" said M. Anthony Lewis, the senior director and the project engineer responsible for the Zeroth, in an interview. "It leads to the possibility of a customized user experience for each individual cellphone user, to be more like the phone that they want rather than the phone that they get."

In a few years, Qualcomm envisions the Zeroth sitting alongside a future Qualcomm Snapdragon, Lewis said.

Snapdragon chips power a number of high-end smartphones and tablets, including the Samsung Galaxy S4, the Galaxy Note 3, the Google/Asus Nexus 7, and the HTC One mini, among others.

Conventional microprocessors were originally designed serially: to execute one instruction, than the next, than the next. That led to ever-increasing clock speeds, to execute those instructions as fast as possible.

Then other improvements were introduced: wider bus speeds, allowing the processor to chew on more data at any given time, and finally parallelism, which gave rise to the multicore chips that are now common today. The latter technology allows a microprocessor to process an instruction on one core while another processes a separate task simultaneously.

Cognitive computing

Massively parallel processors are seen as the future, if only because they can work on a multitude of tasks at once. That's how the human brain operates: processing the vast amount of data our eyes, ears, skin, nose, and mouth produce, building the sensory experience of a morning brunch on the patio of a mountain cabin, for example.

Instead of transistors and circuits, however, the brain uses a series of neurons to pass information. So-called cognitive computing is being worked on by IBM and Google, as well as national initiatives both within the United States, and separately within the European Union. Measuring the power of a neural network is usually dependent on the parameters or connection forged between the individual components; Lewis said that Zeroth was scalable to 10 million neurons and beyond—still a fraction of the hundreds of billions of neurons within the brain itself.

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.