Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Artificial intelligence needs your data, all of it

Mike Elgan | Feb. 23, 2016
Today's concerns about giving up privacy will seem quaint in the coming years. A.I. will need everything, and we'll happily give it.

I won't go into the details, in part because I don't understand them (When Google engineer Anjuli Kannan addressed a crowd of professionals about how SmartReply works at the recent Virtual Assistant Summit in San Francisco, I could tell they didn't understand it, either). But the technology behind SmartReply is monstrously advanced and powerful, despite the fact that its output tends to be stuff like "got it, thanks!" and its purpose is to save you two seconds.

That SmartReply works at all relies on Google's harvesting terabytes of email messages and replies, which they promise no human ever reads.

Why we'll all offer up our data to A.I.

Andy Rubin's dashcam, Microsoft's SwiftKey and Google's SmartReply are examples of where a large number of people would allow their data to be harvested to feed the A.I. systems that need it. In exchange, people get useful and free tools.

But there's an even better reason to feed the A.I. beast -- saving and improving human lives.

Air pollution is estimated to kill some 5.5 million people a year. A new app called AirTick emerged this month from Nanyang Technological University in Singapore. The app uses smartphone pictures to track air pollution.

Smartphone photos can be tagged with time and location. By harvesting thousands of photos a day from major cities, the AirTick app can train A.I.-software to learn how to estimate the amount of smog from the photos. Over time, the A.I. plus the smartphone photo information should enable the system to maintain real-time, neighborhood-by-neighborhood estimates of air quality. That could allow timely alerts for people to go inside when the air quality gets really bad and also provide evidence for citizens to demand cleaner air, say, in factory towns where the air may be especially unhealthful.

Another research project out of the University of California at Berkeley last week published a free app called MyShake that can detect earthquakes. It uses the motion sensors in smartphones to constantly monitor the phones' every movements. The app can tell when motion is caused by an earthquake or from non-earthquake motion.

It's like having millions of seismographs all over the place, rather than dozens or hundreds. Eventually, the system should be able to predict earthquakes faster than current systems.

And yet another new app came out recently for iOS that helps visually impaired people to identify everyday objects. To use it, you simply snap a picture. Artificial intelligence in the cloud analyzes the smartphone photo, figures out what it is, then sends the answer back.

For example, let's say a blind user is shopping for a birthday present at Toys 'R' Us. The user points the camera at a box, and has Aipoly tell the user that it's a Star Wars Lego set. Or while shopping for fruit, the app could tell the difference between a lemon and a lime.

 

Previous Page  1  2  3  4  Next Page 

Sign up for CIO Asia eNewsletters.