There's a lot of sci-fi-level buzz lately about smart machines and software bots that will use big data and the Internet of things to become autonomous actors, such as to schedule your personal tasks, drive your car or a delivery truck, manage your finances, ensure compliance with and adjust your medical activities, build and perhaps even design cars and smartphones, and of course connect you to the products and services that it decides you should use.
That's Silicon Valley's path for artificial intelligence/machine learning, predictive analytics, big data, and the Internet of things. But there's another path that gets much less attention: the real world. It too uses AI, analytics, big data, and the Internet of things (aka the industrial Internet in this context), though not in the same manner. Whether you're looking to choose a next-frontier career path or simply understand what's going on in technology, it's important to note the differences.
A recent conversation with Colin Parris, the chief scientist at manufacturing giant General Electric, crystalized in my mind the different paths that the combination of machine learning, big data, and IoT are on. It's a difference worth understanding.
The real-world path
In the real world -- that is, the world of physical objects -- computational advances are focused on perfecting models of those objects and the environments in which they operate. Engineers and scientists are trying to build simulacra so that they can model, test, and predict from those virtual versions what will happen in the real world.
As Parris explained, the goal of these simulacra is to predict when (and what) maintenance is needed, so airplanes, turbines, and so forth aren't taken offline for regular inspections and maintenance checks. Another goal is to predict failure before it happens, so airplanes don't lose their engines or catch fire in midflight, turbines don't overheat and collapse, and so forth.
Those are long-held goals of engineering simulations; modern computing technology has made those simulacra more and more accurate, allowing them to be used increasingly as virtual twins of the real thing. Higher computing power, big data storage and processing, and connectivity of devices via sensors, local processors, and networks (the industrial Internet) have made those virtual twins more and more possible. That means less guesswork ("extrapolation," in engineering parlance) and more certainty, which means fewer high-cost failures and fewer large-cost planned service outages for checks.
There's another goal, made possible only recently by those industrial Internet technology advances: machine-to-machine learning. Parris' example was a windmill farm. Old turbines could share their experience and status with new ones, so new ones could adjust their models based on the local experience, as well as validate their local responses based on the experiences of other turbines before making adjustments or signaling an alarm.
Sign up for CIO Asia eNewsletters.