Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Smart machines require ethical programming: Gartner

Zafirah Salim | March 19, 2015
Gartner has identified five levels of programming and system development, based on their ethical impact.

Responsibility is shared between the user, service provider and designer. Users are responsible for the content of the interactions they start (such as a search or a request), but not for the answer they receive.

The designer is responsible for considering the unintended consequences of the technology's use (within reason); and the service provider has a clear responsibility to interact with users in an ethical way, while teaching the technology correctly, and monitoring its use and behavior.

For instance, in the past, a smartphone-based virtual personal assistant (VPA) would guide you to the nearest bridge if you told it you'd like to jump off one. Now, it is programmed to pick up on such signals and refer you to a help line. This change underlines Gartner's recommendation to monitor technology for the unintended consequences of public use, and to respond accordingly.

Level 3: Evolutionary Ethical Programming

This level introduces ethical programming as part of a machine that learns and evolves. The more a smart machine learns, the more it departs from its original design, and the more its behavior becomes individual.

At this level, the concept of the user changes again. In Level 2, the user is still in control; but in Level 3, many tasks are outsourced and performed autonomously by smart machines. The less responsibility the user has, the more trust becomes vital to the adoption of smart machines.

For example, if you don't trust your VPA with expense reporting, you would not authorise it to act on your behalf. If you don't trust your autonomous vehicle to drive through a mountain pass, you will take back control in that situation. Given the effect of smart technologies on society, Level 3 will require considerable new regulation and legislation.

As part of long-term planning, Gartner recommends CIOs to consider how their organisations will handle autonomous technologies acting on the owner's behalf. For example, should a smart machine be able to access an employee's personal or corporate credit card, or even have its own credit card?

Level 4: Machine-Developed Ethics

Gartner does not expect smart machines to become self-aware within a time frame that requires planning at the moment. However, Level 4 predicts that smart machines will eventually become self-aware. They will need to be raised and taught, and their ethics will need to evolve.

In Level 4, the concept of "users" has disappeared, replacing it with the concept of "actors" who initiate interactions, respond and adapt.

Ultimately, a smart machine, being self-aware and able to reflect, is responsible for its own behaviour.

 

Previous Page  1  2 

Sign up for CIO Asia eNewsletters.