Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

12 ethical dilemmas gnawing at developers today

Peter Wayner | April 22, 2014
The tech world has always been long on power and short on thinking about the ramifications of this power. If it can be built, there will always be someone who will build it without contemplating a safer, saner way of doing so, let alone whether the technology should even be built in the first place. The software gets written. Who cares where and how it's used? That's a task for somebody in some corner office.

There are no simple ways to measure the ethical obligations, and many programmers have wasted many keystrokes arguing about what they mean. Still, the entire endeavor will grind to a halt if people stop giving. The good news is that it's often in everyone's best interest to contribute because everyone wants the software to remain compatible with their use of it.

Ethical dilemma No. 10: How much monitoring is really warranted

Maybe your boss wants to make sure the customers aren't ripping off the company. Maybe you want to make sure you get paid for your work. Maybe some spooky guy from the government says you must install a backdoor to catch bad guys. In every case, the argument is filled with assurances that the backdoor will only be used, like Superman's powers, to support truth and justice. It won't be used against political enemies or the less fortunate. It won't be sold to despotic regimes.

But what if the bad guys discover the hidden door and figure out how to use it themselves? What if your backdoor is used to support untruths and injustices? Your code can't make ethical decisions on its own. That's your job.

Ethical dilemma No. 11: How bulletproof should code really be

Sure, the minimal calculation, simple data structure, and brute-force approach works well in demo when the problems are small. The users try out the code and say, "Gosh this works quickly." Several months later, when enough data has been loaded into the system, the cheap algorithm's weaknesses appear and the code slows to a crawl.

Developers must often decide exactly how hard to work on the final product. Do you whip off a quick and cheap solution or spend weeks adding bulletproof code that deals with extreme cases? True, clients and users should assume some of the responsibility during the requirements and sign-off phases, but developers are often better equipped to anticipate potential contextual hiccups of running code.

Ethical dilemma No. 12: How much should future consequences influence present decisions

Many projects don't make waves. The information goes in and never escapes. Some, however, take on a life of their own, escaping into the wild where they may do untold harm. Security, penetration testing, espionage — these are obvious candidates for considering the collateral damage of your code.

Take Stuxnet, a virus widely considered a tool for attacking the centrifuges used to purify uranium in Iran. Perhaps it succeeded, but now it lives on, floating along in Windows systems throughout the world.

For most developers, the collateral damage is less obvious. We code for today — a hard enough proposition — but we should also consider the future.

 

Previous Page  1  2  3  4  5  6  Next Page 

Sign up for CIO Asia eNewsletters.