Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Facebook emotional manipulation test turns users into 'lab rats'

Sharon Gaudin | July 1, 2014
Anger grows even as Facebook researcher posts apology for causing users anxiety.

Hadley Reynolds, an analyst with NextEra Research, said while he finds Facebook's move "reprehensible," he also doesn't find it surprising.

"Web-based businesses have been 'experimenting' on their users since the beginning of the commercial Internet, so it's hardly surprising that Facebook would be testing the effect of various content algorithms on groups of their members," Reynolds said. "Facebook is simply gauging its ability to manipulate its customers' emotions for its own private gain. This incident won't break Facebook's franchise, but it will add another straw to the growing pile that eventually will erode user's perception of value enough to put the business in long-term decline."

Jeff Kagan, an independent analyst, called the experiment an abuse of users' trust.

"This creeped me out when I heard about what Facebook is doing to their users, their customers," Kagan said. "Facebook just sees this as an interesting experiment, but this is another case where they cross way over the line. Companies often battle to protect the privacy of their users from other companies. However, Facebook seems to be abusing their own customers themselves. This is inexcusable."

This is far from Facebook's first questionable practice regarding privacy and users have not left the social networking site. That means Facebook has little impetus to stop making these kinds of moves.

"In the old days, customers would simply leave and punish Facebook," Kagan said. "However today, customers don't leave so Facebook continues going down this same path with no self control. How can a company that is supposed to value its customers, abuse them so badly?"

The study alleges that users' accept Facebook's right to manipulate their News Feed when they click on the site's terms and conditions of use.

Rob Enderle, an analyst with the Enderle Group, said that explanation doesn't fly for him. He said he is troubled with the company's decision to purposefully make people sad to further their experiment.

"That was the goal of the study and that sadness does represent harm," Enderle said. "I'd anticipate one or more class-action lawsuits."

Kagan noted that Facebook's social experiment could easily invite various governments to launch an investigation.

"I don't think Facebook will show self control until they are forced to one way or another," he said. "I can see how governments in every country where Facebook operates may step in. When they do, it will be an expensive lesson for the company."

 

Previous Page  1  2 

Sign up for CIO Asia eNewsletters.