Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Facebook emotional manipulation test turns users into 'lab rats'

Sharon Gaudin | July 1, 2014
Anger grows even as Facebook researcher posts apology for causing users anxiety.

Users and analysts were in an uproar over news that Facebook manipulated users' News Feeds to conduct a week-long psychological study that affected about 700,000 people.

News reports said that Facebook allowed researchers to manipulate the positive and negative information they saw on the social network in order to test the emotions of users. The study, which was conducted Jan. 11 to Jan. 18, 2012, was published in the Proceedings of the National Academy of Sciences.

For the past several days, the media, blog posts, social commentators and industry analysts have been venting their anger over Facebook's emotional manipulation.

"I think this violates the trust of Facebook users who rely on their protection," said Patrick Moorhead, an analyst with Moor Insights & Strategy. "There were two lines that were crossed that violated trust. First, Facebook's News Feed was manipulated for the sake of the experiment. Secondly, it involved a third party who published results externally. Facebook users are more than public lab rats."

In the experiment, Facebook temporarily influenced what kind of posts and photos would be seen by about 700,000 of its English-speaking users on their News Feeds. The social network enabled researchers to either show more positive or negative comments and posts to users in order to see if it influenced the users' emotions.

As a consequence, users were not shown a regular cross-section of their friends' posts, but instead were given a manipulated feed.

The study found that users who were shown more positive comments, made more positive comments and users shown more negative comments became more negative themselves.

Facebook did not respond to a request for comment, though Adam Kramer, a data scientist at Facebook who participated in the study, apologized for upsetting users in a post on his Facebook page.

"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," Kramer wrote. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

He pointed out that the research only affected about 0.04% of users, or 1 in 2,500. Facebook today has more than 1 billion users.

While some users commented on his blog post, saying they appreciated the fact that he addressed the issue, not all were so understanding.

"I appreciate the statement, but emotional manipulation is still emotional manipulation, no matter how small of a sample it affected," wrote Kate LeFranc in a comment.

Andrew Baron wrote, "This is the nail in the coffin for my concern that Facebook is the kind of company that Google talks about when they say don't be evil... There is no turning back from this."

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.