Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Facebook's icky psychology experiment is actually business as usual

Fredric Paul | July 8, 2014
Unless you've been living under a rock for the last couple weeks, you've no doubt heard about Facebook's creepy, secret, psychological experiment designed to see if negative newsfeed posts inspire more negativity -- and vice versa. I don't want to excuse Facebook's behavior, which has prompted a (sort-of) apology from Facebook COO Sheryl Sandberg, as well as an ongoing stream of condemnation and outrage from legitimate psychologists and Internet commentators. I too was weirded out by the revelations, feeling manipulated and that somehow my privacy had been unfairly invaded without my permission.

Unless you've been living under a rock for the last couple weeks, you've no doubt heard about Facebook's creepy, secret, psychological experiment designed to see if negative newsfeed posts inspire more negativity — and vice versa. I don't want to excuse Facebook's behavior, which has prompted a (sort-of) apology from Facebook COO Sheryl Sandberg, as well as an ongoing stream of condemnation and outrage from legitimate psychologists and Internet commentators. I too was weirded out by the revelations, feeling manipulated and that somehow my privacy had been unfairly invaded without my permission.

But the more I think about this kerfuffle, the more I realize that while Facebook's actions may have been unseemly and morally wrong, it's hardly unusual. In fact, far more egregious personal violations of this kind take place all the time, generating little or no comment. These kinds of "experiments" in modifying our emotions are not just common, they are the everyday stock in trade of a wide variety of businesses and media outlets.

Everybody manipulates emotions

The vast majority of the advertising business is built on manipulating emotions, from tear-jerking AT&T commercials to personal grooming ads built on the idea of making you embarrassed about some bodily function you wouldn't otherwise give a second thought. Apart from a few price-oriented ads for staples or feature-oriented ads for business products, almost all advertising is about affecting your moods and feelings to spur a purchase or other behavior.

When was the last time you saw a car ad, for example, that focused solely on getting you from here to there instead of making you feel like winner for sitting in the XYZ Model Dreadnought 3000?

It's not just ads, of course. Stores spend millions on everything from decor to music to influence the way you feel while shopping, hoping that you'll spend more if they get it right. Writers and moviemakers devote their careers to eliciting emotional reactions, and when they succeed we celebrate them for it.

So what's different with Facebook?

Sure, the Facebook experiment could have negative impacts on users, but so too do all those other examples of emotional manipulation. Based on public comments, several things seem to be bugging people, but they don't really add up to a significant difference from what's already going on.

Is secrecy the problem?

First, the Facebook experiment was secret. The folks at Facebook Labs who conducted the test didn't see the need to tell anyone what they were doing (and no one else at the company stepped in to suggest that maybe that wasn't such a good idea). The difference, I guess, is that it's OK to manipulate your emotions in situations where that's expected, but not to do it unannounced in what people assumed was an objective environment.

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.