Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

In search of a social site that doesn't lie

Mike Elgan | Aug. 4, 2014
Mike Elgan would like to find a social network that doesn't lie to users, doesn't experiment on users without their clear knowledge, and delivers by default all the posts of the people they follow.

And don't get me started on OKCupid's algorithms and how they could affect the outcome of people's lives.

Not only do both companies experiment all the time; their experiments make huge changes to users' relationships.

The real danger with these experiments

You might think that the real problem is that social networks that lie to people, manipulate their relationships and regularly perform experiments on their users are succeeding. For example, when Facebook issued its financial report last month, it said revenue rose 61% to $2.91 billion, up from $1.81 billion in the same quarter a year ago. The company's stock soared after the report came out.

Twitter, which is currently a straightforward, honest, nonmanipulative social network, has apparently seen the error of its ways and is seriously considering the Facebook path to financial success. Twitter CEO Dick Costolo said in an interview this week that he "wouldn't rule out any kind of experiment we might be running there around algorithmically curated experiences or otherwise."

No, the real problem is that OKCupid and Facebook may take action based on the results of their research. In both cases, the companies say they're experimenting in order to improve their service.

In the case of OKCupid, the company found that connecting people who are incompatible ends up working out better than it thought. So based on that result, in the future it may match up more people it has identified as incompatible.

In the case of Facebook, it did find that mood is contagious. So maybe it will "improve" Facebook in the future to build in a bias for positive, happy posts in order to make users happier with Facebook than they are with networks that don't filter based on positivity.

What's the solution?

While Twitter may follow Facebook down the rabbit hole of user manipulation, there is a category of "social network" where what you see is what you get — namely, messaging apps.

When you send a message via, say, WhatsApp or Snapchat or any of the dozens of new apps that have emerged recently, the other person gets it. WhatsApp and Snapchat don't have algorithms that choose to not deliver most of your messages. They don't try to make you happy or sad or connect you with incompatible people to see what happens. They just deliver your communication.

I suspect that's one of the reasons younger users are increasingly embracing these alternatives to the big social networks. They're straightforward and honest and do what they appear to do, rather than manipulating everything behind the scenes.

Still, I'd love to see at least one major social site embrace honesty and respect for users as a core principle. That would mean no lying to users, no doing experiments on them without their clear knowledge, and delivering by default all of the posts of the people they follow.

In other words, I'd love to see the founders of social sites write blog posts that brag: "We DON'T experiment on human beings."

Wouldn't that be nice?


Previous Page  1  2  3 

Sign up for CIO Asia eNewsletters.