Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Algorithms and experiments make strange bedfellows at SXSW

Lamont Wood | March 15, 2016
Data has to be a two-way street, these panelists agreed.

Search and social media algorithms can produce strange results for their users -- but fiddling with them can produce strange results from their users. In both cases the results may be too important to leave purely to machines.

That was the message from two conference sessions this weekend at South by Southwest (SXSW) Interactive in Austin, as the largely youth-oriented technology conference moved into its second day under clearing skies.

"How to raise your IQ by eating gifted children," suggests the auto-complete feature of a search engine. Another tagged a specific ethnic group of humans as gorillas. In yet another case, women were not shown high-end job openings. And, Facebook will say that you are getting a particular advertisement "because you are similar to our customers."

Christian Sandvig, a professor at the University of Michigan, presented these examples as part of a session titled "Algorithmic Lunacy and What to Do About It."

"We have a crisis of confidence in algorithms," he said, calling for "seamfull" (as opposed to seamless) design that makes the algorithms visible rather than invisible. "But that is premised on having non-evil designers who are trying to help (rather than delude) you," he added.

Fellow panelist and Intel research scientist Dawn Nafus told of the algorithm of her fitness monitor harping at her for being unhealthy for not taking enough steps daily at a time when she was confined to a wheelchair after an accident. As a result, she would not have qualified for a workplace health insurance discount, she complained.

She called on the industry to make data downloads the norm so that users can go "off script" and "domesticate" their data through personal analysis.

She recalled a friend who analyzed the data from his sleep monitor and found he was waking up every morning at exactly 3 am. A computer in the room was backing up at that time and awakening him, he discovered. Another friend saw that her periodic exercise routine was triggering her auto-immune disease and so was actually hurting her health.

"Make data downloading the norm," she pleaded. "It's super-cheap to build in a data download button."

But when Facebook experimented with the algorithm that selected the items for the news feeds of individual users, many users were deeply troubled, as discussed by panelists in a session titled "Massive Online Experiments: Practical Advice." The so-called emotional contagion experiment of 2012 (published in 2014) altered the news feeds of about 700,000 users to see if their emotions changed (as evidenced by the language of their postings) in response to changes in the emotional content of their news feeds.

There could be a number of reasons for the backlash, said Duncan Watts, principal researcher at Microsoft Research. The word "experiment" has negative associations with rats and petri dishes, he said, while the need to resort to an experiment implies disappointing ignorance on the part of those running things, he said. Meanwhile, people are not comfortable with machine-based randomization, although it is integral to many experiments, he added, as they would rather believe that humans are making the choices.

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.