Rather than transparency, Sandvig's preferred solution focuses instead on auditing -- systematic tests of the results of the algorithms, rather than the algorithms themselves, to assess the nature of their consequences.
"In some areas, we're not going to be able to figure out the processes or the intent, but we can see the consequences," he said.
In the area of housing, for example, it may be difficult to fully understand the algorithms at work behind loan decisions; much easier and much more diagnostic would be an examination of the results, such as, are people of all races getting mortgages in all neighborhoods?
It's clearly early days in terms of figuring out the best approaches to the potential problems involved here. Whichever strategies ultimately get adopted, though, the important thing now is to be mindful of the social consequences, the FTC's Soltani said.
"A lot of times the tendency is to let software do its thing," he said. "But to the degree that software reinforces biases and discrimination, there are still normative values at stake."
Sign up for CIO Asia eNewsletters.