Curation means bias
On the night of the US presidential elections, the reaction to the realization that pundits had gotten reality so wrong was to reflect outward. Talk of filter bubbles reemerged, and of fake news, and of algorithms that curate and rank. Those pesky algorithms, let loose from the lab to wreak havoc on the faculties of well-meaning men and women. I wanted to think for myself, but Facebook wouldn’t let me.
To recognize that artifacts reflect the biases of those behind them is important. Equally important is realizing that that particular clock started ticking much earlier.
The insidious manifestations of bias in analog curation can be destructive. In many an autocracy, the state’s tourist board would be more inclined to point you to attractions that legitimize the state rather than those that predate it. The morning paper’s headline is more likely to be about something utterly petty than a genuinely consequential event. An unsubstantiated story could be designed to traduce a family, or lead to someone’s imprisonment or end. The term yellow journalism came into use a century ago, and that general problem can be traced all throughout history.
From that vantage, the limitations of software-aided approaches to curation are at worst on par with those traditional approaches and at best fixable. As such, the reaction one might have observed over the past few months seems akin to throwing the baby out with the bathwater. When it’s the case that a search engine’s first two results attract 50-percent of clicks that places a great deal of responsibility on the engineers and designers behind the search engine. But it doesn’t follow that that implies malice, unless the evidence suggests that of course.
Is it a bad thing that Quora or Amazon or Goodreads or Audible recommend things to you? No. Is it a bad thing that YouTube thinks you might enjoy a slew of videos about dad jokes because you watched a clip about that "Martin Loofah King” exfoliator the night before? Up to a point, no (hehe, Loofah King). Might that ultimately preclude you from experiencing the full current of human life, if you’re only ever reading and watching things you like? Quite possibly, but then again, there are ways to recognize and mitigate that limitation.
And there are limitations. One of the chapters in Bad Choices covers the most fundamental way that these recommendation engines work—an approach that relies on looking at how things are linked and giving more weight to those things that have a greater number of links. The more connected you are, the more visible you become.
I was at an event one time and the speaker asked us all to stand up and look for others in the audience who were from the same city as us. And to then hold their hands until the exercise was over. The point being that even in a room full of strangers, one could find ways to connect with others. Needless to say, the few people at the back of the room who were from remote parts of the world scratched their heads as they wondered how that exercise was helping them feel more included.
Even so, to the optimist, to the spectator, these machine-assisted ways we’re developing to better aid discovery and personalization and classification are remarkable. Anything we consume, especially if it’s curated, will have some element of bias in it. The antidote to bias is critical thinking, not passing the buck.