25

May
2011

We should embrace Filter Bubbles

Eli Pariser's TED talk "Beware of online 'filter bubbles'" and book documents the rise of personalized algorithmic filtering on the web, namely the use of our past, online behavior to determine the content shown to us in the future. The Filter Bubble is a metaphor of what happens when our past determines the future content we see.

Pariser explores the societal implications of filtering: the amplification of bias, echo chambers, loss of serendipity, etc.  Although he is raising consciousness on an important issue of the day, his discussion feeds the alarmists.  Predictably, popular and social media have done what they do best, amplify fear.  While academic discussions and the New York Times have reasonably measured coverage, if we are to believe some posts, we could be facing the death of democracy as we know it.  

Without significant filtering, the modern internet would not function.  The original Google search algorithm, PageRank is itself a filter driven by the link-formation behavior of millions of people and subject to constant editorial adjustment to remove noise.  Spam filters are now an essential feature of any e-mail system and prediction engines (also filters) are routinely used to improve serendipity.  Personalization technologies are nothing new, there has never been a neutral way to look at the net.

As with many problems in the modern age, we'll go a long way simply by being transparent about the goals our filtering technologies are trying to satisfy, and making them more adaptive to the in-the-moment context of our actions.  For example, in Ian Eslick's work, a recommender system accepts explicit statement of goals, and implicit background information about you to prioritize, but not remove, your choices.  Many popular recommendation engines explain the reasons for their recommendation.  This is a non-intrusive and highly transparent form of filtering.

To truly serve our intent, and not mindlessly follow our behaviors, filters need to become more aware of the goals we bring to an online interaction.  When I use web search as a delegate for my memory ("Do I take 1 or 2 pills a day?"), I absolutely want past behavior taken into account.  When I use web search to search for opinions on a medical subject, depending on my goal, I may only want to see institutionally approved content from WebMD and Mayo Clinic or I may only want to see what success people have had with lifestyle changes.

In an era of increasing information overload, the filter is a necessary and valuable tool and we're only at the beginning of the technology curve.  In the context of health, filters are critical to improving the effectiveness of the rising class of e-patients.  Our collective energies should be directed towards calling for more transparency, control, and better techniques for learning and detecting our goals.