At a time when technology should be broadening our information and knowledge, it may in fact be narrowing our minds. Attention management is becoming an instinctive self-preservation behavior. We don’t have the time and energy (nor the interest) to read and evaluate everything presented about every issue. So we filter. Whether we admit it or not, we tend to filter out what does not reinforce our worldview. This dynamic is nothing new, but in the past this selective-openness was largely self-inflicted. Now we are beginning to bake bias into our technology.
This dynamic played out recently when PolitiFact.com announced their annual “Lie of the Year” selection. The fact-checking website chose the Democratic Congressional Campaign Committee’s claims about Medicare as the most egregious distortion of 2011. This news isn’t particularly interesting in and of itself. As in war, the first victim of an election year is always the truth. Politicians lie. Always have. Probably always will. Thanks to services like PolitiFact.com and FactCheck.org, it’s now a bit easier to call them on it. We just don’t like it when a foul is called on our team.
The major fact checking sites are equal opportunity whistle-blowers. The Republicans held the Lie of the Year title for the previous two years. During those two years conservative pundits vilified the site while progressives praised it. Now that the Democrats wear the “pants-on-fire” distinction, Politifact’s former champions are on the attack. Gawker’s Jim Newman went so far as to call it “dangerous” and the New York Times proclaimed it “dead”. A few days later PolitiFact editor Bill Adair responded to the uproar. In a brief article, he laments the fact that most discourse now takes place in an echo chamber.
At a Republican campaign rally a few years ago, I asked one of the attendees how he got his news. “I listen to Rush and read NewsMax,” he said. “And to make sure I’m getting a balanced view, I watch Fox.”
My liberal friends get their information from distinctly different sources – Huffington Post, Daily Kos and Rachel Maddow. To make sure they get a balanced view, they click Facebook links – from like minded liberal friends.
This is life in our echo chamber nation. We protect ourselves from opinions we don’t like and seek reinforcement from like-minded allies.
We tend to reinforce our views and values by surrounding ourselves with people who see things our way. As Bill Bishop notes in his book The Big Sort, “we have segregated ourselves into enclaves of people who look like us, talk like us and act like us.” I would add that too often this translates to people who think like us and believe like us. This isn’t just reflected in our neighborhoods and social clubs, but increasingly in our online activities.
Personalization has been a feature of Google for years. If the search engine knows who you are (and it usually does) it is going to tailor the SERP (Search Engine Results Page) to your history, behaviors and preferences. Facebook amplifies the filtering effect of your self-selected social network by presenting only “important” updates and allowing you to “hide all stories by…”. This of course lets you publish your views without having to see any response or rebuttal. Eli Pariser has written an excellent introduction to the subject in his book The Filter Bubble: What the Internet Is Hiding from You.
Personalization is now moving to the next level in the form of social search. Search engines consider many factors when matching search terms to content and ranking them according to relevance. Social search adds a new ingredient to this secret sauce. In addition to more or less objective relevance criteria (yes, I’m ignoring for now how sponsored searches cook the books) social search takes your social graph into account . If your Facebook BFF likes a story, its likely that you will like it too and so it gets boosted in the SERP. When searching for music, restaurants and movies this can be useful. When searching for news or competitive intelligence, it can be myopic. If our friends and colleagues look, act and think like us, our social search engine results are unlikely to be “fair and balanced.”
Microsoft’s FUZE labs is developing what is potentially the most ambitious manifestation of social search to date. The new site, called So.cl, has noble aspirations. In a Technology Review interview Lili Cheng, the Microsoft researcher who led development of So.cl, says "So.cl is really an experimental research project focused on how social networking and search can be used for the purpose of learning." So.cl made a brief public debut last July but access was quickly limited to the University of Washington, Syracuse University, and New York University. This was likely due to Microsoft’s insistence that So.cl is about learning how people learn. Cheng went on to say "The project isn’t specifically for formal learning, but learning as a general activity on any topic." So keeping it on campus might make sense. Then again, isn’t that how Facebook got started?
Like any technology, search personalization and social search can be used for good or evil. It can be a bug or a feature. The trick is to be aware of what your search engine of choice is doing behind the curtains and compensate for bias as necessary. I may be a bit hypersensitive to this issue as we head into an election year, but we would do well to heed the advice of John Stuart Mill at all times.
"It is hardly possible to overrate the value…of placing human beings in contact with persons dissimilar to themselves, and with modes of thought and action unlike those with which they are familiar…Such communication has always been, and is peculiarly in the present age, one of the primary sources of progress.