The Trouble with the Online Echo Chamber – No Surprise = Affirm Biases. NY Times
By NATASHA SINGER
Published: May 28, 2011
“ON the Web, we often see what we like, and like what we see. Whether we know it or not, the Internet creates personalized e-comfort zones for each one of us.
Give a thumbs up to a movie on Netflix or a thumbs down to a song on Pandora, de-friend a bore on Facebook or search for just about anything on Google: all of these actions feed into algorithms that then try to predict what we want or don’t want online.
And what’s wrong with that?
Plenty, according to Eli Pariser, the author of “The Filter Bubble: What the Internet Is Hiding From You.” Personalization on the Web, he says, is becoming so pervasive that we may not even know what we’re missing: the views and voices that challenge our own thinking.
“People love the idea of having their feelings affirmed,” Mr. Pariser told me earlier this month. “If you can provide that warm, comfortable sense without tipping your hand that your algorithm is pandering to people, then all the better.”
Mr. Pariser, the board president of the progressive advocacy group MoveOn.org, recounted a recent experience he had on Facebook. He went out of his way to “friend” people with conservative politics. When he didn’t click on their updates as often as those of his like-minded contacts, he says, the system dropped the outliers from his news feed.
Personalization, he argues, channels people into feedback loops, or “filter bubbles,” of their own predilections.
Facebook did not respond to e-mails seeking comment….”