filter bubbles | TED

Dang good talk on the idea of filtering communities by someone I haven't yet heard of: Eli Pariser.  He identifies one of the key issues, which is whether "personalized" visibility of information can simply reinforce your own's tribe's prejudices, interests, and paradigms. Pariser perfectly illustrates how Facebook has been slowly increasing this "feature."  You might have 800 friends, but you're only seeing regular entries from a small portion of them in your news feed, and it's based on it tracking who/what you show interest in based on your clicks. A few things Pariser doesn't get quite right.  He describes the "early internet" as not really have gatekeepers, and now it does. That's not true. There's no such thing as "no gatekeeper" (read: no editor) in a overwhelming flood of information. The early gatekeepers were simply the  rudimentary keyword matches algorithms, etc.  Anything below result #50 was still invisible, whether or not it was relevant. And Google has never indexed more than a percentage of the web.  Now instead, the discussion of gatekeepers is starting to appear as we think about this, but it's always been there.

So now the gatekeeper function is more personalized, which more closely models how we actually work.  We all tend to notice and read and pay attention to things we already know about or agree with.  Further, we pay attention to the information friends and community give us over strangers, generally speaking.  Our lens is self-directed and relationally-directed.  So this "new" function is doing exactly what most new digital innovations have been doing over the last few years:  modeling behavior humans already had to begin with.

I share exactly Eli Pariser's concerns on this.  I just don't think the problem is new because of digital space, I think we're simply noticing it there.