Eli Pariser: Beware online “filter bubbles”

Speaker: Eli Pariser: Beware online “filter bubbles”

Length: 9:05

Summary

There is an invisible shift in how information is flowing and Eli Pariser wants us to be aware of it. The web now adapts depending on the specific user. Eli first noticed this automatic filtering in his own Facebook news feed. He is politically progressive and noticed that he was starting to see less and less of the conservative links posted by his Facebook friends. Facebook had worked out that Eli had been clicking more liberal links than conservative links and hid them. This invisible, algorithmic editing is used by nearly all major sources of news and information. Google now uses 57 different signals to determine your search results. Ranging from your geographic location to your age and ethnicity. Yahoo News and Huffington Post have also begun to personalize their information. The information I get is no longer the information you get.

The problem with this, Eli says, is that while the Internet is showing what we want to see – its not necessarily what we need to see. A filter bubble is what he calls it. It’s a bubble of your own unique information, but you can’t see what doesn’t get into it. When the Internet was created it was seen as a release from the control of the people that were controlling and editing what information you saw. However, the reality is that these human gatekeepers have been just replaced with algorithmic ones.

These algorithms have been feeding us a steady diet of relevant information. But what we need is a balanced diet that also include information that is uncomfortable, challenging, and important. Eli wants this to change. He wants algorithms that have encoded in a sense of public life and a sense of civic responsibility. Algorithms that allow us to see what doesn’t get through. This is the key to unlock the full potential of the Internet. The Internet should be something that introduces us to new ideas, new people, and different perspectives.

Advertisements

Online Filter Bubbles

Speaker: Eli Pariser

Length: 9:05

RatingĀ 3 / 5

Summary

Internet algorithms (Facebook, yahoo, Google) areĀ filtering what we see based on what we tended to click in the past. This means we don’t tend to hear different points of view, instead just reinforcing what we want to hear. Ultimately this is bad for humanity and democracy, which is reliant on receiving a range of arguments.

His suggestion was for internet giants to be transparent about how they filter information, and to allow users more control over what is filtered / prioritised for them.

Critique

A worrying look at our life being tailored too much. We’ll always be at the mercy of what we want to hear, without ever having our views challenged. I think we are at risk of this regardless of algorithmic bubbles, just because of the sorts of communities we attach ourselves to. Eg – a conspiracy theorist would be a member of a variety of forums happy to reinforce each others views, and convincing themselves that all evidence disagreeing with them is a lie. In the end the conspiracy theorist and sceptic would be so convinced of themselves that they wouldn’t even be capable of arguing their point. The same sort of dichotomy would exist in political circles – with people surrounding themselves with others holding similar views, then saying “This politician is useless – I don’t know anyone who voted for him, he must have cheated”.

Regardless, I support the idea of internet titans releasing details into how they filter. Is it entirely on popularity of things you’ve clicked on, or is there more tuning going on?

We also should occasionally see a site that challenges our views, so we don’t end up with many unconnected internets.