Speaker: Eli Pariser
Rating 3 / 5
Internet algorithms (Facebook, yahoo, Google) are filtering what we see based on what we tended to click in the past. This means we don’t tend to hear different points of view, instead just reinforcing what we want to hear. Ultimately this is bad for humanity and democracy, which is reliant on receiving a range of arguments.
His suggestion was for internet giants to be transparent about how they filter information, and to allow users more control over what is filtered / prioritised for them.
A worrying look at our life being tailored too much. We’ll always be at the mercy of what we want to hear, without ever having our views challenged. I think we are at risk of this regardless of algorithmic bubbles, just because of the sorts of communities we attach ourselves to. Eg – a conspiracy theorist would be a member of a variety of forums happy to reinforce each others views, and convincing themselves that all evidence disagreeing with them is a lie. In the end the conspiracy theorist and sceptic would be so convinced of themselves that they wouldn’t even be capable of arguing their point. The same sort of dichotomy would exist in political circles – with people surrounding themselves with others holding similar views, then saying “This politician is useless – I don’t know anyone who voted for him, he must have cheated”.
Regardless, I support the idea of internet titans releasing details into how they filter. Is it entirely on popularity of things you’ve clicked on, or is there more tuning going on?
We also should occasionally see a site that challenges our views, so we don’t end up with many unconnected internets.