Beware online “filter bubbles” | Eli Pariser
- Articles, Blog

Beware online “filter bubbles” | Eli Pariser


Mark Zuckerberg, a journalist was asking him a question about the news feed. And the journalist was asking him, “Why is this so important?” And Zuckerberg said, “A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” And I want to talk about what a Web based on that idea of relevance might look like. So when I was growing up in a really rural area in Maine, the Internet meant something very different to me. It meant a connection to the world. It meant something that would connect us all together. And I was sure that it was going to be great for democracy and for our society. But there’s this shift in how information is flowing online, and it’s invisible. And if we don’t pay attention to it, it could be a real problem. So I first noticed this in a place I spend a lot of time — my Facebook page. I’m progressive, politically — big surprise — but I’ve always gone out of my way to meet conservatives. I like hearing what they’re thinking about; I like seeing what they link to; I like learning a thing or two. And so I was surprised when I noticed one day that the conservatives had disappeared from my Facebook feed. And what it turned out was going on was that Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, it had edited them out. They disappeared. So Facebook isn’t the only place that’s doing this kind of invisible, algorithmic editing of the Web. Google’s doing it too. If I search for something, and you search for something, even right now at the very same time, we may get very different search results. Even if you’re logged out, one engineer told me, there are 57 signals that Google looks at — everything from what kind of computer you’re on to what kind of browser you’re using to where you’re located — that it uses to personally tailor your query results. Think about it for a second: there is no standard Google anymore. And you know, the funny thing about this is that it’s hard to see. You can’t see how different your search results are from anyone else’s. But a couple of weeks ago, I asked a bunch of friends to Google “Egypt” and to send me screen shots of what they got. So here’s my friend Scott’s screen shot. And here’s my friend Daniel’s screen shot. When you put them side-by-side, you don’t even have to read the links to see how different these two pages are. But when you do read the links, it’s really quite remarkable. Daniel didn’t get anything about the protests in Egypt at all in his first page of Google results. Scott’s results were full of them. And this was the big story of the day at that time. That’s how different these results are becoming. So it’s not just Google and Facebook either. This is something that’s sweeping the Web. There are a whole host of companies that are doing this kind of personalization. Yahoo News, the biggest news site on the Internet, is now personalized — different people get different things. Huffington Post, the Washington Post, the New York Times — all flirting with personalization in various ways. And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see. As Eric Schmidt said, “It will be very hard for people to watch or consume something that has not in some sense been tailored for them.” So I do think this is a problem. And I think, if you take all of these filters together, you take all these algorithms, you get what I call a filter bubble. And your filter bubble is your own personal, unique universe of information that you live in online. And what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out. So one of the problems with the filter bubble was discovered by some researchers at Netflix. And they were looking at the Netflix queues, and they noticed something kind of funny that a lot of us probably have noticed, which is there are some movies that just sort of zip right up and out to our houses. They enter the queue, they just zip right out. So “Iron Man” zips right out, and “Waiting for Superman” can wait for a really long time. What they discovered was that in our Netflix queues there’s this epic struggle going on between our future aspirational selves and our more impulsive present selves. You know we all want to be someone who has watched “Rashomon,” but right now we want to watch “Ace Ventura” for the fourth time. (Laughter) So the best editing gives us a bit of both. It gives us a little bit of Justin Bieber and a little bit of Afghanistan. It gives us some information vegetables; it gives us some information dessert. And the challenge with these kinds of algorithmic filters, these personalized filters, is that, because they’re mainly looking at what you click on first, it can throw off that balance. And instead of a balanced information diet, you can end up surrounded by information junk food. What this suggests is actually that we may have the story about the Internet wrong. In a broadcast society — this is how the founding mythology goes — in a broadcast society, there were these gatekeepers, the editors, and they controlled the flows of information. And along came the Internet and it swept them out of the way, and it allowed all of us to connect together, and it was awesome. But that’s not actually what’s happening right now. What we’re seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they’re going to decide what we get to see and what we don’t get to see, then we need to make sure that they’re not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important — this is what TED does — other points of view. And the thing is, we’ve actually been here before as a society. In 1915, it’s not like newspapers were sweating a lot about their civic responsibilities. Then people noticed that they were doing something really important. That, in fact, you couldn’t have a functioning democracy if citizens didn’t get a good flow of information, that the newspapers were critical because they were acting as the filter, and then journalistic ethics developed. It wasn’t perfect, but it got us through the last century. And so now, we’re kind of back in 1915 on the Web. And we need the new gatekeepers to encode that kind of responsibility into the code that they’re writing. I know that there are a lot of people here from Facebook and from Google — Larry and Sergey — people who have helped build the Web as it is, and I’m grateful for that. But we really need you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility. We need you to make sure that they’re transparent enough that we can see what the rules are that determine what gets through our filters. And we need you to give us some control so that we can decide what gets through and what doesn’t. Because I think we really need the Internet to be that thing that we all dreamed of it being. We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it’s not going to do that if it leaves us all isolated in a Web of one. Thank you. (Applause)

About Ralph Robinson

Read All Posts By Ralph Robinson

100 thoughts on “Beware online “filter bubbles” | Eli Pariser

  1. I agree entirely with everything he said in this presentation. To my knowledge, however, I have yet to see any improvements on everything he pointed out as being problems. 

  2. Reminds me of the Big Brother-ly advertising scenario in the film "Minority Report," where the environment scanned you and targeted you with specific products they knew you wanted–and the government scanned peoples' retinas on public transportation in order to watch you. But I'm not paranoid.

  3. I do think these filters are "innocent", in that it's optimising the likelihood of people clicking into a link by showing them what they want so that it maximises ad revenue, but he's right, it's also important that we know what those filters are and have the ability to disable them should we choose to, but this video is quite old now, has this improved? 

  4. So did larry respond already?
    This video also got a mention here: https://www.joomag.com/magazine/tvp-magazine-issue-no-19/0982738001429395718

  5. It's very interesting to see the implications of the filter bubble, when at times the conversation has been about the filter saving us from 'information overload'.

  6. Personally, I believe if a user uses the right way to train the personalised google search, the result will be more accurate and relevant according to personal preferences. However, on the other hand, I think that google is not so intelligent and we cannot make sure that we always tell the right information to google. This may result in some ambiguous and misleading search results.

  7. Its scary how Activism feeds itself with more articles that that person should see.   It blocks out the alternate views, making some people extremists.

  8. Indeed, very progressive politically when you start your presentation in another stereotype, people dying in Africa.

  9. How do I get some of these presentations to my elected representatives: Senators, Representatives, State Senators, State Representatives? Also, how do I get it to the FTC and FCC?

  10. I'll be honest with you, normally I make fun of TED talks. most of them are goofy mental masturbation. but this guy was and is 100% correct. it's 2015, look at the recommended videos on your Youtube page… it's all the same stuff, it's all about the same stuff, sometimes it's even from the same people. just whatever you've been clicking lately, that's what's in there. so you click the stuff that you click and the stuff that you click and stuff like the stuff you click is your entire digital window to the world. as someone who was here for the earliest days of Youtube, it's shocking to see how far this is going, because that's not at all how it used to be. I remember this talk in 2011 and it wasn't as bad. this guy pretty much predicted exactly what would happen and it's all happened in record time. I remember clicking through all sorts of diverse links online and getting agreeable and disagreeable stuff and choosing what I liked. I found stuff that challenged me, stuff that I hated, stuff that I loved, all sorts of stuff. and it all shaped me from childhood (Internet pre-Youtube), through my teen years, and now into adulthood. I last watched this vid 4 years ago. In 4 years I'll be 30. I wonder what the Internet will look like then? because this trend does not look like it's going anywhere, any time soon. eerie listening to him talk to people from google that were in that crowd. I guess it was completely lost on them. that's really unfortunate, because I miss that 90s optimism about the Internet and its potential.

  11. Why would curated filtering even be a thing in the first place? The internet isn't primarily for entertainment. Or at least it shouldn't be… I could see something like this being necessarily for YouTube (maybe), but not Google's search engine. Google says "don't be evil"… yet this is one of the most evil things you could do.

  12. I think this is why there's an increasing divide in our politics. Everyone is spending more time online, and they're spending more time in their bubbles of news that cater to their opinions. People become more opinionated and emotional when they're met with someone who doesn't agree with them on a more mainstream platform (like YouTube), and people viciously fight, then retreat back into their news bubbles, and the cycle repeats itself. Google is doing this knowingly, but I can't place my finger as to why they are. This is what happens when you have customers (in this case, advertisers) who are fighting for the spot on the front page of a Google search, I guess. In the case of Facebook, I guess that explains why they were valued so high during their IPO. Advertisers and sellers of products want Facebook to find their markets for them. They want to sell things to their customers – customers who are people who might not even be aware themselves that they are a customer of so-and-so company. Same thing for YouTube.

    All of this is a result of greed for user information and advertisers selling and Google/Facebook selling that ad space.

  13. this is a pretty rosy view all in all. as if actual censorship and manipulation didn't come into this as well.
    still appreciate it as a beginners piece of getting attention drawn to this.

  14. This is a disturbing trend in the "legacy" media's quest to try to maintain ad revenues.

  15. The artificial intelligence responsible for Facebook's News Feed is now on Facebook 😀
    https://www.facebook.com/AlgoTheAlgorithm/

  16. This is why Trump is likely to be the next president of the US.
    This is why radicalisation all over the world has increased enormously in the last ten years: in Europe, Northamerica, the Middle East and even in Asia.

  17. Trump did deserve his presidency. America would have been worse under progressive leadership. It's high time to address social problems.

  18. The filter bubbles are absolutely annoying. They prevented me seeing new ideas and other point of view. I think every tech company that offers this service should let their users have right to decide whether use this or not. I desperately want to explore the 'real world' around me.

  19. Why they just don't add plot option to whatever was excluded from your feed by the algorithem… and every now and then you take a look at it and be happy… ?

  20. I used to learn so much from spending time on the internet. now it seems all I'm doing is just wasting my time.

  21. 7.00 minutes in "This is what ted does right?, Other points of view".
    I'm sorry to say, but TED is one of the biggest circlejerks on the internet, every single talk is basically the same.

  22. https://www.quora.com/If-modern-American-far-right-Republicans-are-so-wrong-why-do-they-have-so-much-power/answer/Frank-Rizzo-70

  23. What's really fascinating is that none of the people I share it with has the same top comments underneath the video.

  24. media is infotainment. us and uk news tv 'competitors' all sing from the same hymn book, every day – coincidence? framing?

  25. Social psych evidence shows that exposure to out-groups can make us more sympathetic or it can further polarize us, depending upon the quality of the interaction. So it's still a crap shoot, and all you need is one troll to evoke reactance and sabotage the enterprise. Another idea is to modify something like the Facebook News Feed to offer expandable access to a few "Counterpoint" links alongside the main links. For example, a Breibart or HuffPo article has a little symbol to the right (e.g. ">>") that you can click on and it expands to show 2-3 articles/resources that counter that main article (if it exists). The main problem with that is creating 'equal time' for potentially crank ideas alongside established ones, but this might be mitigated by making the algorithm sort by volume of supporting research. And at least the second way avoids trolls.

  26. Ted shows other points of views? Ask Sarah Silverman. And a few others they try to basically right for what they talked about.

  27. You have all sure, its is a problem to conect in global informations, because of the filter ball..

  28. Maybe YouTube can put indicators if it applies any filter on the searches we make. Either in YouTube, Google, or anywhere else in Google universe. There probably no perfect algorithm that works for everyone, all the time. I might want to have a filter bubble when I am looking for e.g. movies, I don't want to know about movies outside my favorite genres. However, I may want to know about everything when I look for news in my region, so I can be aware of things and have the Internet expand my horizon rather than narrow it.

    I should be able to control, and aware, about the information I receive. So I can make a judgment about its balance; rather than built-in a code that standardized a conception of a particular balance.

    Would be good if Google can say Google applied x, y, z-filters to present you with this search result. Then I can see.. hmm.. it says it filter my result based on my search from the past week, so if I turn that off, would I get something else interesting. Something like a light bulb can appear at the corner of the search result, then a list of applied filter can be turned on or off if I desired to do so.

    Google collecting data is not a problem (in this matter), but what set of filters get applied, am I aware of them, and can I have control over each individual filter, are the things what can change the search experience. I am pretty sure Google search can offer these functionalities.

  29. That is 100% correct. This talk already has 6 years and things actually only got worse. It is really harder now to find information not coming from mainstream medias. And the struggle about fake news announces that it will now be very very hard to find "alternative" sources of information!

  30. You wonder why the leftist liberal democrats are steadily losing their minds, going bonkers, getting violent, snapping and killing people. Think about all of the constant hatred and calls for violence, etc. that is coming up in their news feeds, videos, and websites that are specifically targeted to the liberal left. They are literally being brainwashed, influenced and encouraged to violence to such a degree by a corrupt one sided agenda that is being spoon fed to them online.
    Our thoughts are not your own !!!. we are all being subconsciously controlled and manipulated, and it is not being done by some random chance of the algorithms, it is being done with a specific agenda behind the information that you are being fed.
    ( As a perfect example just look at how extremely limited, and one sided or controlled your facebook newsfeed is now compared to what it used to be when you first joined facebook).
    When you stop and think about it, you can almost see that these people are quite literally being stirred to madness by the global elite who can control or steer these ever narrowing algorithms or thought bubbles.

  31. Typical Utopian "progressive" perspective. The goal of the (DARPA developed) "internet," wasn't to "bring the world together," kid. It was to take "brainwashing" and "centralized control" to a higher level, into ever fewer hands, which is precisely what loyal servants like public figurehead CEOs have done almost worldwide, for their masters. (And why controlled puppets like O' bummer "signed- over" control of the internet to his controllers at the the UN, on his way out…)

  32. I tried so hard and got so far, but in the end it doesn't even matter. I had to fall to lose it all, but in the end it doesn't even matter

  33. This is why after watching this video, youtube suggests me in the right bar some Ted videos between soccer goals, recipes and guitar videos…

  34. so someone needs to create an app which produces daily (or weekly – whatever) search lists which are designed to manipulate the robot gatekeepers…you don't actually have to look at what is being searched but in theory, it should change your "bubble" (in a good way).

  35. The most important thing is: WHO decides what is important and what is not? Zuckerberg? A burocrat in UN, EU or other politician? Then we're back to gatekeepers…

  36. #FutureNet is the answer because it is the next #Facebook that is UNCENSORED and pays the people in #Bitcoin! Join us for free at https://lightleader.futurenet.club

  37. It is so true. But, what can computer engineers do? They depend entirely on big companies that survive just on selling information to privates. For all of this vicious cycle to collapse, we should revolutionize the economic system, it's the only way.

  38. how about give the user an off switch, to remove algorithms and filters, so we can be delivered unassuming, unbiased data and experience, like the old days? should be easy to do, please don't make us pay for it.

  39. YouTube does this too, even with the comments. The YT comments look entirely different depending on who is logged on.

  40. I solved this problem by switching to Duck Duck Go search engine (which presents every user with the same search, and does not track individuals—by design), and soon I will switch to Signal for my non-iMessage social network chatting and away from Facebook Messenger. I have never used WhatsApp and don't use Instagram. I stopped using Facebook, the website and app, long ago: it was a happiness drain. Mark Zuckerberg it too calculating an individual to run a service I use, tbh. He's doesn't care about people on a personal level. That much is clear when you observe how he runs his business. Judge them by what they do, not what they say. He's been apologising for invading people's privacy for a decade. He either plays dumb or says sorry and the following year does something worse.

  41. Although I agree with some of these sentiments, isn't it inherently undemocratic and condescending to assume that giving people what they want is bad because people don't know how to regulate themselves? I mean, that's true. But concluding that Google should be in charge of showing us what we "ought" to be viewing sounds a little totalitarian.

  42. This is how reality has always worked. One person has one reality another person has another reality. "Your reality does not create your point of view; your point of view creates your reality." So one person who always asks, "wow how does it get even better than this?" even when things may appear to be going wrong for another person, has their life in all areas keep getting better and better, but someone who is saying, "Damn this is terrible," for some very insignificant thing that any of us could step over and not notice, will start having more and more rugs pulled out from under him/her. Welcome to the matrix; time to start creating the reality you desire and require and stop buying all the lies of other people's realities. You don't have to be causally incarcerated into other people's causal realities. How does it get even better than that?

  43. I think there seem to be few or no comments against what he said because the algorithm of Google works on Youtube as a filter. This may make people think his opinion is definitely right.

  44. If you search for foreign affairs or history do so in the original language.

    Compared with your own data
    the bias will be obvious

  45. Recently I have been noticing even ted social media site on Facebook Post talks related to topics I have typed whatsapp private chat, twitter etc. Is it coincidence or algorithm? It's scary. 😐

  46. Algorithms are a counterfeit for the Law Of Attraction….. All that is , has a man made counterfeit version …..all the controversy that stems from the new intelligence is normal response. It's how the kinks are being worked out….we are the generations that are a part of the experience of the shift…..Hold on tight as we take flight

  47. How is this video worth a standing ovation lol. This entire video can be summarized in 3 or 4 sentences. TED videos are just progressive propaganda. We know know that Google and friends cover up right-wing results on search engines very frequently. Just pick a dozen political topics, and the top results are always left leaning sources such has the Washington Post or CNN.

  48. This idea that companies such as google and yahoo news will change their algorithms to reflect ethics is ridiculously naive and unrealistic, because as long as corporations are driven by profit, these "ethical algorithms" will interfere with that profit and thus will never be implemented. Things will not change until there is a total upheaval of the western value system.

  49. Extraterrestrial Composition 👽 🎶
    https://chrome.google.com/webstore/detail/threelly-ai-for-youtube/dfohlnjmjiipcppekkbhbabjbnikkibo

  50. There's a startup working on addressing these issues for the news space. They expose a diversity of opinion from accross the globe and most importantly show the same opinions to everyone. The idea being you see the facts and opinions then make up your own mind (not have the algorithm pick for you!!)

    They're called Scope
    https://scopenews.co.uk/get

    I find them pretty good, might be of interest to some of you 🙂

  51. Well you could turn off personalization in a search engine like https://www.ecosia.org. That would avoid the situation of a filter bubble

  52. Also Instagram. I hate that I can't find stuff that I'm looking for, instead getting more and more recommendations based on what I already have.

Leave a Reply

Your email address will not be published. Required fields are marked *