Saying Goodbye To The Kojo Nnamdi Show
On this last episode, we look back on 23 years of joyous, difficult and always informative conversation.
Many Americans get their news through social media. But the stories that surface in our Facebook feeds are heavily curated by an unseen force: an algorithm that purports to select the most relevant stories and postings for each user. In contrast, Twitter users access an unfiltered stream of postings by people in their network. But recent comments by company executives hinted that could soon change. Tech Tuesday examines how algorithmic filtering works, and why some people worry it is negatively impacting social media users access to information.
MR. KOJO NNAMDIFrom WAMU 88.5 at American University in Washington, welcome to "The Kojo Nnamdi Show," connecting your neighborhood with the world. Later in the broadcast, a high profile hack of celebrities' racy images has many people wondering just how secure cloud storage services are. We'll get some tips for protecting your content online. It's "Tech Tuesday." First, unseen forces influencing what you see on social media. Millions of Americans get their news from Facebook.
MR. KOJO NNAMDIThey use the site's front page to figure out what's going on in the world. And what's happening to people in their social network. But Facebook's news feed may not be serving up as much news as you think it does. The social network uses a computer program to comb through all the postings from your friends and selects the stories it thinks are most relevant to you. It's called algorithmic filtering. And some people think it's having a negative impact on social media users by narrowing the news and views we see.
MR. KOJO NNAMDISo they were especially alarmed when an executive recently seemed to indicate that Twitter could introduce filtering. Joining us to talk about this is Zeynep Tufekci. She's a Professor in the School of Information at the University of North Carolina and faculty associate at the Harvard Berkman Center for Internet and Society. Zeynep Tufekci joins us from studios at UNC Chapel Hill. Thank you for joining us.
MS. ZEYNEP TUFEKCIThank you for inviting me.
NNAMDIYou too can join the conversation. Give us a call at 800-433-8850. How would you assess the user experience on sites like Facebook? Do you think the stories on your news page are giving you an accurate view of the news of the day? 800-433-8850. You can send email to kojo@wamu.org. Shoot us a tweet @kojoshow using the hashtag techtuesday. Or go to our website, kojoshow.org and join the conversation there.
NNAMDIZeynep Tufekci, the average Facebook user has about 200 friends, according to the Pugh Research Center, but that does not mean that we actually see postings from 200 people on our news feeds. In fact, a computer program is employed by Facebook to comb through all that user data, brings a set of stories to our front page. What do we know about how this algorithm works?
TUFEKCIRight. So, the algorithm is not released. So what we're -- what I'm going to tell you is what we think we know by trying to reverse engineer this. The things we've been able to tell are that it allows things that are commented upon, liked to rise to the surface. It allows things that are contained pictures, that contained links. And also, very interesting, things that have the word congratulations in the comments, right? So, if somebody says, here, I'm getting engaged or I'm having a baby.
TUFEKCIAnd people start congratulating them. The word congratulations seems to trigger the algorithm to put that on top of everybody's feeds. So if you're wondering why your Facebook is full of baby news and engagements and marriages, that's partly why. And recently, my friends who've learned of this have started hacking this. One of them was trying to sell her camera equipment and she just went and said, can you all congratulate me so that, you know, my camera equipment sale is visible.
TUFEKCIAnd we tested this, and sure enough, it was on top of a lot of peoples' news feeds. So, it's really -- it's opaque. It's gameable. And again, remember, this is an ad based platform, so it's also kind of tailored to make things easier to sell ads and also to show you content that Facebook thinks is going to keep you coming back and clicking on like.
NNAMDILast week, news stories indicated that Twitter was considering its own foray into filtering, which set off a chorus of outrage among many Twitter users. Since its inception, Twitter has been unfiltered, meaning it reads as a sort of a chronological, objective document of news as it breaks. Would filtering change the nature of Twitter?
TUFEKCISure. It would. Right now, it's not true that there is no filtering. There is filtering in the sense that the people I choose to follow on Twitter filter for me, right, by re-tweeting or by mentioning other things, so it's kind of like this human network that I've chosen to follow that shows me, here, this is what I think is important. And I get to choose whether I like what these people are choosing or not. Now, if a computer program started doing that, I really don't know whether it would pick the kinds of things that my friends and the people I follow would pick.
TUFEKCIAnd I also don't know if it would just pick a few very popular things. You know, if my Twitter feed right now has very little, you know, fluff, or Kardashians, or other things, you know, maybe the algorithm might think, hey look, all these other people are clicking on that. And start showing me those things, which is why I kind of worry. There's a way in which that there's a human intelligence at work at Twitter, which makes it kind of cumbersome. That's why they're trying to change the algorithm, I think.
TUFEKCIBecause for a new user, it's a little confusing. But it's very rewarding after you've put in the labor to create a list of people whose judgment you trust. But of course, being a company, they're thinking, how do we make this quicker and faster and get a lot more engagement? And on Facebook, what Facebook does is we know better what you want. Here you go. And that kind of works in some way, in that you get stuff that other people liked and commented on. Or have pictures or have congratulations. But who knows what you're missing? Maybe you're missing...
NNAMDIHas there been any survey conducted, at all, about how people respond to what we might be missing? How people respond to the knowledge that they're not necessarily seeing everything that their friends are posting, which, of course, would be a whole lot of information.
TUFEKCIYes. Yes.
NNAMDIBut, yes. There have?
TUFEKCIPeople don't even know. No. People say the thing -- two friends of mine, Christian Sanbeck (sp?) and Carrie Cariolis (sp?) , they did a study. They were trying to test just this. And what they found was that 62 percent of the people didn't even know, they didn't even realize, that Facebook was algorithmically curating their feed. For most people, they turn on, here it is. You know, this is what my friends said. And they're not really giving it a lot of thought. Algorithms are really interesting because they're invisible, right?
TUFEKCISo you, Facebook doesn't make -- advertise the fact that it's curating. It doesn't advertise the fact that it's hiding things from you. You have this thing that kind of appears, almost like magic. Here you go. And I think what has happened is as people learn, then they might start looking and thinking, well, what is it that I'm missing? So, I sometimes do it. I sometimes go and check, like, through my Facebook feed as much as I can. You know, chronology. But even I found that it doesn't show me everything.
TUFEKCIAnd on Facebook, one of the downsides, I think, of algorithmic curation is that the only signal you sent -- can send is to like something. And we saw this, I think, in the Ferguson news. How do you like a news story like that?
NNAMDII was about to get to that. The Ferguson...
TUFEKCIYeah, I dislike it. You know? So, I couldn't really say -- it's hard for us to tell Facebook, hey, I don't like this. But it's important. Right? So Facebook, I think, because it only allows like as a signal, seems to be surfacing things that are happy, more -- I'm guessing, because once again, they're not showing us the data. They're not allowing us control, which I would really like. And we have a lot more -- you know, my Facebook feed has a large number of, partly because of my age, of course, people who are announcing babies.
TUFEKCIBecause everybody's congratulating them and then Facebook is showing it to more people. And the more people that see it, it becomes this, you know, feedback cycle. The more people see it, the more people comment. And the more Facebook decides, oh, I'm gonna show it to even more people. Whereas maybe, if there's some more sad news that people aren't necessarily clicking like on, I don't even know if I'm -- you know, I don't know what I'm not seeing.
NNAMDIIn case you're just joining us, we're talking with Zeynep Tufekci. She's a Professor in the School of Information at the University of North Carolina. And faculty associate at the Harvard Berkman Center for Internet and Society. We're taking your comments and questions at 800-433-8850. Do you trust the algorithms used by Facebook and Google to accurately guess what you want when you log in? 800-433-8850. You can send us email to kojo@wamu.org. Zeynep, I'm so glad you mentioned the issue of congratulations and how it tends to attract attention.
NNAMDIThis summer, there were two huge stories that caught fire, first on social media, then ended up becoming huge stories across traditional media. You mentioned Ferguson, Missouri, the shooting of Michael Brown, an unarmed black teenager, which sparked outrage then days of unrest. Also, the ALS ice bucket challenge encouraged people to dump ice water on themselves to raise money for a horrible disease. And some people began to notice a curious pattern on their social media accounts.
NNAMDIThe ice bucket challenge was blowing up on Facebook while everyone on Twitter was talking about Ferguson. Was this algorithmic filtering at work?
TUFEKCII suspect very much so. Again, since Facebook doesn't tell us, we really don't know, but I really suspect so. Because the ALS bucket challenge is geared, it's made perfect for Facebook's algorithm. It's got a video, which, as far as we can tell, the Facebook algorithm seems to prioritize. You tag other people, you know, you say here, I pass this challenge on to you. And, you know, since people are trying to raise money for a horrible disease, everybody says, oh, congratulations. Great, good for you. So it raises a lot of comments.
TUFEKCISo it's just the kind of thing that is designed, almost, to be prioritized by Facebook's algorithm. And I had the same experience. You know, I've had so many post after post after post on the ALS bucket challenge. And also, Facebook keeps showing the same posts to me, again and again. You know, more people would see it and more people would comment. So I've still got, you know, ALS bucket challenge from a week ago showing up on my feed. And it just perpetuates this.
TUFEKCIWhereas, beginning of the sort of, the early days of the demonstrations, my own Twitter account was very heavily -- my Twitter follow, which has a lot of overlap with my Facebook friends. Although, of course, very different groups by numbers, was very heavily geared towards Ferguson. And Ferguson didn't surface in my Facebook profile, you know, my Facebook page until the next day or two and then I started seeing more posts. And still, they were greatly outnumbered by the ALS bucket challenge.
TUFEKCISo, it's probably partially a function of who posts what where. But it's also, since you post something to Facebook, you don't really know who's gonna see it, it seems, you know, one might be more reluctant. The algorithm changes our behavior, too. It doesn't just filter what we see. It kind of encourages and discourages us, you know, to take the time to post here or there, because you don't know if your friends are gonna see it. But I do want to say, if you go on Facebook and post something and none of your friends comment, and you're thinking, why aren't they saying something?
TUFEKCIMaybe they're not seeing it at all. It's totally possible. So this is this great new divergence in which instead of an editor, like in traditional newspapers or people you choose to follow, human intelligence on Twitter. On Facebook, it's an opaque algorithm who's functioning a secret that's deciding what's important and what's not. In some ways, it makes life easier, because you don't have to make decisions. In other ways, you have a lot less control and you don't really know what's going on.
NNAMDIWe got a tweet from Howard who says, I've got an idea. Why can't Twitter let us pick our algorithms? A filter might be nice if we could enable it as we choose, which brings me to the issue of transparency. Facebook caught a lot of heat earlier this year when it was revealed that it had intentionally manipulated the newsfeeds of some users as part of a research project. That raised some thorny questions about consent and research ethics. But it also highlighted how little we know about how the algorithms work. So when Howard says, let us pick our own algorithms, what would or what should transparency look like here?
TUFEKCII absolutely agree. I mean, algorithms aren't some evil demons, right? I mean, they're computer programs. When you, you know, if you have an ABS brake system in your car, you got an algorithm figuring out how to best brake. So they can be quite useful. But you know, and in Facebook, obviously, if you have 200 friends and they're all posting three things a day, that's, you know, 600 right there. And a lot of people post more, and which one is Facebook gonna show you? So there is a question of how do you prioritize? The thing, as you highlight, with the backlash to the study, is that this is, especially on Facebook, this is you and your friends and your family.
TUFEKCIIt's a couple hundred people that you feel close enough to mutually friend. So people do want more control and more transparency as far as I can tell. And I think that's what the backlash to the experiment was about. There are a lot of ways that there could be algorithmic choices that were transparent to, and you could pick. And you could say, here, you know, this is how I want it to work and you could go back and tweak if it didn't work for you. And there would be a lot more transparent ways for you to go, you know, find what you were missing.
TUFEKCIYou could -- there's so many ways to do it, but I see no move from Facebook, which is kind of worrisome in and of itself. And again, remember, the platform lives by delivering ads to you. So you're not its actual customer. The people who are buying the ads are its actual customers. So I feel like, you know, how do we know the algorithm is tweaked more to our benefit, not to the delivery of ads? And now Twitter's, you know, sort of moved towards this direction, potentially. It's unclear. I think caused that backlash, because people are, more and more, our online spaces are crucial to our civic spaces.
TUFEKCIYou know, for Ferguson, it's civic. For ALS bucket challenge, it's important. For, you know, our friends, our families, and if you're an immigrant, your family is probably communicating with you on Facebook and other platforms. They're these really integral, really important, both public and private spaces of 21st century. And we've got this invisible layer of algorithms that are opaque, not transparent. We have no idea how they work. And these stand between us and our news.
TUFEKCIThey stand between us and our friends and family. And I think, given how much money these companies have made recently, how big they've grown, because they're so important to us, I think they have a moral obligation to say, wait, this algorithm isn't just for us to decide. This is something that's determining how people see, you know, what they see of their friends. And I fear the day that, you know, somebody's gonna post a cryptic suicide note to Facebook and nobody's gonna think it's important. That algorithm's not gonna think it's important enough, because...
NNAMDICause nobody's gonna say congratulations.
TUFEKCIIt -- nobody's gonna say congratulations, and maybe it's not gonna have the key words, you know?
NNAMDIExactly.
TUFEKCIYou know, I'm sure Facebook has some key words for -- if you say, I'm going to, you know, commit suicide, I'm sure Facebook has key words for that. I'm not saying they're irresponsible. But what if it's an opaque roundabout thing only the friends would understand, but it's never shown? It's gonna happen.
NNAMDIHere is Bob in Chillum, Maryland. Bob, you're on the air. Go ahead, please.
BOBHey. Good afternoon, folks. I want to say, kind of in a nutshell, that this is really a highly depressing conversation. And that what we're already seeing in our society is increasing polarization because of algorithms. People are only getting news that's tailored to agree with what they already believe. And it's heightening, exponentially, this degree of polarization in this country, to the degree that we're not able to do anything. And yet, more and more, we rely on this and these technologies. And at the very least, we need to stop referring to this as a platform for free speech, because it's obviously not. Anyway...
NNAMDIYou raise a fascinating point, Bob. Because Zeynep Tufekci, for a number of years now, scholars have observed a tendency among we internet users, that we tend to aggregate among people who see the world in similar ways to us. Some have called this the filter bubble. Some have called it homophile. Do you think sites like Facebook, with their curated stories, make it worse, which is what Bob seems to be suggesting?
TUFEKCIWell, we had recent research from my friend at Rutgers, Keith Hampton, whose previous research had found that social media increases political participation, but it seems to be dampening deliberation exactly for this reason. Because of polarization and we are sort of afraid of the arguments. Now, the thing that -- this comes from real human tendency, right? Wanting to go find people who think like you is not the fault of technology. This is a human trait.
TUFEKCIBut what happens online is this already existing human tendency to seek the comfort of agreement is made worse by the fact that the algorithm's gonna feed that sugar to you. It's going to show you things you clicked like on before, because they don't want you to get upset and storm off the platform. They want you to say, oh, here are things that I agree with. Now, an interesting twist to this would be, since we, as humans, have this tendency, if we had more control over our algorithms, maybe we could put them to work the other way.
TUFEKCII mean, if somebody offered me, here are 20 smart awesome people who disagree completely with you and really make good points, I'd actually want to know. Right now, you know, it's not easy for someone to find things to read that are good and smart and that disagree with you. Because, by definition, you think, I'm so smart, therefore if they disagree with me, they're not. Maybe our algorithms could be used to nudge us, push us in the opposite direction.
TUFEKCIAlmost like sort of nutrition labels, right, when you could look at and say, hey, this is good food for me. And this is not. But right now, because the internet is unfortunately based on ads, what they're doing is how do we create the pleasant, mellow environment that makes us click on ads and buy stuff?
NNAMDIGlad you...
TUFEKCIWhich is a pity, given the, you know, potential.
NNAMDIGlad you made that point, because my final question, these algorithms are good at making a quantitative assessment of what's important. They can quickly identify the stories that are garnering the most clicks on the web and surface them on our news feeds. But what about important stories that cannot be quantified? Stories that have not been read by very many people, but they should be. You say that these algorithms reward content that has already been rewarded.
TUFEKCIAbsolutely.
NNAMDIWhat do you mean?
TUFEKCIAbsolutely. I mean, these are rich get richer systems. If you're already, sort of, on the rise, then you're gonna be seen by more people, and more people are gonna react with you. And then Facebook's gonna show it to more people. So if you take off, you're gonna dominate. Like ALS bucket challenge. Once something spikes, it can completely take over the feed, because, you know, it's just a feedback loop. On the other hand, if something doesn't spike, it could completely be buried.
TUFEKCISo it increases the distance between things that, you know, dominate the news and the things that maybe should have a chance, but never even had a chance. If you never get the chance, you just get buried very quickly. So, there are very steep inequality regimes, so to speak. Which is unfortunate, once again, because it's an open system. It has the potential to do so much more, in terms of bringing important and essential news and information from each other to each other.
NNAMDIZeynep Tufekci is a professor in the School of Information at the University of North Carolina and faculty associate with the Harvard Berkman Center for Internet and Society. Thank you so much for joining us.
TUFEKCIThank you for inviting me.
NNAMDIWe're gonna take a short break. When we come back, a high profile hack of celebrities' racy images has many people wondering just how secure cloud storage services are. We'll get some tips for protecting your content online. It's "Tech Tuesday." I'm Kojo Nnamdi.
On this last episode, we look back on 23 years of joyous, difficult and always informative conversation.
Kojo talks with author Briana Thomas about her book “Black Broadway In Washington D.C.,” and the District’s rich Black history.
Poet, essayist and editor Kevin Young is the second director of the Smithsonian's National Museum of African American History and Culture. He joins Kojo to talk about his vision for the museum and how it can help us make sense of this moment in history.
Ms. Woodruff joins us to talk about her successful career in broadcasting, how the field of journalism has changed over the decades and why she chose to make D.C. home.