Join us for our weekly review of the politics, policies, and personalities of the District of Columbia, Maryland and Virginia.
Online platforms like Facebook and Twitter gave people a place to connect and share ideas. And because they reach global audiences, the private companies that run them are at the center of an international debate over competing conceptions of free speech. Kojo looks at how these tech companies are shaping their content policies, and what those policies mean for interpretations of free speech around the globe.
- Jeffrey Rosen professor of law, George Washington University Law School; legal affairs editor, The New Republic; president, National Constitution Center
- Zeynep Tufekci fellow, Princeton Center for Information Technology Policy; assistant professor, department of sociology and School of Information, University of North Carolina, Chapel Hill.
MR. KOJO NNAMDIFrom WAMU 88.5 at American University in Washington, welcome to "The Kojo Nnamdi Show," connecting your neighborhood with the world. It's Tech Tuesday. Supreme Court justices have spent 200 years debating the 1st Amendment. And their rulings have come to define what free speech means in the U.S. But it may actually be leaders in the corporate boardrooms of Google, Facebook and Twitter who are making the most important decisions about freedom of speech in the years ahead.
MR. KOJO NNAMDIAs judge and jury of the social Web that people who write the rules and enforce them, they're defining hate speech in the digital age. They're dealing with regimes who want to target dissenters and dictate through censorship, and all the while, they're trying to make a profit. But for better or for worse, the future of online speech could be in their hands. Here to discuss that is Jeffrey Rosen. He is a professor of law at George Washington University, legal affairs editor at The New Republic and president of the National Constitution Center. Jeffrey Rosen, good to see you again.
PROF. JEFFREY ROSENOh, it's great to be here.
NNAMDIAlso joining us from studios at Princeton University is Zeynep Tufekci. She's a fellow at the Center for Information Technology at Princeton University and a professor at the University of North Carolina, Chapel Hill. Zeynep Tufekci, thank you for joining us.
PROF. ZEYNEP TUFEKCIThank you for inviting me.
NNAMDIYou too can join this conversation at 800-433-8850. Do you trust the private companies like Facebook and Google will make the right decisions about free speech and privacy? Do you believe that free speech should be absolute, or should we be willing to sacrifice free speech so that the Internet can be, well, a safer place? 800-433-8850. You can send us email to email@example.com or send us a tweet, @kojoshow, using the #TechTuesday.
NNAMDIJeffrey Rosen, in the U.S., we celebrate our right to free speech. Supreme Court have historically been in charge of deciding what free speech means. We all know you can't yell fire in a crowded theater, but how about the online theater, so to speak? Today, much of our speech is happening on the Web and in a global marketplace. So who's in charge of deciding the online rules of free speech?
ROSENWho's in charge? Often 22-year-olds wearing flip-flops and T-shirts in Silicon Valley.
ROSENIt's really a remarkable thing. It's the -- the content or the people who are responsible for making hate speech decisions at the leading companies, like Google and Facebook and Twitter, who are more powerful than any king or president or Supreme Court justice, as you said in your introduction. I think this is the most fascinating topic I've had the privilege of meeting these people over the past couple of years in the course of writing a bunch of articles about them.
ROSENAnd I'm struck by how conscious they are of the magnitude of their own decisions. I talked recently for The New Republic for Dave Wilner, who's in charge of content for Facebook, and he's 28 years old. He joined Facebook five years ago as the guy who was in charge of the helpdesk, you know, at midnight just writing into user requests. A couple of years later, he was promoted to lead the six-member content policy team.
ROSENHe now supervises hundreds of people distributed across the world. And the way it works is on Facebook as on Google, if a user complains about a particular item, say the innocence of the Muslim video that made such a controversy a few months ago, there will be an initial decision by the first responders in India or Austin or Dublin, the case of Facebook, and if the decision is tough, it will get escalated up to Dave Wilner and his colleagues.
ROSENSo he is ultimately responsible for whether to take something down and leave it up. And in the -- when it came to the innocence of the Muslims' discussion, he decided that because Facebook's policies allow speech that criticizes institutions but ban attacks on religious groups, there was nothing to ban because the innocence of the Muslim video criticized the Prophet but didn't say, I hate Muslims, therefore he left it up. And in retrospect, the decisions that he and the people at Google made seemed better, more nuanced, more protective of speech than that made by President Obama.
NNAMDIWould you say in general would you characterize that as the kind of general mindset that those people, like Dave, fairly unknown at Facebook and Twitter, is that the kind of mindset they generally have going into debates over free speech on the Web?
ROSENI think they are marinated in the American 1st Amendment tradition. Not all of them are lawyers. Dave Wilner was an anthropology major in college. He never even went to law school. But he read John Stuart Mill, and he understands the basic American legal principle that speech can only be banned when it's intended to and likely to cause imminent violence or lawless action.
ROSENThat's not to say that Facebook and Google have adopted 1st Amendment standards entirely. Their hate speech guidelines do allow the suppression of speech that would be constitutionally protective under the 1st Amendment, including speech that says I hate members of a religious group, which is permissible under the U.S. Constitution, though not under Facebook's guidelines.
ROSENBut I do think that Wilner, Nicole Wong, who's the legal director at Twitter and had previously been the decider at Google, and the people at Google now are more inclined to protect speech than their colleagues in Europe, for example, where there's a much broader tradition of banning speech that offends the dignity of particular groups.
NNAMDIZeynep Tufekci, we're talking about American companies operating in countries around the world, and while Americans might consider the 1st Amendment to be the gold standard of free speech, you say that our free speech culture is actually baffling to much of the world. What makes the U.S. an outlier in that respect?
TUFEKCIIt really is an outlier, but I want to start by contextualizing that it's not true that we have this no speech ban except for imminent violence. U.S. does have very strong intellectual property rights. And that leads to a good deal of speech being suppressed. Now...
NNAMDIOh, something we'll get to in more detail later on this broadcast.
TUFEKCIRight. So the objection usually is, well, so that's just the way we do it. And if you look at the rest of the world, the U.S. really is an outlier. In fact, Europe, Canada, most rest of the world, it's not just sort of Third World countries have various legal protections against what is sometimes called group libel or hate speech or speech that is seen as just undesirable in the public sphere. And also, I'm a social scientist, and just before coming to the show, I got curious about how much support there is in the U.S. for the 1st Amendment as it is generally understood.
TUFEKCIAnd the general social survey, which is the gold standard of social surveys, has some questions, recent questions about this. And to my somewhat surprise, I found that even in the U.S., if you put the 1st Amendment to vote today, it would lose. It's not -- it's -- free speech as it is defined now is a gold standard and fairly popular among U.S. educated elite groups, and that's probably a very small portion of the, you know, entire globe.
TUFEKCINow, I'm fairly close to that position myself, but I have these discussions again and again with people around the world outside of the U.S. elite circles. And when I explain in the U.S., you know, you can have something that is publicly, you know, sort of we don't have individual defamation laws. We don't have ways to any kind of controls on free speech when it comes to hate speech from the government.
TUFEKCIPeople are just baffled. It really is we are the minority as it comes to the world, and in the United States, the elites are the minority as it comes to public opinion on free speech. And in fact, historically, the current version of 1st Amendment is also recent. So we can -- I mean, there are many good reasons to defend U.S. 1st Amendment as it's constituted.
TUFEKCIBut it's not right to just sort of look at the rest of the world and say, "Why don't they get the message when it's clear that this kind of any speech goes and can not be intervened by the government?" And the public sphere is a recent historical minority position, even in the United States.
NNAMDI800-433-8850 is the number to call. We're discussing free speech on the Internet. Are you from a different country than the U.S.? How did the laws around speech and censorship in your home country compare to those in the U.S.? 800-433-8850. Zeynep, you mentioned that videos from YouTube and other things can be taken down because of copyright violations. Why do you think that we as a culture are more comfortable with that kind of censorship?
TUFEKCIWell, this is the historical development. U.S. has always -- U.S. is the home to big media companies and, you know, sort of traditional content providers that have lobbied for and have gotten very strong protections for intellectual property rights. And it's not just that these intellectual property rights are used to prevent, you know, illegal copying, which is something -- it's a complex other matter.
TUFEKCIThere are many cases, you know, if you're a scholar trying to write about, say, the Walt Disney corporation, you'll be facing a lawsuit on intellectual property, the sort of most famous sort of sociological study of it could not use Mickey Mouse at the cover. It just -- it -- so you can see intellectual property rights are applied much more strictly and broadly for longer compared to the rest of the world.
TUFEKCIBut again, this is historically contingent. It's only in 1989 or so that U.S. decided to enforce copyright law globally. This -- and it sort of adopt copyright protections globally because U.S. had arrived as the, you know, main producer of content in the world. So a lot of these things we sometimes think of as these absolute rights that are self-evident to us are results of long historical debates in which the position that seems self-evident to us now just was not self-evident, say, 20 years ago, or it's not self-evident outside of a small circle.
NNAMDIIn case you're just joining us, Zeynep Tufekci is a fellow at the Center for Information Technology Policy at Princeton University and a professor at the University of North Carolina, Chapel Hill. We're talking about free speech in the Internet. Joining us in studio is Jeffrey Rosen. He is a professor of law at George Washington University, legal affairs editor at The New Republic and president of the National Constitution Center.
NNAMDIYou can call us at 800-433-8850. Have you ever encountered harmful speech online? How did it affect you? 800-433-8850. Jeffrey, we may think of China, oh, Iran as governments with a heavy hand in censorship. So why is it actually Europe's stance toward harmful speech online that has you and some of these tech companies worried?
ROSENBoth are worrisome. It was troubling that in December, the U.N.'s International Telecommunications Commission approved a proposal by China and Russia and Tajikistan among others to create ominous sort of international norms and rules standardizing the behaviors of countries in cyberspace, which will allow for top-down censorship. But I am concerned about the European approach, which Prof. Tufekci is absolutely right, is more broadly adopted than the American approach.
ROSENWe are indeed an outlier. I'm approached -- I'm concerned about it because I think it will allow for not only dictatorships but democratic governments to suppress a great deal of speech critical of themselves in the name of avoiding dignitary harms. Let me give a concrete example. The European Union is currently debating sweeping new right called the right to be forgotten, and it comes from France.
ROSENThe le droit a l'oubli, the right of oblivion, which is very French, like, you know, the French want to be forgotten, and Americans want to be remembered. But the right is actually quite sweeping. It would allow anyone, any data subject to demand the deletion of any data concerning that subject, unless a European privacy commissioner determined that it was necessary to protect literary, scientific or journalistic purposes.
ROSENAnd if Google and Facebook guess wrong about whether the speech is necessary for public purposes, then they're liable for up to 2 percent of their annual income, which in Google's case is $40 billion last year, that's up to $2 billion per incident that they could be liable for if they guess wrong about whether the speech is protected. Obviously, Google and Facebook will have a huge incentive to remove content as soon as anyone objects to it.
ROSENAnd we see this all the time where a government official in a country like Thailand will demand that a YouTube video criticizing the king come down because that's illegal in Thailand. Now, it will be possible for a European politician who's running for parliament to say that a particular item offends their dignity. And if there's any uncertainty about whether or not that a European privacy official would allow it, Google and Facebook may have to take it down or face horrendous fines.
ROSENSo I agree that there are very different norms here, and the European tradition has a very respectable and important history. But I, like some of the tech companies, are concerned that in practice if the European approach is codified and given the weight of huge fines to put behind it, the Internet will become a far and less open and free place than it is now.
NNAMDIWhat criteria do the Web companies currently use for assessing the harm in online hate speech, Jeffrey?
ROSENThey have their own content policies, which broadly ban hate speech that attacks groups on the basis of race, religion, ethnicity, gender and so forth. So that's why they will allow speech that says I hate the pope, but won't allow speech that says I hate Catholics. But more broadly, the companies have pledged to respect the laws of the individual countries in which they do business.
ROSENSo initially, Yahoo, for example, had taken the position that it didn't have to remove Holocaust denial material, which is illegal in France and Germany 'cause they wanted to adopt the American approach. But when a French judge threatened to fine them a lot of money, they took the material down. And now, if material is clearly illegal in a particular country, it will be removed. So Google, for example, in 2007, had a squabble with the Turkish government.
ROSENThere were videos on YouTube posted by Greek football fans accusing Kemal Ataturk, the founder of modern Turkey, of being gay. This is illegal in Turkey. It's an insult to Turkishness. So Google took down the videos in Turkey and IP blocked them for Turkish users. That wasn't enough for a Turkish judge who wanted the videos removed around the world.
ROSENSo as a result, Google itself was blocked in Turkey for several years. That just shows that when the law is clear, the companies will remove the material. But in closed cases, they want to protect speech and allow political criticism, and sometimes governments will retaliate by trying to block them as a result.
NNAMDIZeynep, speaking of Turkey, as your home country, where authorities actively center content online, one of the consequences, in addition to what Jeffrey said, was a two-year ban on YouTube. That ban ended in 2010. But at what point do laws prohibiting defamation or hate speech become more extreme bans on online expression, in your view?
TUFEKCIWell, the -- I mean, I'm in no way defending the YouTube blockage. That was ridiculous, and it was just sort of just embarrassing, nothing more in terms of -- just undefensible. But part of the issue that was going on there was that Google had not opened offices in Turkey at the time and was not paying taxes. And that was part of the story, and I think that's kind of linked to the story we're talking about, the tussle between governments who want these companies who are operating in their countries, you know, not being subjected to their laws.
TUFEKCINow, in the case of, say, China or, you know, Iran, I think it's very easy for us to say very good. Don't obey Iranian, you know, censorship laws, and don't obey Chinese censorship laws. Or don't obey, you know, Turkish laws to block based on ridiculous laws. But at some point, you really come back to the question that Professor Rosen has written in his article too. At what point do you sort of put these companies exempt from all government's politics?
TUFEKCIAt what point does a country have a right to determine what kind of speech and what kind of freedom of assembly policies they're going to have because speech online is never just speech online. It's also a freedom of assembly. It's not just freedom of speech. It's a place, increasingly, a lot more for public discourse is taking place. And in his article, Professor Rosen puts it as this tension between civility and democracy.
TUFEKCIBut I would put it as a tension between civility and free speech because civility can also be part of what's necessary for democracy. Not really -- I'm not in favor of the current laws in Europe that are, you know, Holocaust denial laws, and I think the right to be forgotten law is really is -- exactly as he is describing, is just draconian in the way it's been shaped. But on the other hand, I do see why the right to sort of have a civil public space might be valuable and desirable in a place, let's say, post-Genocide Rwanda or in a place where ethnic tensions are high (unintelligible).
NNAMDIWell, allow me to ask you this question then...
NNAMDI...because you've mentioned and Jeffrey Rosen has mentioned an Internet user, oh, in Germany might find speech denying the Holocaust more harmful than a user here in the U.S. So how active of a role should Web companies have in regulating speech...
TUFEKCIThat's the trick.
NNAMDI...that might ignite ethnic (unintelligible).
TUFEKCIRight. So in the case of the "Innocence of the Muslims" video, for example, Google just took it off YouTube without even a court order, which I was against. I mean, I thought that was just patronizing. If Egyptians really want to take that off their own YouTube, they should've at least have to go to the court and do something rather than Google just, you know, unilaterally pulling the plug just -- I understand how they did it.
TUFEKCII understand the tension, and I understand it was difficult moment. But a lot of the times, what we ended up having is these companies are just kind of making up stuff as they go along. And as a result, in places where they have offices on the ground, like Twitter and Google have now offices in Europe, they're kind of going along with lots of things defensively almost. And in places where they have nobody on the ground, they're just either not going along with the government.
TUFEKCIYou could say it's because of their First Amendment beliefs or also because they don't care. Their business model doesn't really require for them to care. And at some point, again, I'm not saying at all that, you know -- in fact, I'm happy, and I fight against this when there's, you know, Iranian government or the Egyptian government or the Turkish government tries to block speech online.
TUFEKCINinety-nine percent of the time, I'm thinking it's a good thing that these companies aren't listening to them. But, on the other hand, I'm constantly bothered by the idea that, you know, however many times he may have read and assimilated John Stuart Mill, a 22-year-old or 28-year-old, in a very sort of fairly rich country without the kind of sometimes ethnic tensions or other tensions and also not necessarily -- I mean, these companies -- these kids -- almost kids, you know, youngsters, 20 something, sort of doing this, they come from a very particular place in life.
TUFEKCIAnd a lot of things about civility that are essential for, say, women to be in public space safely, especially in Middle East countries, are not their concerns. And I'm bothered by the idea that the best we can come up with is just a few hundred people who are moderating content, 400 million...
NNAMDIGot to interrupt because we've got to take a short break. Jeffrey Rosen, hold that thought. We'll get right back to it. If you have called, stay on the line. We'll get to your calls. The number is 800-433-8850. Intriguing question, do you think you should have the right to completely erase your past from the Web? 800-433-8850. I'm Kojo Nnamdi.
NNAMDIWelcome back to our conversation on Tech Tuesday about who gets to decide what's free speech on the Internet and what standards they're using. We're talking with Zeynep Tufekci, she is a fellow at the Center for Information Technology Policy at Princeton University and a professor at the University of North Carolina, Chapel Hill. She joins us from studios at Princeton.
NNAMDIHere with us on our Washington studio is Jeffrey Rosen. He is a professor of law at George Washington University, legal affairs editor at The New Republic and president of the National Constitution Center. Before we go -- went to that break, Jeffrey Rosen, you were about to say?
ROSENI agree with Prof. Tufekci that it is troubling that young people at these tech companies have so much power. That was the central point of this article, that elite squad, that -- the trust-us model doesn't work in the U.S. constitutional tradition, and it may not be sustainable when it comes to Google and YouTube.
ROSENBut I do think that Google and YouTube actually did a better job at protecting free speech in the innocence of the Muslim context than either the president of Egypt to his demanding that the video be banned everywhere or President Obama who, at the U.N. on a separate speech, called on YouTube to re-examine its conclusion that the video didn't violate its own content policies.
NNAMDIBut you say that video tested the fine line between what is and what is not hate speech on the Web.
ROSENIt's absolutely right.
TUFEKCIAlso, Google took it down.
ROSENIf I could...
TUFEKCIGoogle took down the video.
ROSENThey didn't take it down worldwide. They took it down temporarily in Libya and Egypt because it concerns about violence on the ground, and they were restricted entirely in India and Russia because it concluded that it was illegal there. But they did not remove it around the globe because they concluded that to remove it would mean the millions of people who wanted to link on to new stories about the video would be denied access to news.
ROSENAnd in retrospect, it turned out that Google's decision to leave it up in most places in the world and to restrict it only temporarily in some places was more accurate than the administration's conclusion that the video had caused violence because it emerged that an English language version of the video had been in circulation since July, calling it to question the claims that the video had caused the riot.
ROSENSo although trust-us may not work, Google and YouTube made a pretty nuance decision, actually, under tremendous pressure, and I think they deserve a praise for protecting free speech more than the president of the United States.
NNAMDIWell, Zeynep, talk about this. A lot of Americans were shocked...
NNAMDI...at the prospect that an anti-Islam video could incite violent protests abroad. After all, in the U.S., distasteful speech regularly makes its way into the public sphere. But do people from some other culture see any meaningful distinction between our government, allowing speech and a government endorsing what is said in that speech?
TUFEKCIRight. So to go back to how -- what happened with that video is, as Professor Rosen says, it was around until July. And if you watched the whole thing, it's so ridiculous. If anybody really watches it, it's one of those you-can't-make-this-up level of just ridiculous. But the way it spread, in terms of incitement, was not through YouTube. It was a television station in Egypt that picked it up in snippets and showed it and made the claim that it was endorsed by the government of the United States.
TUFEKCINow, here's the part where the culture clash comes into play. In lots of parts of the world, if you make a video like that and you put it on YouTube, there would be some potential legal action. There would be, say, even if it wasn't just sort of pure blasphemy, there would be some potential way to take that down or do something, whereas in the United States, the 1st Amendment tradition, really, there's nothing a prosecutor could plausibly do.
TUFEKCISo it's -- to people, especially a place like Egypt that's just coming out of, you know, 30 years of very heavy censorship, the idea that this video that was promoted by the television station as having being promoted by the U.S. government and being shown in, you know, movie theaters around U.S., just -- they couldn't make sense of it.
TUFEKCISo it wasn't the YouTube video, but it was the incitement, the active incitement by a television station who misrepresented the story and used the fact that there's a huge culture gap between U.S. and the rest of the world in terms of what kind of speech gets regulated, and that was the incitement. And I think in that particular case, it shows the importance of, sort of, a global free speech culture, developing and different countries trying to understand each other rather than, you know, what The Delete Squad may or may not do. That's the key thing because playing it...
NNAMDIJeffrey Rosen, before I go to the phones.
ROSENI think it shows the importance of American constitutional standards being adopted for the world, of a kind of American free speech imperialism, if I may, because when you look at the company that most protects speech, Twitter defines incitement according to U.S. constitutional standards. It prohibits only direct specific threats of violence against others.
ROSENAnd only a standard that's that narrow, which is true is not supported throughout most of the world, let alone in the United States, would prevent the heckler's veto that we saw in this case because American courts generally do not allow speech to be banned based on the predicted effect on the audience.
ROSENThere has to be both intent to incite and the likelihood of success. And it took the America 200 years to adopt this standard. The companies are not formally bound by it because they're the government. But I think only a standard like this could avoid confusion and suppression like this in the future.
NNAMDIOn to the phones. Here's Tiffany in Huntingtown, Md. Tiffany, thank you for waiting. You're on the air. Go ahead, please.
TIFFANYHi, Kojo. I was calling because I was -- my feeling is that companies like Yahoo, Google, Facebook are being self-regulated, but they're also regulated by the people. If we don't like the way they're handling -- the way Google is handling something and the way that they're -- what they're choosing to allow and not allow, we can just go to Yahoo or Bing or that type of thing. I feel like that's just the kind of the way capitalism works, and that, by letting the people regulate and then self-regulate, I think it works better than trying to let our legislative system deal with it.
TUFEKCIMay I say something to that?
NNAMDIPlease go ahead.
TUFEKCIYes. So we're obviously -- I mean, I think a lot of us are in agreement that it's not about legislators. It's about the companies. But there is a very particular phenomenon in online spaces why -- and this is why this issue is very contentious. It's called network effects. In effect, if you're in Egypt right now and if you want to do politics and if you're not on Facebook, that's where the audience is.
TUFEKCIAnd network effects refers to a situation in which whoever becomes dominant has such an overwhelming advantage that, you know, an alternative social network that is used for politics in Egypt right -- is just not viable because you want to be where everybody is. If you're selling something, you're going to go to eBay because that's where the buyers are. If you're not in the Google search page, as if you're not in the first or second page, you're going to lose a huge amount of traffic.
TUFEKCIAnd that is exactly why people are concerned about both what these companies allow or not. I have seen instances where Facebook pulled very significant political speech in the Middle East, including in Egypt, either based on an erroneous intellectual property claim or because of its own terms of service. And in effect, it just did not get out. So I wish there were, you know, lots of alternatives to Facebook in lots of countries within politics...
NNAMDIBecause you're arguing that in some cases, college kids have no place else to go.
TUFEKCINot just college kids. Again, the current military, you know, sort of the SCAF as it was called in Egypt, when it took over the country, it set up a Facebook page to put its communiqué. So if you wanted to have access to that or communicate about it, you were -- you just had to go to Facebook. And I think that's why, I mean, this is going back to the point that these companies are incredibly crucial in regulating the shape of our online publics because of network effects. They just aren't -- there aren't these viable alternatives to them.
ROSENI think that's a very good point that in certain countries, if you're not on a particular ISP or not in the action -- in India, it's Google's Blogger service. But the caller makes a very interesting suggestion. Can the net regulate itself? Could individual communities of users in Egypt or India decide for themselves what they think is likely to cause violence on the ground or to offend community norms? That's the optimistic story.
ROSENIt's like the activism against SOPA and PIPA where the net rose up to stop illiberal laws that suppress free speech. Unfortunately, I'm not so optimistic about the potential for self-regulation when it comes to speech for some of the reasons that we've been discussing. The truth is that the idea that you can only ban speech if it's likely to cause violence is not popular in Europe and even in the United States. As Professor Tufekci very interestingly pointed out, if you put the First Amendment to a vote, that's not what the community would vote for.
ROSENSo I fear, for that reason, you need some neutral party that's actually enforcing these standards. There's one other alternative that we haven't discussed, and that is an algorithm. Some of the companies like Facebook and others, Google and Twitter, are thinking about the possibility of algorithms that could predict whether a given piece of content is likely to cause violence in a particular reason based on patterns of violence in the past.
ROSENAnd they hope that if the machine can tell them when to remove the speech, it'll solve some of the trust-us problem that we're seeing now. I fear, though, that these -- as we've been discussing, these decisions are so hard. They're so contextual. You know, we're reasonably disagreeing in particular cases about what the right answer was, that the thought that it could all be solved by a machine seems to me to be too optimistic as well.
NNAMDITiffany, thank you very much for your call. We move on to Jennifer in Woodbridge, Va. Jennifer, your turn.
JENNIFERThank you so much for taking my call. It's more of a comment, I guess, based on the thought that came up regarding what one of your panelists said about the individuals who have created these companies. And in many respects, they created these companies when they were in their late teens. And I applaud their efforts. They are very intelligent kind of people. But at the same time, they may not have the world experience and just the interest in other cultures and what's going on.
JENNIFERAnd I know when I was in my teens, I was very laser-focused on what I wanted to do and who I was and what I was interested in and, as a result, stepped on many people's toes in the years to come because I wasn't necessarily considerate. And so I think part of the issue that really concerns me is that these companies are so large, they don't necessarily care about the...
NNAMDIWell, allow me to put Jeffrey...
JENNIFER...about the feelings -- I'm sorry, go ahead.
NNAMDIBut, Jeffrey Rosen, you're putting free speech imperialism in the hands of people who may not yet be mature enough to fully be sensitized or competent in other cultural norms.
ROSENWell, I completely agree with the caller. First of all, that if there is to be a right to be forgotten, I want it to apply to my teens and 20s as well.
ROSENThere's lots that I'd like to forget and be forgotten. And it's certainly true that we grow in wisdom and maturity. And the kind of decisions that you might make if you're 22, as the caller suggests, are not the same as the one that a Supreme Court justice would make at the age of 70. But the question of whether we need more cultural sensitivity, as the caller suggests, or more -- I'm being, obviously, tongue-in-cheek by talking about imperialism.
ROSENBut I'm quite serious when I say the alternative is a principled decision on the part of the companies that this is the right principle to preserve speech throughout the Net and just to impose that by the technological or judicial fiat. I think I come down more on the idea that this -- we do need a little bit of free speech imperialism because to be culturally sensitive in many of these cases would have meant banning that "Innocence of the Muslim" video across the world, allowing heckler's veto merely because some governments wrongly, you know...
NNAMDIWell, allow me to complicate matters just a little bit more. Zeynep, the founders of Reddit say the forum stands for freedom of speech, but its standards ended up allowing forums that included inappropriate photos of young women and minors. Can online communities defend content like that under the banner of free speech?
TUFEKCIThat's a very good example of something that shows that free speech is not sort of valid at standing by itself, but it often comes in clash with other things, and you have to try to figure out who are you going to make to pay the price. Now, I think for political speech, we're just 100 percent on the same page.
TUFEKCIBut in -- what happened in the case of Reddit was that Reddit allowed for the creation of forums that was dedicated to "CreepShots" and Jailbait, which were photos of teenagers -- who knows which -- how many of them, what percent of them were underage -- that were taken without their consent in compromising positions, up skirt photos, people going up the stairs, all sorts of athletes, teen athletes.
TUFEKCIAnd the Reddit founders, for a very long time, defended those forums as just free speech. Now, the idea that as soon as you're in a public space that you have no right to privacy and some creep might come and take a photo of you, you know, an up skirt photo, and post it on the Internet and that this gets defended as, you know, free speech I think shows how important it is who gets to make these regulations because I don't really know too many sane women who would defend that.
TUFEKCIAnd I think what happened in the case of Reddit is that this is just a misreading of the creeps' rights to take these photos as some kind of perverted ideal of free speech or First Amendment when it's not, because the freedom to assemble, the freedom to participate in public spaces is also of value. And you're going to say, well, you can't really stop these things, you know, that people who have cameras say, maybe you cannot stop these things.
NNAMDIWell, allow me to...
NNAMDILet me pile on a little bit more because just after the Boston Marathons, many Reddit users went online to identify possible suspects in surveillance photos. In that process, they pointed out people who were entirely innocent, many of whom were minorities. How can unregulated speech reinforce harmful biases?
TUFEKCIWell, I mean, in this -- I just want to sort of finish with the Reddit case. Reddit is where the United -- president of the United States goes to do an Ask Me Anything session. So we're not just talking about, you know, are we going to ban all this off the Internet? We're talking about is a platform where the only -- which is the only place that the president of the United States took questions, does it have any responsibility to create an environment which isn't so hostile to women, in some cases?
ROSENI chuckled with appreciation because you're right to pile on. You're asking the tough questions, and both of them arise in these two Reddit examples. No one denies that clearly illegal speech that involves defamation or child pornography or intentional harassment should be restricted. The question of who's responsible for taking it down can be tough in some circumstances.
ROSENAnd Reddit, like other Internet service providers in the U.S., generally enjoys immunity under section -- under the Communications Decency Act and says that if they don't exercise editorial control, then they're not responsible for content posted by others. That's not to say that, when they're given notice, that content is clearly illegal, they won't, and shouldn't remove it. So I think we're just talking here about the procedures that they used and how responsive they were under the notification.
ROSENYour second example is really important, Kojo, because it reminds us that the Net can both be a well-regulated space for the marketplace of ideas or turn into a mob, an absolute mob that leads to the targeting, on -- absolutely on racial and other ethnic grounds, of the innocent, who, in some circumstances, may have their lives destroyed or commit suicide as a result. And the ability of unregulated mobs to pile on without any restrictions is scary and needs to be addressed.
ROSENNow, how to stop a mob, it's difficult. You know, you can prosecute individuals who say, rise up against this particular suspect as a kind of incitement. But when much of the mob is anonymous and is engaging in this sort of mob activity in real time, you can't avoid tragedies like we saw after Boston where one of the innocent people actually did kill himself.
ROSENSo I think the main -- what I take away from this really important example is let's not romanticize the Net as a place of -- the flowering of cool, Jeffersonian discourse. I mean, it can be -- little pockets of that can exist, and it can also be a place for really scary mob activity.
NNAMDIGot to take a short break. If you've called, stay on the line. We'll try to get to your calls. You can also send email to firstname.lastname@example.org or send us a tweet, @kojoshow, using the #TechTuesday. I'm Kojo Nnamdi.
NNAMDIWelcome back to our Tech Tuesday conversation on free speech and the Internet. We're talking with Jeffrey Rosen, professor of law at George Washington University, legal affairs editor at The New Republic and president of the National Constitution Center. Zeynep Tufekci is a fellow at the Center for Information Technology Policy at Princeton University and a professor at the University of North Carolina, Chapel Hill.
NNAMDIAnd you can call us at 800-433-8850 or send email to email@example.com. Jeffrey, Facebook relies on users to identify undesirable content. What that means, the site receives more than 2 million requests to remove information every week. How are companies like Facebook able to make thoughtful decisions about content when they're faced with that kind of workload?
ROSENIt's a great question, and I've wondered about the answer. A few years ago, in 2007, I got to go to the YouTube headquarters and to see the first responders in action. And as I said, the -- everyone looks the same. They're all wearing T-shirts and flip-flops, and they're 22 years old. And my host at Google said, try to identify the first responders, see if you can spot them. And I looked, and everyone was wearing the same uniform, and they're all hunched over their laptops, and I couldn't. Everyone looked the same.
ROSENAnd then he said, the only way to tell them apart, look at the porn flickering on their laptops. Basically, they're the first responders who get the user objections from the YouTube headquarters. The content is put right on their screens, and they're making split-second decisions, based on their own impulse, about whether it violates community standards, and only in tough cases will they escalate it up the chain so it'll reach someone like Nicole Wong or Dave Wilner and so forth.
ROSENSo how do you make thoughtful decisions? Dave Wilner, as I described in this piece for The New Republic, is so worried about this question -- about the volume of material and the difficulty of making nuanced decisions -- that he wants to make it as objective as possible. He'd started off by giving the first responders more discretion. But his current effort is to write the community standards so that these 22 year olds can make a decision based on the four corners of the user objection and really make it like an algorithm.
ROSENIt's a very Silicon Valley approach to the solution. Sometimes it works, and sometimes it doesn't. But you're really right. That number, 2 million requests a week to remove stuff, billions of pieces of content posted every month, the sheer volume makes it unrealistic to imagine that you can have a mini Supreme Court with nine, you know, judges adjudicating each of these millions of requests.
NNAMDIWe got an email from Steve in Arlington, Zeynep, who writes, "Persons or groups might be offended by a statement like all believers in a given religion are violent, which is merely an opinion. Those same persons or groups might also be offended by a statement that says, oh, Christians or Muslims or, oh, Rastafarians, believers in a given religion committed such and such acts of violence in 2012, which could be completely true. So hate speech laws," say Steve, "could be used to keep important facts from being disclosed." What do you say to that, Zeynep Tufekci?
TUFEKCIWell, obviously, you know, the sort of line between just mere offensive speech and hate speech, it's just sort of -- there is no clear line. And it's also clear that you can use hate speech laws to -- for censorship, just opinions you don't like. But the fact that this can be use that way doesn't really mean there's never any relevance to hate speech laws.
TUFEKCIIf you look at sort of history of mass killings and sort of recent history like the Rwandan genocide, you find that the radio -- Radio Rwanda played a key role in inciting this broad environment of hate. And it would have failed by the minor standards. It would have failed by, you know, target place method kind of standard because it was just this is broad discriminatory, insulting talk about the Tutsis. And that created part of the, you know, the ground work for what came afterwards.
TUFEKCISo -- and, in fact, it was so severe that the United States government itself considered taking out Radio Rwanda as a foreign policy action. It came to that. So, I mean, let's just sort to project to say 10 years or 15 years down the line. Things that I'm worrying about really aren't sort of minor instances of offensive speech, which, you know, some people just want to shutdown and other want to people say it, and in those things, I think, it's pretty clear that the U.S. First Amendment culture is just broadly superior.
TUFEKCIBut there is going to be cases in which we are going to see certain kinds of hate speech that are not eminent violence by U.S. legal standards. But given a cultural and political and ethnic context in another place in the world -- not here and not in Palo Alto and not on, you know, D.C. and not in California -- that is laying the ground work for really own desirable consequences. Or let's say, that's the exact...
NNAMDIAnd that's the algorithm that Jeffrey doesn't quite trust to figure that out.
TUFEKCII don't think this -- the algorithms are decisions people make that they write into a computer code. There's nothing about algorithms that is objective outside of what people have programmed into it. So if you relegated to an algorithm, all you've done is you have put into code what your decisions are, and you kind of removed your own discretion from it, so that doesn't solve anything. So there are a lot of instances where I'm just sort of thinking, are these companies?
TUFEKCIAnd, by the way, they do. They do censor hate speech when it occurs in Europe because they have offices in Europe. And so they're going to -- Twitter, recently, didn't totally take down but censored trending topics around that were anti-Semitic. I don't think they should have done it to be honest, but I can understand. It's a business decision for them. They have people on the ground in France.
TUFEKCIAnd that's just what they're going to do. So I don't even trust these companies to be these arbiters of First Amendment the way Prof. Rosen thinks they will. I actually expect them to ignore cases where they don't have a business interest or staff on the ground and kind of cave in cases where they have people on the ground. And I'm not really sure that's great either.
NNAMDIYour turn, Jeffrey Rosen.
ROSENOnce again, that's not what Twitter did. It actually only removed that un bon Juif tweets that were clearly illegal under French law, for which they have no choice because they have to abide local laws. They left up the rest. And we're struck to find that over a short period of time, the anti-Semitic thread turned into a pro-Semitic thread as the Net community was able to switch. But I was glad to see Prof. Tufekci basically saying that she is closer to the American standards than even Google and Facebook did.
ROSENOur last caller correctly said that the Google and Facebook standards, which allow the banning in speech when you say I hate all members of this religious group because they're all violent, that would be illegal under the Facebook standards. It could be quite relevant to political discourse. And it sounds like we agree that that sort of speech should be left up.
ROSENIf we're talking about what's going to contribute to Rwandan genocide, that's a completely legitimate discussion under current U.S. standards. And if we focus, the company is on whether or not speech was likely to incite violence rather on whether it was likely to offend the dignity of a particular group, then I think that speech would be much better off in the long run.
NNAMDIHere is Steven in Baltimore, Md. Steven, you're on the air. Go ahead, please.
STEVENHi, Kojo. I have a quick question. One of your guests keeps referring to the 20-something year old in flip-flops and T-shirts making the decisions. I would just like for them to ask or answer who should make the decisions because there are nine folks in Washington that -- with a few law degrees that I don't think make great decisions. So you can't put Google and Yahoo back in the box. So who should?
NNAMDIYou know, a lot of this, Steven, is because we tend to know so much more about those nine folks in Washington, D.C., than we do about the 20-somethings in flip-flops who are making these decisions. Jeffrey.
ROSENI think Steven's question is exactly right. That's the hard question we're discussing during this -- during the show. Who should decide? Who should the deciders be? We didn't -- the -- I noted at the beginning of this New Republic piece that all of these 22 year olds kind of jokingly could be referred to as the deciders because -- and Nicole Wong, who used to be at Google and made all these decisions for that company, her colleagues during the Bush administration kind of affectionately called her the decider 'cause she had the ultimate power of review over those 22 year olds.
ROSENBut I agree with you, Steve, that it's -- someone's got to decide. They're young, but they're not doing a terrible job right now. Seventy-year-old justices in Washington, who know far less about technology and about Web norms, might make less nuanced decisions than they do.
ROSENI just -- I think we all agreed in this discussion that whenever tremendous power is concentrated in a small group of people without accountability and transparency, that's potentially troubling. And we just have to start thinking about what oversight mechanisms are necessary to ensure this power is exercised responsibly.
NNAMDIZeynep Tufekci, same question to you. Last question -- we're almost out of time. Who should be the deciders?
TUFEKCIWell, I mean, obviously, there is no easy answer. I think to begin with, these companies are relying on employing very few people compared to how many users they have. Facebook has couple thousand employees for a billion users. And that really has to change given how important they have become. And they have to become more accountable and transparent in their decisions. I think the biggest problem we have is that we really don't know what is exactly going to work and how is our public sphere is going to work.
TUFEKCISo it's -- I think that without an easy answer, the first steps are accountability to transparency and a lot more people, and not just 22 year olds, who get to decide and understand and challenge how these things work. Sometimes you can have Facebook or Google take a decision to censor what is, in effect, political speech based on intellectual property claims. And they don't get back to you for two months, three months, while speeches suppressed because they just don't have enough staff to deal with the complexity. That's right.
NNAMDIAnd we don't have enough time to continue the conversation. Thank you so much. Zeynep Tufekci is a fellow at the Center for Information Technology Policy at Princeton University and a professor at the University of North Carolina, Chapel Hill. Zeynep Tufekci, thank you for joining us.
TUFEKCIThank you for inviting me.
NNAMDIJeffrey Rosen is a professor of law at George Washington University. He is legal affairs editor at The New Republic and the president of the National Constitution Center. Jeffrey Rosen, in 10 seconds or less, what is the National Constitution Center?
ROSENWe promote bipartisan debates about constitutional issues. We're a beautiful museum on a Philadelphia mall and America's town hall. And I hope that everyone will come and visit us on the Web and in Philadelphia and participate in these great conversations.
NNAMDIJeffrey Rosen, thank you for joining us. And thank you all for listening. I'm Kojo Nnamdi.
Most Recent Shows
In author Jabari Asim's fictionalized St. Louis -- the 'Gateway City' first introduced in his short story collection 'A Taste of Honey' –- characters come to grips with the fallout of the civil rights era in surprising ways. We talk with Asim about the fictional world he created and examine the realities of how we deal with race in America today.
We explore the lessons from cities that have boosted their minimum wage as D.C. activists try to get a minimum wage hike on the ballot next year.
Kojo sits down with Baltimore City Health Commissioner Dr. Leana Wen to talk about her first months on the job, how she's prioritizing public health needs, and how her personal story instructs her vision for health policy and progress in Baltimore.