Two Washingtonians with wildly different paths to farming have written a new handbook for the modern agricultural generation.
The World Wide Web has long been known as the wild, wild west of content. Almost anything goes — from lewd videos and doctored photos to troublemaking “trolls” who harass web forums with provocative speech. In recent weeks, sites like Reddit, 4chan and even Twitter have come under fire for hosting and posting abusive language and unauthorized content. The incidents have renewed debate about civility online, privacy and the role websites play in hosting objectionable content. Kojo explores bad behavior on the web, its evolving role on the Internet, and how companies use legal and moral guidelines to keep it in check.
- Jeff Steele Co-owner and Administrator, D.C. Urban Moms and Dads
- Jules Polonetsky Co-chair and Director, Future of Privacy Forum
- Annemarie Dooling Online Audience Development Strategist; former Senior Community Editor, Huffington Post
- Nate Cardozo Staff Attorney, Electronic Frontier Foundation
MR. KOJO NNAMDIFrom WAMU 88.5 at American University in Washington, welcome to "The Kojo Nnamdi Show," connecting your neighborhood with the world. It's "Tech Tuesday." Well, they say almost anything goes on the internet, and incivility on the web reached new highs, or lows, depending on your viewpoint. From nude celebrity photos to harassment on Twitter and even the dissemination of execution videos. The freewheeling nature of the internet has forced websites to question their hands off approach to what users post.
MR. KOJO NNAMDIMoves by sites like Reddit and 4channel to shut down forums where users were posting illegal, private or even grisly material follows similar efforts by some news sites to protect themselves from so-called trolls, who harass and disrupt web forums. While being a creep, a troll or a troublemaker on the web can be disturbing, privacy advocates say it's protected speech. But when does this protection go too far? What responsibilities do websites have for hosting and posting provocative or objectionable content? And what drives all this bad behavior on the web?
MR. KOJO NNAMDIIt's our "Tech Tuesday" conversation. And joining us in studio to have it is Jules Polonetsky, Co-Chair and Director of The Future of Privacy Forum. Jules, thank you so much for joining us.
MR. JULES POLONETSKYGreat to be with you.
NNAMDIAlso joining us in studio is Jeff Steele, co-owner and administrator of D.C. Urban Moms and Dads. Jeff, thank you for joining us.
MR. JEFF STEELEThank you. Happy to be here.
NNAMDIYou too can join the conversation. Give us a call at 800-433-8850. You can send email to email@example.com. You could shoot us a tweet @kojoshow using the hashtag techtuesday. Joining us from NPR studios in New York is Annemarie Dooling. She's an online audience development strategist, former Senior Community Editor at the Huffington Post. Annemarie, thank you for joining us.
MS. ANNEMARIE DOOLINGHappy to be here, Kojo.
NNAMDIAnd joining us by phone from San Francisco is Nate Cardozo. Staff Attorney with the Electronic Frontier Foundation. Nate Cardozo, good to have you aboard.
MR. NATE CARDOZOThanks for having me on.
NNAMDIAgain, you can join the conversation by giving us a call at 800-433-8850. Or sending email to firstname.lastname@example.org. What drives all this bad behavior on the web? Annemarie, incivility on the internet is nothing new, but recently, we've had some high profile examples of particularly egregious and even illegal behavior on web forums. As someone who has built a career managing and moderating online audiences for companies like The Huffington Post, Yahoo and Salon, is bad behavior inevitable as long as the web remains open?
DOOLINGWell, I'd say it's nothing new. I think that as long as there are places for people to find like-minded people, we'll always have arguments. We'll always have fights. And it really depends on what you are trying to accomplish by having comments or forums or message boards or whatever your chosen form of user communication is. So if we're talking about something like a news website, you know, you really have the responsibility to make sure the contributions that accompany your news and your supposedly factual articles have some sort of decorum. Or some sort of resemblance to being civil. And some sort of fact checking going on there.
NNAMDIJeff, much of this online troublemaking is attributed to what are called trolls. It's a funny term for a not so funny definition. As the long time moderator of D.C. Urban Moms and Dads, how do you encounter these people at your site?
STEELEWell, we're lucky that we don't deal with the outrageous types of abuse that have been in the press lately. But we do deal with the nearly constant problem of people just being rude. I'm a big believer in the report and block strategy. Reporting is just a way for users to report inappropriate content and for people like me to do something about it. And blocking is a way for users to block inappropriate posters. Our site doesn't allow that, but things like Twitter make that very easy for users to block other users.
NNAMDIAnnemarie, how do you define trolls and what drives their nastiness online?
DOOLINGThe term troll is actually very funny, as you just mentioned, because I think that it's been used in strange ways lately. It seems like it's been used for anyone who just simply doesn't agree with you. Or who is momentarily uncivil in a conversation. But an actual troll, actual trolling means that someone has a harmful intent before they even engage with you. They're coming there to derail a conversation, to make problems for a writer. To make problems for a message board and the other users there.
DOOLINGSo that's actually what trolling is, and there are different ways to combat that, as we just mentioned. But it seems like lately we have a lot of misuse of the term, and that can also lead to problems when we figure out how to combat it.
NNAMDI800-433-8850 is our number. Do you make comments at websites or participate in online discussions? Have these forums ever become nasty or abusive? And how have you responded? Give us a call. 800-433-8850 or send email to email@example.com. Jules, Twitter recently said it was evaluating its user policies following abusive comments made to Robin Williams' daughter after his death. And sites like Popular Science, Chicago Sun Times and Gawker, media's Jezebel, have either taken down or pared back their online comment section because of this problems.
NNAMDIWhat responsibilities do companies have when user generated content becomes abusive or even illegal, in the case of the nude celebrity photos that were posted recently?
POLONETSKYWell, I'm the father of a young teen, so for me, a turning point was PewDiePie, the most followed poster on YouTube, who's got, I think, 30 million or so people watching him play video games. Turned off comments and said, you know what, this just isn't worth the trouble. I've seen this both as a manager of policy at AOL, where we had to deal with this at large scale as well as the manager of a very small community list serve in Potomac. And I think it's interesting to see companies come into this space and initially have sort of a everything's open, we're all for free speech.
POLONETSKYAnd then you start seeing some of the excesses. And they start realizing that they do want to make sure that their environment is one where users aren't chased away. Where celebrities don't have to quit because they feel they're being abused or where the average person isn't attacked just because they're female and they're commenting about feminism or whatever the topic is. So, I think we're slowly seeing companies step up, hire moderators, start using community software that can help the community take responsibility. Very early, if you remember the days of AOL instant messenger.
POLONETSKYWe had a feature called "eviling," where, if somebody did something offensive, you could sort of give them a charge. And if many people complained, the community would slow down your ability. You couldn't post as often. And that was a very effective kind of self-policing that I think we're starting to see. But there's probably a lot more that can be done.
NNAMDIWell Jules, for years, Section 230 of the Communications Decency Act of 1996 has provided immunity for websites that publish content provided by its users. But that was passed nearly 20 years ago, which, on the internet, is a lifetime. Should this landmark legislation be reconsidered, revised maybe?
POLONETSKYSection 230 has probably been the most critical thing for the development of an internet where there is lots of content, because if my colleague, with his phenomenal local site here, had to review every single thing that was posted on his site to make sure that it was legal, that it met some standard or he was on the hook, you know, he couldn't be in business. We couldn't have Google generating the immense amount of material that it has if some lawyer had to go through every single content. So, Section 230 has been, you know, strong protection, strong freedom from liability.
POLONETSKYBut that doesn't mean that those sites can't, on their own, say you know what, I don't want to dignify hate speech. I don't want to dignify attacks on women. I don't want to be legally responsible. And let's note, there are some places where harassment can be so bad that it is legally actionable. It's one thing for me to say, I hate this group. I hate that group. My internet service provider or my, you know, AOL or Twitter might say, we don't want to allow that. But if I actually go after an individual and I say, I'm gonna kill you. Or I'm gonna kill all people like you.
POLONETSKYOr I persist in harassing you in a way that really is emotionally harmful, the laws can take that into account and consider that to be a legal violation.
NNAMDINate Cardozo, I'd like to get your thoughts on whether Section 230 of the Internet Decency Act needs to be updated. Is this law working, despite the proliferation of unauthorized porn and abusive content online?
CARDOZOSo, I think we need to look at the issue of CDA 230 from two different perspectives. The first is, you know, sites are free to, as the other guests have discussed, sites are free to institute moderation. And that works. That works very well for a lot of different forums online. But CDA 230 also works very well. It is the only reason why we're able to have things like Wikipedia based in the United States. Why Yelp exists, why Craigslist exists. Why YouTube has a comment section in the first place. Without CDA 230, all of those really, truly innovative speech forums and drivers of the economy, especially here in Silicon Valley, would simply disappear.
CARDOZOMany of the proposals that we've seen to update CDA 230 would really gut it, and would not allow sites like Yelp to even exist. So, before we do anything to CDA 230, we need to look at not just the intended consequences, but the unintended consequences of making intermediaries liable for (unintelligible) .
NNAMDIWell, let's talk about that more specifically, because Washington state passed a law that made web sites criminally liable for providing access to offensive materials posted on other sites. Can you tell us about the case that the internet archive brought up against Washington state when this law passed?
CARDOZOAbsolutely. So, the law was passed with the absolute best of intentions. It was a law passed to combat child sex trafficking, something which everyone here is going to agree should not be taking place and should not be taking place online. But the way the law was worded, it criminalized the posting, either direct or indirect, of any solicitation for child sexual activity. Our client, the internet archive, at archive.org, runs the way back machine. Your listeners may or may not be familiar with this.
CARDOZOBut this is an automated service that crawls the internet periodically and archives the entire internet. And you can go back and look at historical versions of almost any website on there, going back all the way to the late 90s. So it's really a remarkable resource, and it's an accredited library. And a fantastic institution. And their entire mission was put at risk by this Washington state law. And New Jersey, actually, passed an identically worded law, so we brought suit on behalf of the internet archive in both Washington and New Jersey.
CARDOZOAnd had the law declared unconstitutional. It was -- it impermissibly interfered with the internet archive's mission and free speech. Because with that sort of criminal liability, without an intent requirement, it would have made the internet archive's business untenable.
NNAMDINate Cardozo's a Staff Attorney with the Electronic Frontier Foundation. He joins us by phone for this "Tech Tuesday" conversation on bad behavior on the web. Annemarie Dooling is an online audience development strategist, former Senior Community Editor at the Huffington Post. She joins us from NPR studios in New York. Jeff Steele is co-owner and administrator of D.C. Urban Moms and Dads. He joins us in our Washington studio, along with Jules Polonetsky, co-chair and director of the Future of Privacy Forum.
NNAMDIYou can join the conversation by calling 800-433-8850. Have you ever encountered a troll at a website? Did you report the bad behavior? 800-433-8850. You can send email to firstname.lastname@example.org or you can go to our website, kojoshow.org. Ask a question or make a comment there. Jules, I don' think lawmakers anticipated things like revenge porn when they crafted the Decency Act in 1996, but this particular privacy intrusion on the web has gotten renewed attention, following the nude celebrity photos that were released last month. Can you give us an idea of what this problem is and how it's grown for websites?
POLONETSKYYou know, it's clear that whether it's a hacking attempt like this that seems to be really targeting, you know, women in an incredibly offensive way -- I called it a sex crime the other day. Some people said that's too strong. But I think showing our revulsion, that this isn't a cool, technical thing or this isn't a, you know, ha ha, let's all take a laugh at this, but really recognizing that it's a, you know, disgusting thing.
POLONETSKYWe clearly have this phenomenon where people get their hands -- here, hopefully it's pretty easy. These people got it there, you know, illegally. But we've got other cases where it's a bit of a challenge. A picture that you've taken, you've got some legal copyrights to. And so you can already go after a picture that you took that's posted online and say, hey, that's mine. You don't have the right to put that out there. I'm sorry that my ex or boyfriend or whoever it is took it.
POLONETSKYBut the challenge is there are pictures that other people take of you. It's their picture. They took it legitimately. And now the relationship is over or they're just, you know, cruel people and it's out there. We can leverage some of the current laws to go after those posters for intentional infliction of emotional harm. I think legal experts have really made the case that these are areas where the harm is so direct and so intentional that we can go ahead and get criminal action against those folks.
POLONETSKYBut look, the sites, you know, can and should step up. And, you know, you talked about people reporting. I think a lot of people think, well, someone else will report it or, you know, I reported it and I didn't see anything happen. So, you know, why should I do anything about it? It's important for listeners to understand that it's very often the combination of reports, you know. Anybody can report something they don't like. The algorithms usually, you know, can ignore that sort of thing.
POLONETSKYBut when a whole bunch of people say this is bad, the systems are usually pretty good at responding and taking things down. Sometimes it takes too long, but hitting that button is actually a good way to be civically responsible and show that you disagree with content and take policing into your hand.
NNAMDIWhere are we in terms of legislating against revenge porn? And what are some of the complications?
POLONETSKYWell, look, there -- those that argue that somebody took a picture, it's their picture, and they've got the right to disseminate it. And so, you know, we've seen law enforcement sometimes being shy at taking action. Wait, was this stolen? Was there a hack? Well, then why is it my business to get involved in, you know, your dispute with your ex? We see law enforcement perhaps not taking these seriously enough, not having the ability to go ahead and make the case, contact, you know, the intermediaries and pursue them in a real open way.
POLONETSKYI think if we start seeing some real obvious prosecutions, we'll see the people who perpetrate this, you know, becoming aware. Now, look, there are intermediaries who help shield these folks. Right? They don't disclose where they took the pictures from. And the law does allow -- there are a number of areas where we probably have states that need to make it more clear that these are examples of intentional infliction of emotional harm.
POLONETSKYThere have been a couple of good model proposals extended. And I think we can do this without violating free speech or without stepping on people's legitimate, you know, interest in expressing the content that they have.
NNAMDIGot to take a short break. When we come back we'll be continuing this conversation on internet creeps, bad behavior on the web. We're still asking you to join us by calling 800-433-8850. If you have already called, stay on the line. We will get to your call. Have you ever been a troll yourself at a website? Why did you do it? You can send email to email@example.com or shoot us a tweet, @kojoshow, using the hashtag TechTuesday. I'm Kojo Nnamdi.
NNAMDIWelcome back. It's Tech Tuesday. Should websites take away the ability to remain anonymous on the web? Give us a call, 800-433-8850. We're discussing bad behavior on the web. We're talking with Jules Polonetsky. He is co-chair and director of the Future of Privacy Forum, Jeff Steele is co-owner and administrator of D.C. Urban Moms and Dads, Annemarie Dooling is online audience development strategist and former senior community editor at the Huffington Post, and Nate Cardozo is staff attorney with the Electronic Frontier Foundation.
NNAMDIWe got an email from Brock, in Silver Spring, who says, "I think one of the main problems with the online comments is people who feel empowered by anonymity. Hiding behind a made-up name or even anonymous and an avatar provides people the shield to say whatever they want without personal accountability. If people were required to post their real names and their pictures, I believe the incidents of trolling would decrease.
NNAMDIThis would remove that shield and allow people to respond to the person, instead of a comment meant to enrage everyone else on the forum." Jeff Steele, I will start with you. How do you manage trolls at your website? Do you thing that there should be anonymity? You do allow anonymity on your website.
STEELEYeah, over 90 percent of our posters post anonymously. Of course, that enables you to engage in anti-social behavior easier. But the other side of that is it allows you to hide from it a bit. It's hard to harass an anonymous person. You can maybe criticize a post of theirs, but you can't follow them from thread to thread. You can't chase them elsewhere around the internet. So this anonymity also allows a lot of protection from abuse. That's the other side that I think often gets lost in this kind of discussion.
NNAMDIAnnemarie, you say that taking away anonymity is not a realistic solution. Why?
DOOLINGNo, it's not. And as we just mentioned, taking away anonymity, it's harmful to the people who are actually having a civil discussion. The people who don't want to be bothered. So when we're talking about taking away anonymity -- and I think anyone who's spent more than 10 minutes with their family on Facebook can back this up -- what we're really talking about is that we want to unmask people and we want to make them liable for the things that they're saying online.
DOOLINGAnd if we back up just a minute and talk about moderation again, because I think a few people just mentioned that it's easy to moderate. It's sort of a nice, in-between from unmasking people to not having any comments. Moderation's difficult. Especially if you're a news site and your primary job is journalism or you're a small website or you're a small community. Moderation is very, very difficult and very expensive. And there are not a lot of tools right now that help people do this.
DOOLINGSo, for example, at the Huffington Post we had maybe 30 to 40 people whose job it was to look at comments. That was a team of developers, community managers, people who worked on our algorithm to filter comments, people who moderated comments. A huge, huge team. If I'm looking at something like Salon, where we have not quite the volume of comments, but quite a lot of active users, we have like two people who moderate comments. So it's a lot of time.
DOOLINGIt's very difficult and I think this is where we get into an area where people want to unmask and attach names. Because then we try to figure out, well, if we can't moderate, how can we actually stop people from speaking? And then you get into this weird area where you're actually trying to curate morals. And it's not something that's easy to do.
NNAMDINate, the right to anonymous speech on the web is a bread and butter issue for you at the Electronic Frontier Foundation. What's the danger that you see in removing anonymity on these web forums?
CARDOZOSo, from our perspective, anonymity is not just important, it's fundamental. We think that, you know, something like Facebook is free to adopt a real names policy and make sure that people's actual names are attached to everything they say. But that's just one small corner of the web. You know, our democracy has always valued anonymity, all the way going back to the Federalist Papers, which were published, of course, anonymously.
CARDOZOAnd anonymity allows people to do the difficult and brave work of democracy. It allows people to advocate for controversial viewpoints for gay marriage or for gun rights or for marijuana legalization or, you know, a Tea Party activist might not want to attach his or her name to everything he or she says online. Anonymity is really what allows our democratic society to float new ideas and to function.
CARDOZOSo absolutely forums that are seeing a problem or that want to be family-friendly, are free, of course, to reject anonymity. But you can't force a forum to reject anonymity. You cannot ban anonymity online -- at least not in this country.
NNAMDIJules, where do you come down on this anonymity issue?
POLONETSKYYou know, I think it's clear that we need locations where anonymity is feasible, whether it's people who can demonstrate against their government, like we've seen in the Mid-East or whether it's people who want to express identities that they don't want to be held to, but that are freeing for them. But I do think there's also a place -- and a healthy place -- for identifying speech, whether it's a forum that wants to maintain a certain discourse and thinks that having users identify will do so. Or where it's something like Facebook, where I want to know who I'm interacting with in a real world manner.
POLONETSKYI think it's important that we have both those there. I do think the challenge of large-scale moderation can't be understated. It's very easy sometimes to say, well, how come Facebook didn't take that down or Twitter didn't take that down. We had hundreds of moderators around the world, 24/7, with technology at AOL, and we were still behind. And today, you know, the Facebooks and Twitters literally do have substantial teams. And there are always mistakes.
POLONETSKYSomeone reports something and says, hey, this is porn. And the moderator quickly looks and takes it down and it's someone breastfeeding. And all of a sudden the community of moms wants to know why is Facebook taking down breastfeeding pictures.
NNAMDISeen that one.
POLONETSKYOr, you know, you name it. Or just consider this, you know, in the early days at AOL, we didn't allow you to use the terms -- and I'll apologize -- but dike or kike, these were considered very offensive terms. You couldn't register a screen name. And then we had people complaining and saying, well, wait a second, I'm using that term to express my identity.
POLONETSKYThis is who I am. I am proud dike. I am, you know, I'm reclaiming what someone use a pejorative. And so I said, okay, I'm going to update our moderating standards and I'm going to tell our moderators who are in the Philippines or in India or in Ohio that beware of these terms, these are bad and offensive terms. However, if they seemed to be used in a positive, self-referential manner, given the context of that conversation and the kind of chat room it's in, I mean, forget about it. You end up having, you know, mistakes. So I agree with Annemarie, this isn't easily done.
NNAMDIOn to the telephones. Here's Richard in Alexandria, Va. Richard, you're on the air. Go ahead, please.
RICHARDYeah, hi. I just wanted to give a defense on anonymity on more of a content side. I'm a -- I grew up in the internet generation and I have used forums, anonymous forums for the past few years. And one of the benefits of it is that you -- people -- that users aren't judged for past content. So if I post and it is not well received by the community, then I can just post something later, like five minutes later, without being judged on that. And that isn't to say that there aren't abuses, obviously. I just -- from a user perspective anonymity is, I mean, it's, I don't know, very good.
NNAMDICare to comment on that Annemarie?
DOOLINGI would agree with him, but I also think that if you're content is not being well received by a community that maybe it's just not the community for you. And to do a side caveat to what I think the caller was mentioning, he's talking about identity and he's talking about clout within specific communities. So if we look at something like Reddit, where you can, you know, spend your time on Reddit, cultivate your content, cultivate who you're speaking to, and create your own persona there, those are the kind of communities where we get back into responsibility and people creating their own leadership.
DOOLINGAnd then we sort of snowball into this area of are platform's responsible for who their self-professed leaders are. Are we then responsible, as a platform, on, let's say, Salon, or Yahoo or the Huffington Post for who the top commenters are if they're chosen by the communities themselves?
NNAMDIThank you very much for your call, Richard. Nate, Jules was mentioning earlier the policies at AOL that included not using certain words, except when people started using words like dike proudly. Facebook is actually facing this anonymity issue right now because of its requirement that everyone use his or her real name on its service. Apparently that requirement is forcing longtime drag queens to reveal their identities online. Can you tell us about that case?
CARDOZOAbsolutely. So Facebook has always had a real names policy. In order to sign up for a Facebook account, you agree to the terms of service, that you use your legal name as your Facebook identity. You can have a different name for a page, rather than a profile, but to have a profile you need to use your real name. Recently, in the last couple of weeks, it seems like one or more people have been systematically reporting the names or the profiles of high-profile, both drag queens and other members of the LBGTQ community, mostly in and around San Francisco, but also nationwide.
CARDOZOAnd Facebook has been suspending their accounts. And in order to reactivate their accounts, Facebook requires them to submit identification. Facebook says it doesn't have to be government issued, but it does need to contain a date of birth. And that essentially means government issued, at this point. So in order for someone like Sister Rosa, a famous member of a performance art group here in San Francisco, called The Sisters of Perpetual Indulgence, to reclaim her Facebook account she had to submit her driver's license with the name Michael Williams.
CARDOZOSo now her Facebook profile says Michael Williams. She's willing to do that because she's very well known in the community and the name Michael Williams is not totally unknown to the people around her, but that's actually quite dangerous for a number of people in the LGBTQ community. There are real consequences to Facebook outing people as Trans online. And I don't think that that can be understated. Right? Anonymity or pseudonymity allows people of marginalized populations a level of protection online which really hasn't existed in the real world.
CARDOZOThese people are free to be who they are using a pseudonym online, which is actually sometimes the pseudonym they also use in real life. Or at least in portions of their real life. So Facebook is taking a lot of heat for this right now. And I think rightly so. And EFF, my organization, is adding its voice and calling on Facebook to -- if not completely abandon its real names policy -- to at least have some flexibility for people who do have more than one identity or whose public persona might not match their real name.
POLONETSKYYeah, well, I'm sympathetic to both sides here. I mean, on one hand, Facebook has long worked with celebrities who sometimes are known by a name that's not their real name, but frankly, you know, that's who they're known as. And, you know, they've been able to identify that, yes, this is, you know, such and such, Madonna or Lady Gaga or whatever -- I don't know if they're -- those are the right examples, but folks who are probably better known. The question is, how do you scale that to people who are, you know, using -- I guess, a celebrity name or maybe that's not the right term for it.
POLONETSKYBut, you know, a name that they've chosen to use as a real recognized persona so that others can interact with them. But how do you not drop the entire concept then? I already get all kinds of friend requests from, like, you know, people. I'm like, who is that? Do I even know this person? And, you know, I like being able to -- for my own privacy, make sure that the content I post is only available to people that I actually know, which ends up being broken down if everyone starts using some pseudonym.
POLONETSKYAnd so Facebook has been able to cultivate a certain identity and community that has enabled it to grow. Here I guess the question is how do -- how can they scale, how can they accommodate what clearly is, you know, a legitimate use of a name that is your identity, that you've assumed as an identity, that is a legitimate identity, but how do we come up with some framework that doesn't completely drop the entire ability of anyone to just completely become anonymous on Facebook, which would undermine sort of the value of Facebook to a lot of us?
NNAMDIAnother suggestion for dealing with the question of anonymity comes from Robert, in Fairfax, Va. Robert, you're on the air. Go ahead, please.
ROBERTHi, Kojo. Yeah, so a lot of forums have addressed -- specifically they want to give us something awful, they charge a small fee when you initially register with the forum, that is designed to disincentivize trolls who often take advantage of the free nature of most forums -- like, say, for example, $10 when you initially register -- that makes it economically unfeasible to continuously troll, and then be banned and then come back and pay another $10 just to be banned again. So the level of discourse is elevated within that kind of community.
NNAMDIJeff Steele, is that something you'd want to do?
STEELEIt's something we might have to consider in the future. Right now, at our scale, we're able to continue as we are. But I can see that our current practices will not scale forever. So that's a good idea. I hadn't heard that before, but that makes a lot of sense.
NNAMDIHow do you currently manage trolls or users who are abusive or aggressive at your site?
STEELEWe really depend on our users. We have a -- each message has a report button. And we really count on the other users to report inappropriate content, and then either my wife or myself will review it. We don't always agree that it's inappropriate. We leave a lot of messages there. But if we see something that's inappropriate, we'll remove it.
NNAMDIAs one of the few identifiable people on your site, which most people use anonymously, it is my understanding that you have been the target of some vitriol on your site.
CARDOZOYeah. Just about everyone who uses a username -- and that's one reason there are so few I think -- becomes a target of vitriol. And I have too, I've had people wish harm to myself or my children. I've had people post about me on other websites and then try to post personal information along with that. None of it has really amounted to anything. I wouldn't consider it abuse, more just an irritant. But that's part of the territory.
NNAMDIGot to take a short break. If you have called, stay on the line. Robert, thank you for your call. If you'd like to call, the number is 800-433-8850. Does the Internet make it easier to truly express our feelings, even if they're offensive or happen to be mean? What do you think? 800-433-8850. You can send email to firstname.lastname@example.org or shoot us a tweet @kojoshow. I'm Kojo Nnamdi.
NNAMDIIt's Tech Tuesday. We're discussing Internet creeps and bad behavior on the Web with Nate Cardozo, staff attorney with the Electronic Frontier Foundation. Jules Polonetsky is co-chair and director of the Future of Privacy Forum. Jeff Steele is co-owner and administrator of D.C. Urban Moms and Dads. And Annemarie Dooling is an online audience development strategist and former senior community editor of the Huffington Post. You can call us at 800-433-8850. Annemarie, you suggest verifying people's knowledge as a way to control bad behavior on the Web. What do you mean by that?
DOOLINGYeah. You know, I take a lot of examples from what I'm seeing very open communities do. And one of my favorites is Reddit. So I really like to look at the different variations of groups on Reddit. Of course there are some Reddits that are absolutely horrible and are not areas that I identify with or want to have any relations to. But then I look at something like the science of Reddit, which has actually gone out of their way to have the different community members contact them and verify their knowledge in some way.
DOOLINGSo whether that's an ID from their job, whether that's maybe a copy of their diploma, an email address that associates them with maybe a school or a business. And this is how they are elevating the conversations by people in the science field, so that you're actually having some form of conversation with people who know what they're talking about. And especially in fields like science, you can have a lot of problems around this. You mentioned Popular Science before closing their comments.
DOOLINGAnd I think they were right to do that, because they, you know, they have a journalistic integrity where they're trying to uphold that. And they have random people posting facts that they're finding that they can't possibly fact check, leaving them in these comment threads. And there was actually an article that insinuates reading these additional facts as caveats to the journalism above the comments can change your opinion of what you're reading in the actual article itself. So giving people some sort of method to say, okay, we're going to have a conversation about science. Let's see who we actually want to participate.
DOOLINGSo everyone can actually have a say. But, you know, curating voices is one way to really help the conversation stay on track.
NNAMDIIs that what happens at Salon.com, where I know you engage online, where I know you engage online with scientists about climate change, which can become a very heated discussion. Is that how you verify users there?
DOOLINGYeah. Exactly. So we've done it a little bit less intense than I think the sub-Reddit of science does it. We actually started because we had a lot of users who were flagging posts in the sustainability section. So instead of looking at the people who were the outwardly troubled -- people who were cursing, people who were yelling -- I actually contacted a few of them because I wanted to figure out why these people were actually being so upset about this coverage. And it turns out they weren't upset about the coverage. They were upset because there were quiet, pleasant trolls within the comments section that were leaving facts that were just not true, or as far as we know with the science available today.
DOOLINGSo if I had just relied on flags or very shallow moderation, I probably would have been getting rid of the bulk of my commenters who were actually related to science and have a vested knowledge in this field and have an actual interest in furthering these conversations. So after a few emails with them, naturally having to calm them down a little bit and talk them through what was happening, we were able to host weekly open discussions with them. So now they lead the conversation. We trust them to flag and we trust them to self moderate.
NNAMDINate Cardozo, this email we got from Sheena. "On several occasions when using the Internet, I have encountered a great deal of racist hate speech on various message boards and comments sections. In Europe, these types of activities are highly frowned upon and occasionally lead to prosecution for the perpetrator. Why is that course of action more prevalent in Europe? And do you ever see this occurring in the U.S. If not, what are other ways to combat trolls?"
CARDOZOSo the difference of course is the First Amendment. The European Union and the constituent states there do not have a system of free expression as robust as ours. In the United States, you do not have a right to have your feelings not hurt. And conversely, trolls do actually have a First Amendment right to say really mean, nasty things online. It's the old saying, right? I might not agree with your viewpoint, but I will defend to the death your right to say those terrible things that you want. This is why the ACLU defended the right of the KKK to march in Skokie -- or actually the Nazi Party -- anyway, the Skokie case.
CARDOZOYou know, in the United States, we've made the -- we've made the bargain, in the First Amendment, that the right to speak does not include a right to be heard and it does not include a right to be left alone. So a prosecution of hate speech or racist speech in the United States would run very quickly up against the First Amendment. On Twitter, right, you have a block button. You can block someone who's saying nasty things to you. You don't have to go on to a hate-speech website. You don't have to go to Stormfront or the KKK. And you don't have to read, right?
CARDOZOBut shutting down hate speech can get very problematic very quickly, because that substitutes one person's moral judgment as to what's good and proper for another's. So, you know, how do we strike that bargain here? We strike the bargain in favor of free expression.
NNAMDIJules, over-legislating the Web in order to maintain decent behavior certainly can get us into dangerous territory. Some people would say, just look at China. But it feels like much of the responsibility of maintaining clean, civil content is left to site moderators and the users themselves. Is that about right?
POLONETSKYThe danger of asking government bureaucrats, who have very different views perhaps, depending on where they live or what their own background is, you know it's not trivial to look at what, as you mentioned, China or Russia or frankly a lot of countries around the world, who in the name of civility are saying, don't rock the boat, don't incite violence, don't criticize the king or the queen or the current president. You know, I like using the example, you know, you've seen some airlines recently land because, you know, people are complaining about leaning back seats or, you know, getting into some sort of dispute. Or they seem -- somebody's putting on some religious garment and they think, oh, my god, it might terrorism.
POLONETSKYAnd you know, I'm sure they mean well. But when you ask, sort of, the government and the range of government views or bureaucrats, or even just people, you know, running an airline, trying to make nuanced decisions, one person's hate speech is another ones, you know, argument about what the future of the country ought to be like.
POLONETSKYIt is imperfect, but the government getting involved or locking people up -- unless we actually have danger, you know, intentional harassment, like revenge porn, like threatening to kill you, like persistently going after one person and, you know, doing things that verge on violence -- the tradeoff of free speech is that we sometimes have some really ugly, disgusting speech that we need to either ask intermediaries to deal with or ask users to deal with or ignore or have systems that vote up or down. It's not perfect, but there really is a true freedom principle here that we need to protect.
NNAMDIApart from your day job, it's my understanding you moderate your own listserv. Can you tell us about some of the challenges you've faced keeping discourse civil there?
POLONETSKYYou know, they're -- even though we're a small community listserv, potomacjewish.com, it's really sort of serving the various sort of synagogues or organizations that are affiliated in the area, there are very different views. And I have a pretty strict set of rules. I say, look, this is just for local things. And my view of local is, it's got to be local. But somebody else's view is, well wait a second. I've got to tell you what Obama is doing to ruin the country. Or I got to tell you my opinions about Israel. Or I've got a -- or that person's view is hateful and, you know, is anti this or anti-Jewish. And so, I have a perspective. And, you know, people sign up. And if they don't like it, they can create their own listserv or send mails to their friends.
POLONETSKYBut you get a lot of feedback. You get a lot of criticism. You end up being the bad guy. It's a thankless job. If you're doing it at a big commercial site, hopefully you get paid. If you're doing it, you know, like our friend here -- it's a life of love and, you know, you can make some money. But the people on the front lines, let's put our heart out to anybody who's been the moderator at any one of these sites. You really try to do your best, but somehow you're too liberal, you're too conservative.
POLONETSKYAt AOL, we'd be accused of being Democrat or Republican or pro-choice or anti-choice, because of decisions we'd make at the same time, the same day sometimes, by, you know, about the same issue. And so it's a no-lose proposition and hats off to the people who really stand up and do this work.
NNAMDIHere's Anthony in Washington, D.C. Anthony, you're on the air. Go ahead, please.
ANTHONYYeah. To go back to the anonymity piece, I've seen a lot of my Facebook friends drop their last name and go to their first and middle names. And sometimes, you know, it catches me off guard, I don't know who they are until I start looking at their pictures or something and remembering. Is that a trend? Is it permissible? And why is it -- why are they doing it?
NNAMDIWell, let me tell you about a trend, Anthony. Or at least what Robin in Bethesda, who emailed us, said there is a trend. Robin writes, "Every high school junior and senior across the country uses pseudonyms on Facebook. Beginning in the middle of their junior year, they know that colleges are looking them up. So everyone changes their names for about one year." This is what you have to look forward in your future, Jules?
POLONETSKYIt's true that the research shows that colleges or employers or frankly everybody, right? I mean who doesn't go to a meeting or meets someone new and, you know, you Google them or you check them out. You want to know what they look like. Or you want to, you know, know what you know. And so it's interesting to see how some of the young people are, you know, beating the system or taking control. They have a profile where they're friends with their parents and it seems to be pretty tame. And then they have the other profile that their friends are connected where, you know, the real action happens. We've got folks on Facebook, there's no way to indicate your best friends, and so people say their married, you know, to someone.
POLONETSKYAnd, you know, that's fine. But they're not. They're actually, you know, teen girls and they want to indicate that they're their best friends. And so at the end of the day, we can plan. But users take control and sometimes, you know, turn these tools into what reflects the way they really want to live.
NNAMDIThat's on the lighter side. On the darker side, Nate, this conversation takes a darker turn when we talk about the responsibility of websites to moderate content like the videos that were recently posted of the two journalists who were beheaded by ISIS. Should websites make every effort to keep this kind of content from spreading on their platform? Should there be legal ramifications if they don’t?
CARDOZOSo I'll give you the -- a lawyer answer to the first half of your question. It depends. Websites need to make a decision of what type of website they're going to be. Is this a journalistic platform or is this a family-friendly discussion forum? You know, if someone is posting beheading videos to D.C. Urban Moms and Dads, I would think that D.C. Urban Moms and Dads would have a very strong incentive to take that down. That said, if someone is posting a beheading video to ProPublica to talk about whether journalists should engage an embargo of reporting on journalist abductions, there's a very different story, right?
CARDOZOAnd that leads to the second half of your question. Should there be ramifications to a website that doesn't take these down. And I think the answer is a resounding no. There are absolute legitimate uses for people posting really terrible conduct for the purpose of commentary, for the purpose of reporting and for the purpose of discussion about, you know, the causes and solutions to the underlying conduct, the beheading itself. You know, beheading someone is a terrible thing and is illegal around the world. Posting a beheading video is not illegal in the United States, nor should it be. The uses of doing so, frankly outweigh the problems associated therein.
CARDOZOOf course, where you post it is a different story altogether. And, you know, if I'm posting a beheading video to D.C. Urban Moms and Dads, I should face the social consequences of that, perhaps being banned from the forum. But if I post it to a Reddit discussion group that's talking about whether journalists are in danger around the world and how to solve that problem, it's a very much different conversation.
NNAMDIAnnemarie, you've had this experience. So what happens when you ban people from commenting at a website?
DOOLINGWell, a lot of the activity, as I mentioned before, that people are doing that they're banned for is something that they might do in their normal life. And I mean, you can't moderate morals. It's something that a lot of people are proud of. And like we mentioned before with Facebook, you know, you'll see people with their avatars holding their babies and hugging their grandmothers. And they're just asking like monsters and screaming at each other in comment threads. So people are proud of their online behavior or they're just not aware of what they look like from the viewpoint of other people.
DOOLINGSo when you ban them, it's almost like a personal attack on who they are. And I've actually banned people and had them send me Word documents of all of their comments or, as they call it, their contributions, because they do have ownership in the website that they're commenting on. And even in some extreme cases, we've had people show up at the office to plead for their accounts to be running again. And just recently, actually had someone -- not in the office I work in, thankfully -- but go to a different office associated with one of these websites to look for me to ask why he was banned.
POLONETSKYYeah, I think we need to recognize that these aren't just comments. These really are people's way of, like, their identity. I mean they might be mild-mannered office workers, but here they are, you know, king funny commentator and they're living out lives that they aren't living elsewhere. And it's part of what makes the Internet great. You can have an identity that is very different than who you are in real life, that has really built out. And people are very attached to it, for years and years.
NNAMDIJules Polonetsky is co-chair and director of the Future of Privacy Forum. Jeff Steele is co-owner and administrator of D.C. Urban Moms and Dads. Annemarie Dooling is an online audience development strategist and former senior community editor of the Huffington Post. And Nate Cardozo is staff attorney with the Electronic Frontier Foundation. Thank you all for joining us. Jeff, just quickly, in 10 seconds, is there a new technology out there that could help flag trolls and copyrighted photos and indecent material that you're looking at?
STEELEI'm not aware of any. I think that still stays with humans at this point.
NNAMDIThank you all for listening. I'm Kojo Nnamdi.
NNAMDIComing up tomorrow on "The Kojo Nnamdi Show," kids growing up in the military. WAMU's special correspondent, Kavitha Cardozo explores the emotional and educational needs of soldiers' children. Then at 1:00, the vegan life. A growing interest in plant-based dishes is fueling a veggie-focused revolution in both vegetarian and omnivore kitchens. "The Kojo Nnamdi Show," noon till 2:00 tomorrow on...
Most Recent Shows
A new report shows the University of Maryland failed to properly treat football player Jordan McNair before he succumbed to heat exhaustion. What comes next for the school's football program, and will anyone lose their job over McNair's death?
The sexual assault allegation against Supreme Court nominee Brett Kavanaugh is prompting members of Washington's private school community to look inward.
New proposed legislation threatens some of the power D.C. Mayor Muriel Bowser exercises over education in the District. Rep. Jamie Raskin is running for a second term in Congress, pledging to protect Maryland's air and federal workers. They both join us in studio.