D.C. Council Member Jack Evans (D-Ward 2) and Maryland Sen. Jamin Raskin (D-Montgomery County) join the Politics Hour team in the studio.
The recent decision by Popular Science magazine to stop posting readers’ comments on its website raises new questions about why and how we post comments online. Tech Tuesday explores how comments influence readers and create communities and how algorithms and moderators decide whose comments get posted and rise to the top.
- Ashley Anderson Professor, Colorado State University
- Stephen Roy Vice President of Marketing, Disqus
- Kate Myers Product Manager for Social Media, NPR
MR. KOJO NNAMDIFrom WAMU 88.5 at American University in Washington, welcome to "The Kojo Nnamdi Show," connecting your neighborhood with the world. It's "Tech Tuesday." First, the Huffington Post closed the door on anonymous commenters. If you want to comment, you'll have to verify your identity before you post. And Popular Science Magazine shut down the comments section of its website altogether, saying negative comments make some people think it's reporting is flawed.
MR. KOJO NNAMDINPR chose a middle path, moderating comments before posting them on certain sections of the website so it can weed out those that are inflammatory or off topic. Media outlets and other businesses are eager to cultivate communities of people who want to discuss their product, whether it's news or a blog or the latest fashion. The challenge is keeping the conversation both open and civil. As with any public discourse, websites don't want detractors to derail the debate.
MR. KOJO NNAMDIThis "Tech Tuesday," we're exploring how website owners are using algorithms, ground rules, and free screening to keep their commenting areas lively and engaging without letting trolls and spam bots hijack the conversation. Joining me in studio is Kate Myers. She is Product Manager for Social Media at NPR. Kate Myers, thank you for joining us.
MS. KATE MYERSThank you, Kojo.
NNAMDIJoining us by phone from Colorado State University is Ashley Anderson, Professor of Journalism and Technical Communication at Colorado State. Ashley Anderson, thank you for joining us.
PROF. ASHLEY ANDERSONThanks for having me.
NNAMDIAnd joining us from studios at KQED in San Francisco is Stephen Roy, Vice President for Marketing at Disqus. Stephen Roy, thank you for joining us.
MR. STEPHEN ROYThanks, Kojo. Great to be with you here, today.
NNAMDIYou, too, can join this conversation. Give us a call at 800-433-8850. Do you agree with Popular Science's decision to shut down online comments on its website? Do you post comments on websites? What motivates you to comment on something you've read? 800-433-8850 or you can go to our website and post a comment or ask a question. kojoshow.org. Send us an email to firstname.lastname@example.org or a tweet at kojoshow using the #TechTuesday.
NNAMDIAshley Anderson, you're co-author of a study that influenced the Popular Science decision to shut down it online comments. You found that someone who's already skeptical about the science presented in an article can become even more doubtful after reading negative comments about the article. Could you explain in more detail what the study focused on and what you found?
ANDERSONSure. So, we conducted an experiment, an online experiment exposing a national sample to a story about nanotechnology, which is very much an emerging technology. It's a largely unknown technology. And we found that people who already are supportive of that technology interpret the news story differently if they saw those uncivil comments. And we found that exposure to uncivil comments elevates the levels of risk or the perception that the technology is harmful among people who already have low support for that technology.
ANDERSONAnd we called this "The Nasty Effect."
NNAMDIIt seems to be not only negative comments, but nasty attack words that influence readers. Can you give us some examples?
ANDERSONSure. I'll try to...
NNAMDIYes, this is a family show. Go ahead.
ANDERSONAppropriate for the radio. But, so, in our experiment, we used words like you're an idiot, and things like that, where we had the commenters or the sort of pseudo commenters in the story that, you know, we made up. You know, we use words like, you're an idiot. And so we had them attacking each other. And so we sort of thought of that as, this is something that's deriding the opposition, if that makes sense.
NNAMDISteve Roy, what do you think about Popular Science's decision to end commenting on its website and the Huffington Post's recent decision to stop allowing anonymous comments?
ROYWell, I think, first of all, there's no one size fits all approach to not only enabling comments, but how you treat comments. In the case of Popular Science, you know, they may have a very valid case. You know, science, in many ways, is under attack. So, actually, I view their move as a reaction to the war on science, not necessarily a war on comments. So, the idea of simply shutting comments down on their site does not mean that they're gonna shut down discussion around their content or the science that they produce in that publication.
ROYSo, it's one measure that they can take to maybe remove some of the confusion, but it's not a silver bullet. And the other thing I would point out, more broadly about science and how science chooses to communicate with the public is that they're quite challenged in their ability to communicate the validity, ethicacy and value of what they're doing. The last 10 to 20 years in the science world -- that's been one of their number one challenges, whether it's climate change, biomedical research, GMO's.
ROYThat has been a major challenge for them is getting their story through to the public. People don't value and appreciate science in the way that they used to. You know, 50, 60, 70 years ago, scientists were rock stars and their word was inherently accepted as truth. And you could, 50, 60, 70 years ago, you could name five, ten top scientists that you knew in the world. Today, you know, the Nobel Group came out with their Nobel Prize winners for Physics and I could hazard a guess that maybe one percent of the world population could name those two gentlemen who were awarded prizes today.
ROYSo, I don't see that as a war on comments, necessarily, but a broader issue the scientific community has in communicating with their public. I'd also like to point out that I took another look at their statement, and they did give themselves a caveat where they said they would enable comments in certain select cases. And that may be the right approach for them. There may be certain pieces of...
NNAMDIWhat do you think about the Huffington Post approach? What's the difference between posting a comment anonymously and having your identity verified first? We got a tweet from Jason who said, "Poster should be required to use their real names. Anonymity lets people say anything without consequence." Steve?
ROYRight. So, we hear that a lot at Disqus. Disqus is used by three million websites around the world, so we have a critical mass of evidence to look at, whether or not things like verified identity actually have a marked improvement on the nature of discourse. And actually, quite the opposite is true. Facebook came into this space a couple years ago, and of course, Facebook comments, it forces you to use your verified identity. And on sites that use Facebook, there was no appreciable decline in the number of either nasty or controversial comments.
ROYWe, as a company, actually firmly believe in the power of pseudonymity. We believe that there is a place for people to not have to use their verified identity in order to say things that they could not otherwise say in the public sphere. And that could include folks that are whistleblowers, those could include folks that simply don't want their employer to discover a particular view they have. And there are more benign issues, or reasons why people may wanna not use their real name.
ROYOnline communities, if you're in one, it feels like it has a personality, and some of those personalities, some of the tone of those sites are gonna differ from site to site. For instance, we were at the Leading Mom Blogger Conference earlier this summer in Chicago, and in that community, when this issue came up of anonymity, they said absolutely not. Real names and photos were necessary to be accepted by that community.
ROYIf you look at folks that maybe are talking sports, you know, using a pseudonym or a nickname actually makes the experience more fun. I mean, if I'm on a Green Bay Packers fan website, I may have more fun, I may be more accepted by the community if I use the handle the ghost of Brett Favre versus my real name, where people can track that I'm from Boston, and they may question why is a guy from Boston commenting about a football team in Wisconsin?
ROYSo, there's no one size fits all. But, ultimately, we believe the requirement of verified identity will reduce the volume, and we actually think the quality, of discussion. Our own research, and we did our own piece of research about a year ago into the issue of pseudonymity, anonymity and real identity. And our actual data showed that 60 percent of comments using some from of a pseudonym actually had a positive signal. And positive signals, for us, we can measure in terms of the number of replies.
ROYBut also the number up votes. We enable people to vote on comments as kind of a sorting mechanism. So, that's the take I'd have on that.
NNAMDIStephen Roy is Vice President of Marketing for Disqus. He joins us from studios at KQED in San Francisco. Ashley Anderson is a Professor of Journalism and Technical Communication at Colorado State University. She joins us by phone from Colorado State. In our Washington studios, Kate Myers, Product Manager for Social Media at NPR. We're inviting your calls at 800-433-8850. Do you think websites should moderate the comments people post, or just let the conversation flow unrestricted? You can also send us a tweet at kojoshow. Kate, about a year ago, NPR decided to moderate comments on certain parts of its website, meaning a moderator reads the comments when they come in and decides whether to post them. What prompted that decision?
MYERSSo, we've been moving over the last five years, with our commenting system, towards a little bit more moderation. We started off with allowing comments to be posted without being reviewed by a moderator. And we found that the, both the comments that violated the rules, there's a large volume of them, and we found that, sort of, the time and energy it took for us to review all of those, and the negative impact that having those comments published had on the conversation was significant.
MYERSSo we've been taking a look at ways that we can both positively encourage comments of quality, so good quality comments to enter the conversation and then influence the conversation, and also ways that we can prevent some of the damage that comes from comments that violate the rules. So going back to what Stephen had said, though, every community has a personality. Every community has a certain amount of rules. And the rules that we set on npr.org go to foster the kind of community that we want to encourage and the kind of conversation that we want to encourage around our comments, around our conversation and our content.
MYERSSo we try to make sure that we can have a civil conversation that takes, that allows peoples' voices to be heard while still maintaining, keeping the conversation on topic, not allowing people to make personal attacks, and really allowing people to engage on the content of the...
NNAMDIInquiring minds want to know which sections have moderated comments and which don't.
MYERSSo, right now, on NPR's site, we do moderate comments prior to publishing on all of our news, which includes our international news, most of our science coverage, most of our arts and life and music coverage. And books are moderated after they are published only when someone flags an inappropriate comment. We draw the line somewhere between science as science and climate and science as the food -- science, the food and food as science. Food science.
NNAMDIOn to the telephones. Here is Chris in Chicago, Illinois. Chris, you're on the air. Go ahead, please.
CHRISHi, thanks for hearing my questions and comments. I'd like to ask all of the guests what they think of the idea that attack comments and negative comments online can be used by individuals or groups who wish to disrupt the discussion. Miss Anderson's research, as I understand it, showed that those kinds of comments do, in fact, drive people away who have serious contributions to make. And the analogy in the real world is to -- some of the people who showed up at the town meetings for health care with guns and to disrupt and to shout down people. I see a very clear parallel between...
NNAMDIOkay. Allow me to have first Ashley Anderson respond, and then our other panelists. First you, Ashley Anderson.
ANDERSONSure. So, I think that, you know, looking at comments online and the incivility that's happening online and comparing it to offline discussions -- it's important to think about it in this larger context. So, typically, in the past, people who wanted to contribute to an environment of contentious politics might have been activists who were going to things like town halls. Now they have a new forum for those outlets. There are stimulating, heated discussions in online sources, which are potentially seen by a broader range of people.
ANDERSONAnd those are subsequently having an effect on how people are interpreting the topics in the news that they're seeing. And potentially the news themselves.
NNAMDIThat's the part that I actually find fascinating, because the parallel that our caller Chris drew to people who go to a function in person in order to disrupt it -- I don't know if any studies have been done, but there has never been any significant literature about how those people might affect the opinions of people who show up at the same town hall meeting. What I get the impression from your study is that this does have an effect on how people think about the issue itself. Is that correct, Ashley Anderson?
ANDERSONSure. That's correct. And I think that, you know, one of the broad takeaways from our study is that this new integrated environment where online media combines news stories alongside discussions of news consumers -- that new integrated environment has the potential to change the meaning of the story and how people think about issues in the news. But then there are also some things to consider there where it may be certain predispositions that people bring with them to reading that news story and reading those comments, you know, affect their interpretation differently.
NNAMDIYour turn, Kate Myers. Same question.
MYERSWell, what we've been trying to do at NPR is sort of the flip side of that. Not so much focused on the disruption that negative comments cause, but really trying to make and hold the space for people with expertise to participate in the conversation. And what can you do to really focus on comments from people who have something great to add to the conversation? So, in that case, you need to discourage those uncivil comments that we're talking about in order to really make a good safe space to have that good expert conversation.
MYERSWhether it's the author of the piece, someone with a personal story to share, or someone else with some external expertise that they can bring in and impact the conversation.
ROYSo, one of the main mistakes we see some website publishers make is leaving those bad inflammatory comments go unchecked. And the analogy I make is somewhat similar to the broken approach to community policing. Where if you let small symbols of injury to a space or assault to a public space, and comments, in some ways, are a public space, it sends the message to others that this space is open ground. That we don't fix the stuff that's broken, we don't kick out the people that misbehave, and that actually becomes a self fulfilling prophecy where you get more and more of those...
NNAMDIBrawls allowed here.
NNAMDIBrawls allowed here.
ROYRight. Right. It does. It does. So, the importance of active participation from not only the moderator, but the community itself, and that's another phenomenon that we see across Disqus is a certain cheers effect starts to take place in really active communities. And what I mean by that is there typically are three or four regular visitors to the site, who are posting or interacting four to five times a week, and there's something about that dynamic that actually turns those people into, essentially, community beat cops.
ROYWhere they themselves are policing those comments, which has a completely different effect than the moderator or the editor themselves knocking down what someone may have to say. And then finally, you know, one point I'll make is, and we're starting to get into the issue of trolls and people that are there just to say nasty things. You know, we define a troll as anyone who posts a comment really with the sole intention of provoking a response from others, any response.
ROYAnd when you respond to them, you give them exactly what they want. So, what we teach people is to A, ignore them, if the substance is not offensive or off topic. If it is completely offensive or off topic, we instruct people to delete the comment and also email the reader with the reason why. Because sometimes they may not understand that, hey, this is a place for factual based discussions or whatever the conditions you'd like to put around that conversation to be. But, there are simple steps that you can do to really soften the blow of the effect of one or two comments, that like a bad apple, tend to stick out if they're not thrown out of the basket or pushed down farther in the basket.
NNAMDIAnd on that note, I have to take a short break. When we come back, we'll continue this conversation about managing online comments. But you can still join the conversation right now by calling 800-433-8850. How do you feel about requiring people to verify their identity before they post comments on a website? 800-433-8850. Send email to email@example.com or send us a tweet at kojoshow. I'm Kojo Nnamdi.
NNAMDIIt's a "Tech Tuesday" conversation on managing online communities with Ashley Anderson, Professor of Journalism and Technical Communication at Colorado State University. Kate Myers is Product Manager for Social Media at NPR. And Stephen Roy is Vice President of Marketing with Disqus, a company that makes software to manage online comments. And when we took that break, Stephen Roy was talking about trolls. I'd like to hear both you Kate Myers and Ashley Anderson on the subject of so-called trolls, a common problem on commenting sites. How do you describe trolls, and what do you do to deal with them, Kate?
MYERSSo, at NPR, we talk about trolls as really anyone who does post something to provoke a reaction. And really try to take control of the conversation. However, you could take a lot of people in journalism, when they publish something, they are trying to really make an impact on the conversation. So, are different opinion columnists trolls and trying to help drive that particular conversation?
MYERSSo, in our community rules where we try to minimize the way that trolls can take over and really redirect the conversation, we ask people, again, to starve the trolls, essentially. Not give them the reaction that they want, but also to make sure that they keep the conversation on topic and focused on issues, not on people and personalities. So, it is easy to say, as Ashley pointed out in her research, you are an idiot is something that would violate our rules, but what you are saying doesn't make any sense and here's why is actually a part of the conversation that we'd want to encourage.
NNAMDIAshley Anderson, anything in your study about trolls?
ANDERSONWell, so there are varying definitions of what incivility is and what trolling is in the scholarly literature. And, you know, again, as I mention in our study, we really thought of incivility as when a commenter derides another entity, whether that be another commenter or the story itself or an actor in the story. And so, you know, as Kate mentioned, there may be several reasons why people are posting nasty comments such as these.
ANDERSONYou know, for instance, she mentioned how it basically enhances that conflict that's in the story or in the discussion that's happening, and so they're hoping it's gonna draw people in and provoke even more response, as she mentioned.
NNAMDIOn to Moaz in Hyattsville, Maryland. Moaz, you're on the air. Go ahead, please.
MOAZHey, thank you Kojo for taking my call. I just want to pose this subject like, if we allow people to modify the comments, there's no rules. I mean, we may go to the road of censorship. I mean, we'll sell we need to modify this because it's hurtful. It's rude. And then, we end up censoring the whole point the person want to make and if there's no clear rules, that can be abused. I mean, I think we need to establish the rules and then we can go and, you know, talk about this.
NNAMDIKate Myers, Moaz suggests that you are on a slippery slope unless there are clearly established rules.
MYERSI actually completely agree with Moaz. You need to -- every community has to have clear rules, whether they are written or unwritten. And those rules have to be enforced consistently. Those rules can be enforced by the community at large or by the authorities, the moderators and the hosts of the community. Every community, whether it's npr.org, whether it's the newyorktimes.com, whether it was the comments that are open on Popular Science, even Reddit and Forchan all have their own community rules that are enforced in their own way.
NNAMDIMoaz, thank you very much for your call. Steve, you're…
ROYYeah, I just had one more point about that, just to echo what Kate was saying, is that every online community has its own social structure, just like there are social constructs offline, whether it's, you know, when you know you're at a dinner versus at work or at a meetup or at church, you're gonna behave slightly different, simply because of the conditions and the expectations that are set in each of those settings. And again, that's why it is incumbent on every single online community to make clear what those expectations are and what they're not.
ROYAnd again, that has the effect of really steering conversation in different ways. And if it's left unaddressed, it creates a vacuum. And most vacuums are filled by either nothing or typically bad things. So, the importance of filling in that gap and saying, hey, this is a place for x, y, and z, but not a, b, and c is really, really important. Cause in some ways, you know, the technology today allows for more than our culture is willing to reject. And so the culture is really in the process of catching up with what the technology can allow.
NNAMDIStephen, your company, Disqus, makes free software embed that manages comments for the websites that choose it, including ours at this show. What percent of people who visit websites view the comments, and how long do commenters typically spend on a website?
ROYSo, according to our data, and we look at somewhere around eight billion page views a month across all the sites that use Disqus, and we see close to 60 percent of time, people are at least viewing comments or leaving a comment on any given time. And that really is a big, big change, because most website creators and publishers really viewed anything below the article as kind of a nice to have and almost a throwaway. Literally, the basement, literally and figuratively the basement of the webpage.
ROYSo, you're seeing tremendous amounts of time and engagement being spent there, and on average, for us at Disqus, we see over six minutes spent, in terms of people reading the comments, voting on comments, and also leaving a comment as well. So, what that is starting to do is it's starting to change the perception of comments as kind of a wasteland for publishers, where it's a thing I have to manage and not necessarily in a great way. It doesn't make me feel good, because most of the comments, if they're left unattended, can either be of little value or negative.
ROYBut, what we're seeing is that more publishers are looking for ways to make money off of that part of their sites, because a, by definition, people that are commenting and leaving comments, reading comments, they're more engaged, they're in a conversation. And that actually makes them more attractive for advertisers, as well. So, we're seeing that shift not only in our business, but in other companies that play in the comment space like Gawker Media, for instance.
NNAMDIIs that also an indication, Steve, that comments bring traffic to a lot of these websites, even though people like me deny reading them, but read them all the time?
ROYAbsolutely. Absolutely. I mean, you know, there's a real popular buzz word in social media and digital media companies of user generated content. And that's essentially the audience writing back to you. And that's exactly what comments are. They are an additional form of content, and if more and more publishers can view them that way and treat them that way, in terms of curating them, moderating them, doing a better job of cultivating the folks that leave comments, they can actually become something that is really distinct.
ROYAnd actually makes your site stand out from others, because, you know, unfortunately, in the publishing world, we live in a space where creating original content that stands out is becoming really, really difficult. But a community, that conversation, you just cannot inherently clone that. Cause it forms organically and you can't fake it. And more and more readers and people I talk to. My cab driver on the way over here, you know, I told him what I was doing today. First thing out of his words were, I love reading the comments. Sometimes they're better than the articles,
ROYAnd we hear that time and time again. But I also think it's symptomatic of a change in the expectations that an audience has. And I actually think this started with call in radio, where the idea was that you, the viewer or the listener, could participate in the conversation, and that's exactly what web comments are. They're the audience looking to participate in the conversation, and at many times, try to rewrite the article, in some senses.
NNAMDIOn to John in Alexandria, Virginia. John, your turn.
JOHNYes, hi. I belong to a gaming community wherein we discuss games. And because we (unintelligible)
NNAMDIOh, you're breaking up on me, John. Are you in a stable position?
JOHNUh, no, I'm not. I'm driving.
NNAMDIOkay, but, try, okay. Let's see if we can hear you. You may have to pull over. But go ahead.
JOHNOkay, I'm pulling over now. So, I belong to a gaming community wherein we discuss games, and because we have no other mechanism to talk, we do a pretty good job of policing our own threads of discussion. Now, because of that, we actually have threads of discussion that are pretty good, and the issue becomes to be part of that conversation, you have to read through all of it, so community members have volunteered to actually curate the content.
JOHNAnd what actually happens is the quick points from the comments get added to the content itself so that not everyone has to read the entire chain (unintelligible)
NNAMDIYou're breaking up again, but you said enough so that we understand exactly the point you're making. I'd like to hear you on that, Kate Myers, especially how do you reward good quality comments?
MYERSYeah, we found that one of the best ways to encourage good quality comments is to do exactly what John just said. Include them as a meaningful part of the content itself. So, in that case, you know, in NPR's context, we look at it as, we look for additional stories that might come out of the comments. We may ask people to tell us their personal story and include that in an additional piece of content. And we really, when we take a look at this, we're looking at ways that we can use the comments to illustrate or to really add to the journalism that we're doing.
MYERSBy putting that level of -- by saying that we value your contribution so highly that we're gonna incorporate it into our content, we tend to get a higher quality of user generated content.
NNAMDIAnd what are the advantages of having a reporter, or the author of a piece of a post, respond to the comments his or her post generates? How does author participation change this dynamic of commenting?
MYERSIt really adds , it adds a whole other dimension, both to the article itself and to the conversation. Because knowing -- when the people who participate in the conversation know that the expert who has written the article is going to get in there, and actually is listening to what they have to say, and is willing to respond and talk back to them, it definitely increase -- we know that it increases the quality of conversation, just because we know we have delete fewer bad apple comments there.
MYERSAnd we know that it definitely -- our authors that choose to engage in that tend to be more, tend to report good things from there, that they appreciate the conversation that they engage in. And it really adds back into the journalism.
NNAMDIComments on that, Steve?
ROYWell, from the user's standpoint, it's actually just a real buzz. It's a real thrill to have the writer themselves respond to you. So, there's a motivation there and there's a real value back to the user in seeing that. But for publication itself, again, it sends the message that this isn't a vacant lot, that people are watching, and that people with a similar expertise in the subject, i.e. the person that wrote the article, is willing to engage in the discussion, in some cases, take on both criticism and -- to Kate's point -- actually look for potentially new sources of content.
ROYWe, you know, recently did an independent study of our user base and asked them, what were among some of their main motivators to contributing? And one of the top three was to contribute something that they felt the author had missed, and not necessarily in a negative way, like, hey, you should have said this, but, hey, this is something that you may want to look at in the future.
ROYSo there's a lot of benefits to it, and it's, again, one of the basic best practices that any website could undertake to improve the quality of the discussion. And, again, there's tremendous value back to the outlet 'cause they're getting a better sense of who their readership is, who the experts in the readership are, and how to re-approach those folks for further engagement. And sometimes the best discussions -- oftentimes the best discussions are between the authors and the readers themselves.
NNAMDIAshley Anderson, we got this email from Chris in Colombia, Md.: "There's considerable evidence that certain organizations are either using automation or are paying individuals to post comments on issues such as climate science." Did your study uncover or indicate anything of this nature?
ANDERSONNo. We -- you know, we haven't looked at that in particular. I do know of -- there are organizations out there that are working on actually sort of countering misinformation that's out there about climate change. And so there are tools that are, you know, that basically, you know -- we started this conversation by talking about how, you know, this is potentially a war on science.
ANDERSONAnd so then the question becomes, you know, how do we produce quality and factual discussions in these comments sections while keeping it in mind that science is an uncertain entity in and of itself and also, you know, respecting people's opinions? And so I think it's certainly a challenge. But, no, we didn't, you know, specifically look at that question of people organizing around the comment section.
NNAMDIHave you run into that issue at all, Kate Myers?
MYERSWe've seen it not so much on the climate side of things, but we've definitely seen campaigns organized both coming out of Iran and Syria looking at some of our international news. We see it on topic -- on really a wide range of topics based on, you know, just based on how the organization comes out. They're usually pretty easy to sniff out. You can usually take a look at what the person has posted in the past or what -- if the comment is repeated on other places on the Internet, it's easy enough to see that it's a campaign.
ROYWe see some of that. In other forms, it takes -- the culprits are sometimes people trying to make a name for themselves, whether they're selling a product, so they're pitching and throwing in links to their company in a comment feed. And that type of stuff really very quickly gets snuffed out by the community, and they get voted down or essentially shamed out of the forum. So it's definitely something that's there. It's occasional. But a lot of the tools that we've built enable quickly for you to block folks like that and to find them.
ROYAgain, I think the case point, you know, we enable people to see a history of their commenting behavior, so a quick look at where they've contributed in the past and if it's been of substance or if it's been a repetitive frame about a product pitch or a certain, you know, stump speech they have about a political issue, it's quickly easy to find a bit of a profile on that person and act accordingly to moderate them.
NNAMDIGot to take a short break. When we come back, we'll continue our Tech Tuesday conversation on managing online comments. You can call us at 800-433-8850. Have you ever been influenced by a comment you read on a website either positively or negatively? Send us a tweet, @kojoshow, using the #TechTuesday, or email to firstname.lastname@example.org. I'm Kojo Nnamdi.
NNAMDIWelcome back. It's a Tech Tuesday conversation on managing online comments. We're talking with Kate Myers, product manager for social media at NPR. Stephen Roy is vice president of marketing for Disqus, which is a company that makes software to manage online comments. And Ashley Anderson is professor of journalism and technical communication at Colorado State University. We're inviting your calls at 800-433-8850. Here now is Tom in Washington, D.C. Tom, you're on the air. Go ahead, please.
TOMYes. Good afternoon. I think there's a real possibility for censorship when a site like NPR uses a forum like discussed to curate comments, A, because the community rules are often vague, and, B, there's often no recourse if a comment is denied. I know at NPR, in my own personal experience, the ombudsman has decided not to get involved if a comment is not acceptable to discuss. And I just think, you know, I think the curation could be limited to the so-called seven dirty words and let free speech play out the way it normally does. Thank you.
NNAMDIOK. First, you, Kate Myers.
MYERSSo I would take issue with the fact -- I will say I don't believe that at NPR, we have no resource for -- or recourse for some of those comments. The way that we take a look at community management is that we are hosting a conversation and publishing it on our site, so we believe that it is not censorship to set the rules of what we're going to publish on our site. We want to encourage a conversation that is on topic, that is on point, and that really gets at the heart of the conversation.
MYERSThat said, there are different standards for conversation based on where the conversation happens. So we know that on our Facebook page, we are more likely to let the community handle itself because we are publishing onto Facebook's platform. You take a look at a community like Reddit where the conversation is the main part of the content there, and that conversation happens with those moderation rules that are set by that particular Reddit or sub-Reddit.
MYERSSo we don't go in and do any kind of moderation on -- we don't do any really serious moderation on places that aren't ours because on our site, that's really where we want to encourage -- that's where we are hosting and publishing those comments.
NNAMDIAshley Anderson, a lot of people equate unrestricted online commenting with the First Amendment right to free speech. What are your thoughts about freedom of expression in the realm of online commenting?
ANDERSONIt's certainly, you know, a fine balance that we need to strike where we don't -- you know, we want to maintain that freedom of speech while also having productive conversations that, you know, offer free and frank expression for both sides of the issue but without people taking other people down. And, you know, I -- it's definitely a fine balance.
ANDERSONAnd I think that, you know, every community handles it differently. And, you know, I think we need to start thinking about journalism as a conversation. And we've been talking a lot about that today already, but thinking about how we can engage quality conversation while sort of minimizing those other non-productive comments.
NNAMDII don't know if Tom is still on the line. Tom, are you still there?
TOMOh, yes, I am.
NNAMDII'd like to read to you, Tom, a couple of emails on -- an email and a tweet we got. This email is from Lisa. She said, "I used to comment a lot on many webpages, and I will still occasionally read comments on some pages or make the odd comment here or there. But because the tone can get so nasty, vituperative, racist, and sexist so quickly on pages without a dedicated monitor, I gave up.
NNAMDI"Plus, it's really easy to get into an argument with someone about the most innocuous things or have someone call you horrible names, cast doubt on your parentage and worth as a human being, and it can be really upsetting. Often, it isn't worth it. It's a shame because there are some pages with great discussions, and I used to have really interesting dialogues with people. But I've been burned and had my feelings hurt one time too many. It just isn't worth it."
NNAMDIAnd this email we got from Kelly: "The Washington Post does not moderate its comment sections, and it's awful. It's the land of trolls. I think it does a disservice to the writers and the paper. I avoid the Post website because of the poorly moderated comments." What would you say in response, Tom?
TOMWell, I don't deny that that can be a problem. I guess the issue is, you know, one person's troll is another person's valid dissenter. And I just think that the rules are vague, and deciding who's a troll or not is just left up to, you know, an employee at Disqus who, you know, may agree or not agree with the position being taken. And I think it's too loose, and I think it constrains a lot of speech.
NNAMDIYeah. But what would you say...
ROYIf I could...
NNAMDIWhat would you say is the alternative, Tom, just letting everyone have their say regardless and just eliminate the seven words?
TOMOh, no. I think the way -- I think there is an element in the way that Disqus does it which is good in that, you know, comments get kind of ranked according to, you know, whether they are liked or not.
TOMAnd you can read the first five or 10 comments. You don't have to go to number 187 that's a troll comment. And that works fine.
NNAMDIOK. Thank you for your call. Here's Stephen Roy.
ROYYeah. If I could just correct something, Disqus does not moderate comments on the sites that use our service. We give...
NNAMDIWell, I'm going to ask you to explain exactly what Disqus does because website owners can set public rules for commenters. But they can also set private filters through Disqus and other software. How does Disqus enable users to catch and set aside comments with certain keywords or phrases, like ones that start with the word you? How does that work?
ROYSo, for instance, there are blacklists and whitelists, and so whitelists means basically, if someone's commented once and it was a quality comment, you could basically say to them, you get a speed pass. Your comments get automatically posted. A blacklist is exactly the opposite. Someone who's maybe said something negative in the past, their comment is held in a filter.
ROYThere are also restricted words that any publisher can set up, and those could be -- you know, in the case of, you know, the term you are, as was said before, if you're starting your sentence off with those words, it is generally a sign that maybe something bad is about to proceed it or follow it. So you could set up words like that or any number of words, and it is all dependent on the site.
ROYBut just to be clear, Disqus is not playing the role of editor and publisher across all the sites. We simply give each site the tools to create those conditions on their own. They moderate the sites. They approve comments. We don't play a role in that.
NNAMDIExplain the algorithms Disqus uses to sort and score comments. Why are the first few comments a viewer sees so important?
ROYSure. So I think Tom, the caller, was referring a little bit to this. So typically most commenting feeds, they're sort of by chronology. And typically the first one or two or three comments tend to not be that great, and they also are the first thing other commenters see. And so it kind of conditions the mindset of what that dialogue is about. So if there are bad comments there, there's going to be more bad dialogue.
ROYSee, we actually have a bit of a filtering system, and it's a sorting of you call best. And what best means is essentially what we're using is we're using an algorithm on the back end where we're not only counting votes, up votes and down vote ratios, but we're also looking at the reputation of that user, whether or not they've been flagged for spam or inflammatory language in the past.
ROYAnd what we do is we simply float those comments to the top of the viewing, and we float the lower-quality comments to the bottom. So, again, it's not a perfect system. It's not a censorship system. We don't delete comments automatically that have a lower score. But it's simply a crowdsourcing mechanism that floats those, you know, to the top.
NNAMDIHere now is Molly in Washington, D.C. Molly, your turn.
MOLLYHi. Somebody kind of brought this up just a second ago, specifically about The Washington Post, but I stopped reading the comments because things just get so horrible. A lot of times I wonder if -- maybe this is too cynical -- if the paper, you know, writes stories that are really controversial in order to get a whole lot of hits on their website. I just wondered if anybody had any thoughts about that. And I can take my answer off the air.
NNAMDIWell, you know, the nature of news is that the news business is about stories that are not what you would expect to be happening in your communities about things going wrong. And so by nature they're supposed to be controversial, and the controversy is, a lot of times, what does attract readers to a newspaper. But the notion that a newspaper is simply planting controversies or any news media are planting controversy simply for the purpose of attracting comments, I think, is a little bit beyond the pale. That doesn't happen, Kate Myers, not a great deal.
MYERSYeah, definitely. Well, not certainly -- there's an attempt to do journalism. We all want to do journalism that has an impact on the conversation and really helps -- gives people something to talk about, whether it's write more stories about it or really respond to it via comments.
NNAMDISteve, how has commenting evolved in recent years? And from your standpoint, what does it take for a website to have a healthy commenting community?
ROYSo, you know, the irony for us at Disqus is that comments, per se, aren't all that interesting to us, particularly if they're not connected to each other. If they're one-dimensional, the comments don't matter as much as standalone items. But when they are connected, when people are replying to each other, when there's a personality to those comments and to that community, you actually start to see the particular culture of that site emerge.
ROYIt has a tone, a feel. The show has a tone. Washington Times has a particular tone. And that tone conditions the environment, and so comments, when they're connected, they feel more like a community versus a string of disconnected comments. So that's the main change that we see developing. In terms of best practices -- I think I mentioned a couple of these -- one is to don't leave that space for comments a vacant lot.
ROYYou know, you don't see a lot of good things in vacant lots. The same is true in comments. The second thing is actually having participation from the moderators, not leaving bad comments unchecked. And the third thing I'd leave with is to really know your regulars. There are typically, on any site that's active, you know, your -- to use the "Cheers" analogy, there are a couple people that continually come back, and to really reward them with engagement by the author, but also maybe using their comments or their ideas as the source for future pieces.
ROY'Cause what happens there is not only do those folks repeat and come back often again and again, but they start to police the community on their own. They kind of do the job of moderation for and with you, which is a tremendous benefit.
NNAMDIStephen Roy is vice president of marketing with Disqus. Stephen, thank you for joining us.
NNAMDIAshley Anderson is professor of journalism and technical communication at Colorado State University. Ashley Anderson, thank you for joining us.
NNAMDIAnd Kate Myers is product manager for social media with NPR. Thank you for dropping by.
MYERSThank you, Kojo.
NNAMDIAnd thank you all for listening. I'm Kojo Nnamdi.
Most Recent Shows
Kojo explores the latest headlines and invites you to weigh in on the discussion.
Over nearly a century, sediment and nutrients have built up in the reservoir behind the dam, and in major storms those pollutants flow into the Chesapeake. Some believe dredging is the solution; others say the dredging debate is a distraction from watershed pollution upriver. We explore the issues.
Kojo chats with U.S. Sen. Benjamin Cardin of Maryland, now the ranking Democrat on the Senate Foreign Relations Committee.