The news that Facebook tried to influence the emotions of unsuspecting users by tweaking their News Feed has sparked a public outcry. The company wanted to see whether people who read more positive posts from friends would respond more positively themselves. A Facebook data scientist says the company routinely filters what shows up in your News Feed, but adds that he understands the concern. We explore research ethics in industry and academia and the growing interest in when and how emotion can be contagious.

Guests

  • Jennifer Golbeck Professor of Information Studies and Director of the Human-Computer Interaction Lab at the University of Maryland
  • Clay Johnson Author, "The Information Diet: A Case for Conscious Consumption" (O'Reilly)

Transcript

  • 12:06:39

    MR. KOJO NNAMDIFrom WAMU 88.5 at American University in Washington, welcome to "The Kojo Nnamdi Show," connecting your neighborhood with the world. For a long time it was conventional wisdom. Reading Facebook posts from friends on glamorous vacations, celebrating happy occasions and otherwise enjoying life could make us feel sad about our own monotonous routine.

  • 12:07:08

    MR. KOJO NNAMDISo Facebook decided to test that hypothesis. Without explicitly telling users, it tweaked half a million people's new feeds to show them more happy or sad posts from friends and see how they reacted. Turns out people respond in kind. Reading happy news makes us happy, according to our own subsequent posts, and reading sad news makes us sad.

  • 12:07:31

    MR. KOJO NNAMDIBut publication of the study led to another finding. Being manipulated by Facebook makes us mad. As Facebook takes heat for playing with people's emotions, defenders insist it's all spelled out in the fine print of the service agreement. But critics say it may be time for new ethical rules for tech companies that conduct ever more research on their unsuspecting users.

  • 12:07:52

    MR. KOJO NNAMDIJoining us in studio to discuss this is Jennifer Golbeck. She's the professor of information studies and director of the Human-Computer Interaction Lab at the University of Maryland, a reliable guest and occasional host on this broadcast. Jen, good to see you again.

  • 12:08:06

    MS. JENNIFER GOLBECKGlad to be here.

  • 12:08:07

    NNAMDIClay Johnson joins us by phone from Georgia. He is author of "The Information Diet: A Case for Conscious Consumption." Clay Johnson, thank you for joining us.

  • 12:08:17

    MR. CLAY JOHNSONThanks for having me, Kojo.

  • 12:08:18

    NNAMDIIf you are interested in participating in this conversation, you can call us at 800-433-8850. Do you think it's OK for Facebook to alter your news feed to study whether certain posts make you happy or make you sad? Give us a call, 800-433-8850. Or send email to kojo@wamu.org. You can shoot us a tweet, @kojoshow. Clay, explain this experiment that Facebook did to see whether its users were influenced by the emotional tone of their friends' posts?

  • 12:08:47

    JOHNSONSure. What Facebook did was they took .004 percent of their user base, which sounds like a small number until you realize that Facebook has a billion people. And .004 percent means about the size of Austin, Texas or about 700,000 people. And that was their sample group. And they decided to see whether or not they could promote social contagions through adjusting news feeds, specifically to see if they could make people post more positively or more negatively by promoting positive or negative posts in their own news feed.

  • 12:09:30

    JOHNSONSo Facebook is constantly tinkering with our news feeds, that stream of information you get when you're logging in to the main page of Facebook. And in this case, they looked for posts with negative words, and they promoted those to a certain set of users. And then at the end of the week, they noted that the users that were exposed to more negative posts wrote more negatively.

  • 12:09:57

    JOHNSONAnd they did the same thing on the positive side, and they found that people that were exposed to more positive posts wrote more positively. And the interesting thing is is that they found that their control group, the people who were exposed to more bland information didn't write as much at all. And so Facebook has sort of discovered that, in order to keep us engaged with Facebook, they need to keep us either really, really happy or depressed, which I find to be one of the more scary parts of this study.

  • 12:10:32

    NNAMDII should mention that we invited Facebook to join us for this conversation. But a company spokesperson declined. Clay, Facebook routinely uses complex algorithms to decide what we see on our news feed. How was this experiment different?

  • 12:10:47

    JOHNSONIt's not much different, except that it was designed to change our moods. But when we're engaging in these advertising-supported mediums, like Facebook or Twitter or your email, if you use a service like Gmail or Hotmail or Yahoo, when we're engaging in these, it has to be our base assumption that we are being constantly manipulated. We're being manipulated by advertisements and by the tweaking of headlines and those kinds of things.

  • 12:11:18

    JOHNSONThat's the price that we're paying when we're engaging in these free services. Now, the difference here is is that, one, they're specifically manipulating our moods. And the desired outcome was to see if we could actually be manipulated in a positive or negative way. The other difference here is that it wasn't really based around advertising. It was based around our news feed, so the information coming from our friends. And there are a lot of social implications that, I think, revolve around this because of that.

  • 12:11:51

    NNAMDIJen, the backlash against Facebook over this experiment seems to focus on the fact that it didn't explicitly tell its research subjects, in this case, 600,000 or 680,000 or so Facebook users, that they were participants in this study. What are the legal and ethical norms for a study by a business like this one?

  • 12:12:10

    GOLBECKSo this is a really interesting space. I'm a academic researcher. And I deal every day with what's called the IRB, the Institutional Review Board. All universities have these. A lot of non-profits have them. And they're set up to review the ethics of the research that you're doing that involves human subjects. There are similar boards if you're doing research with animals. And these came about because of some really ethically terrible research that went on in this country.

  • 12:12:37

    GOLBECKThe Tuskegee experiment is the basis of this that started in the 1930s and ran through the 1970s where the U.S. government went and basically did medical experiments on African-Americans, starting with sharecroppers, in the 1930s to study the long-term effects of syphilis. So they got a bunch of subjects. They had syphilis. They told them they would give them free medical care and insurance. And even when penicillin came out as a way to treat syphilis, they didn't treat these patients.

  • 12:13:07

    GOLBECKAnd they didn't tell them that they had syphilis. So they were allowing these people to get sick to monitor them. And, in fact, the same group went to Guatemala and infected people with syphilis to see the impacts. There was a huge outrage in response to this. And so the government came up with something called the Belmont Principles.

  • 12:13:26

    GOLBECKUniversities instituted these Institutional Review Boards to make sure that, if people are participating in experiments, that they give what's called informed consent. That's the key issue here, that they're told that they're participating in an experiment. They have the right to leave. They know what their risks are. And Facebook just didn't follow that here at all. There was no informed consent for the people participating.

  • 12:13:48

    NNAMDIBut there's this. Facebook has one bullet point in its data use policy that says it may use information about you -- quoting here -- "for internal operations, including troubleshooting, data analysis testing, research, and service improvement."

  • 12:14:04

    GOLBECKYeah.

  • 12:14:05

    NNAMDIIs that enough?

  • 12:14:06

    GOLBECKIt's not. So we can get into a lot of the details about scientific standards for this. But I pulled some up. This thing called the Declaration of Helsinki, this is a set of ethical principles for doing research with human subjects. It's required by the National Academies of Science, who published this study, that all work that they publish complies with that.

  • 12:14:29

    GOLBECKAnd I -- they have a whole section on informed consent, and it says, for people who are giving informed consent, they have to know the anticipated benefits and risks of the study, the discomfort that it may entail, the post-study provisions, any relevant aspects, along with having the right to refuse to participate or to withdraw consent at any time without reprisal. So that one word from Facebook that says your data may be used in research doesn't meet any of these requirements for what it means to give informed consent.

  • 12:14:59

    NNAMDIClay.

  • 12:15:00

    JOHNSONNot only that. Not only that, but also that bullet point was amended to their terms of service after this study was posted.

  • 12:15:11

    JOHNSONSo it's sort of neither here nor there. And I would submit that -- so Facebook's -- I took a look at Facebook's terms of service and usage policy and stuff like that. We're talking about 14,000 words of legal text. That is roughly double the size of the United States Constitution and all of its amendments. And we have, you know, we have a legal profession called, you know, constitutional scholars.

  • 12:15:40

    JOHNSONBut we certainly don't have a legal profession of, like, Facebook terms of service scholars, even though Facebook governs more than three -- the Facebook policy governs more than three times the number of people that the United States Constitution does. And so I have a hard time believing that there's anything going on with sort of an explicit informed consent here when we're talking about a checkbox on a sign-up form that says, you agree to these 14,000 words.

  • 12:16:13

    JOHNSONYou know, it would be much better to me if it said something like, you agree to our terms of service, you agree to be experimented on, and you agree -- part of the issue that's not being brought up in this discussion is the impact on minors, whether or not Facebook actually conducted these studies on minors in their social network, people that are maybe 16, 17. I as a parent certainly would not want for my child to be subjected to a psychological test without my consent first...

  • 12:16:49

    NNAMDIEspecially if it involves emotional manipulation.

  • 12:16:52

    JOHNSONRight. And then the -- you know, if those checkboxes were a little bit more explicit on the registration page, then maybe Facebook might have a case that this was informed consent. But I'm really not buying it. However, I still think that there's a bit of personal responsibility here at play. And the real moral of the story is that we, as consumers of information, have to understand that, you know, we are being manipulated.

  • 12:17:23

    JOHNSONI remember I founded a company called Blue State Digital, and we did all the digital work for the Obama campaign. And before that, we did all the digital work for the Howard Dean campaign. And all the way back in 2004, we were running tests on our email list to see, you know, who -- you know, let's see, if we change the donate button from red to blue, will that get more people to donate?

  • 12:17:45

    JOHNSONIf we change the subject line from, you know, these particular characters to those particular characters, will we get a better response rate? And that kind of thing happens to you when you're using the Internet probably hundreds of thousands of times per day. You're engaged in some form of test like that. And some of them are, in fact, to put you in a mood to do something, right? That's what advertising is. And we have to sort of take a step back and, I think, accept some responsibility here and say, hey, this is the price that we're paying, and it means that we need to be conscience consumers of information, not just passive consumers and assume that everything's on the up and up.

  • 12:18:22

    NNAMDIAnd in case you're just...

  • 12:18:24

    JOHNSONWe're being manipulated all the time.

  • 12:18:24

    NNAMDIIn case you're just joining us, we're talking about Facebook's Psychology Study with Clay Johnson, author of "The Information Diet: The (sic) Case for Conscious Consumption," and Jennifer Golbeck, professor of Information Studies and director of the Human-Computer Interaction Lab at the University of Maryland. You can call us at 800-433-8850. Should Facebook or Google or Netflix tell you if they're using you in a study, 800-433-8850? You can send email to kojo@wamu.org. Here is Kimberly in McLean, Va. Kimberly, you're on the air. Go ahead, please.

  • 12:19:01

    KIMBERLYHi, thanks for taking my call. I'm a psychotherapist so I'm listening to this and reacting to it from not just the ethical or legal implications but also just the human cost. I see people everyday in my practice who leave my office depressed and sad. And to know that Facebook, you know, would -- I don't know if they're -- Facebook is assuming that all people are just, sort of, in a neutral state at all times and you can manipulate them up and down. But people who already are suffering from depression or loneliness or isolation, to manipulate them with the possibility of making their sadness and their depression even worse, it's just unconscionable.

  • 12:19:42

    NNAMDIOkay. Thank you very much for your call. Facebook's findings in this one weeks study, Jen, contradicted the conventional wisdom that reading lots of upbeat posts from friends make us feel badly because we think we're missing out. What did Facebook find out about how people react to positive and negative posts? And they obviously wasn't a control factor for people who might be therapeutically suffering from some form of depression.

  • 12:20:07

    GOLBECKYeah, I mean, I think that that's a great point. And you can take Kimberly's point about what happens if you have people who are depressed and they may be going to Facebook to try to feel better and you make them feel worse. You can carry that out to some not-unreasonable extents from what happened here and it gets really concerning. So, yeah, what Facebook found out and the reason they did this study was that there is this conventional wisdom that in -- is supported by quite a bit of research, that says, we feel like our life is not as good as our friends' lives when we see the curated things that they post on Facebook.

  • 12:20:42

    GOLBECKThis showed that if people post sad things, that you'll post sad things too. And the same goes for posting happy things. The interesting thing is that scientifically, there's an interesting insight in this research and it's not that the research shouldn't have been done but there are ways that Facebook could have done this without intentionally manipulating people.

  • 12:21:04

    NNAMDIAs we said, Facebook was invited to join this conversation but a company spokesperson declined. So allow me to speak for Facebook. What other way could we have done this? Had we told people in advance that we were looking to see how they respond to sad or happy or negative or positive posts, then that would have mucked up the whole thing.

  • 12:21:20

    GOLBECKI think you actually could've done this without needing any informed consent. You have 1.4 billion people on Facebook. You have these abilities that they used here to detect when people are posting happy things and when people are posting sad things. I bet, if you went through 1.4 billion profiles, you'd find some people whose friends just happen to be posting a lot of sad things and those would naturally, with the regular algorithm, show up on their feed and other people whose friends posted happy things.

  • 12:21:49

    GOLBECKAnd then you could see how people reacted to the un-manipulated version of what their friends would be sharing. You could get the same results from that without intentionally trying to make people feel happy or sad. And that would dismiss a lot of the ethical problems that came out of this study.

  • 12:22:07

    NNAMDIWell, let me take it someplace else, academic researchers such as yourself use human subjects all the time.

  • 12:22:13

    GOLBECKYeah.

  • 12:22:13

    NNAMDIWhat kind of consent are they required to get before starting their experiments?

  • 12:22:18

    GOLBECKSo we go through about a two week process for the kind of research that I do on social media, which is similar to this study. We have to fill out a application with our IRB, the ethics board on campus. They review it and we say, what are the potential risks to the subjects? How do we mitigate those risks? Is there any deception? Because you may want to do the study without telling people that you're manipulating them. And you're allowed to do studies like that but then you have to have a way to go back afterwards and say, look, we actually manipulated you here and deal with the potential impacts that'll have on people and Facebook didn't do that.

  • 12:22:55

    GOLBECKSo we have a lot of forms to fill out. And we make what's called an informed consent form that people need to read and sign before they participate in the study, unless you've made special arrangements to deal with them afterwards, that talks about the benefits, the risks and the right to not participate, if they don't want to.

  • 12:23:14

    NNAMDIGotta take a short break. When we come back we'll be continuing this conversation and continuing to take your calls at 800-433-8850. Does your mood change depending on the tone of the posts you read on Facebook? You can go to our website, kojoshow.org, ask a question or make a comment there. I'm Kojo Nnamdi.

  • 12:25:23

    NNAMDIWelcome back, we're talking about Facebook's Psychology study, looking at ethics and emotion. The study, of course, looked at emotion. We're talking about the ethics of the study with Jen Gol -- Jennifer Golbeck, she's a professor of Information Studies and director of the Human-Computer Interaction Lab at the University of Maryland. Clay Johnson is the author of "The Information Diet: The (sic) Case for Conscious Consumption."

  • 12:25:46

    NNAMDIThis is for both of you but I'll start with you, Jen, a prestigious academic journal, the proceedings of the National Academy of Sciences published this study. It was co-authored by a Facebook data scientist, a Cornell University professor and his graduate student. What are the rules from -- for publishing in scholarly journals and were the rules followed here?

  • 12:26:06

    GOLBECKSo journals have their own policies about what needs to be followed. The proceedings of the National Academies of Sciences, has a section specifically describing research involving human and animal participants in studies. And it says, "For experiments involving human participants, authors must include a statement confirming that informed consent was obtained from all participants." So they reviewed this study and Facebook says, well, we, you know, did an internal review.

  • 12:26:34

    GOLBECKAnd Cornell said, our IRB, which is our ethics board, approved this. But it turns out, Cornell is really backing away from that statement, at this point because the researchers at Cornell said, well, we're just getting this data set from Facebook, like, we're not actually running any experiments with Facebook, they're just giving us the data to analyze. And that's a common thing that we do. And that's easy to get ethical approval for 'cause you didn't experiment, you're just analyzing some data.

  • 12:27:01

    GOLBECKBut they actually were involved with Facebook, running these experiments and developing them. And the IRB didn’t look at that at all. So the proceedings of the National Academy's had a lot of ethical questions and honestly, I think, that Facebook didn't come anywhere near to meeting those requirements and the Journal may have been a little wowed by the number of subjects and the fact that it was coming from Facebook, that they kind of overlooked these requirements.

  • 12:27:26

    NNAMDIClay Johnson.

  • 12:27:27

    JOHNSONThe thing that troubles me is, you know, this experiment was the one that Facebook chose to publish. But Facebook doesn't have any legal obligation to be transparent with the public about all of the experiments that it runs. And, in fact, the response to this particular study has provided Facebook a incentive to be less transparent, moving into the future.

  • 12:27:54

    JOHNSONThere's no oversight over Facebook as to, you know, there's no board that Facebook actually legally has to go to, to say, hey can we target, say, 435 members of Congress with a pro-choice or pro-life messages and see if we can make Congress vote more pro-choice or pro-life, that's something that they are legally allowed to do.

  • 12:28:16

    JOHNSONAnd the implications of that seem to be quite, you know, sort of, puzzling to me. What if Mark Zuckerberg just really hated a guy he went to high school with and decided to target him with very negative messages for a year to make him very, very, very depressed, right? Those are the kids of questions that we need to be asking when you're dealing with a network that's run by a private corporation, that has a billion people in it.

  • 12:28:43

    GOLBECKAnd I'd like...

  • 12:28:43

    NNAMDIJen.

  • 12:28:44

    GOLBECK...just follow up on that. I wrote an article in Slate, last December, about another study that Facebook had published where they had said, you know, if you're on Facebook and you start typing a comment or a status update and then you change your mind and you don't post it, Facebook was actually collecting those unposted pieces of text and analyzing them. And we -- I got a huge response to that article.

  • 12:29:10

    GOLBECKAnd again, that's something that Facebook published scientific work on that was, again, ethically questionable. But it makes you wonder exactly, like Clay was saying, what other stuff are they doing behind the scenes that they're not trying to publish in these academic spaces?

  • 12:29:24

    NNAMDIOnto Nick in Washington, D.C. Nick, you're on the air. Go ahead, please.

  • 12:29:29

    NICKYeah, hi. I was just -- I was listening and I heard, I can't tell whether this is really like something that you would take to court or not. You know, like a class action lawsuit or is it just something everybody's gonna frown on? But the one gentleman there, I appreciate what he said but he said, well, you know, this -- I wouldn't want my children to be subjected to this, so forth. But we need to be -- you know, educated consumers. And I was thinking, that's kind of a -- you can't really be an educated consumer at this point with Facebook.

  • 12:29:58

    NICKPeople are hooked. I'm not even a user of Facebook but my wife does and they'd never say, read through all the pages, say, oh, I'm not using anymore. Like, I don't know how to -- how do you reconcile those two things?

  • 12:30:10

    NNAMDIHow, in this situation, Clay Johnson, can you make an informed decision? How can you be an informed consumer if you're looking at pages that are in excessive of the U.S. Constitution?

  • 12:30:23

    JOHNSONSo, I think -- right. I think, we have to take two approaches. That we, you know, just have to make a base assumption that we're being constantly manipulated in these situations. Now, that's sort of an easy, easy out and easy answer. Not a great answer to this because you don't even know when you're -- if, you know, the good kids of manipulation that -- not the good as in, you know, capital "B", capital "G", you know, force for good. But the advance forms of manipulation are the case that you don't even know that you're being manipulated.

  • 12:30:52

    JOHNSONSo you don't just go, like, hey, I'm being manipulated. But I think that we need to have a, sort of, a skeptical approach to the information that we're getting through these online services. But I also think that Facebook needs to do a better job of getting informed consent from its users. And that's maybe breaking down the privacy policy and the terms of use into a few separate check boxes, so that you can -- you basically have to check in the registration process that, yes, I agree to this. I agree to have my data collected and sold to advertisers and I agree to have my news feed subject to psychological tests.

  • 12:31:37

    JOHNSONAnd that kind of thing, I think, that Facebook ought to do that. And in terms of the legal side of things, I generally think that people probably ought to consider -- the participants in this study ought to consider if they, you know, know who they are, ought to consider a class action lawsuit against Facebook. And...

  • 12:31:56

    NNAMDIAs a matter of fact, we got an email from Jonathan who asks, "Is there a way to find out if you were part of Facebook's test group?" Do you know, Jen Golbeck?

  • 12:32:03

    GOLBECKThat data is kept private and anonymized. So they are sharing, at least according to the Journal article, they make some data available with -- not within a user names or anything like that. So, I think, if they were to take Clay's suggestion and someone file a class action lawsuit, they could probably get that information from Facebook but it's definitely not available now.

  • 12:32:22

    NNAMDIBut it's not just Facebook, we had a caller, Doug, who couldn't stay on the line, who says, "Google manipulates search results so that this is not a matter of all or nothing but examining what purpose the manipulation serves." How do we do that?

  • 12:32:34

    GOLBECKYeah. That's a great point and inline with a lot of what Clay was suggesting. Marketing and advertising is designed to manipulate you into wanting to buy a product, right? Google absolutely changes their search results in the order that they show things in. And their goal is to show you the most relevant piece of information. I was at Google once and they said, the ideal way that Google would work is that you think of the thing you want and it instantly pops up on your screen, you don't even need to bother searching. You don't need to pick, it just shows up there.

  • 12:33:03

    GOLBECKAnd they get more money if they're giving you better results. With Facebook, the interesting thing that I've seen in these studies is that, when they're publishing this scientific work, they're really interested in how do we get people more engaged with Facebook? How do we get them to post more? How do we get them to stay on the site more? Because that's how they're going to get more money with you coming to Facebook more. So it's not necessarily that they want to serve you better. They just want to keep you consuming as much as possible.

  • 12:33:34

    JOHNSONRight. And the reason I suggest a lawsuit isn't because I, you know, I sort of, you know, want to take the torches over to Menlo Park and burn down, you know, Facebook headquarters or anything like that. But it's because there's -- we're dealing with a billion people here. And there's really not a lot of law and not a lot of, you know, thought around how we're being governed in these social network societies. And I'd like to see the courts take more cases like this on so that we either get more thought from the judicial branch of government, or the legislative branch of government. We can have more of a dialogue as a society around this stuff.

  • 12:34:15

    NNAMDIWe're taking your calls at 800-433-8850. Do you think emotion is contagious online? You can shoot us a tweet, @kojoshow. Or send email to kojo@wamu.org. Here is Christian in Bethesda, Md. Christian, your turn.

  • 12:34:31

    CHRISTIANHi. Thank you, Kojo, for having me on here. I just wanted to point out that there's important distinction between the emotion that the poster is having and the fact that they're posting to me the -- a more -- it may be more representative that the person's feeling more comfortable to share their more positive feelings when they see more positive statuses around. And the same thing goes with their negative feelings as well.

  • 12:34:59

    CHRISTIANI also think that a more robust interpretation of the signings that they had would be what you mentioned about the fact that people with more emotional statuses or any of them tend to post more. So in that sense, they're able to -- they understand now that they can, you know, manipulate posters and bring them in by having, you know, more emotional statuses on their web page.

  • 12:35:26

    NNAMDIWhat do you say to that, Jen, not necessarily affecting your overall mood, just the things you post?

  • 12:35:31

    GOLBECKYeah. I mean, Christian actually makes a really interesting point about these two studies that we keep contrasting. Does seeing lots of good stuff from your friends make you feel worse about your life? Or does it actually make you feel good? I think he hits on an interesting point there, that I may feel like, man, my life is so boring compared to my friends off on their tropical vacations and at their parties, but I'm not going to...

  • 12:35:52

    NNAMDII can't bring them down with my post. I got to post something up, too.

  • 12:35:54

    GOLBECKExactly, right? I'm going to post in the same tone as my friends. And if all my friends are like, their dog died and they lost their job, I'm not going to be like, hey, check out my vacation pictures.

  • 12:36:03

    NNAMDII got to empathize.

  • 12:36:05

    GOLBECKThat's right. So it could be that both things are true. I may see lots of good things and feel worse about my life, but I'm going to post in the spirit of what my friends are posting.

  • 12:36:13

    NNAMDIClay Johnson, the question at the heart of this study is whether emotion is contagious online and can be spread either inadvertently or by design. You mentioned earlier how this is also important in politics. But we've seen through the Edward Snowden leaks that the government collects a lot of data on us. Is there any chance that it could or would use emotional manipulation? And if so, in what way?

  • 12:36:38

    JOHNSONWell, I think the first question that raises in my mind, Cornell University was one of the universities that helped to produce this particular study. This story broke on Saturday. I think it's amazing that actually The Onion broke this story, the A.V. Club of The Onion broke this story.

  • 12:36:59

    GOLBECKA great article, yeah.

  • 12:37:00

    JOHNSONYeah. And on that day, the United States Army Research Office was listed as a sponsor of the research. Now, once the story broke out, the -- somebody took the United States Army out of Cornell's press release, saying that they were at press release. And I don't know whether that was an error on Cornell University's part. I certainly have never been a part of a non-profit or institutional study where a sponsor's name was inserted into it erroneously but -- or whether or not the Army just called and said, hey, we don't want to be associated with this.

  • 12:37:40

    JOHNSONBut I have to go back to when the State Department earlier this year got caught setting up a corporation in Cuba to basically do what Twitter does in order to incite revolt in Cuba. So it's clear that government is keen on using social media in some way to manipulate people. It's clear that government has a large dataset called PRISM. We know that for a fact.

  • 12:38:12

    JOHNSONAnd whether or not government was actually involved in this particular study remains to be seen. It's unproven. It was true on Saturday and false on Sunday. But it definitely causes me to -- you know, saying government is trying to make people happy or sad on Facebook before Snowden's leak probably would have been the ravings of a crazy man. But now, after the PRISM story and Snowden's leak, I think a little bit of paranoia around this stuff is probably warranted.

  • 12:38:46

    NNAMDIJen.

  • 12:38:46

    GOLBECKI may actually have some true insight to add to that one. So I didn't see the citation about which Army project it was. But the Army does have a very big network science research center, which I am a part of and some researchers at Cornell, along with about a hundred people are part of. This could have been part of that project where we're interested in how do we analyze networks and use that to better understand people. The Army is interested in that not to manipulate and interact with the public but to see how to better understand war fighters, right, people on the ground.

  • 12:39:22

    GOLBECKAnd we can't have access to that kind of data. So we academic researchers work with whatever we can get, which often ends up being social media research. Now, the Army's supposed to say, yes, that study you did, totally put our name on it. But it's a pretty quick review process from them, so it could be that these researchers were funded on this big network science project, analyzing social networks. The Army said, OK, go ahead and state your affiliation with us, and then once they saw the ethical issues backed off.

  • 12:39:51

    NNAMDIOn to Eric in Fairfax, Va. Eric, you're on the air. Go ahead, please.

  • 12:39:57

    ERICHey Kojo. Hey, all I was gonna say is, if you don’t like Facebook, don't use it. It's like when you're watching something idiotic on TV like the Kardashians, turn it off.

  • 12:40:08

    NNAMDIWell, it's a little more than that actually, Eric. Do you use Google? Do you use any other service at all? Do you go online at all?

  • 12:40:15

    ERICOh, yeah. Yeah. I mean, I have...

  • 12:40:18

    NNAMDIWell, wait a minute, if do that, then data is being collected on you and that data is being used to do research on occasion to manipulate you. How do you deal with if...

  • 12:40:30

    ERICI know. I know they're doing that. Actually, when I check something out on the internet, they generally send me information about it. I've got to admit, you guys, I don't care. I don't know what's wrong with me, but I just simply don't care. Kojo, I'm sure that's wrong, okay? But I don't.

  • 12:40:46

    NNAMDIOh, but you're fine. That means you can happily Facebook.

  • 12:40:51

    JOHNSONI think you're with the majority of Americans and the majority of the people on Facebook don't actually care. Now I think a lot them don't actually know or haven't thought through this stuff because, well, they've got other stuff to think through. They've got their jobs to think through and stuff like that.

  • 12:41:05

    NNAMDIWell, we got to...

  • 12:41:06

    JOHNSONBut I also...

  • 12:41:06

    NNAMDIWe got an email from someone who I guess wishes to remain anonymous who says, "Advertising companies have employing top psychology for decades. I hope people are really listening when your guest says understand that you are constantly being monitored and your data is being collected for manipulation. However, I'd like to question why it has to be like this. If citizens don't take action now to take back the internet with laws to protect us, this manipulation will continue to escalate."

  • 12:41:34

    NNAMDI"As it is, it is difficult to tell fact from fiction. What a great way to control the population." Easy to say laws to protect us, more difficult to do, isn't it, Jen?

  • 12:41:44

    GOLBECKSo hard, it's so hard. And this is just really the tip of the iceberg. Here we're talking about, like, data that was gathered. Kojo, you and I have talked and I have a TED talk about this, about all these crazy computational things we can do to find out information that people don't even want to share. It comes from the same space. And, you know, I think we're starting to see, we're about 10 years into the popular social media.

  • 12:42:08

    GOLBECKWe're starting to see places try to legislate it. So we had the Spanish ruling about the right to be forgotten on Google. Germany has had a few rulings about people not being able to keep digital photographs of other people even if they are private and taken consensually. I think we absolutely are going to have to move into a space where there's more legislation in place, right? Maybe that'll come from the legislative branch.

  • 12:42:33

    GOLBECKBut I think we're more likely to get it from the judicial branch through lawsuits about this. But there's just no really good way to understand, like, what's permissible and what isn't, what rights we have over our data and what we hand away.

  • 12:42:45

    NNAMDIGot to take a short break. If you've called, stay on the line, we'll try to get to your calls. If you'd like to call, call us at 800-433-8850. Do online posts about political or ethical issues ever influence you to change your own views? 800-433-8850 or you can go to our website, kojoshow.org. Ask a question or make a comment there. I'm Kojo Nnamdi.

  • 12:44:50

    NNAMDIWe're discussing Facebook psychology study with Jennifer Golbeck, professor of information studies and director of the Human Computer Interaction Lab at the University of Maryland; and Clay Johnson, author of the "Information Diet: The Case for Conscious Consumption." Clay, how could the notion of emotional contagion online be used in the future for the public good? You know someone who's working on something called a mood map.

  • 12:45:16

    JOHNSONYeah, Deanna Zandt in New York is creating the weather map and she's focusing on, hey, can we find and spot, like, these contagions through public posts on Twitter and on Facebook? Can we actually start charting to see whether or not there's a depression outbreak in New York City or happiness outbreak in Austin, Texas? Can we start tracking people's emotions and see if these things are maybe geographically contained or, you know, constrained or, you know, social proximity constrained to start tracking how these things are spread.

  • 12:45:50

    JOHNSONAnd maybe, you know, there's another study done in New Zealand, I think, where people started, like, creating a suicide prediction algorithms to start looking at someone's posts and to see whether or -- they made some serious traction to see whether or not they were able to see if someone was at risk for a serious depression incident or suicide. And it think, you know, if we can go along those lines and say, hey, this person is at risk for suicide.

  • 12:46:20

    JOHNSONLet's start advertising suicide prevention hotline at them. I think those, you know, could be very good things to start taking a look at. But the real thing here, I think, the lesson learned is that we have no -- we, as the user, as the subject of these experiments, have no transparency, none at all. We don't even know that we were part of the study. Nobody knows that they were a part of the study and we don't know what studies are being done on us.

  • 12:46:46

    JOHNSONAnd I think that that's something that we really need to start taking into consideration here. I get the idea that, hey, if you don't like Facebook, don't like -- then don't use it. But to some extent, I think as we travel through time and travel forward, these services are going to be as necessary as automobiles in order for us to get jobs and to live productive lives in society. It's going to be hard for us to not use them as we change to a more knowledge-based and more digital economy.

  • 12:47:16

    JOHNSONSo I think that it's really worth considering, hey, what kind of transparency do we need to have around ourselves and these experiences -- experiments being run on us?

  • 12:47:27

    NNAMDIWe have a suggestion in the form of a tweet. I'm not sure I understand what it means, but I hope you do, Jen. "There should be a common end-user license agreement paradigm, similar to something like standardized licenses such as Creative Commons' GPL, et cetera."

  • 12:47:41

    GOLBECKWow. So the Creative Commons, for example, is an open way of licensing software. So I can say anyone can take this and do whatever they want with it, it's free. Or you can reuse it but not commercially or if you do reuse it, you have to attribute it to me. So you could imagine something like that with informed consent, right? So you are -- by using Facebook, you're going to participate in some experiments.

  • 12:48:11

    GOLBECKAnd we might manipulate X and Y and Z. We'll look at this kind of data. Here's the rights that you have. We'll keep your data anonymous. We'll protect your identity and you just sort of generally understand, okay, here's what I'm getting myself into. I don't know the details of the specific experiment, but I know generally what's going to happen. And also, you're given the right to withdraw and opt out, which is a critical component of all informed consent.

  • 12:48:37

    GOLBECKSo it could be that we create a set of kind of standard informed consent forms that people can read and agree to when they use a service like Facebook or Twitter or Google, and they're going to be experimented on.

  • 12:48:47

    NNAMDIHere's another tweet we got, Clay Johnson, from someone who says, "Folks need to stop complaining when they are using a free service like Facebook and signing privacy policy and end user license agreement." It's a common axiom in the internet age that if you're not paying for the product online, you are the product. How, Clay, does that apply in this case and what are the lessons for people who use free social media platforms?

  • 12:49:13

    JOHNSONWell, again, I think we need to be better products and we need to, you know, we need to be a little bit more informed about that. I think that that's a wise education. But, look, is it too much to ask that these companies start acting like decent human beings? Like, you know, that's really all I'm asking for here is, hey, can we tell people if we're going to intentionally try and make them depressed every once in a while?

  • 12:49:38

    JOHNSONCan we tell people if they're going to make them happy? The moral implications of this are very, very serious. Again, you know, out of 680,000 people, what if somebody during that week did commit suicide? Should Facebook be on the hook for that? That, to me, seems like it's worthy of thinking through. And I bet you that nobody on the Facebook data team wants to be responsible for that.

  • 12:50:05

    JOHNSONBut I think that they ought to be thinking of that in terms of public health and public safety. I think part of the issue here is that we don't have a mental parity when it comes to physical health, right? We say, oh, I'm going to the doctor because I've got the flu, but we don't say, oh, I'm going to the doctor because I've got the depression. And because there's a stigma assigned to mental health.

  • 12:50:29

    JOHNSONBut maybe if we started, you know, would it be acceptable if Facebook said, well, we gave 600,000 people the flu? No. It wouldn't be acceptable, it would be outrageous. And I think that this is, you know, we're treading a line here and we're heading to a place where if we looked at things through the public health perspective, we'd see something really, really scary and nerve racking.

  • 12:50:51

    NNAMDIIndeed we have a caller, Jay, who couldn't stay on the line who says, "Any instances of people becoming depressed and/or committing suicide during the period of this test, any way to relate that to Facebook?" I don't think we have that information since we don't know who was involved in the test. But I think Paul in Arlington, VA begs to differ with that point of view. Paul, you're on the air, go ahead please.

  • 12:51:13

    PAULYes. I should say, first of all, I have very little interest in Facebook and I haven't looked into this subject thoroughly. But it strikes me as being so minor and so vague the types of manipulation that people are complaining about, particularly in contrast to the real much, much larger forms of manipulation of public opinion and emotion that the news media routinely commit that I find it hard to understand why people are getting so upset.

  • 12:51:41

    PAULAnd I'd also say, psychological studies have shown that most people go through their lives floating on a cloud of unjustified happiness and optimism. So I would say that probably the negative...

  • 12:51:52

    NNAMDIWait, let me back up for a second. "Floating on a cloud of unjustified happiness and optimism," why is it unjustified?

  • 12:52:01

    PAULBecause things are pretty bad. So I would say that...

  • 12:52:07

    NNAMDII think Clay finds that hilarious.

  • 12:52:09

    GOLBECKMy life is also awesome.

  • 12:52:11

    JOHNSONYeah, I'll enjoy sitting on my cloud. You know, I think it's important to note that, again, that this is the study that has been disclosed. And while it may be minor and banal, like, what about the ones that are not being disclosed? You know, maybe those are a lot more exciting. But we have literally no transparency here over the probably hundreds of thousands of experiments that were going through every day. And that needs to change.

  • 12:52:41

    NNAMDIWell, one more comment from someone who apparently agrees with Paul. Shirley writes, "I think people are overreacting to the Facebook study. Sure, we can take the slippery slope argument about the effects on deeply depressed individuals, but no one is saying that this manipulation of feeds had significant impacts on users. For the media to get outraged about this is a little hypocritical, in my opinion."

  • 12:53:03

    NNAMDI"Commentators are constantly showing stories to enrage viewers and listeners. They do it because they want their audience to feel those largely negative emotions." No, actually what they want is ratings. "How is the Facebook study such a horrible trespass?" Well, Jen, you wrote last year about another instance of Facebook monitoring users without their knowledge.

  • 12:53:23

    NNAMDIIn that case, Facebook was saving and studying the things people start to type into their status update, but then changed their mind and delete. What happened there?

  • 12:53:32

    GOLBECKYeah. So they wanted to see what they can self-censorship. Why are people self-censoring? And they studied this by collecting the text that you would start to type in the box. Now the studies that they published said all they did was count how much you typed. They didn't actually read the content. But, of course, they had to collect that content and though they didn't publish anything, saying what they had looked at, we have to wonder whether or not they were actually reading that text and they just didn't publish about it.

  • 12:54:02

    GOLBECKThis gets to the point that Clay keeps making, we have no idea what they're doing with this data. And I think people were really surprised. It's one thing if Facebook is analyzing and manipulating us with things we have intentionally given them. It's another thing if they're analyzing and potentially manipulating us with content that we have explicitly decided not to put on Facebook and they're able, with some technology, to grab from us anyway.

  • 12:54:27

    NNAMDIHere is Marshall in Gainesville, VA. Marshall, you're on the air, go ahead please.

  • 12:54:33

    MARSHALLHi, Kojo. I have a, well, what I think is a simple solution to a lot of these things -- a lot of these topics you've brought up. You can tell me why it's not simple or your panel can. Why not a generic rating system on some of these social or all websites for that matter, just an A, B or C rating. You know what you're getting into when you're on the site. It's right there on the screen for you.

  • 12:54:55

    MARSHALLIt would open up a -- it would make it a lot easier for people to know what they're, you know, what's being seen and what's not. But it would also open up a lot of competitive opportunities for other social media websites who are trying to get their foot in the door. Obviously, Facebook has the, you know, as the Cadillac of social media. But it gives other people opportunities to say, well, you know, we offer the same thing, but we're not going to study your information.

  • 12:55:15

    NNAMDIBut what would be the A, B or C rating you suggest be evaluating?

  • 12:55:22

    MARSHALLThe level of -- the depths of probing, I suppose, of what -- how much information is being collected. And that information obviously could be, you know, regulated by the FCC or whoever regulates this sort of thing. But...

  • 12:55:37

    NNAMDIJen Golbeck, how difficult would that be? How complicated would that rating system likely be?

  • 12:55:42

    GOLBECKI mean, the overall idea is a good one and it's something that we've seen with people talking about, like, privacy policies, for example. Does this policy clearly spell out your rights or is it really vague? The issue here is that Facebook has literally this one word that says "we may use your data for research" in this long list of things and you have no idea what they're doing. You know, the argument that they're making here, well, people consented because they agreed to this.

  • 12:56:05

    GOLBECKThat's the same argument they made in the study that we just talked about where they're collecting your posts that you aren't posting. Can we really understand that that's what they're doing from that one word? We can't. And as Clay has been saying and I'm agreeing with, they're probably doing all sorts of things that they're not publishing. So we just can't tell from the privacy policies and terms of service how much they're probing and how much they're manipulating.

  • 12:56:29

    NNAMDIAnd you have to be able to know that in order to...

  • 12:56:31

    GOLBECKExactly.

  • 12:56:31

    NNAMDI...present a...

  • 12:56:32

    JOHNSONAnd the other reason why it's complicated is because, look, it's not just you. The value of Facebook isn't you as the -- to you, it's not you as the user, it's all the people that you're connected with. So it's not like any other website, where you can just go use another website. You can't just, you know, like, get ticked off at Slate and start reading Salon.com or get ticked off at the new York Times and start reading the Washington Post.

  • 12:56:55

    JOHNSONIf you get ticked off at Facebook and start using -- go back to MySpace or something like that, which is a preposterous thought, then, you know, you're losing your relationships with all the people that are on Facebook. And with every new person that you become friends with on Facebook that gets harder and harder and harder to do. So those ratings are something, but it's not a solution.

  • 12:57:14

    NNAMDIWhat's your level of optimism here, Jen Golbeck? Are we likely to see tech companies ask users more explicitly for permission to run experiments on us?

  • 12:57:22

    GOLBECKI think it's extremely unlikely unless we actually get this either through legislation or through the court, get some real guidance about what's acceptable and what isn't, I think we're going to see a lot more of this going on without the kind of transparency that I think most of us would want.

  • 12:57:38

    NNAMDIClay Johnson, 10 seconds or less, will we be seeing greater transparency from tech companies about how they use our data?

  • 12:57:44

    JOHNSONNo, and I think this is the precipice of the next prism-type thing. I think we're going to see, you know, a story in the next five years about how government used social networks to topple other regimes. I think that we're on the verge of some really scary stuff.

  • 12:58:01

    NNAMDIThank you for getting my weekend off to such a bright start. Clay Johnson is the author of the "Information Diet: The Case for Conscious Consumption." Jennifer Golbeck is a professor of information studies and director of the Human Computer Interaction Lab at the University of Maryland. Jen, always pleasure.

  • 12:58:14

    GOLBECKIndeed.

  • 12:58:15

    NNAMDIClay, thank you for joining us.

  • 12:58:17

    JOHNSONThanks for having me.

  • 12:58:17

    NNAMDIAnd thank you all for listening. I'm Kojo Nnamdi.

Related Links

Topics + Tags

Most Recent Shows