In the 2016 election, 41.3% of eligible voters did not cast a vote.

In the 2016 election, 41.3% of eligible voters did not cast a vote.

What exactly does a poll do, and why does it matter? We hear from Goucher College’s Mileah Kromer on the inner workings of one of Maryland’s own polling operations.

Plus, there are more than 100 million eligible voters in our country who do not vote at all. What does the Knight Foundation’s 100 Million Project tell us about America’s non-voters?

Produced by Julie Depenbrock

Guests

  • Mileah Kromer Director, Sarah T. Hughes Field Politics Center and Associate Professor of Political Science, Goucher College; @MileahKromer
  • Evette Alexander Director of Learning and Impact, Knight Foundation

Transcript

  • 12:00:03

    KOJO NNAMDIYou're tuned in to The Kojo Nnamdi Show on WAMU 88.5, welcome. Later in the broadcast staying safe and having fun during a scary pandemic Halloween. But first what exactly do polls tell us and why does it matter? In this tense election season we're exploring one of Maryland's own polling organizations. Our country's largest voting block might surprise you, non-voters. There are more than 100 million eligible voters in our country who do not vote at all.

  • 12:00:34

    KOJO NNAMDIWhat does the Knight Foundation's 100 Million Project tells us about America's non-voters? First on the issue of polling, joining us now is Mileah Kromer, the Director of the Sarah T. Hughes Field Politics Center and Associate Professor of Political Science at Goucher College. Mileah Kromer, thank you for joining us.

  • 12:00:55

    MILEAH KROMERHi. Good afternoon. Thanks so much for having me.

  • 12:00:58

    NNAMDIFor listeners who may not know, what is the Goucher College poll?

  • 12:01:02

    KROMERThe Goucher College poll is a survey of Maryland residents and voters sponsored by Goucher College in Baltimore County, Maryland. And so we've been around since 2012. And we do a variety of policy issues facing the state. And we do a little bit of horse race polling when election season comes around.

  • 12:01:21

    NNAMDIWhat kind of work goes into the kind -- the polls like the ones you just described?

  • 12:01:27

    KROMERWell, there's a lot of things that go into it. We use the Goucher College poll primarily as a teaching tool. And so I teach a course on survey research methods and research methodology and political science and state politics. And so students in those classes often help me research and through the Goucher poll I help them learn how to write good survey questions.

  • 12:01:47

    KROMERFrom 2012 until just recently we used to house an on-campus survey center where students were trained and actually conducted the calls. We have moved on from that. Now a professional call center actually conducts our calling for us and we are refocusing our efforts to using the Goucher poll to teach students how to analyze and write about data. And that's something that's really consistent with our new curriculum.

  • 12:02:14

    NNAMDIWe see a lot of national polls. What we see less of are polls focusing on state races and issues. How challenging is it to accurately poll at the state level?

  • 12:02:24

    KROMERI don't think it's any more challenging than what you have nationally, but I do think it's important for there to be a presence of a statewide poll in every single state. Now a lot of small colleges like Goucher College have taken up this effort, and we are the only one here in Maryland. And I think -- I think a public service to the state. And that's the spirit I've always -- when I founded the Goucher poll that was the spirit I really had that I wanted to make sure that people of Maryland had a voice in policy issues facing the state. And we were going to ask important state level policy questions for the reporters and the media here and Maryland to cover.

  • 12:03:03

    NNAMDIMileah, what effect does it have gathering this data? In other words, what does a poll exactly do?

  • 12:03:11

    KROMERSo that's a great question. There's different methodologies for different polls. So a traditional telephone poll, which is what the Goucher College poll is, is we draw what we call dual frame random sample of Marylanders. That means we call both cell phones and landlines. So a common mistake I hear from people is that all polls only call landlines. Well, that's not true, and that hasn't been the case nearly -- every reputable polling firm now is more cell heavy than they are landline. As a matter of fact, the Goucher College poll is a primarily cell phone poll. And that's because most Marylanders use their cell phone and they don't have landlines. I don't think you could even talk to a respondent under the age of 40 on a landline.

  • 12:03:55

    KROMERAnd so you have these types of telephone polls that we use. We do random digit dialing, which is I think very pertinent to our discussion today that we capture both voters, but also non-voters, because we contact residents in general. And then screen down to identify registered voters and then likely voters.

  • 12:04:15

    KROMERNow telephone polls are clearly not the only polls in the game anymore. You have large -- they do some probability based and non-probability online panels that are reaching respondents through email. Some polls do text to web where they send out a text message and that drives them to a website when they can answer a poll. So there's a bunch of different methodologies. The most common one, though, still is that the random digit dialing or the registration based sampling of telephones.

  • 12:04:48

    NNAMDIMileah, why is it so important how you word the questions in a poll?

  • 12:04:54

    KROMERThat is the art of public opinion polling certainly. Because little changes in wording or phrases causes a big difference. I'll give you an example on our most recent Goucher College poll. The movement to defund the police, that policy issue has been at the top of people's minds across the nation since the summer. And we asked respondents in the Goucher College poll two questions, one if they approved or disapproved of the movement to defund the police, and then a second question we ask if they approved or disapproved of reallocating police funding to social services like housing, education and healthcare. And that was -- you could see the dramatic difference in results.

  • 12:05:54

    KROMERPeople, the majority of Marylanders were actually supportive of reallocating the funds to more social programs versus only a small percentage of Marylanders, who are actually supportive of defunding the police. And that just shows you -- now a lot of advocates for the defund of the police movement quote/unquote, are actually advocating for just a reallocation of police funding to social services. But you can see how a small -- the change in words or the phrasing gets completely different answers in terms of public opinion.

  • 12:06:14

    NNAMDIWhat are some of the questions you asked in polls for this election cycle?

  • 12:06:18

    KROMERFor this election cycle, we did a battery on COVID-19, as well as a battery on policing. And so the COVID-19 stuff, I think the most important finding was how divided Marylanders were on whether they would take an FDA approved vaccine. And that is a really tough thing for public health officials, especially now, when you see the new rollouts with who will get the vaccines first here in Maryland. The state government is finally starting to talk about these things. And there's been obviously some back and forth with the federal government of when a vaccine would actually be available.

  • 12:06:54

    KROMERBut the problem we find in our numbers is that a lot of people, about half, say they wouldn't take it if it was available today. And that sort of lack of public trust in public health is really problematic.

  • 12:07:07

    NNAMDIHere's Bob in La Plata, Maryland. Bob, you're on the air. Go ahead, please.

  • 12:07:12

    BOBYeah. Hi. My daughter went to Goucher and participated. Did one year with your college, her name was Addie, A few years ago and she really enjoyed it. But my question was -- I was curious as to why they're still doing presidential poll results -- or still polling so close to the election. How is that going to benefit the election by still doing polls when in four days we'll know the results of the election? So that was my -- what does that benefit the electorate by doing polls so close to the election?

  • 12:07:43

    NNAMDIMileah Kromer.

  • 12:07:44

    KROMERThat's a -- that's a great question. And can I just say that's wonderful that your daughter is a Goucher alum and worked at the Goucher College poll. That makes me really glad to hear from somebody. But yeah, so I think that is the pushback people sometimes have about public opinion polls especially this sort of close to election. I think it's because people really want to know the outcome before the last votes are counted, but I will say this, they're not just asking the horse race in these. Some of these pre-election polls that focus on the horse race, they're asking what issues are important to Americans, and I think leading up to election to understand the electorate to better understand what's on the American voter's mind is always valuable.

  • 12:08:28

    NNAMDIThank you very much for you call. Peter in Arlington emails to ask, "How do we know a call is coming from a non-partisan poll? In other words, how do we know it's not spam?"

  • 12:08:41

    KROMERThat is another big issue that pollsters are facing. And I think all of us have received a phone call that has either been offering some sort of car warranty or some other various spam. And so it's become a problem for pollsters. A reputable pollster should identify themselves over the phone right up front who they're calling from. So, for example, the Goucher College poll, the intro is Hello, my name is -- and I'm calling from -- on behalf of Goucher College, or I'm calling on behalf of the Goucher College poll. Just to let people know upfront.

  • 12:09:14

    KROMERYou'd have to ask like each individual survey center what their intro looks like, but they really should be disclosing who the poll sponsor is right up front, especially these what we call public polls. So that's the sort of thing that we do, Monmouth University does, Marist does, CNN, and the polls that are funded for public consumption.

  • 12:09:36

    NNAMDIThe reputation of polling, Mileah, took a big hit in 2016. Many people in polls indicated a solid win for Hillary Clinton. What happened? And how do you respond to polling skeptics?

  • 12:09:51

    KROMERRight. So everybody loved to talk about 2016, but it seems that nobody wants to recognize what happened in 2018. So I'll talk about 2016 first. There was clearly some misses in key swing states by even the best pollsters. And those small misses as you know -- in a really close election, a small miss makes a big difference. And so the election was decided by a handful of states and a handful of votes in those states. And the overall message was that if you didn't wait by education and that's -- AAPOR, the American Association for Public Opinion Research did a large study after the election cycle. And they found that a lot of the pollsters that missed were overestimating the amount of college educated voters in the electorate particularly in those key swing states. And so pollsters have adjusted for that.

  • 12:10:46

    KROMERAnd not only that there's been a push forward to have more statewide polling done, so national polls are great. Statewide polls in key swing states really give you a clearer picture of what's going on. So the amount of polling done in the individual states especially those key swing states has increased dramatically from 2016. So we're getting a clearer picture. And they've adjusted for the mistakes. But I'll say this so some of the adjustments were made for the 2018 polling cycle. And the polls in 2018 accurately predicted the blue wave election. And, again, yes, there were some mistakes in 2016. But a lot of those mistakes we see were corrected for 2018. And I guess we're going to find out in a week if we've corrected for it in 2020.

  • 12:11:36

    NNAMDIWe have less than a minute left in this segment, but what's the difference between a poll and an election forecast?

  • 12:11:42

    KROMERYeah. So polls are inputs for forecasts. And so these larger forecasts you see, the really famous ones, 538 and the economist and Sarah Sabato's Crystal Ball, they use polls are part of their indicators. But they also include other indicators for their electoral map forecasts. So I think the best thing that people should do is certainly look at these forecasts. These are done by experts, but also look at polls in the key swing states. Like how close -- is it within the margin of error? So that's the number I would be looking for.

  • 12:12:17

    NNAMDIGot to take a short break, when we come back, we'll talk about the 100 Million Americans who don't vote and reasons why. I'm Kojo Nnamdi.

  • 12:13:10

    NNAMDIWelcome back. Joining us now is Evette Alexander, Director of Learning and Impact for the Knight Foundation. She led research and analysis on the 100 Million Project, a study of non-voters in the United States. Evette Alexander, thank you for joining us.

  • 12:13:24

    EVETTE ALEXANDERThank you for having me and for covering this topic.

  • 12:13:27

    NNAMDI100 million Americans, who did not vote. What's the effect of 100 million people not voting in the 2016 election? What's the effect of that, many people not voting simply not participating in their own democracy?

  • 12:13:43

    ALEXANDERYes. That's the question that the foundation was asking as well. And there's certainly a deep effect when it comes to not only the outcome of the election and who is represented in elected government, but who's, you know, concerns they advocate for. And this is a chronic problem in the U.S. We've had, you know, turnout hovering around the 50 to 60 percent mark in recent decades. So, you know, we understand that a part of the reason why there's a big impact is because the non-voting population, it doesn't mirror exactly the active voting population or those who show up most consistently to the polls.

  • 12:14:19

    ALEXANDERSo, you know, compared to those who are habitual voters, non-voters tend to have a bit lower income, are a bit younger on average, less likely to hold a college degree, which is an important factor that the previous person you had was talking about. They're less partisan typically. And while they're majority white, there's also more people of color among non-voters than among habitual voters. And so those are the voices that tend to be less represented in our elected government. And our democracy would be stronger with their participation.

  • 12:14:49

    NNAMDIWhat is the 100 Million Project and what is its goal?

  • 12:14:53

    ALEXANDERSure. So the 100 Million Project is a landmark study, the largest that we know of non-voters. We surveyed 12,000 of them across the country and in 10 swing states. And we wanted to undertake the project for three reasons. You know, first of all, the fact that the non-voters don't look exactly like those who vote consistently and are therefore underrepresented, which I mentioned previously.

  • 12:15:15

    ALEXANDERSecondly, non-voters is such a huge segment of the electorate. Like you mentioned about 43 percent of eligible voters sat out the 2016 election. And they're not very well understood. Political polling and the parties themselves typically focus on what they call likely voters that we heard about earlier in the segment, those who have a history of showing up to the polls or certain characteristics that make them likely to show up. So non-voters are a group that we don't really understand very well, particularly the chronically disengaged, those who show up very seldomly or not at all. And they were the the focus of this study.

  • 12:15:43

    NNAMDIObviously a big part of this conversation is uncovering the process involved in research and polls. So, Evette, what went into this study and how is it representative of the non-voting population?

  • 12:15:56

    ALEXANDERSure. It was a little bit of a difficult group to survey. We made sure that we had a 4,000 strong national sample population that was representative. We weighed it to what we knew statistically from the Census Bureau and other data sources about what the non-voting population should look like in terms of their ages, their races and we weighed it accordingly. And we also did a comparison survey of 1,000 active voters and 1,000 young eligible voters, those 18 to 24 year old we were eligible to vote. And we wanted to compare some of the attitudes and behaviors of non-voters to these active voters to kind of see some clues as to their lack of participation.

  • 12:16:39

    NNAMDIWhat do we know about non-voters, especially in terms of the reasons they choose not to vote?

  • 12:16:46

    ALEXANDERSure. It's a good question. So first of all, the study reveals that persistent non-voters are by no means a monolithic group. But they're, you know, varied as American society itself. There's not a one size fits all description of a non-voter or is there a single, you know, one reason why they don't participate. But one of the goals was to understand voter turnout challenges a bit better or why people don't vote.

  • 12:17:08

    ALEXANDERAnd we learned about that in two different ways. One, we asked non-voters directly why they don't vote, and responses vary, but the top reason was that they don't like the candidates. That was 17 percent of respondents. And they also said that they feel that their vote doesn't matter, followed by not feeling informed enough on candidates or issues. Then we also learned about some of the underlying reasons why non-voters don't participate by comparing some of their views and behaviors with active voters.

  • 12:17:33

    ALEXANDERAnd here's what we found there. Non-voters have a lack of faith in our election system and in the impact of their vote. Large numbers of them don't believe that the election system actually works that their votes will be counted or tallied fairly or accurately. They have some concerns about the Electoral College system and why the popular vote, for example, doesn't win the election. Somehow believing that the system is rigged or the outcome is predetermined. So that was one point that they were different than voters.

  • 12:18:03

    ALEXANDERAnd non-voters also engaged less with news and so they're left feeling under informed. They tend to bump into political news as they go about their lives versus actively seeking it out like active voters. So when they come into, you know, information on candidates or issues it's often on, for example, a social media feed, where there's somebody's opinion on top of an article or things like that. And only about half say they feel informed enough on candidates and issues to cast a vote come election time compared to 80 percent of active voters.

  • 12:18:32

    ALEXANDERWe also found that they are very politically diverse so that there's room for both parties to benefit from increased turnout depending on which segment of non-voters is mobilized and in which states. Finally, to me, one of the biggest and most concerning findings of the study was that at least at the start of the year, young eligible voters looked more like non-voters in terms of their attitudes and behavior than non-voters themselves.

  • 12:18:53

    ALEXANDERSo let me explain what I mean by that. You know, they reported that they were less interested in politics, less interested in news and less interested in voting in 2020 than those with a record of not voting. They were also the group that had the most difficulty with the voting process itself. However, we did a new college poll in August. So we've seen how their views have shifted a bit.

  • 12:19:16

    NNAMDIYeah, I was about to say, because the narrative these days is that young people are likely to turnout in larger numbers in this 2020 election than they have in previous elections, but back to the issue of polling. Mileah Kromer, we received an email from Elliot who said, "The polling in battleground states in 2016 was horrible. Almost every poll put Clinton ahead of Trump in battleground states. And Trump ended up winning the election with over 300 electoral votes. I don't believe polls at all." Mileah, it underscores the point you were making that some people are not noticing any difference between how polls were conducted in 2016 as opposed to 2018. But here is Owen in Frederick, Maryland. Owen, you're on the air. Go ahead, please.

  • 12:20:01

    OWEN (CALLER0Hi, there. Yeah, so I want to -- well, I originally wanted to ask about how you kind of calculate apathy in polling. Like how do you incorporate the people who say they just don't care about a topic whether it's for election or for like a referendum sort of ballot.

  • 12:20:23

    NNAMDIMileah Kromer.

  • 12:20:25

    KROMERLet me just directly -- just to pushback a little bit on the first email you received. There were some misses, but if you actually look at a lot of the swing state polling that the momentum was behind Trump. And he had pulled those polls within the margin of error in several of those key swing state polls. And so at that point once you're within the margin of error, it's anybody's game. And things broke really heavily on Trump's side that night. And I just think that's an important correction to make.

  • 12:20:55

    KROMERBut to the other caller's point, the issue of apathy, right? And so there's a couple of different things. Hopefully, we're hoping that the likely voter screens that we're using are eliminating -- especially when we're doing the horse race polling, we're eliminating folks, who don't care, who have no interest in voting from the analysis on the backend. But it does create a problem, this idea of apathy, on policy polling, because you're trying to craft a question about policy issues.

  • 12:21:27

    KROMERAnd as the other guests have mentioned that, you know, some people just don't pay attention. There are very casual observers of politics and so when you're formulating these policy questions you have to be very careful that you've specific enough, that you're queuing them in like on an attitude. But you're not giving them too much information to bias them. So they're just responding to the queues within the question. And that goes back to like -- that's the art of writing these really good policy questions. You have to be very careful that you're giving enough information that can queue people in to expressing an attitude, but not too much information that you're biasing the results.

  • 12:22:05

    NNAMDIEvette Alexander, in the one minute we have left. This election already seems to be setting records when it comes to early and mail-in voting. Is there an expectation that the non-voting block of voters is getting smaller?

  • 12:22:16

    ALEXANDERI think we will see some record turnout this year. Certainly early indications show that. We had a poll in August of college students that said that 71 percent of students said that they were absolutely certain that they were voting in the upcoming election. So that was much higher than we were seeing prior to COVID, prior to the racial reckoning happening in the country. And those were higher among Democrat and women than they were among Republicans or male students.

  • 12:22:40

    NNAMDIOkay. And I'm afraid that's all the time we have. Mileah Kromer, thank you for joining us.

  • 12:22:45

    KROMERThanks for having me.

  • 12:22:47

    NNAMDIEvette Alexander, thank you for joining us.

  • 12:22:49

    ALEXANDERThank you so much.

  • 12:22:50

    NNAMDIComing up next, staying safe and having fun during a scary pandemic Halloween. I'm Kojo Nnamdi.

Topics + Tags

Most Recent Shows