Kojo explores how design encouraged the historic mental health hospital's mission.
Last week, U.S. News and World Report dropped George Washington University from their all-important “Best Colleges” list over inflated data. Some are questioning the value of those rankings, and asking why potentially more useful information about graduation rates and post-graduation salaries aren’t collected or shared consistently. As college application deadlines near, we look at what information is and isn’t available to students.
- Mark Schneider Visiting scholar, American Enterprise Institute; Vice President, American Institutes for Research
MR. KOJO NNAMDIGeorge Washington University found itself in something of a pickle recently. the school acknowledged it had overstated the number of incoming freshmen in the top 10 percent of their high school class. US News and World Report promptly dropped GW from its important -- all important best colleges issue for this year. So what's behind these mysterious rankings? A lot of patchy data, as it turns out. Since many high schools don't have class rankings no two schools can report using consistent data.
MR. KOJO NNAMDIThe incident raises a long standing debate that has intensified in our increasingly data-driven world. Are rankings based on academics the most important measure of a school or should we be looking at other information like graduation rates or job prospects? It's a question a lot of students and parents are likely to be discussing at Holiday dinner tables as application deadlines near. And joining us to discuss it is Kevin Carey. He is the director of the Education Policy Program at the New America Foundation. Kevin, thank you for joining us.
MR. KEVIN CAREYGood to be here.
NNAMDIAnd joining us from studios in Albuquerque, N.M. is Mark Schneider, a visiting scholar at the American Enterprise Institute and the vice-president at the American Institutes for Research. He served as the U.S. Commissioner of the National Center for Education Statistics from 2005 to 2008. Mark Schneider, thank you for joining us.
MR. MARK SCHNEIDERMy pleasure.
NNAMDIMark, I'll start with you. George Washington University's problems stemmed from what the school reported about its incoming freshmen, how many students were ranked in the top 10 percent of their graduating high school class. But that statistic is not as simple as it sounds. Why not?
SCHNEIDERWell, first of all because many schools don't calculate them anymore. And the reason for that goes back to all kinds of issues with regard to how we want to judge our students, high school students and whether or not telling a student they're in the top 10 percent or 5 percent is a good thing or not. So it's not a standard metric that is required to be reported. And what George Washington did was -- GWU -- was they creatively interpolated the scores based on some other information.
SCHNEIDERBy the way, GWU is not the only university that's misreported data. MR University recently got into hot water for misreporting its SAT scores. They too also said, oh this was an innocent mistake, though the direction of the error was clearly in their favor.
NNAMDIKevin Carey, is there any way to compare two colleges based on the class rank of their incoming freshmen using consistence and comparable data?
CAREYNot really. As Mark said we're relying on high schools themselves to make academic judgments about students. And there are thousands of high schools in America that all have different standards and different ways of grading students. So it is, at best, an approximation of the quality of the incoming class.
NNAMDI800-433-8850 is the number to call if you'd like to join this conversation. Do you think rankings like US News and World Reports best colleges are important? Do you use them? 800-433--8850. How important are US News a and World Reports best college rankings in your view, Kevin first.
CAREYWell, they're pretty important because there are really no other rankings that rival the US News rankings from a public awareness standpoint. And clearly the colleges care about the rankings because if they didn't they wouldn't cheat on them. They are, for better or for worse, probably more for worse, the definitive list of status in our higher education system. And I would argue that's mostly what they measure, not quality but status.
NNAMDIMark Schneider, same question.
SCHNEIDERSame answer. It -- yes, I think the US News and World Report ranking system -- first of all, it only covers a handful -- a relative handful of institutions compared to the hundreds and hundreds of campuses that are out there. And it does distort a lot of decision making because people want to be in the top 10 to top 25. And a lot of bragging rights go along with that. But the differences are just arbitrary.
SCHNEIDERYou know, ranking one versus ranking three doesn't matter. And the numbers of students that are involved or enrolled in those small numbers of schools is actually distorting a lot of the discussions about what we really need in terms of measuring the quality of the schools that most students go to.
NNAMDIKevin, so what do the US News and World Report rankings actually tell us?
CAREYWell, they use a lot of different statistics to compile these rankings. It's partly based on a survey of presidents and deans. It's partly based on things like funding and class size and the percent -- SAT scores, the percentage of people in the top 10 percent of their class. But I would argue that all of those different statistics more or less represent three things. How famous you are, how exclusive you are and how wealthy you are. So a rich famous, exclusive college will always do well in the rankings, which is why Harvard and Princeton are always number one and number two or tied.
NNAMDIWhat then, if those are the strengths of the ranking, are their weaknesses?
CAREYWell, wealth, exclusivity and fame don't really tell you anything about the quality of the education you're going to get, nor the value that you'll get as a parent or a student taking out loans or paying for college. So it's not a bad idea to go to a very well known high-status institution. If you can go to Harvard it's -- you'll probably do okay for yourself. But for the vast majority of people who aren't in that position and actually need to find a college or university that will provide them with a high quality education, help them get a job and not charge them so much money that they're burdened by terrible student loans, the US News rankings really have very little to say.
NNAMDIBut there are other best college rankings out there. Newsweek, for instance, does its own version. How did US News and World Report come to be the definitive list?
CAREYBy accident pretty much. Back in 1983 the editors of US News were -- the news magazine at the time were kind of sitting around thinking about story ideas. And had the bright idea of sending out a survey of -- to just--actually not all colleges, just some of the top colleges, asking them who they thought were the best colleges. They reported the results and it turned out that people were very, very interested in this question. Because 1983 was kind of a moment in the history of our higher education system when the market was really becoming national.
CAREYYou had a lot of people going to college. College degrees were becoming worth more in the job market. We had deregulated the airline and the telecommunications industry so you could fly to college on the other side of the country and call back to your parents for less money than you used to. And so people really were very interested in what the "best colleges" were. And US News provided a kind of pseudoscientific answer to that question. And they got there first and they've been there ever since.
NNAMDIWe're talking with Kevin Carey. He joins us in our Washington studio. He is the director of the Education Policy Program at the New America Foundation. Mark Schneider joins us from studios in Albuquerque, N.M. He is a visiting scholar at the American Enterprise Institute. He's also vice-president of the American Institutes for Research. Mark served as the U.S. Commissioner of the National Center for Education Statistics from 2005 to 2008.
NNAMDIWe're talking about rating colleges and inviting your calls at 800-433-8850. How did you choose your college? Did you rely on data such as its academic ranking? Should schools be required to make post-graduation salaries and graduation rates public? 800-433-8850. Mark, we've just, I guess, punctured the balloon of the best colleges ranking. So what other objective information do students have when making this major decision about where to attend college?
SCHNEIDERWell, unfortunately there are some data out there that students could use like graduation rates, but they're badly measured. And one of the projects that I'm working with now with a bunch of states is to try to get the information about the wage outcomes of your graduation available to students before they enroll. So you should know, for example, that if you major in a certain program in a certain university that your likely wages may be $22,000 a year, while a different major from the same university your likely wage outcome's maybe 50,000. Or if you choose a political science program in one university compared to another and they cost you $5,000 a year.
SCHNEIDERThese data are -- we can get to these data. It's hard work and then we have to figure out how to make these accessible to students but we can do that. And this is a -- it's called know before you go. You know, what's the likely outcome that's going to -- in terms of wages that you'll experience after you graduate from a certain program. And this you could see as fundamental because it should advise you about how much money you should borrow. If you're going to borrow 50 or $60,000 and your expected wages are $25,000, you're probably going to be in a world of hurt. If you borrow $20,000 and you expect the wages are $50,000 that's a damn good investment.
NNAMDIWell, Kevin, it's my understanding that maybe a lot of schools don't even quite know who's graduated or not.
CAREYYeah, schools don't do a very good job of keeping track of their students because from a financial perspective it doesn't really matter to them how many people they graduate. It only matters how many enroll because that's who pays tuition. And if you're a public institution usually your state support is based on enrollment. But the great thing is that we don't even actually have to require schools to tell us what the earnings of their graduates are because the government already knows.
CAREYThe data that Mark was talking about comes from federal and state databases, I think state databases in the case of Mark's project. But if you think for example about the Social Security Administration, every year I think up until this year, everyone gets a letter in the mail from the Social Security Administration saying here's how much money you made every year since you started paying into the system. And based on that here's how much money you're going to get from Social Security.
CAREYSo we already know how much people are making. All we need to do is find out where they went to college, which is a pretty trivial thing to do from a data perspective and we can calculate on behalf of colleges and students what the earnings are rather than relying on colleges to do it themselves. And frankly they either will do a bad job or will refuse to do it at all.
NNAMDIWell, you pointed out that -- oh, please go ahead, Mark.
SCHNEIDERYeah, and the nice part about using these administrative data is that they can't be cooked, right. So a University doesn't control these and a university can't say, oh our students are making $100,000 a year. There's 90 percent of them are graduates employed making on average $90,000 a year because that is verifiable and from datasets that the schools don't control.
NNAMDIOn to the telephones. Here is Kevin in Reston, Va. Kevin, you're on the air. Go ahead, please.
KEVINHi. Yeah, I just wanted to share a comment. I'm two years out of college right now and when I was a high school senior applying to colleges I was lucky enough to be in the top 10 percent of my high school class and a national merit scholar, two items that colleges typically recruit for. And in the college application process I received about 70 pounds of college advertising in the mail -- I did weigh it -- and about $800,000 in scholarship applications. But these were scholarships to schools that weren't necessarily academic top tiers but they did offer a full ride, such as Arizona State, Ohio State, Texas A & M, large schools like that.
KEVINAnd what I found is that in a lot of the college guides and application guides they would list the schools and list the exact number of national merit scholars at these particular institutions, and the exact percentage of students in their top 10 percent. But what the college guides did not explain and what I find a lot of kids who at the time were applying and kids who are applying now don't understand, is that those numbers are not indicative of academic success or the quality of the academic institution or the quality of the education or even the usefulness of the degree in finding a job.
KEVINIt's simply indicative of the amount of money that the schools have that they're willing to essentially bribe the people to come there and in turn increase their rankings. It's really-- you know, it's become a game. It's become an arbitrary ranking. And I think a lot of people are not necessarily knowledgeable about this and that they just see the numbers and go, oh wow, that must be good. I should go there. When, you know, it's more about fit, it's more about academics class size, that type of thing. So...
NNAMDIWell, I'm glad you made that point because, Kevin Carey, you made the point earlier that schools are not as interested in who's graduating as who's coming in and who's able to pay dollars. And Kevin used the term bribe people to come there. There's a lot of bribing and schmoozing that goes on in getting people to come there. You point out that some colleges actually employ Disney consultants.
CAREYSo the -- yes, it's -- we should expect that colleges will act like anyone who's in the business of selling something. They will put their best face in the data. They're not going to be objective about themselves and it's not really reasonable to think they will be. Part of that is the 70 pounds of brochures that our caller just talked about. And part of that is the campus tour, this kind of experience that when you visit and they bring you around.
CAREYAnd in fact, a couple of years ago the Chronicle of Higher Education wrote a whole story about how there are tour consultants now who got their training at Disney World in the kind of amusement park experience. And have brought that experience to revamping the college tour, which suggests perhaps that colleges are more like amusement parks than anyone would like to admit.
NNAMDIGot to take a short break. Thank you very much for your call, Kevin. When we come back, we will continue this conversation. If you have called stay on the line, we will get to your calls. We're talking about rating colleges and taking your calls at 800-433-8850. What data on programs do you think schools should collect? What data on programs do you think they should be sharing? And tell us about your own experience.
NNAMDIHow did you choose your college? Did you rely on data such as its academic ranking? 800-433-8850, send us email to email@example.com, a tweet @kojoshow, or simply go to our website, kojoshow.org, join the conversation there. I'm Kojo Nnamdi.
NNAMDIWelcome back. We're discussing rating colleges with Mark Schneider, visiting scholar at the American Enterprise Institute and vice president at the American Institutes for Research, and Kevin Carey, director of education policy program at the New America Foundation. Mark, most people understand that it's important to consider what kind of salary they're likely to make after graduation, but you say a return on investment calculation is more important. Can you explain what that means?
SCHNEIDERWell, we need to take into account how much you spent in terms of time and actual money getting your degree. So we could do a return on investment calculation that takes into account how many years it took you to get your degree, what's the probability that you're actual going to earn a degree, how much you spent on that degree -- achieving that degree, and then we can look at the wages. And actually, once you have those kinds of numbers, it's actually not very hard to compute a return on investment.
SCHNEIDERI've been doing this for some institutions and the return on investment in some schools is lower than the cost of money. So in other words, you invest say $50,000 in getting a degree, and you would have been much better off taking that money and investing it in a savings bond actually, or, you know, money market, compared to how much it's costing you to borrow that money. The numbers from the student perspective can be quite scary.
SCHNEIDERWe need to push this down to the level of the program, and what I think about the return on investment number is that it's a summary statement of all the costs that you've borne in earning that degree relative to what your likely outcome is. So it's a metric that's very powerful in helping students understand the likely payoff of their decisions in terms of colleges or degrees or programs.
NNAMDIHow difficult would it be for all colleges and individual programs to collect and provide information like graduation rates and post-graduation salaries for different majors?
SCHNEIDERWell, it turns out not to be that hard, except that we've, up till now lacked the political will to do this. We have a federal system that collects really archaic measures of graduation rates that have nothing to do with what the -- what schools like today, but the fact of the matter is that most states actually collect the data that can help us track graduation rates for specific kinds of individuals or actually specifically individuals which could be aggregated up to specific kinds of individuals.
SCHNEIDERSo we know what -- we can get much more accurate graduation rates. We can track the time it took on average for a student to earn a degree in a given program, and we could calculate the true cost of getting that degree. So I'm working with, for example, Tennessee and Texas, Virginia, all of whom have the data systems that allow us to do these kinds of ROI calculations, the return-on-investment calculations at the program level.
SCHNEIDERAnd one of the things actually that we'll discovery -- that we have discovered already, and that will become increasingly clear, and is actually worth a lot of thought, is that most estimates are that even while we have eight percent unemployment in the United States, there are about three million jobs that are sitting empty because of a skills mismatch between the students that are being turned out by colleges and by the needs of employers.
SCHNEIDERAnd one of the things that we've discovered is that technical degrees, both from community colleges and from four-year schools, command a wage premium because these are the jobs that people need to fill right now. And some of the numbers are actually quite astounding in terms of what the size of that premium is, and students should know that. We've tended to think about associate degrees and technical degrees as also (word?) because we tend to think that liberal arts are the things that matter the most, and if fact they're very -- liberal arts are important.
SCHNEIDERI don't want to be in a position of denigrating them, but at the current time we have a skills shortage, and technical degrees are commanding quite a significant premium in many cases.
NNAMDII want to tell our callers to stay on the line, because I'd like to pursue this thought for one second. We will get to our calls, but isn't a task force in Florida recommending, or has recommended using lower tuition as an incentive for students to study particular majors? Mark, what do you think of that idea?
SCHNEIDERSo I think that what we're doing is we're -- many states, and I think a lot right now is up to the states to do. They run the state campuses where most students attend. They have more regulatory authority than the federal government. They have -- they actually already the employment data, the wage data. So states could push really hard on this if they want to, and an increasing number of states want to do this and will be doing this.
SCHNEIDERI think what we're going to end up is with alternate experiments to see how this works. Lower tuition might be one, rewarding schools that -- or programs in terms of additional revenues for example, state revenues, in terms of placing students in good jobs may be another way. I'm actually much more in favor right now in simple information. I think it's a little early to think about regulations based on employment outcomes.
SCHNEIDERBut I believe that we're probably in a good enough position right now that parents and students, legislators, governors, should know what the wages look like, the economic returns from graduates in different programs are. I'm a little bit hesitant right now, and probably for the next few years, about whether or not we should start regulating on the basis of this.
NNAMDIKevin, the federal government requires for-profit colleges to report employment information. What kind of information is reported?
CAREYWell, actually, exactly the information we've been talking about. So the federal government, for all programs at for-profit colleges, and actually all certificate and job focus programs at non-profit colleges, including community colleges, so they're under this umbrella too. It calculates the ratio of debt to income, so it looks at how much money you borrowed, and how much money you're earning afterwards, which is just another form of the return on investment calculation that Mark was just talking about, and if your ratios are very bad over a certain number of years, it will kick you out of the federal financial aid programs.
CAREYThe idea being that the taxpayers should not be subsidizing overly expensive degrees that don't lead to jobs where people have enough money to pay their debts back.
NNAMDIOnto the telephones. There are a lot of people who would like to get in on this conversation. We will start on with Rick in Arlington, Va. Rick, your turn.
RICKHi Kojo. Thank you. I did a lot of research on the U.S. News rankings of law schools some years ago by talking to people at U.S. News and others. I'd just like to share a couple of observations with you. One is, I think your people missed at least one reason, or two, why the U.S. News rankings have prevailed at all levels. The first was, they were presented as news. People didn't have to search them out. They were in magazines that came to their home and they had that authority.
RICKThe other reason is the rankings are quite plausible. They're very specific. You look at them, Harvard is on top, Amherst is up there, et cetera. You say, gee, this looks right. What people don't realize is there are a hundred different rankings that could have looked right to people. So that's how they got their foothold. But the more fundamental point that I learned is that every measure they use, and I was looking at law schools, though it's pretty true of undergraduate schools as well, was either seriously flawed or biased or open to manipulation by the law schools or the colleges and universities.
RICKOne common bias, for example, was a favoritism to small schools because some measures like library resource and others were per capita, and that doesn't really make sense. For example, originally in the law school rankings books per capita, what you want is books from the library not books per student.
RICKIf I can share just one anecdote with you that captures...
RICK...the perversion of it. I gave a talk at a law school that will remain nameless that was in the fifth tier of U.S. News, the very bottom tier, and the people hated the news rankings and they loved my talk. And then about two months after I got back, I got a message sent to me that the Dean had sent around to the faculty boasting about how this law school had risen to the fourth tier. What the message didn't say was the U.S. News had changed its ranking system so there were only four tiers. (unintelligible)
NNAMDIThank you very much for your call. We got an email from Mike who says part of the GWU story not widely known is that GW Law School gives each of its grads a stipend for one year after graduation if they can't find employment, and that way they become a freebie for one year to a law firm or whatever, and GW Law can boost their stats on the number of law students who obtain a job, a widely-used statistic by law schools to attract students. Care to comment, Kevin?
CAREYWell, what I think this shows is that while everybody in higher education likes to complain about the U.S. News rankings, in fact, the rankings are really just a way of empiricizing (sic) a status hierarchy and a set of values that the higher education establishment itself holds dear. If they all really agreed that these rankings were meaningless, then consumers wouldn't care and U.S. News couldn't sell them.
CAREYBut in fact this cheating that goes on, the competition, either at professional schools or at the undergraduate level, there have been cases where presidents have gotten bonuses written into their contracts if they could increase their U.S. News rankings. So the industry has embraced the values underneath the rankings, and so the rankings endure. The magazine doesn't, but the rankings do.
SCHNEIDERBut this is why I believe we need to move away from these kinds of surveys into much more rigorous and objective data that schools can't cook and that they can't lie about.
NNAMDIHere's Steve in Silver Spring, Md. Steve, you're on the air. Go ahead, please.
STEVEThanks, Kojo. I'm an education professor at a university, and I've dozens, and there are likely hundreds of criticisms of these rankings, yet they endure for the reasons that were said. But some of the critiques just briefly that haven't been mentioned, we have only 20 or so indicators reduced to one number. Now, there's lots of variation in the 20, but the reduction to one number gives a false precision, and, you know, out of the 20 indicators picked, you know, you could name another hundred that should be in there.
STEVEAnd on top of that, this is as much a qualitative issue as a quantitative issue, and nowhere are qualitative discussions included. And lastly, this is just driving the universities around the country, and they have no choice but to get on the bandwagon, and like was said, you know, promotions and tenure of presidents depend on these rankings to the exclusion of almost everything else. We're doing education a disservice with these rankings.
NNAMDIMark Schneider, are schools indeed being painted into a rankings corner, so to speak?
SCHNEIDERYeah. Rankings are inevitable, and I don't -- I could imagine ways that we could get out of it, but people are always going to compare, and I think the question is, what are they comparing on. There is a fundamental point in the last callers comment about how many dimensions and how you wrap them up into a single ranking or a single number, and there's no reason to do that.
SCHNEIDERYou know, some people may value some dimensions much more highly than others, and actually we could easily disaggregate these numbers and allow people to do their own weighting systems to figure out what is valuable them. But again, we keep coming back to this point that -- and Kevin has articulated very clearly, that there's a lot of -- a lot riding on these rankings. The winners love them, bragging rights go along with it, and the industry in general, but not entire industry.
SCHNEIDERRemember, there are hundreds and hundreds of schools where most students go attend don't make it into these rankings, and what we really need to be mindful of is not only the, you know, the top schools, but we really have to be mindful of how to help students make the choices between the regional campuses, the comprehensive campuses that most students attend.
NNAMDIHere's Chris in Silver Spring, Md. Chris, you're on the air. Go ahead, please.
CHRISHi Kojo. I wanted to talk about what people think of as education, which is what actually happens in the four or five years that you're actually on campus and in the university. The problem with the U.S. News and World Report rankings is that it's all input data as some of your guests have noted. It's all about how much money you have, and all the inputs.
CHRISAnd then the problem with your guests' idea about this return on investment, which I think is, to be honest, a very silly idea, and it would be just as superficial and subject to abuse and gaming as others, as the U.S. News ranking, is it's focused all on outputs, and neither gets at what actually happens in the college. Like what kind of education are the students getting, and I guess I'd love to hear your guests talk about the national survey on student engagement, which is...
NNAMDIOkay. I'm glad you brought that up because we're running out of time. So I'll start with you, Mark. College, I guess our guest is saying is about more than academics. Shouldn't that figure into decisions as well?
SCHNEIDERWell, so I don't -- Kevin actually is a bigger fan of NSSE, the National Survey of Student Engagement than I am, but I believe that we are getting closer to measurements of what actually students learn while they're in school, and we're probably going to make a lot of progress on that. So let me go back to my silly idea...
NNAMDIYou got about 30 seconds, yes. Go ahead.
SCHNEIDEROkay. So the reason that I talk about ROI is because the wages that students earn as an objective indicator of the value of their learning. And remember, when students are asked why they go to college, 90 percent of them say for a good job and career.
NNAMDIAnd Kevin, the issue of student engagement as a factor?
CAREYWell, you know, that survey is very interesting and here's what interesting. Hundreds and hundreds of colleges around the country have administered this survey of -- it's a survey of best educational practices to their students, but they keep the data to themselves. They will not release it. So this is one of the reasons that we are stuck with rankings that focus on inputs to the extent that colleges have information about the quality of their own education, they will not make it public, and so there are not rankings.
NNAMDIKevin Carey. He's the director of education policy program at the New America Foundation. Kevin, thank you for joining us.
NNAMDIMark Schneider is a visiting scholar at the American Enterprise Institute and vice president at the American Institutes for Research. He served as the U.S. Commissioner of the National Center for Education Statistics from 2005 to 2008. Mark, thank you for joining us.
NNAMDIAnd thank you all for listening. I'm Kojo Nnamdi.
Most Recent Shows
Kojo explores how D.C.'s main library fits into the city's strategy for caring for the homeless, and how patrons are reacting to the closure.
Kojo explores what Etete's new look and menu says about changing expectations in U Street corridor.
The arrival of the Trump administration may add new stipulations to who wins the $2 billion FBI headquarters deal.