For the past few years, the Obama administration had been pursuing a national rating system for colleges and universities. The complexities of implementing such a system led them to abandon the effort, but some of the data that would have been used for constructing the system is now available online as the College Scorecard at collegescorecard.ed.gov.
The rating system would have attempted to assess the “value added” by higher education — meaning is a student better off for having attended one college or university compared to another. It sought to answer this important question: Are some colleges and universities better than others at adding value to their students?
Added value is, of course, a complex and subjective measurement, which is why the U.S. Department of Education discontinued the project. The Harvard School of Business certainly adds a different sort of value than the Julliard Academy of Dance. But the question is still fundamentally interesting and important.
It’s also important to control for the types of students an institution admits. Students who have high test scores coupled with a high class ranking might reasonably be expected to have a head start on those with a lower socioeconomic status or lower test scores. Simply looking at the achievements of graduates without looking at where they come from arguably measures only whom a college chooses to admit, and not the value the college adds.
So, a college or university that only admits students already predicted to be highly successful might add less value than institutions that admit students from less advantaged backgrounds.
Although the federal government has abandoned this rating system, the Economist, an international news magazine, has now tackled one aspect of it. The question the Economist asks is this: Do graduates of a particular college or university do better or worse financially than we would have statistically expected them to do? In other words, if we predict how financially successful a student ought to be based on test scores, are actual graduates doing better or worse than the prediction?
The Economist surmises that institutions whose graduates are earning more than predicted might be said to have “added value,” at least in this one area. Nearly all private and public colleges and universities were included in the Economist study. Two-year schools like community and technical colleges were not included, and neither were for-profit schools. (The rankings can be found at economist.com/collegevalue)
According to the Economist, Winona State University, right here in southeast Minnesota, is in the upper quarter nationally in terms of the financial value added to its graduates.
According to this report, the median income of Winona State graduates is about 6 percent higher — or about $2,189 more per year — than what would have been predicted. According to this measure, Winona State University is the highest ranked public university in Minnesota, and ranks above the level of even Princeton and Yale. Given this extra value, it’s no surprise that WSU graduates have a nation-leading 94 percent non-default rate on student loans, as this “value-added bonus” identified by the Economist offers Winona State graduates more financial flexibility.
Obviously, there are many other ways to measure added value. The big one, of course, is “What have students learned and mastered?” But there are others. Are graduates engaged in their communities? Are they making a positive difference in the world? Are graduates satisfied with their lives?
All of these measures of value — and more — are important. The Economist rankings don’t tell the whole story, but they tell an important part of the story, one that prospective students should be considering as they choose their educational path. In at least this one area of value — better earnings than predicted — the state of Minnesota offers one of the best values in higher education in the nation right here at Winona State University.