Imagining A Robust Ranking of Christian Universities
Can the US News Ranking system be adapted for Christian Liberal Arts Institutions?
This week my friend Chris Gehrz, in his ongoing series of how to choose a college, did a deep dive on the annual college rankings from US News. Here’s a taste:
Each college profile at U.S. News also collects in one place the statistics I mentioned in the last two chapters, like net price, graduation rate, and typical student debt. But the second thing to understand about U.S. News and its competitors is that each system emphasizes some criteria and deemphasizes (or ignores) others. While USN has started to pay more attention to colleges’ graduation rates and economic “return on investment,” it still considers the academic caliber of student a school attracts and how much its faculty are paid, neither of which necessarily reflects the teaching or learning that happens there. Most controversially, one-fifth of its ranking points continue to come from subjective peer assessment: U.S. News still asks college leaders to rank rival institutions on a five-point scale. More and more of those leaders are skipping that survey, but for now, a school’s reputation among people who have never studied or taught in its classrooms or labs can still count for more than its graduation rate.
As Chris observes, 20% of an institution’s overall score comes from peer reputation. I remember when I became Chief Academic Officer at Warner Pacific in the mid-90s. One day I received this survey in my mail. It was a list of several hundred private liberal arts institutions and asked me to score them on a number of criteria, including academic quality. There were certainly other criteria, likely including faculty strength, student performance, and financial viability — frankly, who remembers after 30 years!
I knew almost nothing about most of the schools on the list. Sad to say, I gave responses on those I knew and left the rest blank. Realizing how tautological the peer evaluation is, I didn’t participate in future years. I came to loath the US News rankings, although I took a perverse pride when my school did well.
According to US News, here is the breakdown of their criteria for schools that gave the SAT or ACT incoming students (allowing a measure of student quality).
Peer evaluation is still 20% of the score. Graduation rate over 6 years from entry as a freshman is another 16%. If that rate is increasing over time, there’s another 10%. Pell grants factor in for a total of 11%, split between Pell recipient grad rates over 6 years and a comparison with the overall grad rate (it’s a ratio that will award more or less points based on the ratio). Financial resources and faculty salaries each get 8%. There’s 4% for having a low student faculty ration and 3% for a favorable balance between full time and adjunct faculty.
I’ve already commented on the problem with peer evaluation and its self-perpetuating character. But this is true for most of the other measures as well. Standardized test scores and graduation rates are highly correlated with admissions selectivity. Well-endowed institutions will have more financial resources and higher faculty salaries,
Or to say it the other way around — medium sized Christian liberal arts institutions have moderate admissions standards, lower endowments, smaller overall budgets, and lower relative faculty salaries. Their graduation rates run a bit higher than public universities but not overwhelmingly so. They have a relatively larger percentage of students on Pell, but they likely won’t retain or graduate at the same rate as their non-Pell peers.
But wait, you say, didn’t my Christian University just release info from their marketing department about how well they did on the latest survey? Yes they did. But what they shared (for most schools) was the ranking as a Regional Liberal Arts Institution. The National list is more prestigious, but by doing the Regionals, it allows schools to brag about being #33 on the US News list.
Since Chris released his piece on Tuesday, I’ve been pondering the problem of ranking Christian Universities. Last Friday I received an email from Dahn Shaulis who has a website called highereducationinquirer.org. He asked me, “What do you see for the future of religious-based schools?” I responded that it was an almost impossible question to answer given the complexity of the Christian Higher Ed sector.
Last February, I wrote about one attempt to create a meaningful ranking system of religious schools. Here’s part of what I wrote:
Glanzer uses a number of variables to rate the various institutions to create an Operationalizing Christian Identity Guide (OICG). These range from self-identification as a Christian college, expecting the constituents (students, faculty, staff, trustees) to be Christian, required Bible or Theology Courses in a Religion department, required or voluntary chapel, placing student expectations in religious language, and whether there are student groups supporting other faiths.
These measures did a good job of distinguishing formerly religious schools and Catholic schools from Christian Universities. But I don’t think they do much to distinguish between Christian schools (although the authors do think having four required Bible classes is “more Christian” than having two).
I would add some of the US News criteria here. The ratio of full time to adjunct faculty is important. Graduation rates, including those for underrepresented populations, would make my list (accreditation groups are asking for this already). The item comparing graduate salaries to high school grads (ROI) is important but worth less given the service orientation of many Christian University students.
I would love to see a comparative measure on the size of the core general education requirements. While it’s possible for these to be too large, a healthy balance of graduation requirement credits devoted to general education would be good to have. I would examine student faculty ratios in the first two years and again for the later years. To me, smaller classes at the introductory level are key to the success measures.
In terms of institutional strength, I’d naturally look at the percentage of the operating budget devoted to instruction (library resources, athletics, and student services would be a different measure). The stability of that percentage over time would be worth tracking. Enrollment change over time is another factor. Too much rapid growth or a long term decline both present challenges. An item I dealt with in my book is examining the undergraduate population as a percentage of total enrollment. The larger this is, the healthier the institution.
A measure of diversity — in undergraduate population, faculty and staff, and perspective — would be great to have although the latter component is harder to measure. So is the ongoing church involvement of students after graduation. I would look at the percentage of young alumni who give anything to the annual fund as an indirect measure of ROI (if they didn’t value their experience, they don’t give).
I haven’t said anything about hiring Christian faculty or having a robust chapel and spiritual life program, mostly because they tend to be found in some measure in most Christian Universities. Given the theme of my book, I’d likely try to work out some demerit measures for being in the news as Culture Warriors.
This is a beginning list. And I’m not sure how I’d gather all the metrics described above. I hope you’ll add things I’ve missed in the comments below.
Who knows: maybe there’s another book in me yet!
Merry Christmas! Wishing you the best for the new year!