Last week a small, Boston-based firm prepared to release a report about the financial status of nearly 1,000 American colleges and universities. Inside Higher Ed prepped an article on Edmit’s report.
In this post I’ll try to summarize my best understanding of what occurred, then explore what on Earth this might mean for post-secondary education. As always I’m eager to hear your thoughts.
Edmit built an analytical model of campus financial sustainability. According to IHE, this rested on “four primary variables: investment return on endowment funds, tuition prices, tuition discounting and faculty and staff member salaries.” Edmit fed IPEDS data from “946 private colleges” into the model, added qualitative information, then produced a report on each school’s likely financial future.
A key detail of those college reports concerned the chance of each one closing:
The projections used qualitative and quantitative data, from federal sources, to estimate how long before the net expenses for the 946 private colleges exceeded their net assets. After that, the model assumed the colleges would fail, because no enterprise can continue to operate without taking in enough to pay its bills. The model provided that information in a single number so it would be accessible. That number was the estimated time until closing for each college. [emphases added]
In 2019 Edmit further developed this analytical tool in-house, supplementing it with an open source project:
The company’s co-founders planned to publish the projections on Github, a platform for open-source projects, under its own logo and the Inside Higher Ed banner. The source code was available, and a lengthy explanation was planned, saying the list was an early measure, that its developers were seeking feedback and potential improvements, and that students and parents shouldn’t base college decisions on it.
Inside Higher Ed started researching the Edmit effort and offered an opinion piece to the firm. This journalistic work alerted campuses and academic organizations to the analytical tool’s existence. Some reacted with lawyers and legal challenges to both Edmit and Inside Higher Ed. As a result Edmit called off the release, and IHE published the somewhat disappointed article linked above.
What were the colleges’ arguments against this forecasting tool?
Some questioned the data as inaccurately describing a given campus’ future:
Pete Boyle, a spokesman for NAICU, said via email[:] “How much sense does it make that four short-term data points can define a college’s long term future, and that colleges do not change and adapt to challenges over time?”
There were other criticisms about the project’s formal features:
The report on the methodology lacked specificity, explanation and breadth, Boyle and others said. The supporting data regarding school closures were questionable, the background research on the choice of explanatory variables and method was lacking, and supporting arguments for choosing the variables were absent.
As one university’s counsel reportedly put it:
“It would be reckless for a respected higher education publisher such as Inside Higher Ed to make such predictions based on old, incomplete, and inaccurate data and an admittedly flawed model,” the lawyer [for Herzing University] wrote.
Another college went further than charging recklessness: “Utica [College] threatened to sue if Inside Higher Ed published an article on Edmit’s projections.”
Perhaps the strongest criticism was that the publication of such forecasts could harm some of the colleges being researched. (From the IHE article: “One college president emailed with the subject line ‘IHE Article Puts Students and Colleges at a Greater Risk?'”) For example, a prediction that a campus was likely to close in ten years, say, would seriously depress student applications, faculty and staff applications and morale, and charitable giving… all of which would speed the institution’s decline along. Put another way, the public act of observing a college could alter its status (on Twitter I called this a kind of Heisenberg effect) In this view, Edmit’s research could close campuses.
As Paul Fain speculates,
Others may have felt their colleges were on the brink of collapse and had to fight against unflattering media coverage with every available resource or risk that collapse accelerating.
Or as commentator Karen Gross argues, the “[l]ist would close off admissions substantially and quickly, well in advance of demise…”
Are these arguments correct? To an extent we can’t fully assess them, since the Edmit analytical tool is still in the dark. But working with what we have, we do have to wonder if such a report could have done harm to colleges already teetering on the financial brink. If we arrive at that conclusion, then keeping the analysis from the light of day was the correct action.
On the other hand… to begin with, Edmit and its data advisory group stand by their data collection and analysis.
Ducoff and Manville… tried to avoid false positives, such as by not requiring a cash cushion for colleges in the forecasts. That means the model was too conservative in some cases. For example, Mount Ida was projected to last indefinitely.
They also argued for some proven accuracy, based on recent history:
While the projections might not fully capture the financial health of some colleges, Edmit had evidence that the forecasts could be accurate. That’s because several colleges included in the modeling tool have shut down during the last several years. Almost all of those colleges had precarious finances, according to the projections.
Here are the model’s estimates for how long it would be before those college would have been at risk of closing: Southern Vermont College (four years), Green Mountain College (six years), Marylhurst University (six years), Concordia College of Alabama (six years), Marygrove College (seven years), Newbury College (seven years) and Grace University (seven years).
Remember, too, the open source supplement, which would give people the chance to improve both data and model. Or to fork their own.
Further, the Edmit group argues that students should have access to such information, given the important decisions they make about attendance. I might put it another way. If someone attends a college and it collapses during or after their studies, wouldn’t they have preferred to have known the risks ahead of time? Try this question on for size: is it unethical to block access to such reports from students deciding where to enroll?
If your college is so on the brink that this report could bring it down, you should be seriously think about responsibly closing your college via merger or teach-out… We think it is horrible when an airline conceals financial problems and leaves its passengers stranded. Concealing financial problems from students is several times worse.
Perhaps we can reconcile these opposing claims of help versus harm, open against discretion. Maybe a public entity, rather than a private one, could take responsibility for data gathering, analysis, and publication? There might be something of a precedent in a new Massachusetts law, which gives that state more authority to suss out campuses’ financial health. It also seems to have provisions for doing so out of the public eye, when necessary. Could state governments pick up the Edmit tool and apply it sub rosa, letting officials quietly contact colleges and universities to either help them survive or wind them down with a minimum of harm? Or a (post-Trump, post-DeVos) Department of Education could conduct such an analysis. Alternatively, this might be a function performed by non-state actors, such as accrediting agencies. They might have more flexibility, especially when it comes to public records laws. (sibyledu, a fine commentator on this blog, has some good thoughts here)
So we have three choices:
- Continue as things are now;
- Edmit publishing their model;
- A public or private agency doing #2.
On a personal and professional note, I confess to reading this story with growing alarm. As a futurist, I also gather quantitative and qualitative data about higher ed, and use it to help everyone involved think more effectively about what’s next for colleges and academia. Readers know my forecasts are sometimes dark.
But that work is aimed at the entire higher education sector, or large swathes of it. No single campus has (so far!) accused me of harming its fortunes through my research. This may be due to American academics’ tendency to not think as members of an industry or sector; instead, we usually see ourselves as part of one institution (Tweet College) or a single profession (biology). Academics outside the United States may disagree with my forecasts, but none have charged me with offering harm to their universities. When I consult with individual campuses, they usually keep my work in-house. I’ve had to sign NDAs for several.
And yet this might be too rosy a view. One passage from the IHE article struck me:
“To look 10 years down the road in higher education is dangerous (what will happen with HEA, for example?),” [Pete Boyle, a spokesman for NAICU] said, referring to the long-delayed reauthorization of the federal Higher Education Act. “And to look 50 or 100 years down the road is worse.”
Dangerous. Quite a word to use here. He’s not saying such forecasting is difficult to do, as many would charge (including myself!). Here Boyle isn’t deeming the work unlikely to be reliable, but instead to be actually dangerous.
I don’t know if we’ll see more of this danger charge levied in the near future. It might seem an ill-considered charge, given the demand for more information and transparency about higher ed. Forecasters also might suffer the traditional fate of not being listened to – I don’t mean ignored, but simply unread and unheard.
However, the danger charge might recur, given the increasing fragility of a big chunk of American higher ed. People in leadership or supporting positions at challenged colleges could conclude that bad press is what’s to blame, either honestly or because it’s much easier to dun media than save institutions. Those of us who augur potential decline in the sector could be targeted, accused of making the situation worse. Some might oppose our efforts to increase knowledge, to boost conversation as having the opposite effect. Books like this forthcoming one might take hits as being deleterious – not to the debate, but to higher ed’s health.
We futurists might be wise to be very, very careful.
In the meantime I will insist, despite everything, in supporting conversation. Please add your thoughts to the comment box below.