When professor Susan VanderPlas learned the Statistics Department was among those on the University of Nebraska-Lincoln’s $27.5 million chopping block, a flurry of emotions hit.
Shock. Anger. Confusion.
The professionally trained statistician set out to understand the metrics guiding the university’s decision making. She and others in the department spent days poring over the limited data available, noting where they saw potential errors.
The longer they looked, the longer the list grew — and the more frustrated VanderPlas became.
“I would fail my students for an analysis like this,” she said.
VanderPlas is among a growing chorus of university voices raising doubts about the data and metrics that UNL used to identify its proposed budget cuts — one of the largest single-year cuts in more than a decade.
There are concerns about inappropriate cross-department comparisons; missing citation and award information; misclassified faculty; and plain bad math. Adding to the frustration, they say, is a lack of transparency on the part of university leaders.
UNL administrators have steadfastly defended the methodology, a standoff that came to a head during a recent hearing on the cuts.
“Another word for metrics is statistics,” statistics professor Christopher Bilder, hired when the department was established in 2003, told members of the Academic Planning Committee. “And guess what, we’re the experts in statistics. … And unfortunately, what we have found is that what the university has done is use faulty data, done faulty analyses and reached faulty conclusions with respect to this approach.”
If the university doesn’t believe its statistics faculty, Bilder said, it should bring in other statisticians to conduct an external review.
That seems unlikely.
The university has stated it’s too late to change the metrics, given the tight timeline for the budget reduction process. And UNL leaders’ faith in the metrics and corresponding analysis remains unshaken.
“I trust the data analyst expertise that we have in the Office of Research and Innovation, as well as the data analyst expertise with Ph.D.s in data analytics and our Institutional Effectiveness and Analytics Office to know how to generate those assessments in a way that are statistically sound,” said Mark Button, executive vice chancellor.
Chancellor Rodney Bennett plans to bring UNL’s budget cuts to the Board of Regents for final approval in December.
Even if Bilder and others save their department, they say the process will leave lasting damage.
“I’m at a university where the leadership doesn’t understand the data that they’re using to make decisions,” VanderPlas said. “What kind of leadership is that? I don’t expect everybody to be a statistician, but I do expect them to think critically about what they’re doing.”
A question of transparency
The exact calculations that led Bennett to propose cutting six programs as part of a $27.5 million budget reduction haven’t been released publicly. UNL published a list of metrics that were included in the calculus, including things like awards, instructional efficiency, anticipated demand and faculty publications.
But that list doesn’t detail how the data was compiled or how calculations were conducted, a fact that statistics faculty said raises alarm.
On Sept. 2, the UNL Faculty Senate passed a resolution calling on Bennett and Button to release the complete metrics and processes. The request was denied, according to Faculty Senate President John Shrader.
The Flatwater Free Press submitted an open records request to the university on Sept. 18, seeking the complete metrics used for academic program review, any spreadsheets produced during that review and any scores released to departments as part of the budget reduction process.
Initially, the university said it would fulfill the response or provide an update by Oct. 3. It then pushed the date back to Oct. 9. That day, it extended the deadline again to Oct. 24 — the same date the Academic Planning Committee makes its budget reduction recommendations to the chancellor.
In an Oct. 13 interview, Button said it’s unlikely the university will release the full data analysis underpinning the proposed cuts.
“We’ve been very public about the way that we’ve constructed the metrics themselves and each unit has received their data analysis,” he said. “There are some proprietary considerations about how we’ve developed this. There is some information that I would say are things that we believe are best kept within the institution for our own decision-making purposes.”
Missing data
Among the metrics used by UNL to assess programs are Scholarly Research Index scores. Those scores, generated by the company Academic Analytics, are intended to measure scholarly productivity and compare departments and programs within the same field to see how they stack up.
Department scores are calculated using individual faculty scores, which consider the number of journal articles, awards, conference presentations and other metrics. The metrics are weighted differently depending on the faculty member’s discipline.
Universities across the country use the data to guide their decision making, but it has its limitations. After learning that UNL had used the scores, Sarah Zuckerman, a professor in the Department of Educational Administration, looked at her own profile.
She said the Academic Analytics database recorded only three conference presentations from her in the last nine years — 7% of the actual number of total national presentations she had done.
“I was like, ‘That’s called a semester for me.’ It was really shocking,” said Zuckerman, whose department is also slated for elimination.
It captured about 17% of her citations and 80% of her peer review publications, she said.
Button said department leaders are working with faculty to ensure that the Academics Analytics data is accurate.
He noted that Academic Analytics is always one year behind, as their researchers need time to develop and analyze their datasets. If a faculty member published a book six months ago, for example, it wouldn’t yet be counted. And some metrics, such as books published, are credited toward a score for longer than others, Button said.
The university is too far along in its process to make changes based on any faculty corrections to their Academic Analytics profiles for this budget cycle, Button said, but he added that UNL is committed to ensuring the data is as accurate as possible.
“I would say that first, we would want to make sure that there is something missing or inaccurate in the data … If it were determined that that was the case, then of course, we would work in partnership with that faculty leader and our Office of Research and our partners at Academic Analytics to address it,” Button said.
Faculty concern over the Academic Analytics data doesn’t end at the individual level.
Heike Hofmann, a statistics professor, said comparing scores of departments across the same university doesn’t make sense, because what constitutes a “good” score varies by discipline, making such comparisons effectively apples to oranges. If the scores were used as intended — comparing the statistics department to other statistics departments — UNL’s department more than holds its own, according to the department’s analysis.
Statistics faculty have said the way UNL is using Scholarly Research Index scores is inappropriate and goes against the company’s own guidance.
Anthony Olejniczak, director of the Academic Analytics Research Center, said the company as a matter of policy does not comment on how any specific institution uses its data.
He provided a copy of Academic Analytics’ guidelines, which state in part that the database “should not be deployed as a punitive tool to assess faculty members nor to deprive faculty members or units resources. Academic Analytics is most effectively used in positive ways.”
In the last decade, a number of faculty organizations across the country have pushed their universities to cancel or curtail use of Academic Analytics’ databases. In 2016, Georgetown University announced it would drop its subscription after its provost said the data was inconsistent across disciplines and inadequate in capturing faculty contributions.
As concerns grew, the company itself changed its messaging, according to the Chronicle of Higher Education. The Chronicle reported the company backed away from claims that individual analyses of underperforming faculty could help universities save money, and instead pivoted to broader assessments of departments and universities.
An uncertain future
In 2021, the future of the statistics department looked bright. The university had just approved a new undergraduate program — the first for the department — and seemed well situated to build a long-term pipeline of talent for its graduate programs.
The first cohort started in 2022. But a year before their scheduled graduation date, the proposed budget cuts came to light. And the new undergraduate program, which had previously been seen as a win for the department, was instead counting against it.
That’s because the undergraduate students were included in UNL’s calculations for degree completions, VanderPlas said. That contributed to a precipitous drop in the department’s degree completion ratio in 2024, the last year of data used in the budget reduction calculations.
“It really just looks terrible,” VanderPlas said.
Button said that UNL is continually changing degrees and programs, and factoring in those changes is part of the broader assessment of each program. He acknowledged a new degree program might score lower in terms of degrees awarded, but added that it would be more likely to score higher in other metrics tracking the growth of student credit hours and majors.
For VanderPlas, the proposed cuts cast a dark shadow over what should have been a momentous time in her career. In early September, the statistics professor received a National Science Foundation CAREER award, a prestigious five-year grant that the university has publicly celebrated even as her job hangs in the balance.
VanderPlas has made the statistics department’s analysis of the metrics available online, prompting questions from colleagues about whether she’s trying to get fired. Her response is simple.
“They can only fire me once, but I’d rather they fire me for doing reproducible statistics than to fire me because they can’t do statistics,” she said. “One of these is better than the others. But I would rather not be fired at all.”