When Khalid Sayood and one of his students at the University of Nebraska-Lincoln set out to study concussions, they had an obvious group of people to study: the Cornhuskers football team.
To make things even simpler, a colleague who had already collected data on some football players said they could use it for their new study.
In the data – measurements of the players’ brain activity taken at the beginning of the season, and then again after some had sustained concussions – Sayood and his student hoped to find signals that would make diagnosing the brain injuries faster and cheaper.
In a paper they published about their research, they say that they succeeded, finding a way to identify the Husker football players that had concussions with over 99% accuracy.
That early success opened up tantalizing possibilities and suggested there’s a potential path to more effective diagnosis of head injuries, both in Cornhuskers and across the country.
More effective diagnosis is a big deal, because each year, there are as many as 3 million concussions from sports and recreation in the U.S., according to the University of Pittsburgh Medical Center. Roughly 300,000 come from football alone.
On average, about 13,600 traumatic brain injuries happened per year in Nebraska from 2014 to 2018, according to a report from the Nebraska Department of Health and Human Services, which under state law must collect information on the injuries. Responding quickly and properly to concussions – essentially brain injuries that cause the brain to not work normally – is key to prevent the injury from getting worse, according to the health department.
With much more work, Sayood says, other scientists could build on their study to develop a new medical device that would be useful in settings with fewer resources than Division I college football.
But after they published their paper, the university’s research ethics board spotted problems with it: The researchers didn’t have the proper permissions to use the football players’ data, and the board said they published information that could be used to find out which players participated. The ensuing investigation and its consequences – including the retraction of the paper – mean the once-promising research may have hit a dead end for now.
“I find the actions of the [board] to be very objectionable, very harmful both to the university and the research enterprise,” Sayood said. “It’s been a horrible experience.”
The research project was born of the idea that the human brain is too complex – and each person’s brain too unique – to be able to find signs of trauma like a concussion by comparing the brain activity of someone who just had a head injury with the brain activity of someone who hasn’t.
Any differences scientists saw could be because people’s brains work differently, not because of a concussion. So Sayood, an electrical engineering professor at UNL, and his student wanted to look for differences in the same person’s results before and after they had a concussion, to be more confident that changes were due to trauma.
They used electroencephalogram (EEG) readings, which measure the brain’s electrical activity.
The players who participated likely would’ve worn skull caps with electrical probes that collected information on how their brains were working while they completed a simple memory exercise.
The scientists compared EEG readings of football players taken before the beginning of the season with readings taken after some of those players had concussions. Additional EEGs of the players who didn’t have concussions served as another comparison.
Sayood and his student used EEG data from eight football players that other UNL scientists had collected for earlier, previously published research. They published their results in December 2021 in the journal “Neurotrauma Reports.”
The method that Sayood and his student developed identified the players who had recently had concussions with 99.5% accuracy, they wrote in their paper. But because they only had data from such a small number of players, confirming whether their method would be helpful in the real world would take more work and more study participants.
Because there’s not currently an objective, clear way to diagnose when a person has had a concussion, a lot of researchers are working on finding more definitive tests, including by using EEG, said Jeffrey Tenney, a pediatric neurologist at Cincinnati Children’s Hospital in Ohio. When he led a group of colleagues who assessed the current scientific research on the approach, however, they found that there wasn’t yet enough evidence that EEG was useful for diagnosing concussions.
“It’s quite possible someone might eventually find there is a way to use this,” Tenney said, and he can see the appeal. A high quality study to establish that just hasn’t been done yet. Researchers would need to study hundreds of people in a well-controlled environment, and take into account any medications or other brain disorders the participants might have.
“It’s not going to win us any Nobel prizes,” Sayood said of his Nebraska-based research. But, he said, the method of comparing the EEGs of participants before and after they had a concussion to find the signs of trauma, rather than comparing a group of people without concussions to people with concussions, was original. It’s also relatively cheap.
But the data Sayood and his student used for their research had a problem he didn’t know about at the time.
According to typical rules for research involving people, scientists must ask an ethics watchdog called an Institutional Review Board (IRB) to review their research plans and get the board’s approval before starting the work.
One of the most important ethical principles IRBs consider is informed consent for the participants, said Abbey Lowe, a bioethicist at the University of Nebraska Medical Center in Omaha. “People should get the chance to understand what you’re asking of them, then say yes or no to it.”
Consent is “a huge part of what makes human research ethical,” Lowe said. If researchers collect data for a study, but then decide they want to do different research with it, the participants’ consent for the original study doesn’t automatically carry over to the new one. “If the study changes, it’s likely that there are things that need to happen so that participants can say yes or no.”
The colleagues who collected the data and shared it with Sayood said that it had been deidentified, stripped of all information that could identify the players who participated, and thus they could use it without going through the IRB first.
But the data files weren’t deidentified, as Sayood would later discover: They included the players’ birthdates, which could trace back to them.
The university IRB would come to a very different conclusion about whether the researchers could use the data without approval.
Soon after the paper was published, UNL received a complaint questioning the researchers’ access to the data they used, said Dan Hoyt, a research integrity officer for the university.
Research compliance staff began looking into the dataset and saw that while it had been originally collected as part of an IRB-approved study, the new paper described work that had not been part of the sanctioned plan. They contacted Sayood with questions.
When Sayood sent the dataset to the staffers investigating the matter, he discovered that it contained the players’ birthdates, and so was not deidentified.
“I went back and looked and said, ‘Uh oh.’”
In addition, the paper contained a table with information on the eight football players whose data was used in the study, including their age on the day they were scanned to two decimal places. The IRB said this could be used to identify the players, in combination with other publicly available information, such as team rosters.
The full ethics board met to assess the study, and determined that the issues amounted to “serious noncompliance,” Hoyt said. In March of 2022, the IRB wrote to Neurotrauma Reports, requesting the journal retract the paper.
Sayood appealed the decision, but when the IRB met to re-review the case, they came to the same conclusion. The journal retracted the paper last April. It was one of nearly 5,000 retractions issued in 2022.
As is common practice, the paper remains online, but with a heading and watermarks declaring it “Retracted,” and a notice that explains the reason for the retraction. At the IRB’s request, the journal also redacted data from the table with information about the players who participated.
The thing that frustrated Sayood the most, he said, is that he’d hoped other researchers would pick up on the method to detect concussions and, after confirming it worked, build simple, cheap devices that could be used, say on the sidelines of a high school football game. But that doesn’t seem likely anymore.
“Now they’ve got this stupid ‘retracted’ thing written all over it, so I’m sure anybody who’s actually going to do something will ignore this paper,” he said.
Indeed, a doctor who contacted Sayood because he wanted to use the work “ghosted” the researcher after the IRB got involved, he said.
Besides the retraction, the university hasn’t imposed any additional consequences. But the retraction makes it difficult for Sayood or his student to advance the research themselves. The IRB ordered Sayood to destroy the data on the players, which he did, so they can’t continue the work or try to repeat it.
Even if they could repeat the study, Sayood is not sure another journal would publish it, because the results are already online, in the retracted paper, and so wouldn’t be novel. Yet they can’t use a retracted paper as the basis for new work, he said. “We are trapped.”
“We did do something wrong,” Sayood said of using the dataset with the birthdates. But he also said that the identifying information wasn’t a factor in their research, it “just sat there,” until the compliance office asked him for the dataset.
Sayood doesn’t see why the paper had to be retracted, unless as a punishment for not following regulations. The published table cannot identify the participants, he said, because the dates on which the players were scanned was not published, so it wouldn’t be possible to count back from those dates to find the players’ birthdates.
Hoyt said that even without the information published in the table, the board was “still very likely” to have ordered the paper’s retraction, because the data used for the study had not been deidentified, and because the IRB had not approved the work beforehand. IRBs at other institutions, including Harvard, have requested retractions for similar reasons.
“It’s not something that’s a routine practice, but it does happen,” Hoyt said of making requests to retract papers. “Nobody takes the notion of retracting an article lightly.”