A scientific fraud. An investigation. A lab in recovery.
Science is built on trust. What happens when someone destroys it?

Daniel Heinz clicked through each folder in the file drive, searching for the answers that had evaded him and his lab mates for years.
Heinz, a graduate student in Brenda Bloodgoodâs lab at the University of California, San Diego (UCSD), was working on a Ph.D. project, part of which built on the work of a postdoctoral researcher who had left the lab and started his own a few years prior. The former postdoc studied how various types of electrical activity in the mouse hippocampus induce a gene called NPAS4 in different ways. One of his discoveries was that, in some situations, NPAS4 was induced in the far-reaching dendrites of neurons.
The postdocâs work resulted in a paper in Cell, landed him more than $1.4 million in grants and an assistant professor position at the University of Utah, and spawned several follow-up projects in the lab. In other words, it was a slam dunk.
But no one else in the labâincluding Heinzâcould replicate the NPAS4 data. Other lab members always had a technical explanation for why the replication experiments failed, so for years the problem was passed from one trainee to another.
Which explains why, on this day in early April 2023, Heinz was poking around the postdocâs raw data. What he eventually found would lead to a retraction, a resignation and a reckoning, but in the moment, Heinz says, he was not thinking about any of those possibilities. In fact, he had told no one he was doing this. He just wanted to figure out why his experiments werenât working.
To visualize the location of NPAS4, the lab used immunohistochemistry, which tags a gene product with a tailored fluorescent antibody. Any part of the cell that expresses the gene should glow. In his replication attempts, Heinz says he struggled to see any expression, and when he saw indications of it, the signal was faint and noisy. So he wanted to compare his own images to the postdocâs raw results rather than the processed images included in the 2019 Cell paper.
He clicked through each file folder until he found a batch of images that looked like they came from the appropriate imaging session, Heinz recalls. Then he sifted through them, trying to find one that resembled the images in the published paper.
Eventually, Heinz says, he recognized a dendrite section that looked like the mirror image of a dendrite from one of the figures. In the paper figure, the image illustrated that NPAS4 appeared only in the dendrites of some neurons. In the raw image, however, it seemed the signal was not restricted to the dendrites but instead filled entire cells.
Heinz immediately knew something was wrong, he says. The raw image looked more like a section of tissue from a mouse engineered to express green fluorescent protein (GFP) in a subset of neurons. Immunohistochemistry is much messier. Antibodies are notoriously dirty and bind to more than what they are designed to target. There is often background fluorescence that makes it harder to pull out a signal from the noise. But there was almost no noise in this image.
Heinz says he suspected that the postdoc had used the GFP fluorescence in the figure but called it the immunohistochemistry data. If his suspicions were correct, it meant the postdocâs data did not support his story that NPAS4 was induced in the dendrites. It meant the lab had been heading down a dead-end path. It meant the postdoc had faked data.
In recent decades, scientific misconductâformally defined as the falsification, fabrication or plagiarism of dataâhas lurched into the spotlight. Investigations have uncovered fraudulent data at the foundation of a prominent Alzheimerâs disease theory, toppled presidencies at elite universities and shuttered entire families of journals. Fake studies sully both the scientific record and the publicâs opinion of science, and they waste time and tax dollars.
The exact prevalence is unknownâit is difficult to conduct a census because cases surface only when the perpetrator is caught, says Lex Bouter, professor emeritus of methodology and integrity at Amsterdam University Medical Center and Vrije Universiteit Amsterdamâbut in surveys, about 4 percent of researchers admit to ever having falsified or fabricated data. âUsually, things go right,â Bouter says. It can be hard to fathom when things go wrong.
This is in part because trust and science are intimately intertwined, making scientific misconduct âvery much akin to acts of perversion,â argued philosopher of science Michael Ruse in a 2005 essay. For most scientists, âto fake your results is just not understandable.â Plus, a romantic mythology often surrounds scientists. They are portrayed as âmore than human, being like gods in their creativity, and also as less than human, being deprived in their work of the passions, attitudes and social ties given to ordinary men,â wrote Robert K. Merton, the founder of the sociology of science, in a 1970 essay. As a result, he wrote, the public idealizes and idolizes them.

Framed in this way, it can seem implausible for scientists to ever fake data. But the scientific world is not powered by curiosity alone: It also runs on a credit system, Merton argued. The scientists who create new knowledge are rewarded with recognition. Jobs, funding, and sometimes awards and fame, follow. Under the credit system, misconduct starts to make more sense.
And when misconduct does occur, it creates a fallout zone in the lab. Certainly it did for Bloodgoodâs group. Thatâs because misconduct is not just a scientific betrayal; itâs a personal one as well, says C.K. Gunsalus, director of the National Center for Principled Leadership and Research Ethics at the University of Illinois Urbana-Champaign. âItâs very hard for a lab to recover.â
This article is about that recovery, and what happens to the people left behind. So we wonât name the person who committed fraud in the Bloodgood Lab, or in any others. (The postdoc did not respond to requests to comment for this story.) Fraud happens every year, in labs all over the world. This story could be about anyone.
T
he postdoc whose work Heinz called into question joined Bloodgoodâs lab in January 2015, shortly after finishing his Ph.D. During her own postdoctoral work, Bloodgood, associate professor of neurobiology, worked in a lab that was investigating activity-regulated genes, with a focus on immediate early genes, which are induced in cells right after the arrival of an outside signal. And they studied an immediate early gene that was specific to neurons: NPAS4. In some experiments, Bloodgood noticed whiffs of NPAS4 in the neuropil of the hippocampus.The postdocâs first project in Bloodgoodâs lab was to explore this phenomenon. He claimed he had found that one kind of electrical stimulation induces NPAS4 in the cell body, another kind induces it in the dendrites, and that the different flavors of NPAS4 interact with DNA in different ways. That is the story he told in the Cell paper, the product of years of work.
These were âreally exciting results,â says Pei-Ann Lin Acosta, a graduate student in Bloodgoodâs lab at the time, who is now a management consultant. Acosta was working on a similar project, but she used optogenetic stimulation instead of electrophysiology. Yet despite the overlap between her work and the postdocâs, Acosta says, she never managed to replicate his results.
The group investigated several potential causes for the failed replications. First, Bloodgood says she chalked it up to Acostaâs inexperienceâshe was a new graduate student, after all. Then, the team ran out of the initial supply of antibody they used to tag NPAS4, and they struggled to find an effective replacement. Eventually, Bloodgood suggested the postdoc and Acosta work side by side at the lab bench so they could figure out what was going wrong, Acosta recalls. He was âmarginally helpful,â Acosta says, but they never discovered the source of the problem. She felt so frustrated that sometimes she cried, and eventually she switched to another lab.
Something similar happened to Andre DeSouza. He transferred into Bloodgoodâs lab in the third year of his Ph.D. His project also built on the postdocâs work. The postdoc had compared three amounts of electrical stimulation: 0, 0.1 and 100 hertz; DeSouza says he wanted to test smaller increments of stimulation to find the threshold that would trigger NPAS4 expression.
Like Acosta, his first step was replicating part of the postdocâs work. And as with Acosta, it never happened, he says. After a few years of failed replications and dead-end troubleshooting, compounded by some personal issues, DeSouza dropped out, leaving the Ph.D. program with just his masterâs degree. âIt sucks to feel like, âOh, I was not a good scientist,â and then realize, like, âOh, I was trying to do something that was just never really going to work,ââ DeSouza says.
O
nce Heinz had found the smoking gun in the postdocâs raw data, it took him a couple of weeks to make âdamn sure that I was right,â he says.First, he needed to work through the logic of what he saw and what it meant, and âput it outside of my brain.â He started a document and spelled out each issue he found, attached screenshots, recorded file names and walked through what evidence would refute or support his hypothesis.
âIntuitively, perhaps, I didnât have any doubt. But thatâs not enough,â Heinz says. âI needed to be able to convince the very critical part of myself that there was no chance that what I was finding was not real.â
He scheduled a meeting with Bloodgood on 13 April to share what he had uncovered, as much as that scared him. âWhat I was terrified of was the monumental nature of the accusation, in that I was afraid that I would be right, that it would be true, and that many peopleâs lives and careers would be ruined,â Heinz says. âI was just really feeling the horrorâthe horror of the consequences of what Iâd found.â
Heinz was also tormented personally. The postdoc was his close friend, he says, and he knew this revelation could destroy his career.
At the start of the meeting, Heinz got right to it and told Bloodgood he had found âa really big problemâ with the postdocâs paper.
âOh no,â Bloodgood remembers saying as a feeling of heaviness sank in.
Heinz says he walked Bloodgood through his findings, telling her it appeared the postdoc had intentionally falsified the image. Bloodgood âdidnât disagreeâ with Heinzâs findings, she says, but she wanted to give the postdoc a chance to explain.
In the back of his mind, Heinz had been hoping that Bloodgood would âpoint out the obvious stupidity in what I was saying.â The fact that this didnât happen shook him, he remembers, but as he was leaving the office, Bloodgood called out to him. He stopped and turned around. âDanny, it might not feel like this now,â he remembers her saying, âbut someday, looking back, youâll be glad you did this.â
The next week, Bloodgood spoke with the postdoc on Zoom. She says she showed him slides with the raw images Heinz had found and the figure from the paper, and she asked for an explanation. The postdoc said it must be a mistake, Bloodgood recalls, though she thought he sounded nervous. âThis nervousness is either because he feels put on the spot,â Bloodgood remembers thinking, or because âhe feels like heâs been caught.â
The postdoc emailed Bloodgood two days later. In that email, Bloodgood says he admitted he had manipulated images in one figure, but he stood by the findings the images represented. And he offered up an excuse: He wrote that he had felt pressure to produce a beautiful paper. But Bloodgood didnât trust him anymore, she says. She asked him to send a spreadsheet detailing the name and location of every image file that had gone into making the figure and step-by-step instructions on how he had analyzed them. He complied, and Heinz got to work.
On 4 May, Bloodgood and her lab met for their weekly lab meeting. Normally, someone presented data from the experiments they had been working on. But that day, Bloodgood broke the news of the manipulated images instead.
When Bloodgood finished speaking, the room fell silent, says Chiaki Santiago, a current graduate student in the lab. Santiago says she sensed both sadness and shock in the silence, but also an odd sense of closure. The dendritic NPAS4 antibody experiments had been a âtrapâ for years, she says, and now they finally had an answer. The group wasnât incompetent; they had been chasing a false signal. Knowing that felt at least like âa path forward to truth.â
Before the meeting disbanded, Bloodgood gave everyone a chance to ask questions and share reactions. More than one person expressed concern for the postdoc, several people present at the meeting recall: The trainees understood the gravity of what the postdoc had done, and that the consequences could âdevastate a person,â says Anja Payne, a graduate student in the lab at the time. How was he doing? they wondered out loud. Was he suicidal, and did he need an emergency intervention?
Bloodgood, hearing this collective goodwill, felt a âhuge warmth to the people in my lab,â she says.
Then, Santiago says, the lab trainees went out to lunch and took a walk on the beach together. âThat was, like, perfect,â she adds. âIt was very soothing and calming and a great reminder that this isnât the end-all be-all; weâre going to figure out how to fix this, and weâre going to figure out ways to work through this together.â
W
eathering someone elseâs scientific misconduct can become aâif not theâdefining moment of a career. For Kate Laskowski, it shaped the way she runs her lab.Toward the end of 2019, Laskowski, assistant professor of evolution and ecology at the University of California, Davis, had just opened her lab when she discovered that three papers she had published in collaboration with a prominent spider biologist contained falsified data: The biologist had collected the data, and Laskowski had analyzed it. In the end, she retracted the papers and published a blog post detailing everything that had happened.
The experience did not sour Laskowskiâs feelings about science, but it did shape her lab in âprofound ways,â she says. She tells her students, âWe live in a glass house; everything we do is going to be public,â she says. âI never want to relive this. And I know that the only reason I survived is because I was so transparent and open.â For example, her lab manual is available on her lab website and outlines detailed expectations for lab notebooks, data storage and analysis, and file organization. The top of the manual states the key mantras: âDonât be a jerkâ and âDonât fabricate/fudge/alter data.â
A close brush with a colleagueâs misconduct left Edward Ester with a lingering worry. In 2015, a few years after he started a postdoc, he says his former Ph.D. adviser told him a graduate student in the labâand Esterâs close friendâhad been accused of fraud in several of his papers. Ester took a closer look at some of the work he had done with the student, found evidence of data falsification in two papers and retracted them, he says.
Today Ester is assistant professor of psychology at the University of Nevada, Reno. When he first opened his lab there, he says he was âvery paranoidâ about his trainees making mistakes and spent a lot of time doing data analysis that he should have assigned to a student. Esterâs lab has transparency policies that are similar to Laskowskiâs, and even now he finds himself âperhaps more of a helicopter [adviser] than I need to be in some instances.â
The experience also instilled in him a cynicism about the incentive structure in science, he says. When a scientistâs worth is measured by their h-index and grant dollars, that can âencourage fraud that might not otherwise occur. Because for some people, I think itâs just out of desperation. Or, for some people, itâs a desire to be the best, but they want to get there too fast, or they donât care how they get there,â he says. âIf you create perverse incentives, youâre going to create perverse behaviors.â
But Ester doesnât let this awareness ruin his daily experience as a scientist, he says. He keeps his attention focused on his own work and his own lab, which is the only thing he can control. The structural flaws necessitate âa lot of vigilanceâ from individual researchers to ensure fraud doesnât occur.
These consequences are amplified when the person faking data is a principal investigator. In 2005, a group of molecular biology graduate students at the University of Wisconsin-Madison discovered their PI had faked data in several grant applications. After months of deliberation, they turned her in, and the lab was shut down. Three of the students left with their masterâs degrees. Three others switched to new labs to finish their Ph.D.s, including Mary Ann Allen, who says she initially wanted to leave research and get a computer science degree instead, because she couldnât imagine trusting another stranger to be her adviser.
She stayed in biology only because a friend recommended his former adviser at the University of Colorado Boulder, she says. Allen moved to the new lab and is now a research associate professor at the universityâs BioFrontiers Institute. She migrated from molecular biology into computational biology (in part, she says, because of the fieldâs data- and code-sharing norms), teaches responsible conduct of research courses to trainees and upholds transparency policies in her own lab.
Her trust in other scientists still ebbs and flows. âI was under the impression nobody committed misconduct. And then you go through this situation, and you start to wonder if everybody does,â Allen says.
A
s Heinz worked through the reanalysis of the postdocâs paper, it became clear that hundreds of images were not accounted for in the spreadsheet the postdoc had sent, Bloodgood says. She asked the postdoc to send the missing images, and a few weeks later, on 9 June, he did.Yet when Heinz looked at the imagesâ metadataâthe immutable bits of information marking when an image was taken, and on what microscopeâhe discovered that what Bloodgood describes as âan overwhelming majorityâ had been taken within the past few weeks. The postdoc, it seemed, had faked more data to cover his tracks. This was awful news, Bloodgood says, but it carried an âecho of relief,â because it meant the group could stop investigating. Nothing could be explained away, and it left Bloodgood with only one choice, she says.
Bloodgood called an emergency meeting on Thursday, 15 June. Santiago, Heinz, Payne and the other trainees piled into Bloodgoodâs office around her computer, and she broke the news. The postdoc could not be relied on to help correct the paper, she told them; it had to be retracted. Also, Bloodgood said she had emailed the postdoc and told him that on the next Tuesday she would notify the National Institutes of Health (NIH), Cell, her department chair and his department chair about what he had done. If he wanted to be the one to tell his chair, he would need to do so before then.
For Heinz, the discovery of the second fraud shifted him from âfeeling guilt to feeling anger,â he says. If you give someone an opportunity to fix a mistake, and âthey try to take advantage of you, thatâs a different level of betrayal.â
Bloodgood was angry, too, she says. âIt didnât have to be this way,â she remembers thinking. âThere are so many interesting things to discover in biology. You donât have to make things up.â
On 15 June 2023, the postdoc confessed to the University of Utah, according to an âadmission of research misconductâ statement and the universityâs misconduct report, which The Transmitter obtained through a public records request. He admitted to manipulating images of NPAS4 using Photoshop, and he admitted to fabricating data in a set of genetic knockout experiments he never performed. He also admitted to incorporating fabricated data throughout the paper to increase the sample size of different experiments. He then used the fraudulent data in several NIH grant applications that led to more than $1.4 million in funding, and in his job application talk that landed him an assistant professor position. Finally, he admitted to sending Bloodgood images that he took after the paper was published âin an initial attempt to conceal my misconduct,â he wrote. âIn truth, this misrepresentation was a falsification of the research record.â
The University of Utah and UCSD conducted separate investigations, and both found that the postdoc had committed research misconduct, as did the U.S. Office of Research Integrity (ORI). The postdoc resigned from his position and entered a voluntary settlement agreement with the ORIâhe agreed to be supervised by two to three senior faculty members for the next five years when conducting federally funded research.
On 12 June 2024, Bloodgood and the postdocâs other co-authors retracted the 2019 paper from Cell, after UCSD concluded its investigation. âWe do not stand by the conclusions drawn in this paper and are retracting it,â the retraction notice states. âWe apologize to the scientific community for any loss of time, resources, and/or morale caused by this publication.â
W
hen people trust someone, they make themselves vulnerable to being hurt, says Karen Frost-Arnold, professor of philosophy at Hobart and William Smith Colleges. Philosophers describe the feeling that comes from someone taking advantage of that vulnerability as âdisrespected in your personhood,â Frost-Arnold says. âIt can feel very dehumanizing.âOne component of the healing process involves trying to understand why the betrayal happened, Frost-Arnold says. The betrayed look at themselves and wonder why they trusted the wrong person. They look at the betrayer and try to decipher their motivations. If the motivations are unclear, they may chalk it up to a random act of cruelty from a bad person. And lastly, they look at structures and institutionsâwhat is it about science in general, or their lab specifically, that allowed this to happen?

In the 18 months since the postdocâs fabrication came to light, members of Bloodgoodâs lab have wrestled through what happened to them and what it means for the rest of their careers. Heinz canât write off the whole episode as a bad person being bad, because he knew the postdoc to be good, thoughtful and caring, he says. âThis is not just, âThere are some bad apples.â Itâs this specific person who I couldnât have believed could have done this, did this. And it forces a different interrogation of the causality.â
Instead, Heinz has taken a close look at the scientific institution. He views its incentive structure as a âmoral hazardâ for scientists, he says, because it tells them that the only way to advance is to take big swings, but most big swings are misses, not home runs. So some may feel compelled to cut corners or fudge results to propel their status and career. Heinz sees a system in which labs push toward âpersonal brand goalsâ instead of biological truths. He loves being a scientist, he says, but heâs not sure if he can protect himself from the dissonance that comes from existing within the scientific world. Heinz hopes to defend his Ph.D. at the end of the year, and he doesnât know yet what heâll do next. He says he wants to do something that feels like a valuable contribution to societyâmaybe thatâs in science; maybe it isnât. But âI almost certainly would have been doing a postdoc if all of this hadnât happened.â
Bloodgood is still working through what lessons might be generalizable, she says. She doesnât want to treat all of her trainees like they may be faking data, because âit would be terribly unfair to them, and it would be an awful way for me to go through my life.â Still, her baseline trust in other scientists has dropped, she says, and she finds herself less confident in someoneâs results when she perceives them to be an ambitious person.
The fraud in her lab has âextinguished a spark that I had for science,â Bloodgood says, and reigniting it has been âelusive.â
Santiago says she has also lost that spark. Last summer, when all of this took place, she had an internship at Neurocrine Biosciences. She noticed that when the Neurocrine scientists tried to replicate a finding reported in the academic literature, it often failed. This observation, combined with the fraud unfolding in her home lab, caused Santiago to lose faith in academia. Industry incentivizes rigor, she says, because drugs are tested in humans, and vast amounts of money are at stake. But in academia, she sees researchers wed themselves to exciting stories that might not be true.
After the internship, while attending a seminar at UCSD by a visiting professor, she remembers thinking, âI donât know if thatâs real.â Later that semester, she read a grant she had written for an assignment a few years prior. Her writing carried both excitement and pride as she described the labâs work and the experiments she planned to do. She had even used an exclamation point. As she reread her old writing, Santiago says she realized that enthusiastic version of herself was gone, and the realization made her cry.
Yet she hasnât lost all hope. After the second meeting with the lab, that terrible one in which the members learned that the postdoc had continued to fake data, Santiago left campus and drove north on Interstate 5 to get back to her internship. She had 30 minutes until her next meeting, so she exited the highway and made a detour to Bird Rock Coffee, a coffee shop across the street from where the marshland meets the beach. Santiago remembers she sat on a stool on the patio, drank her coffee, stared at a great blue heron that was hanging out in the marsh and wondered absently about how much energy it took for the bird to stand on one leg. And this moment, she says, somehow reassured her that everything was going to be alright.
For Payne, it took months to fully process the postdocâs fraud, she says. The events unfolded not long before she was to defend her thesis and move across the country to Virginia, to start a postdoc at the Janelia Research Campus. She remembers sitting in Bloodgoodâs office, thinking, âI canât internalize this right now.â Her main thought was to defend, graduate and âget out, get out, get out,â she says.
At first, Payne says, she had felt compassion for the postdoc. âTruthfully, my reaction to it was a little bit this sense of like, âThere but for the grace of God go I,ââ she says. But in Virginia, when it was over and she had physically left it behind, she had more time to think. Then she started to feel angry. By October, she felt afraidâshe worried that there was âsomething missing about my understanding of science,â she says. She had once felt that all scientists had âthe same goals,â but after the fraud she doubted that.
Payne says she realized that she was grievingâsomething she had experienced when her brother died during her first quarter of graduate school. In the aftermath of that loss, she joined a graduate student grief group at UCSD to help herself cope. Eventually, she began to facilitate the group alongside a counselor.
While leading that group, she says she learned that working through grief is not about fully healingâthe loss of her brother might forever feel raw. Instead, what she needed was to find a way to tolerate the wound in her life. âYou truly do just have to get to a place of acceptance or go crazy. Thereâs not an in-between,â she says.
Payne says she doesnât expect to understand why the postdoc did what he did, and she also does not expect that her attitudes about science will look the same as they did before. She considers her recovery from the fraud to be an âupward spiralâ that will vary each day and wonât be linear.
Now, she says, she sees that the only thing she can control is the rigor of her own work. There is no way to prevent fraud. This realization is still painful at times, but she has accepted it. âIt is just, unfortunately, a feature of humanity that we have to contend with,â she says. You have to âlook the beast in its ugly face.â
If you or someone you know is having suicidal thoughts, help is available. Here is a worldwide directory of resources and hotlines that you can call for support.
Recommended reading
Physicians who oversaw diagnostic manualâs revision had pharma funding
JNeurosci joins open peer review trend but keeps opt-out
Putting a bright idea to the test
Explore more from The Transmitter
New tissue-clearing techniques let microscopes peer deeper into living brains