this post was submitted on 09 Feb 2024
42 points (100.0% liked)

United States | News & Politics

7210 readers
348 users here now

founded 4 years ago
MODERATORS
 

A scientific paper that raised concerns about the safety of the abortion pill mifepristone was retracted by its publisher this week. The study was cited three times by a federal judge who ruled against mifepristone last spring. That case, which could limit access to mifepristone throughout the country, will soon be heard in the Supreme Court.

The now retracted study used Medicaid claims data to track E.R. visits by patients in the month after having an abortion. The study found a much higher rate of complications than similar studies that have examined abortion safety.

Sage, the publisher of the journal, retracted the study on Monday along with two other papers, explaining in a statement that "expert reviewers found that the studies demonstrate a lack of scientific rigor that invalidates or renders unreliable the authors' conclusions."

It also noted that most of the authors on the paper worked for the Charlotte Lozier Institute, the research arm of anti-abortion lobbying group Susan B. Anthony Pro-Life America, and that one of the original peer reviewers had also worked for the Lozier Institute.

Mary Ziegler, a law professor and expert on the legal history of abortion at U.C. Davis: "We've already seen, when it comes to abortion, that the court has a propensity to look at the views of experts that support the results it wants," she says. The decision that overturned Roe v. Wade is an example, she says. "The majority [opinion] relied pretty much exclusively on scholars with some ties to pro-life activism and didn't really cite anybody else even or really even acknowledge that there was a majority scholarly position or even that there was meaningful disagreement on the subject."

you are viewing a single comment's thread
view the rest of the comments
[–] SatanicNotMessianic@lemmy.ml 4 points 9 months ago

Same. I’ve also had peer reviews that pointed out that I spelled Erdős’ name incorrectly as Erdos. I had another that I grew so irate over Reviewer #2’s critique of my lack of explanation that I turned a ten page paper into a 53-pager, which was then accepted. I’ve also seen absolute blatant inattention, and I’ve definitely been subject to being told to add coauthors because of their seniority/role or current lack of pubs.

I’m completely with you on the academic publication industry. I sympathize with the younger researchers now who are in a far more pay to play environment than I ever was. We’d always build public fees into our funding because we felt obligated to open access all of our work (being government funded, but also just morally), but we were a big money institution that had that kind of flexibility. $10k is nothing on a $5M grant. But now, there’s so many journals that exist only to churn out papers for the publish or perish culture, and no one seems to take seriously the fact that they go unread and are just hitting a check mark.

99% of the time I’m sure it doesn’t matter. It’s just flotsam. But there should be a way of gauging a paper’s potential importance, both by journal ranking and maybe by topic. I’m really not going to call out some overseas researcher who is just trying to keep their job for publishing in a backwater journal, but it’s like that old saying that a lie can travel around the world while the truth is still putting on its shoes. Or that Ashkenazi story about the rabbi emptying the pillow full of feathers to illustrate how a damaging lie is impossible to recover from.