←back to thread

246 points rntn | 3 comments | | HN request time: 0.001s | source
Show context
N_A_T_E ◴[] No.43795965[source]
Is there any path forward to fixing the current reproducibility crisis in science? Individuals can do better, but that won't solve a problem at this scale. Could we make systemic changes to how papers are validated and approved for publication in major journals?
replies(12): >>43796160 #>>43796211 #>>43796313 #>>43796358 #>>43796415 #>>43796725 #>>43796906 #>>43796908 #>>43796955 #>>43797084 #>>43797605 #>>43797627 #
cogman10 ◴[] No.43796908[source]
Yes, but it costs money. There's no solution that wouldn't.

IMO, the best way forward would be simply doubling every study with independent researchers (ideally they shouldn't have contact with each other beyond the protocol). That certainly doubles the costs, but it's really just about the only way to catch bad actors early.

replies(1): >>43797273 #
1. JadeNB ◴[] No.43797273[source]
> Yes, but it costs money. There's no solution that wouldn't.

True, although, as you doubtless know, as with most things that cost money, the alternative also costs money (for example, in funding experiments chasing after worthless science). It's just that we tend to set aside the costs that we have already priced in. So I tend to think in such settings that a useful approach might be to see how we can make such costs more visible, to increase the will to address them.

replies(1): >>43798359 #
2. cogman10 ◴[] No.43798359[source]
This is a flaw of capitalism.

The flaw being that cost is everything. And, in particular, the initial cost matters a lot more than the true cost. This is why people don't install solar panels or energy efficient appliances.

When it comes to scientific research, proposing you do a higher cost study to avoid false results/data manipulation will be seen as a bug. Bad data/results that make a flashy journal paper (room temp superconductivity, for example) bring in more eyeballs and prestige to the institute vs a well-done study which shows negative results.

It's the same reason the public/private cooperation is often a broken model for government spending. A government agency will happily pick a road builder that puts out the lowest bid and will later eat the cost when that builder ultimately needs more money because the initial bid was a fantasy.

Making costs more visible is a good goal, I just don't know how you accomplish that when surfacing those costs will be seen as a negative for anyone in charge of the budget.

> for example, in funding experiments chasing after worthless science

This is tricky. It's basically impossible to know when an experiment will be worthless. Further, a large portion of experiments will be worthless (like 90% of them).

An example of this is superglue. It was originally supposed to be a replacement glass for jet fighters. While running refractory experiments on it and other compounds, the glue destroyed the machine. Funnily, it was known to be highly adhesive even before the experiment but putting the "maybe we can sell this as a glue" thought to it didn't happen until after the machine was destroyed.

A failed experiment that led to a useful product.

How does someone budget for that? How would you start to surface that sort of cost?

That's where I think the current US grant system isn't a terrible way to do things, provided more guidelines are put in place to enforce reproducibility.

replies(1): >>43798568 #
3. JadeNB ◴[] No.43798568[source]
> > for example, in funding experiments chasing after worthless science

> This is tricky. It's basically impossible to know when an experiment will be worthless. Further, a large portion of experiments will be worthless (like 90% of them).

I don't mean "worthless science" in the sense "doesn't lead to a desired or exciting outcome." Such science can still be very worthwhile. I mean "worthless science" in the sense of "based on fraudulent methods." This might accidentally arrive at the right answer, but the answer, whether wrong or accidentally right, has no scientific value.