←back to thread

253 points lnyan | 3 comments | | HN request time: 0.001s | source
Show context
joshuamcginnis[dead post] ◴[] No.41870262[source]
[flagged]
bayindirh ◴[] No.41870334[source]
When asking these kinds of questions, I always remind myself "The Usefulness of Useless Knowledge" [0].

On the other hand, I believe that researching how animals think, behave and "work" in general, is a very important part of being human. They're alive, too, and they defy tons of prejudice we have about them over and over. We need to revise tons of knowledge about animals and other living things, in general.

[0]: https://www.ias.edu/sites/default/files/library/UsefulnessHa...

replies(1): >>41870398 #
joshuamcginnis ◴[] No.41870398[source]
So what exactly is your criteria for when a study should or should not be publicly funded?
replies(3): >>41870448 #>>41870504 #>>41870599 #
bayindirh ◴[] No.41870448[source]
Good question.

I think if there's a large corpus of research supporting a hypothesis, any research retrying that hypothesis in an insignificant way can be disqualified from funding. If you challenge the hypothesis, or adding something significant to the dark areas of that hypothesis, you could be funded.

Moreover, if your research fails to prove that hypothesis, or proves the exact opposite, that should be also printed/published somewhere, because failing is equally important in science.

In short, tell us something we don't know in a provable way. That's it. This is what science is.

This is what I think with about your question with my Sysadmin/Researcher/Ph.D. hats combined.

replies(1): >>41870503 #
joshuamcginnis ◴[] No.41870503[source]
Thanks for your kind response! Are you familiar with the Replication Crisis? What happens when most of the "hypothesis" being challenged can't be rightly replicated in the first place?

And what happens when the primary means of funding is attached the volume of papers and not the quality or impact, as is what I believe to be the case generally here in the US?

https://en.wikipedia.org/wiki/Replication_crisis

replies(1): >>41870579 #
1. bayindirh ◴[] No.41870579[source]
Hey, no problem. Yes, I'm familiar with it, and I work in/with projects which aims to create reproducible research (Galaxy, Zenodo, etc.). If you tell me that "I can make this unreproducible paper reproducible, but with a different process (or the same one), and share all the pipeline from dust to result", I'll tell you to go for it, and fund you.

At the end, if something is not reproducible, and you're testing reproducibility of that thing, it's illuminating a dark area of that hypothesis.

Measuring the quality of the research and its impact is not something I'm very familiar with to be honest, and I'm not from US, so I can't tell how universities push their people, however publish or perish is a real problem everywhere in the world.

We used to see citation numbers important, then cite-rings cropped up. We valued paper counts, then professors started to lend their names to papers in their areas for "free" advisory. Now we have more complex algorithms/methods, and now I'm more of a research institute person than an academic, and I don't know how effective these things are anymore.

But hey, I do research for fun and write papers now and then. Just to keep myself entertained to find reasons to learn something new.

replies(1): >>41870614 #
2. joshuamcginnis ◴[] No.41870614[source]
Fair enough and all great points. I think we're more aligned than not on the fundamentals here. Folks seem to be reacting negatively to my even propositioning these questions without even having made a judgment on the merit of the study myself.
replies(1): >>41870774 #
3. bayindirh ◴[] No.41870774[source]
Yes, we agree in the fundamentals. The reality is, academia dynamics is very different w.r.t. to private sector, esp. startups. So, knowing how research works in academia is a bit of an unknown for people who're not interested in this line of work, or people who doesn't know how these things are done in general.

In short, the value proposition for a piece of research is very different depending on the lens you're looking through to that research.