Author: Étienne Fortier-Dubois
Interesting position. A counterargument would be the phenomenon of reputation traps in scientific research. A good example is the Fleishman and Pons cold fusion announcement. After the initial frenzy in the media, and the rush to replicate the results, there was an equally frenzied denouncing due to the first few groups failing to replicate easily. By the time a few groups did report anomalies consistent with replication, the news cycle had moved on. And now if a mainstream researcher decides to study "cold fusion" or even the rebranded "low energy lattice confined nuclear reactions" the knee-jerk response is that the person is a crank and all that was "debunked" decades ago. Meanwhile the anomalous observations continue to accumulate in the margins, but the phenomenon gets only a tiny fraction of the energy it deserves (especially compared to huge hot fusion projects which are going nowhere fast).
A positive example of the phenomenon is Mendel's data on genetic inheritence. Subsequent statistical analysis of his data showed it was too perfect to be real. He most likely "hand polished" the data in his notes. At the time, this may have helped convince other people that the theory was real, though if it had been discovered he was cherry picking data then who knows what would have happened. The same is true for millikans calculation of the charge on the electron with his very fiddly apparatus that took years to refine (IIRC- I know one of these seminal particle property experiments features the experimenter hand picking experimental runs that showed the "correct" result and discarding the rest).
"It is sometimes said that great innovators — startup founders, genius artists etc. — have to be somewhat self-delusional. Not too much, or they’ll just waste their time on impossible dreams. But they have to somehow believe in their own crazy idea to a degree that’s perhaps just slightly unreasonable. Normal people, i.e. you and me, would easily give up, because we’re not self-delusional. But then that’s why we’re not great innovators."
What a coincidence, Youtube just served me a video of Adam Jensen (co-founder and CEO of Nvidia) saying basically the exact same thing: https://www.youtube.com/watch?v=wH4cv1e1MvU (Jensen Huang : Don't Start A StartUp)
1. Interviewer: If you could do it all over again, what type of company would you found? Any differences?
2. Jensen: I wouldn't.
3. Truth is, it was a million times harder than I had thought, founding a company.
4. If I had known how much pain, suffering, and humiliation I had just signed up for...
5. I wouldn't have done it. Nobody would.
6. That's the superpower of an entrepreneur: not knowing.
7. They say, "How hard can it be?", rather than actually answering that question.
8. And once the answer starts arriving anyways, the only reason they keep going...
9. ... is self-delusion. Keep saying to yourself, "Well, how hard can the *next* thing be?"
10. And I still have to do that, every day. Nvidia Omniverse? How hard can it be? Retire? Nah, I'm still good for another bout, how hard can it be?
11. And that's why I wouldn't do it again if I could. It's just too much.
(Also, I'm surprised you didn't mention the famous case of George Danzig solving 2 famous unsolved problems in Mathematics as a grad student, because his professor left them on the blackboard and Danzig assumed they must be homework problems. So when he couldn't get a solution, he just kept trying till he did, cause he assumed that there must *be* a solution.
Unfortunately, most scientists are caged by the granting systems that provide funds. I completely agree that crazy ideas should at times be pursued and can lead to more innovation (or maybe just a cool learning experience) but when you have five years to ‘accomplish’ all the mundane experiments you proposed in your grant you had best stay within the lines and get it done if you want another five years of funding - and all that comes with it, like grad students, money to publish in open access journals, etc....
The necessary (or, ~"adequately powerful") domains of knowledge have existed for a very long time: Epistemology, Ontology, Logic, Phenomenology (perhaps plus semantics, linguistics, semiotics, etc).
What we need is a ~"scientific" method for metaphysical, non-deterministic matters, like science kinda has for physical, deterministic affairs (I say "kinda" because lacking capability in the metaphysical realm, science's method is incomplete, because our existence is in the metaphysical realm).
Cost/benefit analysis? Alchemy doesn't seem to have contributed anything to knowledge and in fact probably diverted many good minds down centuries worth of blind alleys. The deliberately fake anti-gravity machine to give physicists a KITA is a popular theme in sci-fi. The position that makes more sense to me is that of increasing universal knowledge and it's just a question of who says Eureka first. That is said to apply to the Double Helix of DNA, the calculus, and Darwin vs Wallace. On balance I'd say that the kind of Sniff Test proposed by Gell-Mann -- is it bizarre, does it contradict known physics, does it explain anything -- is what makes a scientist.