Sorites paradox, but for bits of evidence in the Bayesian prior.
Just as a heap of sand stops being a heap when it's small enough, the difference between "science" (not just modern but everything from Newton and Galileo onwards) and "brute force" is the available evidence before whatever hypothesis we're testing.
Scientific research these days requires a lot of prior information, things humanity collectively has learned, as a foundation. We have a lot of weight on our Bayesian priors for whatever hypothesis we're testing.
Sorites even applies to your own attempt to mock it, as the difference between humans and dolphins is "just" a series of genetic changes. Absolutely they're different and it's obvious why you chose the example, but even then it's a series of distinct small changes that are each so small it's easy to blur them together than treat them as a continuum, like we do with water even though that's also discrete molecules.
Humanity massively predates the modern scientific method, it took millennia of mistakes to go from the Greeks being wrong about four elements to finding a bit less than the 91 natural elements, and from there to finding the nucleus (1911) and that it was made of protons and neutrons; and only then did we get to logical positivism (late 1920s), and it was only around WW2 (just before, Karl Popper 1934) that we switched to falsifiability.
Each grain on the heap. We know the fields of work, we know the space of possibilities within the paradigm, the shape of the research can be to constrain that space without finding the answer directly — a divide-and-conquer approach to reducing the space that needs to be then brute forced.
These days we can even automate much of the more obviously brute-force parts, which is why e.g. CERN throws away so much data from the detectors before it even reaches their "real" data processing system. And why SETI automatically processes out any signals that seem to be from in-system before the rest of the work.
Re: the Sorites paradox, I have a definition of a heap that I think is workable.
It's not a specific number; it's "a collection of items becomes a heap when some indeterminate number of items are obscured by other items on top of them, thus making the total number of items uncountable without disturbing the heap".
Therefore it depends on factors beyond just the number of grains of sand; if you have 1000 grains spread out on a surface so they are can all be distinctly counted, that's not a heap. But if you have 1000 grains gathered together, some on top of each other, then it becomes a heap.
Just as a heap of sand stops being a heap when it's small enough, the difference between "science" (not just modern but everything from Newton and Galileo onwards) and "brute force" is the available evidence before whatever hypothesis we're testing.
Scientific research these days requires a lot of prior information, things humanity collectively has learned, as a foundation. We have a lot of weight on our Bayesian priors for whatever hypothesis we're testing.
Sorites even applies to your own attempt to mock it, as the difference between humans and dolphins is "just" a series of genetic changes. Absolutely they're different and it's obvious why you chose the example, but even then it's a series of distinct small changes that are each so small it's easy to blur them together than treat them as a continuum, like we do with water even though that's also discrete molecules.
Humanity massively predates the modern scientific method, it took millennia of mistakes to go from the Greeks being wrong about four elements to finding a bit less than the 91 natural elements, and from there to finding the nucleus (1911) and that it was made of protons and neutrons; and only then did we get to logical positivism (late 1920s), and it was only around WW2 (just before, Karl Popper 1934) that we switched to falsifiability.
Each grain on the heap. We know the fields of work, we know the space of possibilities within the paradigm, the shape of the research can be to constrain that space without finding the answer directly — a divide-and-conquer approach to reducing the space that needs to be then brute forced.
These days we can even automate much of the more obviously brute-force parts, which is why e.g. CERN throws away so much data from the detectors before it even reaches their "real" data processing system. And why SETI automatically processes out any signals that seem to be from in-system before the rest of the work.