Pages

Thursday, July 03, 2008

The End Of Theory?

Chris Anderson, the editor in chief of Wired Magazine, wrote last week an article that you find at the Edge proclaiming


Anderson claims that our progress in storing and analyzing large amounts of data makes the old-fashioned approach to science – hypothesize, model, test – obsolete. His argument is based on the possibility to analyze data statistically with increasing efficiency, for example online behavior: “Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.”

This, he seems to believe, makes models entirely unnecessary. He boldly extends his technologically enthusiastic future vision to encompass all of science:

“Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical
speculation about n-dimensional grand unified models over the past few decades (the "beautiful story" phase of a discipline starved of data) is that we don't know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on.

Now biology is heading in the same direction... ”

Examples he provides rely on statistical analysis of data. It doesn’t seem to occur to him that this isn’t all of science. It strikes me as necessary to actually point out the reason why we develop models is to understand. Fitting a collection of data is part of it, but we construct a model to gain insight and make progress based on what we have learned. The point is to go beyond the range in which we have data.

If you collect petabytes over petabytes about human behavior or genomes and analyze them running ever more sophisticated codes, this is certainly useful. Increasingly better tools can indeed lead to progress in varios areas of science, dominantly in those areas that are struggling with huge amounts of data and that will benefit a lot from pattern recognition and efficient data classification. But will you ever be able to see farther than others standing on the shoulders of a database?

If you had collected a googol of examples for stable hydrogen atoms, would this have lead you to discover quantum mechanics, and all the achievements following from it? If you had collected data describing all the motions of stars and galaxies in minuscule details, would this have lead you to conclude space-time is a four-dimensional continuum? Would you ever have understood gravitational lensing? Would you ever have been able to conclude the universe is expanding from the data you gathered? You could have assembled the whole particle data booklet as a collection of cross-sections measured in experiments, and whatever you do within that range you could predict reasonably well. Bould would this have let you predict the Omega minus, the tau, the higgs?

Anderson concludes

“The new availability of huge amounts of data, along with the statistical tools to crunch these numbers, offers a whole new way of understanding the world. Correlation supersedes causation, and science can advance even without coherent models, unified theories, or really any mechanistic explanation at all.”

With data analysis only, we might be able to discover hidden knowledge. But without models science can not advance beyond the optimal use of available data – without models the frontiers of our knowledge are set by computing power, not by ingenuity. Making the crucial step to identify a basic principle and extend it beyond the current reach is (at least so far) an entirely human enterprise. The requirement that a model be not only coherent but also consistent is a strong guiding principle that has pointed us into the direction of progress during the last centuries. If Anderson’s “kind of thinking is poised to go mainstream,” as he writes, then we might indeed be reaching the end of theory. Yet, this end will have nothing to do with the scientific method becoming obsolete, but with a lack of understanding what science is all about to begin with.

PS: I wrote this while on the train, and now that I am back connected to the weird wild web I see that Sean Carroll wrote a comment earlier with the same flavor, so did Gordon Watts. John Horgan wrote about problem solving without understanding, and plenty of other people I don't know added their opinion. This immediate resonance indeed cheers me up. Maybe, science will have a chance. Leaves me wondering whether writing articles that cross the line of provocation to nonsense is becoming fashionable.


See also: Models and Theories.

55 comments:

  1. It just sounds to me like methods for handling complexity: numerical methods, like the methods used for weather forecasting. If you can't build a model due to the complexity, just throw a supercomputer at the data. Nothing particularly new about it.

    ReplyDelete
  2. Hi Andrew,

    I could have understood had Anderson been enthusiastic about complexity and the progress that can be made in this area due to computational power, but this was not the point of his article. A numerical analysis of a complex model still is based on a model (just read the first sentence of the Wikipedia entry you link to). In this case, it is just no longer possible to analytically make predictions with the model you have. What Anderson is aiming at instead is to do without a theory, without a model, just use the data, and forget about the scientific method. Go, read his article, I am not exaggerating. Best,

    B.

    ReplyDelete
  3. Oh, I only read the top bit!! I thought that was the article! He he.

    It was interesting. That kind of technique is no doubt useful for some purposes - where there isn't really an underlying model to be found. I can imagine Google or Amazon find it useful monitoring the behaviour of millions upon millions of people on the internet - maybe that's something you can't find a simple model. It's similar to intelligent agents technology. I actually did some work with neural networks once and that's very similar - you just throw a lot of data at your neural network and it "learns" to classify, but at the end of the learning process you don't really understand what it's doing! I'm sure Google use something like that for their page ranking algorithm (it's not all backward links).

    But you're right as far as physics is concerned - there's always some underlying simplicity in nature to be found.

    ReplyDelete
  4. Yeh, the people and the media love that stuff. The End of theory, the End of science, the End of history, the End of the world. They make a good sell.

    ReplyDelete
  5. Science is currently going to the exact opposite direction. Huge amounts of data have been gathered in, e.g., biosciences, but still we don't know much about how biological systems work. Trying to find insight from a huge pile of data produces noise - accidental correlations and similar artefacts.

    To make progress we need predictive models. What use is it for us to notice - after the fact - that something happened, when we need to model, predict and act.

    For example, in finding the right action in a potential worldwide pandemic, or in fighting the climate change, data is just junk by itself if we can't make predictive models.

    ReplyDelete
    Replies
    1. You're probably right about the worldwide pandemic there

      Delete
  6. Hmm. Even ignoring the unreasonable effectiveness of mathematics isn't Anderson pretending that these "statistical algorithms [which] find patterns where science cannot" are somehow not models? Somehow not being used to generate and test hypotheses? I think that's where the silliness comes in.

    The greater point that "correlation is enough," that one can derive useful models that can't be expressed analytically, is a reasonable one. Just like automated computation offers a new way to visualze mathematical systems, checking for patterns and exceptions in large datasets is a new and increasingly useful tool. I understand it's even been used to test poor, flawed quantum mechanics.

    ReplyDelete
  7. 200+ years of punctilious record keeping and intense economics analysis cannot imagine the next trading day's Dow-Jones average. No economist ever became wealthy of his own investments.

    A sine wave can be arbitrarily closely fitted by an algebraic polynomial until one ventures outside the fitted period. Then, the fit explodes. Intrinsic analysis need not evolve understanding or even valid extrapolation.

    ReplyDelete
  8. uncle al: The real world is non-perturbative?

    ReplyDelete
  9. It's frightening. I've a lot of "number crunchers" here were I work, and I can see that the trend is going to promote their approach. Just make huge collection of numbers and never, never try to deeply interpretate them.
    Numbers without interpretation are nothing. But I cannot explain them this simple concept, they jus don't like it... You need to work out your brain to be able to interpretate data.
    Without resorting to Quantum Mechanichs, just think of geocentric representation, with epicycles: they're a better way to predict "mobile stars" motions than Copernican theory, and they fit remarkably well with experimental datas. An "intelligent agent" would have chosen this representation over the heliocentric one... And we'd never had discovered Newtonian Gravity.

    ReplyDelete
  10. Chris Anderson probably holds a stake in some startup that purports to do such data mining in a new way.

    And there is no limit to human stupidity.

    ReplyDelete
  11. Hi Bee,

    “It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature.”

    -Neils Bohr

    “I want to know how God created this world. I am not interested in this or that phenomenon, in the spectrum of this or that element. I want to know His thoughts; the rest are details.”

    -Albert Einstein

    I guess it all depends on what one thinks science is or what it’s meant to accomplish. If one holds with Bohr for instance then Anderson is doing good science; that is if the sole ambition of science is to make predictions based on given situations, then that indeed is all that would be required. However, if one has expectations for science that Einstein held then this data mining correlation approach just won’t cut it, unless of course the computers themselves are made to be intelligent as in sentient. One must first be reminded of course that when Einstein uses the word God that he actually means simply nature and not any anthropomorphic conception of one.

    Best,

    Phil

    ReplyDelete
  12. Hi Andrew,

    I am not sure what you are saying with referral to the intelligent agent site you link to, esp. since it starts with explaining there are two different ways the word is used. Did you actually refer to genetic algorithms? Please keep in mind that there you have an underlying model, that which sets how the algorithm works. You don't know exactly where it goes, but you know how it adapts.

    Yes, you can throw a lot of data at a network, and you don't understand what it's doing but examine the outcome - one can do that. But how much can you learn from that is what I am asking? I was not saying this is useless, but it strikes me that Anderson is looking roughly as far as the tip of his own nose. Best,

    B.

    ReplyDelete
  13. Hi Phil,

    If one holds with Bohr for instance then Anderson is doing good science; that is if the sole ambition of science is to make predictions based on given situations, then that indeed is all that would be required.

    I think you have missed the point of my writing. The predictions that science is aiming at are not constrained to predictions that lie within the data set you already have. If you've dropped a thousand apples to the ground, you can predict the next apple will also drop to the ground. That's what you can do with analysing data. But what data analysis can not do for you is the leap to say, hey, the moon is moving on its path because of the same gravitational force that lets the apple fall to the ground. A prediction that you'd aim at would eg be that additional planets in the solar system would affect the motion of the other ones. That is the kind of prediction we are looking for, it comes from generalizing an underlying concept, it comes from a hypothesis, and goes through a model. If you claim you don't need it you throw out an essential part of science. I do not think that Bohr in the quote was constraining 'what we can say about nature', to what we can say about nature based on the data we have gathered - and then stop there, because we don't need any scientific method anymore. Best,

    B.

    ReplyDelete
  14. There's another point against the "crunching" of data as a means to the progress of science without theory. Anderson "forgets" that all data are interpreted even before collected (in the design of the experiments according to existing or proposed theories and modes of thinking burdened with the whole history o science). He is being naif or else promoting some kind of "uniform thinking".
    If you say you only have to process the data with detail, you're admitting there's only one way to process the data while you're not allowing the possibility to reduce the experimental data according to different patterns. The same estatistical population can have totally different meanings if you change the variables considered important (or their definition), the cause for the effect or the origin supposed to a correlation. Science is a language and you should be able to read the same symbols (data) with different idioms (theories).

    ReplyDelete
  15. Hi Bee,

    “I do not think that Bohr in the quote was constraining 'what we can say about nature', to what we can say about nature based on the data we have gathered - and then stop there, because we don't need any scientific method anymore.”

    In as I cannot speak for those who have passed on with absolute certainty, I can’t dispute you. However, statements such like the one I offered of Einstein’s were made in reaction to how he considered Bohr and his followers to be thinking at the time. I also firmly agree with what you hold as a definition for science. On the other hand, once science has come to the point where it is only considered to be describable within a statistical context and if that is truly considered to be the only one relevant, how can we be so certain that Anderson’s program is not all that we are left with? For me it comes down to one of those throwing the baby out with the bath water scenarios.

    Oh yes I did leave out the first part of Bohr’s statement as to give it a more general context for it actually reads as follows:

    “There is no quantum world. There is only an abstract physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature.”

    I would ask then in this more precise context, what would you consider to be Bohr’s idea of the limitations we are stuck with?

    Best,

    Phil

    ReplyDelete
  16. Hi Phil,

    Well, I am more with Einstein than with Bohr as you probably know, but still Bohr's attitude (or what I interpret it as) I can relate to, in contrast to Anderson's. I would guess what Bohr meant to express is the pragmatic point of view that the task of the physicist is in the first line to describe nature, trying to say something about nature. Yet the way how we obtain new insights and go beyond that what we can already say is based on abstractions, it is based on models that allow us to see farther. It is this concept that Anderson seems to deny is essential.

    Einstein's point of view to me expresses that this pragmatic approach might allow us to describe natural phenomena to high precision but it fails to take into account the relevance of inspiration that drives scientists, the search for the big picture. I too think that this inspiration, together with the philosophical questions, is an essential aspect of physics and it is a mistake to leave it out. Either way, I don't think this has anything to do with what Anderson wrote. Best,

    B.

    ReplyDelete
  17. Dear Arun,

    Chris Anderson probably holds a stake in some startup that purports to do such data mining in a new way.

    Possibly... Leaves me wondering why the Edge promotes such writing. On their website it says "The mandate of Edge Foundation is to promote inquiry into and discussion of intellectual, philosophical, artistic, and literary issues, as well as to work for the intellectual and social achievement of society." As much as I try, I can't find anything in Anderson's article that is either intellectual, philosophical, artistic or literary. It strikes me as the kind of writing typical for bloggers that originates in a sudden thought that, if followed further, would crumble into dust, so you better write it down immediately and publish it before you realize it's nonsense. Best,

    B.

    ReplyDelete
  18. Hi Anonymous,

    “If you say you only have to process the data with detail, you're admitting there's only one way to process the data while you're not allowing the possibility to reduce the experimental data according to different patterns.”

    Yes, and yet in quantum mechanics for instance when we are looking for a wave is that not what we find and when looking for a particle the same? This has also been extended to if we are looking for both this is also what we find. To agree with J.S. Bell which is when theories constructed in such a ambiguous and imprecise fashion are generally accepted we end up with opinions for the future and fate of science such as the one Anderson holds.

    Best,

    Phil

    ReplyDelete
  19. "What Anderson is aiming at instead is to do without a theory, without a model, just use the data, and forget about the scientific method." - Bee, second comment

    Feyerabend's Against Method, 1975, argues that in fact there isn't a scientific method; scientists just use whatever method proves most useful for the task.

    E.g., if you look at Archimedes' proof of buoyancy, it's entirely fact-based. There is no speculative assumption or speculative logic involved at all, so you don't need to test the results. It's a fact-based theory. (The water pressure at equal depths in water will be the same in free water as in a location with a floating object above the point in question. Hence, the weight of the water which is displaced by the floating object must be precisely the same as the weight of the floating object. QED.)

    Modern physics uses a different kind of theory. The fact that speculative assumptions play a role in modern theories, instead of entirely factual input, means that the theory must make checkable predictions to make it scientific.

    E.g., you could speculate about extra dimensions and then search around for twenty years trying to make a falsifiable prediction from the resulting 10^500 possible extra dimensional universes.

    Even if it did make a falsifiable prediction which was confirmed, that wouldn't prove the theory, because (as with Ptolemaic epicycles and the earth centred universe) the theory may be a good approximation only because it is a complex model selected to try to fit the universe. E.g., string theory starts with a 2-d spacetime stringy worldsheet and adds 8 more dimensions to allow conformal symmetry. This gives 10 dimensions, and since only 4 spacetime dimensions are directly observable, string theory compactifies 6 dimensions with a Calabi-Yau manifold.

    This is mixing facts with speculation in the theoretical input, rather in the way that epicycles were added to the Ptolemaic universe to incorporate corrections for observed phenomena.

    The successes of the Ptolemaic theory in predicting planetary positions were not due to it's basic theoretical correctness (it's a false theory), but were due to the fact that it was possible to make a false theory approximate the actual planetary motions due to the factual input.

    So even if string theory was a success in making a validated falsifiable prediction, that wouldn't necessarily be a confirmation of the speculative assumptions behind the theory, but just a confirmation due to the factual input which constrained the endlessly adjustable framework to correctly model reality. String theory is constrained to include 4 spacetime dimensions and spin-2 gravitons because of it's selected structure.

    There is really a larger set of string theories with an infinite number of alternatives, and the selection of the 10^500 universe variants of M-theory with 10/11 dimensions from that set is based on the anthropic principle.

    The more anthropic constraints (4 observable macroscopic spacetime dimensions, spin-2 hypothetical gravitons, a small positive cc, etc.) you apply to the basic stringy idea, the smaller the landscape size. But even if you got it the landscape size down to 1 universe, it would only be a speculative model constrained by empirical data via ruling out all the stringy universe models which are wrong. There will be no evidence that the best string model will be the real universe, it might be just like the best version of Ptolemaic epicycles.

    So I think it's best to try to search out physics empirically, instead of starting with a mixture of speculation constrained by facts.

    A fact-based theory is quite possible. Back in 1996, I spotted that the central error was made when Hubble discovered the acceleration of the universe in 1929, but reported it instead as a constant ratio of recession velocities to distances.

    If only he had recognised that the distances were (in spacetime) times past, he would have got a constant ratio of recession velocities to times, which has units of acceleration (unlike Hubble's reported constant velocity/distance, which has units of 1/time). With this acceleration, he would have predicted the acceleration of the universe in 1929, which was discovered in 1998. This would also have led to the correct theory of quantum gravity back in 1929, because the observable outward radial acceleration of the mass of universe implies an outward force, which by Newton's 3rd law is accompanied by an inward reaction force, allowing you to predict the graviton induced coupling constant G.

    Hubble law: v=Hr

    a = dv/dt = d(Hr)/dt = H(dr/dt) + r(dH/dt) = Hv + 0 = rH^2 ~ 6*10^{-10} ms^{-2} at the greatest distances.

    If Hubble had reported his recession constant as v/t = a, then since t = r/v, we have a = v/(r/v) = (v^2)/r = (Hr)v/r = Hv = rH^2, exactly the same result as that given by differentiating a = rH^2.

    It was really a tragedy for physics that Hubble reported his constant in terms of distance only, not time (he was ignoring spacetime).

    This really messed up the development of physics, with people ignoring the physical mechanism and spending years applying metrics to fit the data instead of seeing the physical meaning and mechanism at play.

    So from my perspective (as far as I'm concerned, everyone else is a crackpot when it comes to the acceleration of the universe, since they won't listen to the facts but are entirely prejudiced in favour of obfuscation and error), Chris Anderson has hit the nail on the head.

    Physicists since the 1920s have stopped constructing physical theories from factual evidence. It's pretty clear that gauge theory is correct in that fields are due to randomly exchanged field quanta (which in large numbers approximate to the classical field expressions), so the lack of determinism for the path of an electron in an atom is due to the random exchanges of Coulomb field quanta with the proton, causing the motion of the electron to be non-classical (analogous to the Brownian motion of a small piece of a pollen grain due to random impacts of air molecules).

    This is a factual mechanism for wave phenomena appearing on small scales, because the existence of vacuum field quanta can be experimentally demonstrated in the Casimir force, in pair-production from gamma rays exceeding 1.022 MeV energy when they enter a strong force field near a nucleus, and the discovery of the weak gauge bosons in 1983.

    However, the mainstream sticks to a non-mechanism based discussion of the lack of determinism of the electron's motion in the atom.

    They don't want mechanistic theory, they think it has been disproved and they just aren't interested.

    People think that empirical equations, like the Schroedinger and Dirac equations, that numerically model phenomena and make predictions, make physical understanding of mechanisms unnecessary.

    The basic problem I believe is that people believe now in a religious way that the universe is mathematical, not mechanism based. Any argument in favour of a mechanism is therefore seen as a threat to the mathematics by these ignorant believers.

    It's pretty funny really to see Feynman's attacks on mathematical religion:

    ‘The same situation exists with electrons: when seen on a large scale, they travel like particles, on definite paths. But on a small scale, such as inside an atom, the space is so small that there is no main path, no ‘orbit’; there are all sorts of ways the electron could go, each with an amplitude. The phenomenon of interference [due to random exchanges of photon field quanta between the electron and the proton] becomes very important, and we have to sum the arrows to predict where an electron is likely to be.’ - R. P. Feynman, QED, Penguin, London, 1990, page 84-5.

    This statement by Feynman and the accompanying Feynman diagram in the text showing an electron randomly exchanging photons with a proton in a hydrogen atom, and thereby undergoing a non-classical orbit, proves that the mainstream quantized (QED) Coulomb field model is the cause of the loss of determinism for the trajectory of an electron in an atom. The electron can't go in a smooth circular or elliptical orbit because on small scales the exchanged photons of the quantized electromagnetic field cause it's direction to fluctuate violently, so it's motion is non-classical, i.e. non-deterministic or chaotic.

    To me this is a major find, because I'm interested in the mechanisms behind the equations. Nobody else seems to be, so in this sense I agree that theory is coming to an end. Or rather, the search for deep mechanism-based factual theories is being replaced by half-baked philosophy of the Copenhagen 1927 variety, where questions that lead towards investigations into physical mechanisms are simply banned.

    What you do is you get some famous physicist to stand up and rant that nobody will ever know mechanism, that it is unhelpful. He doesn't say that when a mechanism is discovered, it might be very helpful in predicting the strength of gravity. He just attacks it out of ignorance. Everyone else then applauds, because they naively think the guy is defending empirical equations.

    Actually, it's pretty clear that empirical equation are useful at hinting at underlying mechanisms, and are not an alternative to mechanisms, contrary to Feynman's fear that mechanistic theory may some how replace the associated mathematical equations:

    It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of spacetime is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities.’ - R. P. Feynman, The Character of Physical Law, November 1964 Cornell Lectures, broadcast and published in 1965 by BBC, pp. 57-8.

    This kind of fear that empirically based mathematical equations are in danger from deeper mechanistic models is groundless. All you get from discovering the mechanistic model is a clearer understanding of the equations plus predictions of empirical constants in those equations. You don't replace or destroy the equations, you just add more useful relationships and predictions.

    ReplyDelete
  20. Hi Nige,

    Yes, I do even agree with Feyerabend that speaking of THE scientific method is already a constraint on plurality that neglects the fact that during our history the discovery of successful theories hasn't strictly followed any method. As far as I am concerned the reason for this is simply that science is a human enterprise, it is based on creativity and works best when left as unconstrained as possible - at least as long as the ethics in the community is intact.

    However, I don't think Feyerabend had in mind to throw out human hypothesizing and replace it with database sifting. I am more than willing to let any approach be pursued to obtain whatever insight there can be obtained, my problem with Anderson's writing is that he argues this would make other approaches obsolete. Best,

    B.

    ReplyDelete
  21. So if number crunching and data mining is the bee all and end of all

    I guess we should be able to make a perfect souffle everytime or the perfect cup of tea

    or make the perfect gin & tonic. But this still begs the question which gin & tonic is the best.
    And how we make the 'best' gin is still about hypothesis and testing, and purely subjective.

    ReplyDelete
  22. But hey if data mining can cure tooth decay, and/or cancer
    I'll die a happy man.

    ReplyDelete
  23. "I am more than willing to let any approach be pursued to obtain whatever insight there can be obtained, my problem with Anderson's writing is that he argues this would make other approaches obsolete." - Bee

    Thanks for responding, Bee.

    I fear that if it hasn't already done so, then the ability to store, interpolate, and extrapolate directly from vast quantities of data with computers will eventually make other approaches (theory and mechanism) obsolete, simply because it's faster and easier, and more suited to large groups of sociable physicists who want to get the day's work done, submit a paper and then go off down to the pub for the evening. This is a different culture from situation innovators like Copernicus, Kepler, and Einstein were in, worrying about problems for long periods until they managed to find an answer. (String theorists may say that they are deep thinkers like Einstein, but really they are missing the point: Einstein tacked real problems and made falsifiable predictions, he didn't tackle imaginary unification speculations with a lot of collaborators for twenty years and fail, or if he did - in his old age - he wasn't respected scientifically for doing that).

    The old way to formulate a theory was to start with hard facts from nature, e.g. get some data, plot it, and then find equations to summarise it in a convenient way (as Kepler did using Brahe's data, and as Galileo did using his measurements of fall times for balls), and then link up the different equations into one theory (as Newton did when linking up Kepler's equations or planetary motion with Galileo's equation for falling objects). Another example of this type is Balmer's line spectra formula, which was eventually generalized by Rydberg and finally explained by Bohr's atomic model.

    But if you have a means to store and analyze trends in large amounts of data, you don't necessarily get beyond the first stage, or find the correct equations at that stage. What happens instead is that people find very complex empirical equations with lots of arbitrarily adjustable parameters to fit to the data, and don't have the time or interest to move beyond that. In addition, sometimes the empirical equations can't ever lead to a theory because they are actually wrong at a deep level although numerically they are good approximations.

    I came across this in connection with air blast waves. Early measurements of the way peak overpressures (of up to about 1 atmosphere) falls behind a shock front were roughly approximated by the expression

    P_t = P(1 - t)*exp(-t),

    where P is peak overpressure, t is normalized time (time from arrival time of the shock front at the location, in units of the overpressure duration). Later it was found that a higher peak overpressures, the fall rate was faster so the exponential term was made a function of the peak pressure. But the curve shape was found to be slightly in error when the blast was simulated by numerical integration of the Lagrangian equations of motion for the blast wave. Brode eventually found an empirical fit in terms of a sum of three exponential terms, each with different functions of peak overpressure. It was very complicated.

    However, it seems that the correct curve doesn't actually contain such a sum of exponential terms, and it is very much simpler than Brode's sum of three exponents:

    P_t = P(1 - t)/(1 + 1.6Pt).

    The denominator equation is a theoretical model for fall in pressure due to divergence of the expanding blast wave after the shock front has passed the location of the observer.

    The problem with modern scientific research which creates vast amounts of data that can be analysed by computer is that it's too easy to fit the data to a false set of equations to a good approximation, just by means of having a lot of adjustable parameters.

    The resulting equations then lead nowhere, and you can't get a really helpful theory by studying it.

    Take the question of fundamental particle masses. There is plenty of empirical data, but that doesn't lead anywhere. In some respects there is too much data.

    Mainstream efforts to predict particle masses are terrible lattice QCD calculations. The masses associated directly up and down quarks are only about 3 and 6 MeV, respectively. The proton's mass is 938 MeV, so the real quark masses are only about 1% of the mass of the observable particle. The other 99% is from virtual quarks, so since QCD involves complex interactions between strong field quanta (gluons) and virtual quark pairs, there is no simple formula and the calculations have to involve various convenient assumptions.

    The theory of particle masses is the mass-providing Higgs-type field and quantum gravity (since inertial and gravitational masses are the same, according to the equivalence principle).

    If the mass-giving field is quantized into units, then the array of observed particle masses may depend on how different standard model particles couple to discrete numbers of mass-giving field quanta.

    The temptation is for large groups of sociable physicists to collaborate together to run computer simulations, predictions, and data analysis, coming up with very complex empirical fits to existing theories.

    So the data deluge does seem to reduce the prospects for new theoretical breakthroughs. Another example is the effects of radiation at low dose rates. If you look at data from the nuclear bombings of Japan, far more people were exposed to lower doses than very high doses, so the cancer risks of lower doses are actually known with greater statistical accuracy. However, it is politically incorrect to try to work out a theory of radiation effects based on well-established data, because of prejudices. So they just build up a massive database and improve the dosimetry accuracy, but no longer plot the data on graphs or try to analyse it.

    Most physicists fear the crackpot label, and strive not to be too innovative or to solve problems which the mainstream groupthink prefers to sweep under the carpet and ignore.

    ReplyDelete
  24. Hi Phil,

    Albert Einstein:

    “I want to know how God created this world. I am not interested in this or that phenomenon, in the spectrum of this or that element. I want to know His thoughts; the rest are details.”

    Neils Bohr:

    “There is no quantum world. There is only an abstract physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature.”

    With A.Einstein I have no problem, but I will appreciate it greatly if you will remind me what N.Bohr contributions in physics are.

    Hi nige,

    “E.g., if you look at Archimedes' proof of buoyancy, it's entirely fact-based. There is no speculative assumption or speculative logic involved at all, so you don't need to test the results.”

    Your interpretation is exactly the opposite of the actual story. It is the classical (ancient) example of the axiomatic top -> down deductive derivation made in the bath. Therefore “you don’t need to test the” conclusions. The usual behavior of a human when he happens to receive the message from Him is start running naked on the streets. A.Einstein suffered the same destiny.

    However, the top -> down approach usually require the preliminary stage of the complimentary bottom -> up phenomenology data processing. Consider for example the chain: T. Brahe -> J.Kepler -> I.Newton. The modern analogy: T.D.Lee and C.N.Yang -> M.Gell-Mann and R.P. Feynman -> S. Weinberg and A.Salam ->??? (Relativistic QM).

    Regards, Dany.

    ReplyDelete
  25. The last time I checked, we, as humans, were still primarily made of carbon, not silicon. So unless we've gone extinct, then models, theories, and above all, the scientific method won't go extinct, either. But I guess it's possible that we could evolve into silicon-based lifeforms, thus enabling us to uncover Nature with raw data alone.

    ReplyDelete
  26. "Of being bold," to being bold.

    Formulating, goes on all the time.:)

    ReplyDelete
  27. Hi Dany,

    “With A.Einstein I have no problem, but I will appreciate it greatly if you will remind me what N.Bohr contributions in physics are.”

    I’m not sure where you’re going with this other then perhaps to indicate that I mistakenly wrote Neils instead of what it should have been as Niels Bohr.

    Best,

    Phil

    ReplyDelete
  28. Hi Phil,

    “I’m not sure where you’re going with this other then perhaps to indicate that I mistakenly wrote Neils instead of what it should have been”

    To arrange the question I used copy – paste and missed the typo.

    I get impression that you are interesting in history of science/physics. The story with N.Bohr is a mystery for me. I doubt that he was familiar with the analytical mechanics and adiabatic invariants and I don’t understand why A. Einstein, E. Schrödinger and W. Heisenberg were interesting to discuss anything with him. I addressed to you with hope that you may provide explanation.

    Regards, Dany.

    ReplyDelete
  29. Hi Dany,

    “I get impression that you are interesting in history of science/physics. The story with N.Bohr is a mystery for me. I doubt that he was familiar with the analytical mechanics and adiabatic invariants and I don’t understand why A. Einstein, E. Schrödinger and W. Heisenberg were interesting to discuss anything with him.”

    In as he was one of the coauthors of the Copenhagen Interpretation and also the head and founder of its Institute for Theoretical physics I don’t know why you would hold such notions. As for adiabatic invariants they were already a part of the old quantum theory and as for analytical mechanics this approach it more closely associated with Feynman’s later developed path integral formulation of QM. You are however correct that Bohr was mainly responsible for the old model and it was Heisenberg, Schrödinger, deBroglie and others (the young blood) that developed and ushered in the new one. This however is not to say that Bohr didn’t understand the work, although he was more closely associated with the philosophical implications of the new physics which is the aspect of Bohr that was so in contrast to Einstein’s thinking.

    If you are truly interested in all this and the period, Antony Valentini (formerly of PI) and Guido Bacciagaluppi more recently wrote a book called Quantum Theory At The Crossroads: Reconsidering the 1927 Solvay Conference, which lends one more insight into the pivotal period and the issues, some of which are still largely misunderstood and at the heart of the current ontological debate involving the foundations of quantum theory.

    Also, it should be acknowledged that this is off topic none the less and has no real place in this thread.

    Best,

    Phil

    ReplyDelete
  30. Hi Phil,

    Do you never sleep? I was about to say the same -

    Dany: I am absolutely not interested why you think Einstein wouldn't have been interested to talk to Bohr or whatever, so would you please stick to the topic, thanks.

    -B.

    ReplyDelete
  31. Hi Bee,

    Yes, I am having one of those (rare) sleepless nights. However, I’m going to finish my cup of tea and try and get a few more hours in. Also, once again my apologies for the drift off topic.

    Best,

    Phil

    ReplyDelete
  32. No problem. I just didn't want that pointless squabble to go on forever. Get some sleep :-)

    -B.

    ReplyDelete
  33. It's funny how people who don't really know what science is always try to attack the scientific method.

    ReplyDelete
  34. Hi Phil,

    Thank you very much for ref. But my God, it is 553 pages!

    Hi Bee,

    “would you please stick to the topic”

    Bee, good morning! I will be glad to keep silent since I have no idea what the topic is.

    Regards, Dany.

    ReplyDelete
  35. Hi Cynthia,

    Cynthia: "But I guess it's possible that we could evolve into silicon-based lifeforms, thus enabling us to uncover Nature with raw data alone".

    Do you remember "Data" in Star Trek? Well guys at their thirties should remember him. He was an android that gradually developed human skills: emotions, imagination etc. Data had the ability to analyze vast amounts of data very very fast. Well at the end Data became a professor of physics at Cambridge. So you could do it the other way around. You could build machines with human skills. This way you could get the correct mathematical and physical models by the analysis and interpretation of raw data. Why not? The machines have already defeated us at chess.

    Regards

    ReplyDelete
  36. Hi nige,

    “E.g., if you look at Archimedes' proof of buoyancy, it's entirely fact-based. There is no speculative assumption or speculative logic involved at all, so you don't need to test the results.”

    Your interpretation is exactly the opposite of the actual story. It is the classical (ancient) example of the axiomatic top -> down deductive derivation made in the bath. Therefore “you don’t need to test the” conclusions. ...
    - Dany.

    Dany, I've read Archimedes On Floating Bodies. Archimedes wasn't floating in his bath.

    Maybe you need to be a bit careful before you muddle things up and then acuse other people of getting the story wrong.

    In the case of floating bodies, the mass of water displaced is equal to the mass of the floating object.

    But in the case of Archimedes in the bath, or rather the problem he was thinking about at the time when he saw water being displaced over the edge of the bath: the gold crown whose density he had to ascertain for the King to make sure that it wasn't alloyed with silver, what happens is entirely different.

    The volume of water displaced is equal to the volume of the object submerged.

    So for a floating body, Archimedes law of buoyancy is that the mass of water displaced is equal to the mass of the floating object, but in the case of a submerged gold crown (or Archimedes submerged in his math), it is the volume of water that is displaced which is equal to the volume of the waterproof portion of the object submerged.

    There is a big difference. The episode of Archimedes in the bath was concerned with the gold crown of King Hiero II. Archimedes had to find its density. Weighing it was easy, but he then had to find a way of ascertaining its volume, and that's what he did in the bath. His books "On Floating Bodies" were of no use in this regard, since they dealt with mass, not volume. Even if gold did float, which of course it doesn't, the law of buoyancy would have been of no use in determining its volume because the water displaced by a floating object only tells you the mass of that floating object, not the volume. He could find the mass easily by weighing the crown. What he wanted to find, when he had the Eureka moment, was the volume of a crown whose shape is very complex and way beyond easy volume calculations.

    In any case, this was a fact-based proof. It wasn't speculative. The life of the crown-maker depended on outcome of whether it was just gold or was actually adulterated with silver. Archimedes wasn't speculating in finding the conclusion. He observed when sunk in the bath that his volume was squeezing water out, displacing a volume of water equal to his own volume. This was a fact.

    (Actually Archimedes' work suggested an analogy in the context of the big bang and the gravitational field. Since the gravitational field, composed of gravitons exchanged between masses, is filling the voids between masses, you would expect important effects on the graviton field due to the recession of masses in the big bang. It turns out that predicts gravitational field strength.)

    Archimedes' book On Floating Bodies Book 1 begins with postulate 1, which is a simple statement that:

    "... a fluid ... is thrust by the fluid which is above it ..."

    This is just the observed consequence of gravity. I.e., the weight bearing down on a stone within a column of stones depends on the total weight of stone in the column above the particular height of interest. This addition of vertically distributed weights bearing down is not speculative: it is an empirical fact and can be demonstrated to hold for water, since the weight of a bucket of water depends on the depth of the water in the bucket.

    Archimedes' Proposition 5 is that:

    "Any solid lighter than a fluid will, if placed in a fluid, be so far immersed that the weight of the solid will be equal to the weight of the fluid displaced."

    His proof of this is to consider the pressure underneath the solid, showing that if you drop a floating object on to the water an equilibrium is reached:

    "...in order that the fluid may be at rest...",

    whereby the pressure equalises along the bottom at similar depths, whether measured directly below the floating object or to one side of it.

    This is Archimedes' proof of the law of buoyancy for floating objects. He bases everything on postulates, and it is mathematically right like Einstein's relativity. Notice that it does not explain in terms of mechanism what is causing upthrust. It is a brilliant logical and mathematical argument, yet it does not explain the mechanism.

    The mechanism is quite different, that upthrust is caused physically by the greater pressure at greater depths acting upwards against the bottom of the floating ship or other object

    The reason for the equality between the weight of displaced liquid and the upthrust force, is simply that the total upward force is equal to the water pressure (which is directly proportional to depth under the water) multiplied by horizontal area, and this product of pressure, area and depth must be exactly equal to the weight of the ship in order that the ship floats (rather than either sinking or rising!). Hence we can prove Archimedes' proposition 5 using a physical mechanism of fluid pressure.

    The same mechanism of buoyancy explains why helium balloons rise. Air density at sea level is 1.2 kg/m^3, so the weight of air bearing down on the top of a balloon is slightly less than that bearing upwards on the base of it. The net pressure difference between top and botton of the balloon depends on the mean vertical extent of the balloon, but the upthrust force depends on the mean horizontal cross-sectional area, as well as upon the vertical extent.

    So an elongated balloon doesn't experience any difference in upthrust when rotated to minimuse vertical extent, because although the vertical pressure gradient is then minimised, the horizontal cross-sectional area is increased by doing so, which cancels out the effect of the change in the difference in pressure. Only if it is actually unanchored and free to ascend will the shape cause an effect on the bouyancy (by causing drag forces).

    There is a massive difference between the ingenious logic used by the mathematician to get the right answer, and the physical mechanism of buoyancy. Archimedes avoided speculations, observed that the pressure at a given depth under water is independent of whether there is or is not a floating object above, and from this proved that if there is a floating object above a given point, whatever it's weight is, it must be exactly the same weight as that of the water which is displaced.

    ReplyDelete
  37. Nige: Would you please hold back on your elaborations about Archimedes? This isn't a discussion forum. I've written a post about Anderson's article (see top of the page), and I would really appreciate if you could stick with the topic. Thanks,
    B.

    ReplyDelete
  38. After eventually reading through the article, it seems to me that there are two major points in Anderson's reasoning that are a bit inconsistent.

    For example, when he says that "The new availability of huge amounts of data, along with the statistical tools to crunch these numbers, offers a whole new way of understanding the world.", I do not see how this is that different from what our brains do every second, sifting through the huge amount of input provided by our senses. Yes, we recognise patterns, but this is just the first step in a "scientific" analysis that tries to throw away coincidences and establish real, causal patterns, or models. The availability of huge amounts of electronic data and the capacity to analyse them seems to me to be just this step, transferred to the big world of data. It's not clear to me a priori why this should supersede the next step of model building, and Anderson does not give a justification either.

    And when he writes about Google: This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear., and with respect to Venter: All he has is a statistical blip — a unique sequence that, being unlike any other sequence in the database, must represent a new species. - I wonder, how can one use applied mathematics or recognise unique sequences without having a model, albeit a rudimentary one, of the data one is about to analyse in the first place? One could argue that this implies a trivial definition of "model", but isn't the claim that "All models are wrong" trivial either, or completely misleading?

    On the example of the CluE projects that will include simulations of the brain and the nervous system and other biological research that lies somewhere between wetware and software., I just can concur to Juha Haataja's comment: As far as I known, neuro scientists for example have huge amounts of data about, say, brain activity, already now and are starting to look for good models to make sense of these data. This was one of the motivations for the foundation of the Frankfurt Institute for Advanced Studies, and probably for many other institutes that try to bring together neuro scientists and researchers from physics or other sciences.

    As I see it, the data mining techniques can be a great opportunity to enormously widen our capacity to sift to huge amounts of data and to find patterns within them - but I don't see at all who and why this could supersede the next step of model building to make sense of these patterns.

    Actually, models can tell us that some complex features we may see in data are there just by chance, or may be emergent, and that there may be no point in trying to figure out a deeper meaning behind the pattern. This week I heard an interview on German radio on the occasion of the 125th birthday of Franz Kafka, where the the interviewee was wondering who is responsible for the "kafkaesque" effects of globalisation, the inner workings of the stock market, or the rising prizes of oil and food, obviously convinced that there are actual people responsible for and steering all this... Here, models can tell us that this need not be the case.

    Best, Stefan

    ReplyDelete
  39. Hi Bee,


    “Would you please hold back on your elaborations about Archimedes? This isn't a discussion forum. I've written a post about Anderson's article (see top of the page), and I would really appreciate if you could stick with the topic.”

    You are of course correct that there is little that relates Anderson’s thoughts on the future of science and Archimedes development of the concept of specific gravity. However there is a connection with Archemedes which I think does apply and you might find of interest. This connection involves what is known as “Archimedes Method” and is a mathematical method that could be considered the precussor to Lebnitz’s/Newton’s calculus and only made known with the discovery of what’s called Archimedes Palimpsest in 1909. This was a mechanical method of analysis that he considered only to be used as a guide to truth which after one would need to demonstrate through “double reduction ad absurdum” to be considered true in terms of proof. It is interesting why he said the method was useful in the first phase of discovery which was:

    “I thought fit to write out for you and explain in detail in the same book the peculiarity of a certain method, by which it will be possible for you to get a start to enable you to investigate some of the problems in mathematics by means of mechanics. This procedure is, I am persuaded, no less useful even for the proof of the theorems themselves; for certain things first became clear to me by a mechanical method, although they had to be demonstrated by geometry afterwards because their investigation by the said method did not furnish an actual demonstration. But it is of course easier, when we have previously acquired, by the method, some knowledge of the questions, to supply the proof than it is to find it without any previous knowledge.”

    I find it therefore interesting that more then 2200 years ago what could be considered to be truth in terms of method, what place those methods serve and how they should be considered legitimate was also a concern. Although I would disagree with Anderson that his data mining approach could ever replace the scientific method(s) as the only means required, as with Archimedes Method it can lend some insight perhaps as to what the answer might be and from what direction it could be approached.

    Best,

    Phil

    ReplyDelete
  40. Hi Bee,

    Come on! You remand me our D. Itzik, Speaker of the Knesset (to avoid misinterpretation – I consider her a remarkable individual, woman and politician).
    Nige/Phil comments are 10^23 more interesting than Anderson's ideas.

    Regards, Dany.

    ReplyDelete
  41. I like that Stefan continues to push the topic here.

    Even though I see Hooft as jumping ship here on string theory, his work here can be highlighted in regards to what we see in use of our computers to assist SETI and LIGO operations from our desk tops.

    I do not have to remind one of what neutrinos have accomplished in our views of the cosmos, or, how we perceive the cosmos, as to this new window(gamma ray view of our sun,) as we see tscan computer programing as validating the information in those neutrino detectors.

    So there is an "overseeing potential here" in regards to data collection. A new view of the cosmos that includes Gamma ray detection.

    ReplyDelete
  42. Plectics, by Murray Gellman

    It is appropriate that plectics refers to entanglement or the lack thereof, since entanglement is a key feature of the way complexity arises out of simplicity, making our subject worth studying.

    See his site here.

    Gerard "t Hooft:No 'Quantum Computer' will ever be able to out perform a 'scaled up classical computer.'

    ReplyDelete
  43. Dear Stefan,

    Thanks for your interesting comment.

    the interviewee was wondering who is responsible for the "kafkaesque" effects of globalisation, the inner workings of the stock market, or the rising prizes of oil and food, obviously convinced that there are actual people responsible for and steering all this... Here, models can tell us that this need not be the case.

    If the effects are unwanted, we are responsible for not steering it. That's what models can help us with. Best,

    B.

    ReplyDelete
  44. ...you just have to open your eyes:)

    Newton's Translation of the Emerald Tablet


    It is true without lying, certain and most true. That which is Below is like that which is Above and that which is Above is like that which is Below to do the miracles of the Only Thing. And as all things have been and arose from One by the mediation of One, so all things have their birth from this One Thing by adaptation. The Sun is its father; the Moon its mother; the Wind hath carried it in its belly; the Earth is its nurse. The father of all perfection in the whole world is here. Its force or power is entire if it be converted into Earth. Separate the Earth from the Fire, the subtle from the gross, sweetly with great industry. It ascends from the Earth to the Heavens and again it descends to the Earth and receives the force of things superior and inferior. By this means you shall have the glory of the whole world and thereby all obscurity shall fly from you. Its force is above all force, for it vanquishes every subtle thing and penetrates every solid thing. So was the world created. From this are and do come admirable adaptations, whereof the process is here in this. Hence am I called Hermes Trismegistus, having the three parts of the philosophy of the whole world. That which I have said of the operation of the Sun is accomplished and ended.

    ReplyDelete
  45. Curiously, there really can't be any "theory" to show us a way if modal-realist type ideas (like Tegmark's) are really "true" - all descriptions exist, and we are just in the one that acts like this. There'd be no "reason why", no underlying "conceptual scheme" etc, there would only appear to be.

    ReplyDelete
  46. This comment has been removed by the author.

    ReplyDelete
  47. Hi Neil

    “Curiously, there really can't be any "theory" to show us a way if modal-realist type ideas (like Tegmark's) are really "true" - all descriptions exist, and we are just in the one that acts like this. There'd be no "reason why", no underlying "conceptual scheme" etc, there would only appear to be.”

    If you are talking about hypothesis like many worlds, many universes or the anthropic principle they all are explanations that would have us accept the answer “because” for the question “why”. As far as I’m concerned this believe that data mining is all that’s required to have us further science is simple much the same.

    I feel that for many when the answers become too difficult they just deny the relevancy of the question. I wouldn’t refer to this as science, I would call it intellectual defeatism.

    Best,

    Phil

    ReplyDelete
  48. Giotis,

    Now if this charater named "Data" has quantum intelligence, not just digital intelligence, then he may indeed gain the upper hand over carbon-based intelligence. But if he's any reflection of a quantum computer, then he's got a ways to go before becoming a full-fledged member of reality.

    ReplyDelete
  49. Mmhh - I just got a subscription to Wired magazine, and this article by Anderson (which in fact is on the cover) makes me wonder if this was such a good idea. Clearly Anderson understands very little about science.

    ReplyDelete
  50. Hi Plato,

    “No 'Quantum Computer' will ever be able to out perform a 'scaled up classical computer.'”

    G.’t Hooft didn’t point out who told him that. In the best case (that it is scientifically justified statement) it is no-go type assertion which usually turns out to be wrong.

    But O.K.,so what? The science is not a soccer or tennis championship. Let suppose for a moment that the human brain is the natural realization of quantum computer. I know that I can’t compete with most primitive calculator in doing arithmetic. But I didn’t listen yet about computer that reproduced Archimedes derivation. Now suppose our achievements in implementation of quantum computer will lead to the conventional (classical) computer which will be able to derive GR equations. What is the difference how we call it? (“what we can say about nature”).

    THAT is the evolution, THAT is the life, THAT is the science.

    I don’t fear the “competition”. If A.Einstein used about 6% capacity of his brain to derive axiomatically GR equations of motion, we have long way to go since we are integrated into the same feedback loop.

    Regards, Dany.

    P.S. Hi Cynthia,

    “The last time I checked, we, as humans, were still primarily made of carbon, not silicon. So unless we've gone extinct, then models, theories, and above all, the scientific method won't go extinct, either. But I guess it's possible that we could evolve into silicon-based lifeforms, thus enabling us to uncover Nature with raw data alone.”

    My son (the same one that likes the pigeons/squabs) now implemented experimentally symbiosis between PS I and GaAs (to have rights to call himself master in biology and physics).

    ReplyDelete
  51. Wired is wrong,

    http://www.builtonfacts.com/2008/07/08/approximately-a-power-series

    "Picard’s great theorem guarantees that the limit will approach any given value depending on which direction you approach from."

    Data will do you no good at all.

    ReplyDelete
  52. A lot of interesting arguments, and yet it's very simple why the Wired idea is wrong: it assumes that "the data are out there" - while in fact data result from how and where we look. It assumes that data drive hypotheses, and while this may be true to some degree, hypotheses drive what kind of data are collected.

    Without a hypothesis, what are you going to measure in order to even correlate something with something?

    Infants that haven't yet got the notion of object permanence (an occluded object continues to exist) will fail to look behind the occuding screen for the object. They just haven't got the hypothesis that allows them to collect the right data.

    I think the Wired article is dumb beyond belief. I like the discussion here much better.

    ReplyDelete
  53. To me this "Google" approach is an extension to the cycle of knowledge.

    Today's research is largely hypothesis-driven, which totally relies on pre-existing knowledge. The problem here: In this way nothing new can be discovered, because the answer is always already formulated in the question.

    Then there is the non-hypothesis driven approach. Most great discoveries were actually INITIATED in this way. Darwin had no hypothesis in mind when he started his investigations. Mendel had no hypothesis to test. Nuesslein-Volhard had no hypothesis to test. Newton also had no hypothesis- how could he??

    Non-hypothesis driven science starts with an idea or theory, and, after doing the right experiments/data analysis, allows one to then formulate a hypothesis which can then be tested. The problem with the non-hypothesis driven approach is that one still only can find things out about what one can formulate and think of (after Wittgensteins: "The limits of my words are the limits of my world").

    Then there is a third approach, which (not surprisingly) is not at all considered part of the cycle of knowledge: it is the accident. The list of accidental discoveries is long, famous examples being Radioactivity, Penicillin, and (closer to my own research in developmental biology) the Spemann organizer. The researchers had something completely different in mind, when either the outcome of the experiment surprised them or something went wrong.

    In these cases there was obviously no theory at hand. The researchers were just sufficiently awake to formulate one and start the cycle of knowledge.

    And this is where the “Google” approach (or whatever Anderson calls it) kicks in. The only thing this Google approach does is to force the luck by sifting though these gargantums of data and providing previously unthought-of correlations and- based on those- come up with theories. This is perfectly valid and will surely lead to many many new discoveries which will allow us to formulate new Theories.

    Very excitingly to me, this is approach allows to integrate “luck” into the cycle of knowledge and provides a rationale means of entering it. It surely does not put the “old-fashioned” theory and hypothesis aside.

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.