Nothing Special   »   [go: up one dir, main page]

Academia.eduAcademia.edu

The Ever Evolving Relation between Physics and Mathematics

This was entered in a contest by the Foundational Questions Institute (fqxi.org) to explain the mysterious connection between physics and mathematics. It did not win. It presents a general survey of how the relation evolved.

The Ever Evolving Relation between Physics and Mathematics Edward MacKinnon Eugene Wigner’s famous challenge to explain “The Unreasonable Effectiveness of Mathematics in the Natural Sciences” (Wigner 1967) set a precedent for how this should be done. First explain what mathematics is, next explain what physics is, and then consider how the two are related. This challenge can be given different interpretations. In a Platonic interpretation, mathematics is a system of objective truths, some of which we have discovered. The correspondence between this idealized mathematics and physical reality as it exists objectively is indeed puzzling. As Wigner expressed it: “It is difficult to avoid the impression that a miracle confronts us here.” (Wigner, p. 229). In the polar opposite, an operational perspective, mathematics supplies a box of tools, which physicists can use to solve problems. Since the tools are plentiful, and more can be fashioned as needed, the correspondence I not particularly problematic. However, it becomes puzzling when the mathematics used leads to physical consequences that physicists never anticipated This article will attempt to steer a middle path between these two extremes. The long and winding path stretches from the origins of science to contemporary cosmology. It presents a schematic account of the way scientists and others came to use mathematics in physical accounts. It concludes with some reflections on the role of mathematics in physics. To make the historical connection between mathematics and developing science intelligible a third term is needed language. Mathematics was used to fit reality as categorized and described. The basic elation was between physical concepts and mathematical concepts. Though concepts cannot be reduced to the terms that express them, any analysis of concepts rides on analyses of linguistic usage. The Coevolution of Mathematics and Classical Physics Arithmetic posed a problem for the scientists of early Greece and Alexandria. The representation of numbers by letters made even the elementary processes of addition, subtraction, and multiplication difficult, while division was prohibitively difficult. They focused on geometry as a tool. Neither these scientists, nor their later Islamic and Hindu counterparts ever developed the idea that qualities could admit of quantitative representations, an essential step in developing mathematical relations and laws. The following account of how the quantification of qualities was introduced into science summarizes a more detailed development presented elsewhere (MacKinnon 2012 , chapt. 2). Attaching quantities to qualities has a counterintuitive nature that received a systematic justification in Aristotle’s ordered list of categories: substance, quantity, quality, relation, etc. This represented a conceptual ordering grounded in normal linguistic usage. Thus ‘red’ cannot be properly predicated of an idea, an emotion, or a point. It can only be predicated of an extended substance. Any talk of the quantity of a quality perverts the natural ordering. This conceptual ordering carried over to measurements. Aristotle insisted that everything must be measured: “by some one thing homogeneous with it, units by a unit, horses by a horse, and similarly times by some definite time (Aristotle, Physics 223b14). The Scientific Revolution was led by people who believed that mathematics could be used in the treatment of all physical qualities. What occasioned this transition? Medieval scholastic philosophers and theologians accepted Aristotle’s categorical system, which they learned from Boethius and used in the nominalism/realism debates. Yet, accepted theology seemed to require the concept of the quantity of a quality. Grace was regarded as a quality, albeit a supernatural one. The degree of sanctifying grace, or lack thereof, determines one’s place in heaven or hell, something vividly depicted in Dante’s Divine Comedy. Thomas Aquinas first introduced the idea of the quantity of a quality in a theological context. (Summa Theologiae, I, q.42, a1, ad 1). He distinguished between quantity per se, or bulk quantity, and quantity per accidens, or virtual quantity. Virtual quantity, or the quantity of a quality, can have magnitude by reason of the effect of its form. Thus one with greater strength can lift heavier rocks. Subsequently the Nominalists changed the issue form a metaphysical concern about the intensification and remission of qualities to an essentially linguistic issue. What are the proper criteria for predicating ‘strong’ or ‘weak’ of the qualities a thing has. The mathematization of motion was developed chiefly by the 'Calculators' of Merton College in Oxford and later by Nicole Oresme and others at the University of Paris. These calculations were based more on abstract considerations rooted in the system of categories than on empirical data. In these works there is no mention of actual measurements or units or, until the sixteenth century, any application to falling bodies. The assignment of values was arbitrary. What concerned the Calculators was the ratio of quantities of qualities. However, this treatment of the intensification and remission of qualities introduced a conceptual language that made discussions of measurement possible, and stimulated mathematical analyses of functional relations and of the representation of varying qualities. This is most clearly seen in Oresme’s graphical representation of the Merton theorem of uniformly difform motion. The quantity of motion under the line, AD, representing uniformly difform motion (or constant acceleration) is equal to the quantity of motion under the line EG, representing uniform motion. Galileo adapted this representation to prove that the terminal velocity of a falling body is proportional to the square of the time. This representation presents new conceptual problems. Thus, F represents the instantaneous velocity at the midpoint. However, this must be an infinitesimal distance divided by an infinitesimal time. Development of this line of thought led to Newton’s fluxions. Further the quantity of motion for difformly difform motion is the area under an irregular curve. Attempts to determine such quantities led to Leibnitz’s integral calculus. After the fall of Constantinople (1453) some Greek scholars moved to Italy and taught Neo-Platonic philosophy in the academies established by noble families. They studied Plato’s Timaeus, where the astronomer Timaeus presents an extended explanation of the order of the universe. This extends from the Demurge fashioning preexistent matter in accord with mathematical forms to an explanation of illness and disease as resulting from a lack of proportion of the four elements A fusion of this idea of the universe as an embodied mathematical system with the biblical account of creation supported the idea that physical reality could be understood in terms of ideal mathematical forms. Qualities could be given quantitative expressions and be assigned numerical values. Originally, the assignment of values was arbitrary, since the mathematics treated ratios of quantities. Calculus, as it was developed by Newton, Leibniz, and their immediate successors, relied on physicalistic notions of motion, acceleration and vanishing quantities. Newton justified treating curves as continuous on the grounds that a curve is generated by a moving point. The idea of a function was shaped by motion studies leading to the Euler-Lagrange definition of a function as a relation between a dependent and an independent variable. Morris Kline (1972, p. 616) summarized the situation: “Far more than in any other century, the mathematical work of the eighteenth was directly inspsired by physical problems”. Basic concepts in physics were constrained by the requirement that they could be represented mathematically. Before presenting his axioms Newton carefully defines basic physical concepts: quantity of matter; quantity of motion; vis insita; impressed force; centripetal force, its absolute quantity, its acclerative quantity, and its motive quanity. The mathematical treatment of heat was blocked by the fact that heat was not an additive property. A distinction between heat and temperature, the introduction of thermometers, a distinction between overt and latent heat, and the realization that different substances have different specific heats enabled a mathematical treatment of heat phenomena. In the period of 1600 to 1800 the relation between physics and mathematics was one of coevolution. Descartes, Newton, Leibniz, Euler, the Bernoullis, Lagrange, Laplace, Ampère, Fourier, and many more made contributions to both mathematics and physics. The table illustrates the branches of mathematics that emerged from physics. Physics Mathematics Mechanics Calculus, Differential Equations, Calculus of Variations Acoustics, Thermodynamics Fourier series Thermodynamics, Electrodynamics Complex analysis Elasticity, Hydrodynamics, Electrodynamics Partial differential equations The co-evolution of physics and mathematics reached a branching point in the early nineteenth century. Two men led the differing developments. Laplace pioneered a new style of mathematical physics. In place of Lagrange's analytic mechanics Laplace developed a style that Poisson later dubbed 'physical mechanics'. Even by the standards of his time his math was not rigorous. He used approximations and power series in which he regularly dropped terms that were deemed insignificant on physical grounds. He treated math as a tool, not a system. His younger contemporary, Cauchy, instituted a program of developing calculus with no explicit reliance on physical notions. This led to the abandonment of physicalistic reasoning and geometric foundations in favor of analysis, and eventually set-theoretic foundations. To a man classical physicists relied on the older Euler-Lagrange mathematics rather than a calculus based ion analysis or set-theory. The only woman prominently involved, Emmy Noether, was too much a disciple of Hilbert to accept such sloppy mathematics. The reasons for this general reliance seems clear. Physicists framed problematic situations in terms that had established mathematical expressions. Terms like ‘force’, ‘acceleration’, momentum’, ‘energy’, ‘temperature’, ‘work’, ‘entropy’, ‘charge’, ‘current’, and many more had gradually been shaped to meet two constraints. They must be part of a coherent system that is capable of describing experiments, reporting results and supplying a conceptual framework for interpreting mathematically formulated theories. The mathematical constraint is that the mathematics used in a theory should have a basic consistency. Experimenters even more than theoreticians relied on the older mathematics. A standard technique in experimental practice is to vary some quantity under the experimentalist’s control and discover what changes this produced in some other variable. The goal was to express the relation through a function. So, they relied on the notion of a function as the relation between a dependent and an independent variable. The idea of a function as a mapping of every element, s, of a domain, S, onto an element, f(s), of a target domain, T, did not support a reliance on physical intuition Quantum Perturbations The development of quantum physics introduced new foundations for physics. One aspect of this revolution is less studied. The three-way binding of physical reality, language, and mathematics characterizing classical physics began to unravel with the advent of quantum physics. Physicists and philosophers generally treat this change by developing interpretation of quantum mechanics, while neglecting the role of language. We will begin by considering Bohr’s teaching on the role of language in quantum physics. This summary is based on more detailed developments give in MacKinnon 1985, 1994, and 2012, pp. 131-147. Bohr consistently treated the mathematical formalism of quantum mechanics as an inferential tool, not a theory to be interpreted. “Its physical content is exhausted by its power to formulate statistical laws governing observations obtained under conditions specified in plain language.” (Bohr , 1963, p. 12)He never invokes the reduction of the wave packet featured in the ‘Copenhagen interpretation of quantum mechanics. (See Howard) A simplified summary brings out the basic ideas. In the mid 1920s physicists were troubled by apparent contradictions in reports of atomic experiments. Thus reports of interference and diffraction experiments relied on the wave properties of light. X-ray reports and Compton scattering seemed to require Einstein light-quantum (or photon) hypothesis. Scattering of electrons from nickel crystals and other substances extended this wave-particle duality to electrons. Bohr eventually resolved such problems by something of a Gestalt switch from properties of bodies to the limits of valid applicability of the concepts used in experimental reports. The crucial concepts involved, ‘position’, ‘momentum’, ‘energy’, ‘frequency’, and ‘wave-length’ are classical concepts. These ae embedded in what Bohr calls ‘plain language’, i.e. the specialized extension of ordinary language used in physics. Any used of this extended ordinary language to describe experiments and report results must be unambiguous. An unlimited extension to quantum contexts produced the contradictions embodied in talk of wave-particle duality. Bohr argued that all such contradictions could be avoided by restriction on linguistic usage. Thus the cluster of concepts centering on the concept ‘particle’ could be used in some situations, while the complementary cluster, centering on ‘wave’ could be used in other contexts. Which is appropriate is effectively determined by the choice of the experimenter. Idealized experiments, such as the single slit/ double slit experiment or Einstein’s photon from a box suspended by a spring are apt tools fro clarifying the conceptual issues. This doctrine of complementarity changes the classical reality-language-mathematics interrelation. If the classical terms that physicist had shaped to express appropriate aspects of physical reality could in quantum contexts only be used in restricted experimental situations, then any simple concept-reality correspondence was no longer tenable. Nevertheless, Bohr relied on descriptive reports couched in classical terms as the basis for setting up and interpreting the mathematics of quantum mechanics. To solve particular problems one sets up a Hamiltonian or Lagrangian using classical terms. Then one replaces classical terms by quantum operators, such as p - i , E i. Dirac and Schwinger extended this method by taking measurements expressed in classical terms as a basis for developing the mathematical formulation of formulation of quantum mechanics. (See MacKinnon 2007) The functional adequacy of this measurement-based interpretation of quantum mechanics is limited. It does not cover advances in particle physics and cosmology. There is a more fundamental problem concerning the relation of mathematical formulations to reality. Quantum mechanics is the most successful theory in the history of physics. Its correspondence with reality must go beyond its functional use as an inferential tool. There have been many attempts to give this formalism a realistic interpretation. None has won general acceptance. Here I will take a more modest approach and consider how some distinctively quantum equations were developed. Many of the key breakthroughs in developing quantum formulas have a common feature. Someone developed an equation by guesswork or heuristic reasoning and then struggled to determine the physical significance of the formula developed. Max Planck tried to relate the thermodynamic concept ‘entropy’, his dissertation topic, to radiation. He eventually derived Wien’s radiation law from the formula, d2S/dU2 = - α/U, where s is entropy, U energy, and α a constant. Improved measurements indicated a significant discrepancy between Wien’s law and the frequency distribution in black-body radiation. Planck modified his derivation by beginning with the formula, d2S/dU2 = - α/(β + U), where β is a new constant. This led to an expression for the radiant energy, E, . Planck proposed this at a meeting in Berlin on October 19, 1900. The experimentalist, H. Rubens spend the evening checking Planck’s formula against his measured results and found agreement on every point. Soo too, after correcting an error, did Lummer and Pringsheim. In the next few months Planck developed three different derivations of his equation, none of them quite adequate. He originally resisted Einstein’s interpretation of energy quantization. Louis de Broglie’s thesis presented a flawed mathematical formulation of wave-particle duality. (MacKinnon, 1976). His famous formula, λ= h/p occurs only once as an approximate expression for the length of stationary phase waves in a gas under equilibrium conditions, not as an expression of wave-particle duality. Though Schrödinger was stimulated by de Broglie’s work he suppressed any reliance on wave-particle, or of a wave interpretation in the first of his quantization papers, which we will cite as Q1, . . . Q4. (MacKinnon, 1980). In Q1 he used the Hamilton-Jacobi equation to develop an equation, H(q, K/ψ, ∂ ψ /∂ q) = E. To give this a physical interpretation he postulated that the ψ-function must be real, a condition that was not met in his subsequent papers. His solution of the Schrödinger equation for the hydrogen atom was developed with such elegance and precision that it soon became a stable part of physics. Yet, in Q2 he referred to the derivation of his equation as “an unintelligible transformation for an incomprehensible transition” , an elegant term for a lucky guess. The way Heisenberg developed his quantum formulation is still disputed. My reconstruction, which Heisenberg accepted His letter to me commenting on the first draft of MacKinnon 1977 is available in Sources for the History of Quantum Physics., argued that his breakthrough relied on the forward Bohr Correspondence Principle, a way of using classical formulations to guess quantum formulations, and on the virtual oscillator model of atoms. Bohr, Kramers, and Slater had developed this by replacing a collection of radiating atoms by a collection of virtual oscillators. The conclusion this supported, that energy and momentum were only conserved statistically, was quickly refuted by two different experiments. Heisenberg’s adaption of this model, which he did not explicitly cite, necessitated a non-commutative algebra. Thus a transition from a state n to a state n-2, symbolized by a(n, n-2) could treated as a virtual transitions from state n to state n-1 followed by a second transition from state n-1 to n-2. To keep the transitions in a proper order Heisenberg postulated that a(n, n-1) a(n-1,n-2) a(n-1, n-2) a(n,n-1). Only later did Born give this a matrix formulation. The next great mathematical innovation in quantum mechanics was Dirac’s relativistic wave equation. Its origin was explained in an interview with Thomas Kuhn: “It came from just playing with the equation rather than trying to introduce the right physical ideas. A great deal of my work is just plain with equations and seeing what they give”. (Sources for the History of Quantum Physics, Interview 3, p. 5). The equation he was playing with was the Klein-Gordon equation. His play took the form of introducing novel 4 x 4 matrices that are not functions of space and time to factor the equation into two linear equations. In his Nobel prize address Steven Weinberg (1980) revealed how he developed his unified theory of electromagnetic and weak interactions. He was trying to develop a theory of strong interactions. He relied on quantum theory, special relativity, and renormalization. The novel feature introduced was the extension of space-time symmetries to internal symmetries. This required a gauge transformation and broken symmetries. The account did not work, because it required massless bosons for the interactions. “At some point in the fall of 1967, I think while driving to my office at MIT, it occurred to me that I had been adapting the right ideas to the wrong problem. It is not the ρ meson that is massless: it is the photon.” (Ibid, p. 517). Switching the formulation to the right problem led to electroweak unification. We complete this survey of serendipitous mathematics-physics connections with a brief look at string theory. In 1968 Gabriel Veneziano tried to guess a formula that would give the probabilities of scattering of two particles of different energies and angles. He came across the beta function that Euler had developed some two centuries earlier. Only much later was this interpreted as describing an ultra small string. The historical account is from Weinberg (1992, p. 213). The formula is given in Kaku (1993, p. 714). Edward Witten, a leading string theorist, summarized this situation by saying that “string theory is a part of twenty-first century physics that fell by chance into the twentieth century”. 3. Mathematics, Physics, and Reality The idea of an objective correspondence between physical reality, as it exists independent of our knowledge of it, and mathematical forms has an almost mystic appeal. In Plato’s perspective, the idea was intelligible. His allegory of the cave I supplemented by the analogy of the divided line. The lower two levels, concerned with opinion, treat images through conjectures and familiar objects through beliefs. The man who escapes from the cave can ascend to the two higher levels concerned with knowledge. The third level concerned with understanding of mathematical forms and structural relations follows the method of hypotheses. The fourth and highest level is concerned with the knowledge, achieved by dialectical reason, of the ideal forms of the good, the true, and the beautiful. Since the lower levels participate in these ideal forms it makes sense to speak of a correspondence between ideal mathematical forms and objective reality. Mathematics has advanced far beyond the figures and numbers that Plato’s contemporaries could treat. Might the Platonic ideal be redeveloped in terms of an objective relation between mathematics at a foundational level and physics at a foundational level? This seems highly unlikely. In the late nineteenth century, set theory was proposed as a foundation. It supplied a unified account of the abstract theoretical presuppositions of athematic as a whole. Almost all the objects mathematicians treat can be described as sets. However, a series of twentieth century developments undercut the idea that mathematics rests on a foundation. Gödel’s incompleteness theorem showed that it is not possible to prove, by established logical principles, the consistency of any axiomatic system strong enough to generate the natural numbers. Alonzo Church’s work on recursive functions showed that there is no general decision procedure capable of determining in a finite number of steps whether a mathematical statement is provable. Turing’s machine, an idealized mechanical procedure for determining whether a proof violates inference rules, leads to the halting problem. The Turing machine stops when it solves a problem. If it does not solve it doesn’t halt. There is no way to prove that it ever halts. Paul Cohen showed that the two most disputed axioms of Zermelo-Fraenkel set theory, the axiom of choice and the continuum hypothesis, are independent of the other axioms. This allows for various possible axiomatic set theories, no one of which is demonstrably superior. Finally, the Löwenheim-Skolem theorem showed that any axiomatic system admits of radically different interpretations Set theory is not a foundation for mathematics in the sense that it supports the edifice. The edifice remained secure while the purported foundations trembled. Other foundations have been developed. In Abraham Robinson ’s (1966) non-standard analysis the real numbers, ℜ, are embedded in a much larger set, * ℜ, that includes infinitely large numbers whose inverses are infinitesimals, something standard set theory does not allow. Smooth analysis and category theory also supply alternative foundations. None of these was introduced on the grounds that they fit reality better than their competitors. They were developed to improving the coherence of the sprawling edifice that is modern mathematics. Modern mathematics does not supply any foundation that fits the idea of an objective correspondence between mathematics and physical reality on a foundational level. Nor does physics supply the type of foundation needed. The projected Theories Of Everything would not explain very much. The standard model of particle physics relies on experimental information for the values of some basic parameters such as masses and coupling constants. The hope is that a TOE would furnish these values. However, if theories of the multiverse are correct, the values of these parameters are cosmic coincidences. In the anticipated schematism, a TOE is at the base of a nested series of effective theories, each representing some aspect of reality within a specified energy range. The hoped for natural bridge linking mathematics and physical reality on a foundational level is missing support on both ends of the bridge. The established correspondence between mathematics and physical reality rests chiefly on the successful use of mathematics in physical theories. In my view, the only intelligible basis for a deeper connection comes from an extrapolation of the methods that have proved successful in physics. A schematic view of the previous historical survey supplies one basis for such an extrapolation. The survey begins with structures in language. Categorization is a basic feature of all natural languages. It provides maximum information with the least cognitive effort. Coding an object or event as a member of a class supplies a basis for drawing inferences.(Rosch 1999) Basic categorical structures in natural languages are determined by familiar objects: tree dog, man, bird, with prototypes serving as exemplars. A robin us a bird; a penguin is kind of a bird. Subordinate categories are determined by differences in objects; a pear tree, an oak tree, a pine tree. There are various types of superordinate categories. These are underdetermined by the things classified and require some sort of supplemental specification. There is a vertical ordering: beagle < dog < mammal < animal < animal, determined by some system of classification, such as a theory of species and genera. There is a superordinate classification of events, e.g., getting dressed, going to the market, fixing dinner. The details that figure in these classes of events are culture specific. This categorical system supports a variety of informal, or material, inferences. If they are formalized hey are represented by enthymemes. There is smoke. There must be fire. This dog is dangerous. He is a pit bull. The ubiquitous role hat categorization plays in language usage and informal inferences becomes apparent when human is contrasted with computer reasoning. Computers re vastly superior to normal humans in making calculations, following complex nested rules, and processing data Yet they are distinctly inferior in pattern recognition and common sense inferences. Suppose that an AI robot is instructed to go into a crowded room, pick up the yellow ball, put it in the small blue box, and avoid bumping into anything. It would begin by scanning the whole room and then attempting to impose a variety of geometrical forms on selected portions. Ask it some simple common sense questions. Can you push things with a string? Can you pull them with a stick? Can a crocodile play basketball? Before answering the computer would begin processing all the information it has on ‘pull’, ‘things’, and ‘string. Confront a normal 7-year old with the same problems and they are quickly solved. Thanks to complex involved connections between perception and cognition and locomotion, she quickly recognizes the objects referred to and easily navigates her way around obstacles. The common sense questions pose no challenge because she can easily infer what strings, sticks, and crocodiles can and cannot do. The lived world is such a highly structured reality that it is almost impossible to get to reality independent of human structuring. Husserl’s bracketing and the tortured dialectic of Heidegger’s probe into the meaning of being bear witness to the difficulties involved. This conceptual structuring carries over to every level of human knowledge. Theories of fundamental particles or of cosmology are never tested by confrontation with objective reality or subjective impressions. They are tested against structured data. The development of physical science relied on the imposition of mathematics on reality as conceptually structured. The primitive forms involved using numbers on things that could be distinguished as units and geometrical forms on things perceived, or idealized, as circles, squares, and triangles. Such mathematics, once launched, invited further development. Classical Greeks and their immediate successors could not go far in arithmetic because of their representation of numbers by arbitrary letters. They had notable success with geometry. The next development considered was the mathematical representation of quantities of qualities. This initiated a development that culminated in the Scientific Revolution. A key feature of the new scientific method was the introduction of carefully defined physical concepts that fit into a coherent representation and admitted of mathematical representation. The expansion of the method to new domains like heat and electricity required the introduction of new concepts that supported the appropriate mathematical representation. Theories using these concepts supplied idealized models of aspects of reality. As indicated in the earlier chart, these advances led to new branches of mathematics that eventually broke away from their physical moorings. Physicists who mastered these advances developed an intuitive approach to physics grounded in the correspondence between classical reality, or an informal model of reality structured by classical concepts, and mathematical expressions. Relativity theory and quantum physics contravened this simple correspondence. The sustained opposition to special relativity was based in large part on a reliance on the intuitive notion of time. The new mathematics introduced into quantum theory defied any simple correspondence interpretation. What of physical intuition? In Q4 Schrödinger argued that his wave interpretation was superior to a particle interpretation partially because of its intuitive appeal. Heisenberg’s indeterminacy paper gave ‘intuition’ an operational meaning. An intuitive understanding of a theory means a qualitative understanding of its consistency and its experimental consequences. Physicists still relied on intuition, but a new sharpened form of intuition, not on the way quantitative concepts are given mathematical expression. Crucial advances came from introducing new formulas, on whatever grounds, and then testing their consequences. Future advances in physics will undoubtedly lead to mathematical formulations that do not yet exist. Is it meaningful to think of these as somehow already existing and waiting to be discovered? Previously we considered and rejected the idea that mathematical forms have an objective existence independent of physical reality. The only objective alternative seems to be that these forms have some sort of potential existence in physical reality. In the earlier stages of science concepts of circles, squares were thought to be abstracted from the wheels and boxes in which they were potentially present. Such an explanation may explain the genesis of the concepts, ‘circle’ and ‘square’ as idealizations of the structured reality perceived. This does not entail the conclusion that the ideal forms had some sort of existence in a wheel or a box. At a much later stage scientists fit mathematical forms to idealizations of structured reality. Kepler analyzed Brahe’s and his own data to conclude that the planets travel in ellipses. The data was good enough to support this claim, but not good enough to register the deviations from ellipticity in all planetary orbits. Modern physicists extended external symmetries to internal symmetries. But, the basic symmetries found in nature are broken symmetries. The to be discovered mathematical forms can function in physics only if gthey can be given a physical interpretation. Any such interpretation is grounded in a structured conceptualization of reality. The applicability of mathematics to reality hinges on human ingenuity, not on the objective existence of mathematical forms. References Bohr, N. (1963). Essays 1958-1962 on Atomic Physics and Human Knowledge. New York: Wiley. Howard, D. (2004). Who Invented the "Copenhagen Interpretation": A Study in Mythology. In S. D. Mitchell (Ed.), Philosophy of Science: Proceedings of the 2002 Biennial Meeting (pp. 669-682). East Lansing, MI: Philosophy of Science Association. Kaku, M. (1993). Quantum Field Theory: A Modern Introduction. New York: Oxford University Press. Kline, M. (1972). Mathematical Thought from Ancient to Modern Times. New York: Oxford University Press. MacKinnon, E. (1976). De Broglie's Thesis: A Critical Retrospective. American Journal of Physics, 44, 1047-1055. MacKinnon, E. (1977). Heisenberg, Models, and the Rise of Matrix Mechanics. Historical Studies in the Physical Sciences, 8, 137-188. MacKinnon, E. (1980). The Rise and Fall of the Schroedinger Interpretation Studies in the Foundations of Quantum Mechanics (pp. 1-57). East Lansing, Mich.: Philosophy of Science Association. MacKinnon, E. (1985). Bohr on the Foundations of Quantum Theory. In A. P. French & P. J. Kennedy (Eds.), Niels Bohr: A Centenary Volume (pp. 101-120). Cambridge, Mass.: Harvard University Press. MacKinnon, E. (1994). Bohr and the Realism Debates. In J. Faye & H. Folse (Eds.), Niels Bohr and Contemporary Physics (pp. 279-302). Dordrecht: Kluwer. MacKinnon, E. (2007). Schwinger and the Ontology of Quantum Field Theory. Foundations of Science, 12, 295-323. MacKinnon, E. M. (2012). Interpreting Physics: Language and the Classical/Quantum Divide. Dordrecht, New York: Springer. Robinson, A. (Ed.). (1966). Non-Standard Analysis. Amsterdam: North-Holland Publishing Company. Rosch, E. (1999). Principles of Categorization. In E. Margolis & S. Laurence (Eds.), Concepts: Core Readings (pp. 189-206). Cambridge: The MIT Press. Schrödinger, E. (1926). Quantisierung als Eigenwertproblem (Erste Mitteilung). [Quantization as an Eigenvalue Problem]. Annalen der Physik, 79, 361-376. Weinberg, S. (1980). Conceptual Foundations of the unified theory of weak and electromagnetic. Reviews of Modern Physics, 52, 515-523. Weinberg, S. (Ed.). (1992). Dreams of a final theory (1st ed ed.). New York: Pantheon Books. Wigner, E. P. (1967). The Unreasonable Effectiveness of Mathematics in the Natural Sciences Symmetries and Reflections (pp. 222-237). Bloomington, Indiana: Indiana University Press. 16