Showing posts with label Newton. Show all posts
Showing posts with label Newton. Show all posts

Monday 9 January 2023

The Philosophy of Science


The solar eclipse of May 29, 1919, forced a rethink of fundamental laws of physics

By Keith Tidman


Science aims at uncovering what is true. And it is equipped with all the tools — natural laws, methods, technologies, mathematics — that it needs to succeed. Indeed, in many ways, science works exquisitely. But does science ever actually arrive at reality? Or is science, despite its persuasiveness, paradoxically consigned to forever wending closer to its goal, yet not quite arriving — as theories are either amended to fit new findings, or they have to be replaced outright?

It is the case that science relies on observation — especially measurement. Observation confirms and grounds the validity of contending models of reality, empowering critical analysis to probe the details. The role of analysis is to scrutinise a theory’s scaffolding, to better visualise the coherent whole, broadening and deepening what is understood of the natural world. To these aims, science, at its best, has a knack for abiding by the ‘laws of parsimony’ of Occam’s razor — describing complexity as simply as possible, with the fewest suppositions to get the job done.

To be clear, other fields attempt this self-scrutiny and rigour, too, in one manner or another, as they fuel humanity’s flame of creative discovery and invention. They include history, languages, aesthetics, rhetoric, ethics, anthropology, law, religion, and of course philosophy, among others. But just as these fields are unique in their mission (oriented in the present) and their vision (oriented in the future), so is science — the latter heralding a physical world thought to be rational.

Accordingly, in science, theories should agree with evidence-informed, objective observations. Results should be replicated every time that tests and observations are run, confirming predictions. This bottom-up process is driven by what is called inductive reasoning: where a general principle — a conclusion, like an explanatory theory — is derived from multiple observations in which a pattern is discerned. An example of inductive reasoning at its best is Newton’s Third Law of Motion, which states that for every action (force) there is an equal and opposite reaction. It is a law that has worked unfailingly in uncountable instances.

But such successes do not eliminate inductive reasoning’s sliver of vulnerability. Karl Popper, the 20th-century Austrian-British philosopher of science, considered all scientific knowledge to be provisional. He illustrated his point with the example of a person who, having seen only white swans, concludes all swans are white. However, the person later discovers a black swan, an event conclusively rebutting the universality of white swans. Of course, abandoning this latter principle has little consequence. But what if an exception to Newton’s universal law governing action and reaction were to appear, instead?

Perhaps, as Popper suggests, truth, scientific and otherwise, should therefore only ever be parsed as partial or incomplete, where hypotheses offer different truth-values. Our striving for unconditional truth being a task in the making. This is of particular relevance in complex areas: like the nature of being and existence (ontology); or of universal concepts, transcendental ideas, metaphysics, and the fundamentals of what we think we know and understand (epistemology). (Areas also known to attempt to reveal the truth of unobserved things.) 

And so, Popper introduced a new test of truth: ‘falsifiability’. That is, all scientific assertions should be subjected to the test of being proven false — the opposite of seeking confirmation. Einstein, too, was more interested in whether experiments disagreed with his bold conjectures, as such experiments would render his theories invalid — rather than merely provide further evidence for them.

Nonetheless, as human nature would have it, Einstein was jubilant when his prediction that massive objects bend light was confirmed by astronomical observations of light passing close to the sun during the total solar eclipse of 1919, the observation thereby requiring revision of Newton’s formulation of the laws of gravity.

Testability is also central to another aspect of epistemology. That is, to draw a line between true science — whose predictions are subject to rigorous falsification and thus potential disproof — and pseudoscience — seen as speculative, untestable predictions relying on uncontested dogma. Pseudoscience balances precariously, depending as it does on adopters’ fickle belief-commitment rather than on rigorous tests and critical analyses.

On the plus side, if theories are not successfully falsified despite earnest efforts to do so, the claims may have a greater chance of turning out true. Well, at least until new information surfaces to force change to a model. Or, until ingenious thought experiments and insights lead to the sweeping replacement of a theory. Or, until investigation explains how to merge models formerly considered defyingly unalike, yet valid in their respective domains. An example of this last point is the case of general relativity and quantum mechanics, which have remained irreconcilable in describing reality (in matters ranging from spacetime to gravity), despite physicists’ attempts. 

As to the wholesale switching out of scientific theories, it may appear compelling to make the switch, based on accumulated new findings or the sense that the old theory has major fault lines, suggesting it has run its useful course. The 20th-century American philosopher of science, Thomas Kuhn, was influential in this regard, coining the formative expression ‘paradigm shift’. The shift occurs when a new scientific theory replaces its problem-ridden predecessor, based on a consensus among scientists that the new theory (paradigm) better describes the world, offering a ‘revolutionarily’ different understanding that requires a shift in fundamental concepts.


Among the great paradigm shifts of history are Copernicuss sun-centered (heliocentric) model of planet rotation, replacing Ptolemys Earth-centered model. Another was Charles Darwins theory of natural selection as key to the biological sciences, informing the origins and evolution of species. Additionally, Einsteins theories of relativity ushered in major changes to Newtons understanding of the physical universe. Also significant was recognition that plate tectonics explain large-scale geologic change. Significant, too, was development by Neils Bohr and others of quantum mechanics, replacing classical mechanics at microscopic scales. The story of paradigm shifts is long and continues.


Science’s progress in unveiling the universe’s mysteries entails dynamic processes: One is the enduring sustainability of theories, seemingly etched in stone, that hold up under unsparing tests of verification and falsification. Another is implementation of amendments as contrary findings chip away at the efficacy of models. But then another is the revolutionarily replacement of scientific models as legacy theories become frail and fail. Reasons for belief in the methods of positivism. 


In 1960, the physicist Eugene Wigner wrote what became a famous paper in philosophy and other circles, coining the evocative expression unreasonable effectiveness. This was in reference to the role of mathematics in the natural sciences, but he could well have been speaking of the role of science itself in acquiring understanding of the world.


Monday 26 September 2022

Where Do Ideas Come From?


By Keith Tidman

Just as cosmic clouds of dust and gas, spanning many light-years, serve as ‘nurseries’ of new stars, could it be that the human mind similarly serves as a nursery, where untold thought fragments coalesce into full-fledged ideas?

At its best, this metaphor for bringing to bear creative ideas would provide us with a different way of looking at some of the most remarkable human achievements in the course of history.

These are things like Michelangelo’s inspired painting, sculpting, architecture, and engineering. The paradigm-shifting science of Niels Bohr and Max Planck developing quantum theory. The remarkable compositions of Mozart. The eternal triumvirate of Socrates, Plato, and Aristotle — whose intellectual hold remains to today. The piercing insights into human nature memorably expressed by Shakespeare. The democratic spread of knowledge achieved through Gutenberg’s printing press. And so many more, of course.

To borrow from Newton (with his nod to the generations of luminaries who set the stage for his own influences upon science and mathematics), might humbler souls, too, learn to ‘stand on the shoulders of such giants’, even if in less remarkable ways? Yet still to reach beyond the rote? And, if so, how might that work?

I would say that, for a start, it is essential for the mind to be unconstrained by conformance and orthodox groupthink in viewing and reconceiving the world: a quest for patterns. The creative process must not be sapped by concern over not getting endeavours right the first or second or third time. Doubting ideas, putting them to the test through decomposition and recomposition, adds to the rigour of those that optimally survive exploitation and scrutiny.

To find solutions that move significantly beyond the prevailing norms requires the mind to be undaunted, undistracted, and unflagging. Sometimes, how the creative process starts out — the initial conditions, as well as the increasing numbers of branching paths along which those conditions travel — greatly shapes eventual outcomes; other times, not. All part of the interlacing of analysis and serendipitous discovery. I think that tracing the genealogy of how ideas coalesce informs that process.

For a start, there’s a materialistic aspect to innovative thought, where the mind is demystified from some unmeasurable, ethereal other. That is, ideas are the product of neuronal activity in the fine-grained circuity of the brain, where hundreds of trillions of synapses, acting like switches and routers and storage devices, sort out and connect thoughts and deliver clever solutions. Vastly more synapses, one might note, than there are stars in our Milky Way galaxy!

The whispering unconscious mind, present in reposed moments such as twilight or midnight or simply gazing into the distance, associated with ‘alpha brain waves’, is often where creative, innovative insights dwell, being readied to emerge. It’s where the critical mass of creative insights is housed, rising to challenge rigid intellectual canon. This activity finds a force magnifier in the ‘parallel processing’ of others’ minds during the frothy back and forth of collaborative dialogue.

The panoply of surrounding influences helps the mind set up stencils for transitioning inspiration into mature ideas. These influences may germinate from individuals in one’s own creative orbit, or as inspiration derived from the culture and community of which one is a part. Yet, synthesising creative ideas across fields, resulting in multidisciplinary teams whose members complement one another, works effectively to kindle fresh insights and solutions.

Thoughts may be collaboratively exchanged within and among teams, pushing boundaries and inciting vision and understanding. It’s incremental, with ideas stepwise building on ideas in the manner famously acknowledged by Newton. Ultimately, at its best the process leads to the diffusion of ideas, across communities, as grist for others engaged in reflection and the generation of new takes on things. Chance happenings and spontaneous hunches matter, too, with blanks cooperatively filled in with others’ intuitions.

As an example, consider that, in a 1959 talk, the Nobel prize winning physicist, Richard Feynman, challenged the world to shrink text to such an extent that the entire twenty-four-volume Encyclopedia Britannica could fit onto the head of a pin. (A challenge perhaps reminiscent of the whimsical question about ‘the number of angels fitting on the head of a pin’, at the time intended to mock medieval scholasticism.) Meanwhile, Feynman believed there was no reason technology couldn’t be developed to accomplish the task. The challenge was met, through the scaling of nanotechnology, two and a half decades later. Never say never, when it comes to laying down novel intellectual markers.

I suggest that the most-fundamental dimension to the origination of such mind-stretching ideas as Feynman’s is curiosity — to wonder at the world as it has been, as it is now, and crucially as it might become. To doggedly stay on the trail of discovery through such measures as what-if deconstruction, reimagination, and reassembly. To ferret out what stands apart from the banal. And to create ways to ensure the right-fitting application of such reinvention.

Related is a knack for spotting otherwise secreted links between outwardly dissimilar and disconnected things and circumstances. Such links become apparent as a result of combining attentiveness, openness, resourcefulness, and imagination. A sense that there might be more to what’s locked in one’s gaze than what immediately springs to mind. Where, frankly, the trite expression ‘thinking outside-the-box’ is itself an ironic example of ‘thinking inside-the-box’.

Forging creative results from the junction of farsightedness and ingenuity is hard — to get from the ordinary to the extraordinary is a difficult, craggy path. Expertise and extensive knowledge is the metaphorical cosmic dust required in order to coalesce into the imaginatively original ideas sought.

Case in point is the technically grounded Edison, blessed with vision and critical-thinking competencies, experiencing a prolific string of inventive, life-changing eureka moments. Another example is Darwin, prepared to arrive at his long-marinating epiphany into the brave world of ‘natural selection’. Such incubation of ideas, venturing into uncharted waters, has proven immensely fruitful.

Thus, the ‘nurseries’ of thought fragments, coalescing into complex ideas, can provide insight into reality — and grist for future visionaries.

Monday 11 September 2017

Chaos Theory: And Why It Matters

Posted by Keith Tidman

Computer-generated image demonstrating that the behaviour of dynamical systems is highly sensitive to initial conditions

Future events in a complex, dynamical, nonlinear system are determined by their initial conditions. In such cases, the dependence of events on initial conditions is highly sensitive. That exquisite sensitivity is capable of resulting in dramatically large differences in future outcomes and behaviours, depending on the actual initial conditions and their trajectory over time — how follow-on events nonlinearly cascade and unpredictably branch out along potentially myriad paths. The idea is at the heart of so-called ‘Chaos Theory’.

The effect may show up in a wide range of disciplines, including the natural, environmental, social, medical, and computer sciences (including artificial intelligence), mathematics and modeling, engineering — and philosophy — among others. The implication of sensitivity to initial conditions is that eventual, longer-term outcomes or events are largely unpredictable; however, that is not to say they are random — there’s an important difference. Chaos is not randomness; nor is it disorder*. There is no contradiction or inconsistency between chaos and determinism. Rather, there remains a cause-and-effect — that is, deterministic — relationship between those initial conditions and later events, even after the widening passage of time during which large nonlinear instabilities and disturbances expand exponentially. Effect becomes cause, cause becomes effect, which becomes cause . . . ad infinitum. As Chrysippus, a third-century BC Stoic philosopher, presciently remarked:
‘Everything that happens is followed by something else which depends on it by causal necessity. Likewise, everything that happens is preceded by something with which it is causally connected’.
Accordingly, the dynamical, nonlinear system’s future behaviour is completely determined by its initial conditions, even though the paths of the relationship — which quickly get massively complex via factors such as divergence, repetition, and feedback — may not be traceable. A corollary is that not just the future is unpredictable, but the past — history — also defies complete understanding and reconstruction, given the mind-boggling branching of events occurring over decades, centuries, and millennia. Our lives routinely demonstrate these principles: the long-term effects of initial conditions on complex, dynamical social, economic, ecologic, and pedagogic systems, to cite just a few examples, are likewise subject to chaos and unpredictability.

Chaos theory thus describes the behaviour of systems that are impossible to predict or control. These processes and phenomena have been described by the unique qualities of fractal patterns like the one above — graphically demonstrated, for example, by nerve pathways, sea shells, ferns, crystals, trees, stalagmites, rivers, snow flakes, canyons, lightning, peacocks, clouds, shorelines, and myriad other natural things. Fractal patterns, through their branching and recursive shape (repeated over and over), offer us a graphical, geometric image of chaos. They capture the infinite complexity of not just nature but of complex, nonlinear systems in general — including manmade ones, such as expanding cities and traffic patterns. Even tiny errors in measuring the state of a complex system get mega-amplified, making prediction unreliable, even impossible, in the longer term. In the words of the 20th-century physicist Richard Feynman:
‘Trying to understand the way nature works involves . . . beautiful tightropes of logic on which one has to walk in order not to make a mistake in predicting what will happen’.
The exquisite sensitivity to initial conditions is metaphorically described as the ‘butterfly effect’. The term was made famous by the mathematician and meteorologist Edward Lorenz in a 1972 paper in which he questioned whether the flapping of a butterfly’s wings in Brazil — an ostensibly miniscule change in initial conditions in space-time — might trigger a tornado in Texas — a massive consequential result stemming from the complexly intervening (unpredictable) sequence of events. As Aristotle foreshadowed, ‘The least initial deviation . . . is multiplied later a thousandfold’.

Lorenz’s work that accidentally led to this understanding and demonstration of chaos theory dated back to the preceding decade. In 1961 (in an era of limited computer power) he was performing a study of weather prediction, employing a computer model for his simulations. In wanting to run his simulation again, he rounded the variables from six to three digits, assuming that such an ever-so-tiny change couldn’t matter to the results — a commonsense expectation at the time. However, to the astonishment of Lorenz, the computer model resulted in weather predictions that radically differed from the first run — all the more so the longer the model ran using the slightly truncated initial conditions. This serendipitous event, though initially garnering little attention among Lorenz's academic peers, eventually ended up setting the stage for chaos theory.

Lorenz’s contributions came to qualify the classical laws of Nature represented by Isaac Newton, whose Mathematical Principles of Natural Philosophy three hundred-plus years earlier famously laid out a well-ordered, mechanical system — epically reducing the universe to ‘clockwork’ precision and predictability. It provided us, and still does, with a sufficiently workable approximation of the world we live in.

No allowance, in the preceding descriptions, for indeterminacy and unpredictability. That said, an important exception to determinism would require venturing beyond the macroscopic systems of the classical world into the microscopic systems of the quantum mechanical world — where indeterminism (probability) prevails. Today, some people construe the classical string of causes and effects and clockwork-like precision as perhaps pointing to an original cause in the form of some ultimate designer of the universe, or more simply a god — predetermining how the universe’s history is to unfold.

It is not the case, as has been thought too ambitiously by some, that all that humankind needs to do is get cleverer at acquiring deeper understanding, and dismiss any notion of limitations, in order to render everything predictable. Conforming to this reasoning, the 18th century Dutch thinker, Baruch Spinoza, asserted,
‘Nothing in Nature is random. . . . A thing appears random only through the incompleteness of our knowledge’.


*Another example of chaos is brain activity, where a thought and the originating firing of neurons — among the staggering ninety billion neurons, one hundred trillion synapses, and unimaginable alternative pathways — results in the unpredictable, near-infinite sequence of electromechanical transmissions. Such exquisite goings-on may well have implications for consciousness and free will. Since consciousness is the root of self-identity — our own identity, and that of others — it matters that consciousness is simultaneously the product of, and subject to, the nonlinear complexity and unpredictability associated with chaos. The connections are embedded in realism. The saving grace is that cause-and-effect and determinism are, however, still in play in all possible permutations of how individual consciousness and the universe subtly connect.