Untying the Gordian Knot:
Process, Reality, and Context
(Contemporary Whitehead Studies)
by Timothy E. Eastman (Author)
Publication Date: December 10, 2020
Amazon Link |
In Untying the Gordian Knot: Process, Reality, and Context, Timothy E. Eastman proposes a new creative synthesis, the Logoi framework—which is radically inclusive and incorporates both actuality and potentiality—to show how the fundamental notions of process, logic, and relations, woven with triads of input-output-context and quantum logical distinctions, can resolve a baker’s dozen of age-old philosophic problems. Further, Eastman leverages a century of advances in quantum physics and the Relational Realism interpretation pioneered by Michael Epperson and Elias Zafiris and augmented by the independent research of Ruth Kastner and Hans Primas to resolve long-standing issues in understanding quantum physics. Adding to this, Eastman makes use of advances in information and complex systems, semiotics, and process philosophy to show how multiple levels of context, combined with relations—including potential relations—both local and local-global, can provide a grounding for causation, emergence, and physical law. Finally, the Logoi framework goes beyond standard ways of knowing—that of context independence (science) and context focus (arts, humanities)—to demonstrate the inevitable role of ultimate context (meaning, spiritual dimension) as part of a transformative ecological vision, which is urgently needed in these times of human and environmental crises.
Editorial Reviews
"Timothy Eastman, eminent space scientist associated for many years with NASA and an important philosopher of science, has here produced a work of enormous significance. Cutting through a "Gordian Knot" of philosophical and scientific problems ranging widely from the mind-body issue, the nature of consciousness, freedom of the will, and the reality of temporal process, to the nature of quantum theory and the quantum measurement problem (to name a few), Eastman shows how an emphasis on physical context and employment of what he calls the relational logoi framework resolves such problems in a parsimonious and elegant way. The book displays astounding erudition producing a "consilience" of streams of evidence across numerous scientific and philosophical disciplines. Process philosophers and scholars working in the American pragmatist tradition will be especially drawn to this project as it resonates profoundly with central ideas found in Whitehead, Hartshorne, and Peirce."
- George W. Shields, University of Louisville
“We rightly marvel at the achievements yielded by the evolution of physics, from the Aristotelian paradigm to the mechanical paradigm to the field paradigm and finally to our current, stubbornly bipolar paradigm of quantum mechanics and relativity theory—that infamously double-edged instrument by which we define nature’s innermost and outermost extremes via mutually exclusive ontologies. This book charts a novel and compelling path forward toward a coherent relation of these incompatible fundamental theories—a path whereby naïve object-oriented realism is redefined as inherently contextual and relational—a groundbreaking synthesis of the ideas of Peirce, James and Whitehead along with modern physics, complex systems, information theory, semiotics and philosophy.”
- Michael Epperson, California State University Sacramento
"Timothy Eastman's book draws from and draws together many sources, from the humanistic to the scientific, inspired especially by the process philosophy of Whitehead and the semiotic vision of Pierce. Calling on these sources and inspirations, it presents an informed and informative synthesis in an integrative approach. It illuminates its fundamental notions of process, logic, and relations in a wide-ranging exploration; yet it is marked by a spirit which grants our fallibility, even as it proposes an ordered vision of things. It is engaging and illuminating in its impressive range of reference. Here we find a very thoughtful and synthetic voice that speaks in a constructive spirit. It witnesses to a new adventure of ideas, calling on the work of many thinkers who are cooperators in the field of constructive thought. Crossing boundaries between disciplines often kept apart, it is engaging and illuminating in its impressive range of reference."
- William Desmond, Katholieke Universiteit Leuven
* * * * * * * * * *
BIOGRAPHICAL DETAILS
Timothy E. Eastman
Sciences and Exploration Directorate - NASA Goddard Space Flight Center
Director, Space Physics and Plasma Sciences - Plasmas International
Dr. Timothy E. Eastman is a senior scientist at NASA’s Goddard Space Flight Center, and a consultant in plasma science. He has more than 40 years of experience in research and consulting in space physics, space science data systems, space weather, plasma applications, public outreach and education, and philosophy of science. This work included serving as a research scientist at Los Alamos National Lab, Branch Chief for the Magnetospheric Physics at NASA Headquarters, senior research scientist and faculty at both the University of Iowa and the University of Maryland, and Program Director for Magnetospheric Physics at the National Science Foundation.
Dr. Eastman discovered the Low-Latitude Boundary Layer (LLBL) of Earth’s magnetosphere (1976), and discovered gyro-phase bunched ions in space plasmas by analyzing energetic ion distribution functions near Earth’s bow shock (1981). The LLBL work, in particular, was further explored by two major multi-spacecraft missions – the Cluster spacecraft mission of the European Space Agency and NASA’s current Magnetospheric Multiscale Mission. He has published over 100 research papers in space physics, philosophy, and related fields.
In addition to an extensive research career, Dr. Eastman provided key leadership of the nation’s research programs in space plasma physics while program manager at NASA Headquarters (1985-1988) and NSF (1991-1994). For such program leadership, he was co-developer of key foundations for major international and interagency projects, including the International Solar Terrestrial Physics program, the Interagency Space Weather Program, and the Basic Plasma Science and Engineering program. He created and maintains major Web sites for plasma science and applications at plasmas.org and plasmas.com, and is lead editor of a book in philosophy of physics entitled Physics and Whitehead, published in 2004 by SUNY Press. Dr. Eastman’s interest in philosophy and philosophy of science extends over three decades with several journal publications in philosophy in addition to the SUNY volume; further, he is on International Advisory Boards for Process Studies and Studia Whiteheadiana (Poland), and is lead editor of a new volume entitled Physics and Speculative Philosophy scheduled for publication in 2015 with DeGruyter Press.
* * * * * * * * * *
Physics and Whitehead: Quantum, Process, and Experience
Leading scholars explore the connections between
quantum physics and process philosophy.
(Suny Series in Constructive Postmodern Thought)
Publication Date: January 8, 2009
* * * * * * * * * *
Physics and Speculative Philosophy:
Potentiality in Modern Science
(Process Thought Book #27)
by Timothy E. Eastman (Editor),
Michael Epperson (Editor),
David Ray Griffin (Editor)
Publication Date: Feb 22, 2016
Through both an historical and philosophical analysis of the concept of possibility, we show how including both potentiality and actuality as part of the real is both compatible with experience and contributes to solving key problems of fundamental process and emergence. The book is organized into four main sections that incorporate our routes to potentiality: (1) potentiality in modern science [history and philosophy; quantum physics and complexity]; (2) Relational Realism [ontological interpretation of quantum physics; philosophy and logic]; (3) Process Physics [ontological interpretation of relativity theory; physics and philosophy]; (4) on speculative philosophy and physics [limitations and approximations; process philosophy]. We conclude that certain fundamental problems in modern physics require complementary analyses of certain philosophical and metaphysical issues, and that such scholarship reveals intrinsic features and limits of determinism, potentiality and emergence that enable, among others, important progress on the quantum theory of measurement problem and new understandings of emergence
* * * * * * * * * *
Big Bang: A Critical Analysis
by Hilton Ratcliffe (Author),
Timothy E. Eastman (Author),
Ashwini Kumar Lal (Author),
R. Joseph (Author)
Publication Date: October 26, 2011
A Word of Caution: The book, "Big Bang: A Critical Analysis," which I have listed under Tim Eastmen's name asks questions of modern day contemporary science. But rather than painting the study of cosmology as an area of "conspiracy theory" I would much rather approach sundry Cosmological Theories such as "The Big Bang" as one of many theoretical options which science is working through in answering contemporary queries of speculative science. The questions being asked in this book are no different than the ones being asked elsewhere amongst cosmologists, metaphysicians, quantum physicists, astrophysicists, etc. There is no "deep state" holding back information or forcing information into earlier molds of scientific theories. Hence, I personally feel the book's blurb should not be couched in the language of conspiracy but in the language of continual scientific inquiry as new ideas, instrumentations, technologies, and sciences comes out to play with older and newer data sets. Thank you. - Russ Slater
Book Blurb
The theory that has come to be known as The Big Bang was originally proposed by a Catholic Priest, to make the Bible scientific. Critics have subsequently referred to the Big Bang theory as religion masquerading as science. Nevertheless, the Big Bang model is the generally accepted theory for the origin of universe. Nonetheless, findings in observational astronomy and revelations in the field of fundamental physics question the validity of the 'Big Bang' model, including the organization of galactic superstructures, the Cosmic Microwave Background, distant galaxies, gravitational waves, red shifts, and the age of local galaxies. Admittedly, the Big Bang research program has generated considerable research and there has been some confirmation for many hypotheses. However, outstanding questions remain and substantial alternative cosmology models, which also have been fruitful, remain viable and continue to evolve. Unfortunately, there has been a concerted effort to prevent research into alternate cosmologies. The Big Bang has become a sacred cow; which must not be questioned. One of the greatest challenges facing astrophysics is derivation of remoteness in cosmological objects. At large scales, it is almost entirely dependent upon the Hubble relationship between apparent brightness and spectral redshift for large luminous objects. However, this data has questionable validity. The assumption of scale invariance and universality of the Hubble law allowed the adoption of redshift as a standard calibration of cosmological distance. A major problem is the Big Bang model implies the existence of a creator. Why the Universe should have had a beginning, or why it would have been created, cannot be explained by classical or quantum physics. To support the Big Bang, estimates of the age & size of the cosmos, including claims of an accelerating universe, are based on an Earth-centered universe with the Earth as the measure of all things, exactly as dictated by religious theology. However, distance from Earth is not a measure of the age of far away galaxies. The Big Bang cannot explain why there are galaxies older than the Big Bang, why fully formed galaxies continue to be discovered at distances of over 13 billion light years from Earth, when according to Big Bang theory, no galaxies should exist at these distances. To support the Big Bang, red shifts are purposefully misinterpreted based on Pre-Copernican geo-centrism with Earth serving as ground zero. However, red shifts are variable, effected by numerous factors, and do not provide measures of time, age or distance. Nor can Big Bang theory explain why galaxies collide, why rivers of galaxies flow in the wrong direction, why galaxies clump together creating great walls of galaxies which took from 80 billion to 150 billion years to form. Big Bang theory requires phantom forces, constantly adjusted parameters, and ad hoc theorizing to explain away and to cover up the numerous holes in this theory. Finally, if at first there was a singularity, then the Big Bang was not a beginning, but a continuation.
Table of Contents
1. Big Bang? A Critical Review A. K. Lal, and R. Joseph, 2. Cosmic Agnosticism, Revisited Timothy E. Eastman, 3. Anomalous Redshift Data and the Myth of Cosmological Distance Hilton Ratcliffe, 4. Multiverse Scenarios in Cosmology: Classification, Cause, Challenge, Controversy, and Criticism Rüdiger Vaas, 5. An Infinite Fractal Cosmos R. L. Oldershaw, 6. Different Routes to Multiverses and an Infinite Universe. B.G. Sidharth 7. The Origin of the Modern Anthropic Principle, Helge Kragh, 8. Cosmos and Quantum: Frontiers for the Future. Menas Kafatos, 9. Infinite Universe vs the Myth of the Big Bang: Redshifts, Black Holes, Acceleration.
* * * * * * * * * *
RELATED READINGS
Amazon Link |
The Quantum of Explanation: Whitehead’s Radical Empiricism
(Routledge Studies in American Philosophy) 1st Edition
by Randall E. Auxier (Author)
Publication Date: March 21, 2019
The Quantum of Explanation advances a bold new theory of how explanation ought to be understood in philosophical and cosmological inquiries. Using a complete interpretation of Alfred North Whitehead’s philosophical and mathematical writings and an interpretive structure that is essentially new, Auxier and Herstein argue that Whitehead has never been properly understood, nor has the depth and breadth of his contribution to the human search for knowledge been assimilated by his successors. This important book effectively applies Whitehead’s philosophy to problems in the interpretation of science, empirical knowledge, and nature. It develops a new account of philosophical naturalism that will contribute to the current naturalism debate in both Analytic and Continental philosophy. Auxier and Herstein also draw attention to some of the most important differences between the process theology tradition and Whitehead’s thought, arguing in favor of a Whiteheadian naturalism that is more or less independent of theological concerns. This book offers a clear and comprehensive introduction to Whitehead’s philosophy and is an essential resource for students and scholars interested in American philosophy, the philosophy of mathematics and physics, and issues associated with naturalism, explanation and radical empiricism.
The Philosophy of Hilary Putnam
(Library of Living Philosophers #34)
by Randall E. Auxier (Editor)
Douglas R. Anderson (Editor)
Lewis Edwin Hahn (Editor)
Publication Date: June 30, 2015
Hilary Putnam, who turned 88 in 2014, is one of the world’s greatest living philosophers. He currently holds the position of Cogan University Professor Emeritus of Harvard. He has been called “one of the 20th century’s true philosophic giants” (by Malcolm Thorndike Nicholson in Prospect magazine). He has been very influential in several different areas of philosophy: philosophy of mathematics, philosophy of language, philosophy of mind, and philosophy of science. This volume in the prestigious Library of Living Philosophers series contains 26 chapters original to this work, each written by a well-known philosopher, including the late Richard Rorty and the late Michael Dummett. The volume also includes Putnam’s reply to each of the 26 critical and descriptive essays, which cover the broad range of Putnam’s thought. They are organized thematically into the following parts: Philosophy and Mathematics, Logic and Language, Knowing and Being, Philosophy of Practice, and Elements of Pragmatism. Readers also appreciate the extensive intellectual autobiography.
Amazon Link |
Quantum Mechanics: And the Philosophy of Alfred North Whitehead
by Michael Epperson (Author)
Publication Date: September 18, 2018
Physics of the World-Soul: Whitehead's Adventure in Cosmology
by Matthew T. Segall (Author)
Publication Date: April 29, 2019
Whitehead was among the first initiates into the 20th century's new cosmological story. This book bring's Whitehead's philosophy of organism into conversation with several components of contemporary scientific cosmology-including relativistic, quantum, evolutionary, and complexity theories-in order to both exemplify the inadequacy of the traditional materialistic-mechanistic metaphysical interpretation of them, and to display the relevance of Whitehead's cosmological scheme to the transdisciplinary project of integrating these theories and their data with the presuppositions of human civilization. This data is nearly crying aloud for a cosmologically ensouled interpretation, one in which, for example, physics and chemistry are no longer considered to be descriptions of the meaningless motion of molecules to which biology is ultimately reducible, but rather themselves become studies of living organization at ecological scales other than the biological.
Linus Learning Link |
"Hi Russ. I just finished a logic book of the sort you describe, although it does incorporate the Boolean —but remember Whitehead gave Boole an algebraic interpretation in 1898, and Langer expressed Whitehead’s theory of extensive connection as an algebra that melded with his interpretation of Boole in 1937. I have operationalized that system adding set-theoretical functions and bringing it into a regimentation process of natural language that Langer and Whitehead do not have." - Randall Auxier
* * * * * * * * * *
Tim Eastman - Untying The Gordian Knot
Process, Reality and Context of the Quantum Theoretical Process
ZoomCast held on April 13, 2021
Introduction by Russ Slater
Today's zoom cast went well as I sat in and listened. I didn't expect to be introduced to 65+ scholars from all kinds of process fields. Nor be asked several times to contribute some of my thoughts I had typed on the chat bar to Tim Eastman and the group (many thanks to Randall Auxier for his helpful insights.) In my group introduction I spoke out what I hoped I would find in today's meeting held by John Cobb and the Cobb Institute. I discovered that relational process logic was actually being applied into quantum computing logic which, of course, will have a lot of import for greentech ecological societies. Especially as contra the reductionistic, Boolean science effort taking place appling silicon logic into quantum fields logic.
R.E. Slater
April 13, 2021
For the lecture itself Tim Eastman provided a summary of his discussion which is shown below. My own notes I don't believe would be as helpful as Tim's notes as I am presently learning and self-educating. Overall, my question was how to apply Whitehead's Process Relational Thought into the area of Relational Quantum Computing Logic. Essentially, it is being researched but as the technology medium is in the early stages of application it will be growing into itself over the next 100 years. Hopefully by utilizing all which Process Thought may provide to perceived reality, time studies, possibility and actuality, etc, within the quantum realm of quarks+ and light. - res
Notes - Russ Slater
Tim Eastman's discussion spoke to current studies in the Quantum Theoretic Process as opposed to transactional-based old-order Logic Systems. It is to be remembered that Alfred North Whitehead was first an English Mathematician before becoming a philosophical thinker who developed process philosophy and theology.
In any Semiotic System there will always be the Problem of Perspective: How to apply it to ontology (causation --> emergence) and epistemology (re constraints of knowledge).
SpaceTime - Is it a derivative of relationality? In process thinking it is. Therefore, "Yes".
The problem of Scientism - The postulation that science is complete in itself as a closed-end system; in Whitehead it is not:
As in Science, so too Life events, Philosophy, Theology, Psychology, and Sociology, etc., are always circumscribed within existential, phenomenological, epistemological, and contextual roots or contexts. Can anything ever be free of context? Frankly, "No." As such, it is important to identify and acknowledge applicable contexts.Kurt Godel's Incompleteness theorem says (i) all closed systems are unprovable within themselves and that (ii) all open systems are rightly understood as incomplete. - res
Generally, one must always ask the question, what is the context of the area of study:
- Science - claims to be context free but is contextualized by its area of study
- Similarly the Arts and Humanities are contextualized within the Anthropological realm
- Spiritual/Religious contexts will always be found in questions of cosmological metaphysics and ontology
Dyatic (sp?) Logic based in Boolean Logic - forces Quantum spin as either up or down, 1 or 0, yes or no. But a transactional Boolean system is never both. Boolean Logic does not negotiate the "Excluded Middle" nor do "Double Negations". These areas are neglected in it.
Non-Boolean Logic may be known as "Free Logic" (Question: Is this similar to "Fuzzy Logic"?? No. But there are similarities between each including the area of "Free Association") where potential relations correlate with potential outcomes.
As an example, Limited Potential Relations run through Constrained Contexts will have Limited Potential Outcomes.
Basically we are describing Process Logic by: "The Physics of Potentiality or Possibility to that of the Actaul Actuality." This is how Whitehead would describe "The Real". As the Physics of the "Possible-lism".
Question 1. When we quantumize measured or non-measured outcomes do we limit the quantum relational process transaction?? This would be described as a Transactional ProcessQuestion 2. Or do we approach Quantum Logic by not limiting it to the transactional process at all? This would be describes as the Possibilist Process
How is Plasma Physics different from Quantum Physics? The former speaks to the electro-magnetic realm; the latter to the additional quantum realms of matter and force (besides EM there is gravity, the weak and strong nuclear forces, and possibly a fifth quantum force).
Bottom line George Boole did not include states for potentiality and possibility (in Whiteheadian terms prehension and conscreascence). These give us the theory of potency, involving the ability to measure general actuality or Real Time. All real events have potentiality for possibility.
We might redescribe Real Time as Relational Realism (per my comment to the group - res).
- SpaceTime --> The World is not a collection of things but a collection of events.
- There are three Categories of Events:
- Momentary or Temporal Events,
- Limited Duration Events,
- Persistent Events.
- Real Time is contextualized in the subject matter event in relationship
- to all corresponding previous and contemporaneous events
- Remember to include the panpsychic and panexistential root/context
- with any real process system
- Methodology must not limit ontology
Restated, the possible-actual proceeds the actual-actual. Similar to Stephen Hawking's description of a photon of radiation which travels to its destination before it actually travels to it, then looks back on the path it traversed, to then actually commit from its stage of potentiality to its stage of actuality. This is the weirdness of quantum physics.
Quantum Stages of Development
- Standard or Fundamental Quantum Physics to -->
- Anticipatory Quantum Systems to -->
- Model-Dependent Systems.
Summary: Memory-based, anticipatory complex systems --> Interleaved Complext Possibilities
Context? The process-relational perspective
Definitions
Wikipedia - Semiotics (also called semiotic studies) is the study of sign processes (semiosis), which are any activity, conduct, or process that involves signs, where a sign is defined as anything that communicates a meaning that is not the sign itself to the sign's interpreter. The meaning can be intentional such as a word uttered with a specific meaning, or unintentional, such as a symptom being a sign of a particular medical condition. Signs can communicate through any of the senses, visual, auditory, tactile, olfactory, or gustatory (sic, feelings].The semiotic tradition explores the study of signs and symbols as a significant part of communications. Unlike linguistics, semiotics also studies non-linguistic sign systems. Semiotics includes the study of signs and sign processes, indication, designation, likeness, analogy, allegory, metonymy, metaphor, symbolism, signification, and communication.Semiotics is frequently seen as having important anthropological and sociological dimensions; for example, the Italian semiotician and novelist Umberto Eco proposed that every cultural phenomenon may be studied as communication. Some semioticians focus on the logical dimensions of the science, however. They examine areas belonging also to the life sciences—such as how organisms make predictions about, and adapt to, their semiotic niche in the world (see semiosis). In general, semiotic theories take signs or sign systems as their object of study: the communication of information in living organisms is covered in biosemiotics (including zoosemiotics and phytosemiotics).Semiotics is not to be confused with the Saussurean tradition called semiology, which is a subset of semiotics.(1986) Semiotics and the Philosophy of Language - PDF, General Editor, Thomas A. Sebeok, Indiana University PressAbstract - The human-machine symbiotics, in its wider conception extends beyond the production game. It is about the symbiosis of hand and brain, a productive interplay between the user and the machine, and an interactive interplay between the objective knowledge and the tacit dimension. Central to this conception is the design of 'machines with purpose', an alternative vision to that of the instrumental rationality embedded in the computer. The paper reflects back upon the 1970s conception of symbiotics, exploring its evolution over the last four decades. As we now enter the world of cyber realities and fragmented selves on the one hand, and the world of cultural diversities and pluralities on the other, we ponder on whether neuroscience offers a route to a holistic symbiotics, which is even more relevant to the digitally mediated world we live in.In computer science, contextualization is the process of identifying the data relevant to an entity (e.g., a person or a city) based on the entity's contextual information.DefinitionContext or contextual information is any information about any entity that can be used to effectively reduce the amount of reasoning required (via filtering, aggregation, and inference) for decision making within the scope of a specific application. Contextualisation is then the process of identifying the data relevant to an entity based on the entity's contextual information. Contextualisation excludes irrelevant data from consideration and has the potential to reduce data from several aspects including volume, velocity, and variety in large-scale data intensive applications (Yavari et al.).UsageThe main usage of "contextualisation" is in improving the process of data:Reduce the amount of data: Contextualisation has the potential to reduce the amount of data based on the interests from applications/services/users. Contextualisation can improve the scalability and efficiency of data process, query, delivery by excluding irrelevant data.As an example, ConTaaS facilitates contextualisation of the data for IoT applications and could improve the processing for large-scale IoT applications from various Big Data aspects including volume, velocity, and variety.
Systems theory is the interdisciplinary study of systems, which are cohesive groups of interrelated, interdependent parts that can be natural or human-made. Every system is bounded by space and time, influenced by its environment, defined by its structure and purpose, and expressed through its functioning. A system may be more than the sum of its parts if it expresses synergy or emergent behavior.Changing one part of a system may affect other parts or the whole system. It may be possible to predict these changes in patterns of behavior. For systems that learn and adapt, the growth and the degree of adaptation depend upon how well the system is engaged with its environment. Some systems support other systems, maintaining the other system to prevent failure. The goals of systems theory are to model a system's dynamics, constraints, conditions, and to elucidate principles (such as purpose, measure, methods, tools) that can be discerned and applied to other systems at every level of nesting, and in a wide range of fields for achieving optimized equifinality.General systems theory is about developing broadly applicable concepts and principles, as opposed to concepts and principles specific to one domain of knowledge. It distinguishes dynamic or active systems from static or passive systems. Active systems are activity structures or components that interact in behaviours and processes. Passive systems are structures and components that are being processed. For example, a program is passive when it is a disc file and active when it runs in memory. The field is related to systems thinking, machine logic, and systems engineering.