Thursday, March 12, 2015

Confusing the Biblical God of Time with the Greek God of Time: Classic Theism & Divine Timelessness - What It Is and Isn't from a Non-Process Point of View


An Example of Unwarranted Theological Speculation: Divine Timelessness

by Roger Olson
[comments by R.E. Slater]
February 19, 2015

“The God of the philosophers is not the God of Abraham, Isaac and Jacob.”
And it can have negative consequences for Christian practices such as prayer.

- Roger Olson

In my immediately preceding post [see next article below]  I argued that far too much Christian theology includes unwarranted speculation—especially about God. Under pressure from Greek ontology traditional, “classical theism” has generally agreed that the God of Abraham, Isaac and Jacob (the Yahweh of the Bible) is somehow (i.e., differently expressed) “outside of time” such that temporal sequence, the passage of past into present into future (or future into present into past) is known to God but not experienced by God.

Put in other words, for this classical theistic view, God’s eternity means (in relation to time) “simultaneity with all times.” In other words, in this view, explained best (most scholars agree) by Boethius, God exists in an “eternal now.” For him, our future has already happened. In other words, this is not just a claim about God’s foreknowledge; it is a claim about God’s being. It is not merely epistemological; it is ontological.

I have taught Christian theology on both the undergraduate and graduate levels for thirty-four years. I can say without hesitation that the vast majority of my students think this is biblical theology and orthodoxy. They think this view of God’s eternity, timelessness, being “outside of time,” is part-and-parcel of Christianity, the Christian faith, even the gospel itself. Try questioning it and you will (I have) experience(d) strenuous push back.

I suspect many people think two things about this traditional view of God’s eternity and time:

1 - First, it is necessary for God’s transcendence; if God is somehow “within time” or if time is “real for God,” then (God forbid!) God must be limited.

2 - Second, it undermines God’s power and sovereignty and makes God unreliable. The future, scary to us, becomes scary to God, too. As one theologian expressed it, denial of the classical view makes God a “pathetic, hand wringing God who has to wait fearfully to see what will happen.

And yet…

Nowhere does the biblical story of God, the biblical narrative that identifies God for us, and upon which classical Christian theology claims to be based, say or even hint that God is “outside of time” or “timeless” or that all times are “simultaneously before the eyes of God.”


  • This view of God’s eternity entered into Christian theology from Greek philosophy which regarded time as imperfection.
  • Greek philosophy was notoriously negative with regard to time. Hebrew thought was not; it regarded time and history as the framework for God’s action.


Many Christian theologians, not all liberal, began to see this especially in the nineteenth century and began to deconstruct the Greek influences in Christian theology insofar as they were extra-biblical and tended to de-personalize God.

Among the earliest to begin this deconstruction project were Richard Watson (Methodist), I. A. Dorner (Lutheran-Reformed), and even Karl Barth (Reformed). (Barth’s belief about God’s relation to time was complex and ultimately ambiguous, but there can be no denying that he incorporated time into God even if he called God’s time “divine temporality.”) (Many conservative theologians “blame” liberal Adolph Harnack for this de-Hellenizing project, but he was not alone.)

And yet…

As Christian philosophers Nicholas Wolterstorff and Keith Ward (among others) have strenuously pointed out, a “timeless being” cannot interact with temporal beings. In fact, such a being, freed entirely of the flow of time, cannot “act” in time at all. In fact, he cannot even know that “today is February 18, 2015.” But most importantly, as Christian philosopher Dallas Willard pointed out most emphatically in The Divine Conspiracy, such a God cannot be affected by our prayers.

So how to preserve God’s transcendence together with denial of God’s timeless eternity?

This has been answered many times by many philosophers and theologians. The answer is “creation as kenosis.” This is explained in many books but is especially clear in The Work of Love: Creation as Kenosis edited by theologian-scientist John Polkinghorne (Eerdmans, 2001). Polkinghorne’s own contribution to the book is best: “Kenotic Creation and Divine Action.” The point is that God’s entry into time with us (T. F. Torrance in Space, Time, and Incarnation) is voluntary.

Why would God do this? Out of love for the sake of having real relationships with temporal creatures made in his own image and likeness. A “timeless” God who is not temporal cannot be affected by anything temporal beings do. The great medieval scholastic theologians knew this and described all God’s relations with temporal creatures as external, not internal. In other words, God remains unchanged, unaltered in every way [sic, divine impassibility]. They drew the natural, reasonable conclusion that an eternal God, one for whom our past, present and future are all “now,” must be immutable and even impassible in the strongest senses possible. He can act but not react. (Whether he can even act within time is strongly questioned by Wolterstorff and Ward and other Christian philosophers.)

God’s transcendence lies in the fact that he remains self-sufficient in his eternal nature and character; time does not erode or add to God’s being. And the voluntarily temporal God remains omnipotent and everlasting. Time does not limit or threaten his being or power.

The conclusion of many modern Christian thinkers (and here I am excluding process theologians such as Nelson Pike whose 1970 book God and Timelessness is a devastating logical critique of the traditional view) is that the classical view of God’s eternity (“outside of time,” “eternal now-ness”) is pure philosophical-theological speculation unrelated to the God of the Bible and alien to any religion that values an interactive God. It tends to make God remote, uninvolved, inaccessible and non-relational (except within himself).

---

I often wonder about the possible connection between this classical view of God’s eternity (viz., “outside of time”) and many Christians’ view of prayer. When I ask my students if they have heard the saying that “Prayer doesn’t change things; it changes me” almost all confess that they heard it in church or home or somewhere among Christians. That prayer changes me is not contested; that prayer can change the mind of God is clearly biblical. To say otherwise is to reduce numerous passages of Scripture to mere imagery, “anthropomorphism,” and guts petitionary prayer [into a nothingness, meaningless, worrisome task]. I suspect that the idea of God as “outside of time,” “eternal now,” “non-temporal,” contributes to and supports that false idea of prayer.

---

So why did this traditional view of God’s eternity (“timelessness”) invade Christian theology which claims to be based on the Bible or at least governed by the Bible? I suspect the Greek “logic of perfection” contributed to it—the idea that God is perfect, time is imperfect, therefore God must be timeless.

A more biblical way of looking at the subject would be that God is perfect, love is perfect, love between God and created beings requires freedom-in-and-for relationship, freedom-in-and-for relationship (between God and created beings) requires time, it is more perfect for God to enter into time than to remain outside time.

---

At this point in a conversation about this subject many Christians will ask “Why can’t God enter into time and be outside of time also?” But think about that language. How can a non-temporal being “enter into time” while also remaining outside of time? If what has been said is true, then only “part of God” would be in real relationship with free creatures. There would still be “part of God” for whom the future has already happened. If the future has already happened at all, for anyone, even for a “part of God,” then history’s unfolding including our ability to affect God (e.g., with prayer) is a charade. The idea that this problem can be solved by positing that God both enters into time and remains outside of time is simply another way of saying God is timeless. It doesn’t solve anything.

But more importantly, the logic of the Bible, the flow of the biblical narrative, God’s story with us, never says or even hints that God is “outside of time.” It only says that God is everlasting and hints (very strongly) that God is the Lord of time. All that the biblical narrative requires is that God be omnipotent within time and able to guarantee his victory over whatever forces oppose him.

If there is some aspect of God that remains above or outside of time, I do not know how to speak of that, how even to imagine that, how to make that consistent with God within time (and God in real relation with creatures in time). Therefore, I leave that aside, simply lay it down, and ignore it. It doesn’t function in a truly biblical theology except as a metaphysical compliment some people feel compelled to pay to God. But I don’t feel so compelled by the Bible or by reason.

I realize that the temporal view of God goes against the bulk of Christian tradition, but the traditional, classical view is derived from alien [(non-biblical, non-Jewish)] sources. It is a foreign body within biblical theology. “The God of the philosophers is not the God of Abraham, Isaac and Jacob.” And it can have negative consequences for Christian practices such as prayer.

Theologian Robert Jenson is widely considered one of the most astute contemporary Christian theologians. Many of his books (such as God after God: The God of the Past and the Future as Seen in the Work of Karl Barth [1969] and The Triune Identity: God According to the Gospel [1982]) strongly point toward a temporal view of God. Page 4 of The Triune Identity powerfully expresses what I (and other non-process theologians who embrace a temporal view of God) believe:

“God may be God because in him all that will be is already realized,
so that the novelties of the future are only apparent and its threats
therefore not overwhelming. Or God may be God because in him
all that has been is opened to transformation, so that the guilts of
the past and immobilities of the present are rightly to be
 interpreted as opportunities of creation.”

 - Robert Jenson


* * * * * * * * *


Theology and Speculation

by Roger Olson
February 17, 2015

The issue here is the legitimacy of speculation in theology. What is speculation? In this context, if not in all contexts, “speculation” is making truth claims without clear warrant—reasonable grounding in relevant data. In theology “relevant data” are revelation/Scripture, tradition, reason (logic) and experience. “Experience” is intersubjective experience, not private experience.

Some years ago I came to the conclusion that much Christian theology, truth claims made by Christian theologians, is speculation—as opposed to clear exegesis of Scripture and tradition using reason and experience as guidance mechanisms and tools of interpretation.

Now I need to give examples. It seems to me that calling the Holy Spirit “the bond of love between Father and Son” in the immanent Trinity (like much talk about the immanent Trinity) is speculation. And yet, probably due to the influence of Augustine (De Trinitate) it has become commonplace in especially Western Christian theology.

Another example is theories of atonement: how exactly Christ’s death on the cross made possible human salvation. All theories of the atonement seem speculative to me.

Of course, my examples reveal that I think much “tradition” is itself speculative compared with Scripture itself. Most Protestants, anyway, claim to believe that Scripture trumps tradition and where tradition departs from Scripture it is to be held more lightly and with less authority.

Over the years I have gotten in the habit of reading every book of theology with a kind of critical principle about speculation: “Is this truth claim made by this theologian actually warranted by revelation/Scripture?”

I would guess that as much as half or more of all that I read in books of theology is speculation.

Now, having said all that, let me also say that speculation is not always bad. The human mind seeks answers and it is naturally for theologians to guess at possible answers when revelation/Scripture is not clear. I don’t blame theologians for attempting to answer the question “Why did Jesus have to die in order for us to be saved?” but I recognize much of most theological answers as speculation.

Some years ago I was at a banquet honoring a world class theologian’s sixtieth birthday. During his talk after the toasts he reminisced about his life and career and specifically referred to a falling out he had with another theologian who was present. He said directly to the other theologian “Let’s all just remember: It’s only guesswork anyway.”

Naturally I was a bit shocked as I had spent much time and effort studying that theologian’s work. And yet I was somewhat relieved at the same time because I had come to suspect that much of his theology was, indeed, “guesswork” or what I am calling here speculation.

Again, to me that does not nullify the value of that theologian’s contributions to theology. I only wish he and others labeled their prescriptive truth claims “speculation” rather than putting them forward as “truth.”

As I said, speculation is not bad or wrong, but it should not carry the weight, authority, that truth grounded directly in revelation/Scripture carries. And yet what often happens is people adopt an entire theological project as truth and are reluctant or unwilling to recognize much of it as speculation when, in fact, that is what it is.

Theologians (and others) should be more open and transparent about this. They should label their truth claims with degrees of speculation—from that which is closer to the data (“justified speculation”) to that which is farther from the data (“guesswork only”). My own reading of Christian theology has led me to believe that most theologians (other than fundamentalists) actually know these differences between truth claims but, for whatever reasons, are reluctant to label them.

What difference does all this make? Well, it should be obvious to people involved in theological work. It is very common for graduate students, for example, to latch on to a theologian and his or her system and treat it as truth itself. Arguments break out and people are demeaned, insulted, marginalized, even excluded because they don’t buy into the “truth system” that, in fact, is half or more speculation.

The twentieth century saw a renaissance of the doctrine of the Trinity in Christian theology—from Barth to Boff theologians (almost) all piled on the Trinity bandwagon offering their own contributions. I have read much of that literature and concluded that at least half of what is said about the Trinity in those numerous tomes of modern theology is speculation. Some of it I would consider warranted speculation but much of it is sheer guesswork. Who can really know the inner workings of the Trinity?

Now don’t get me wrong; I love theological speculation. I’m sure I do it! But a greater degree of humility ought to accompany speculation in theology (than is usually the case).

A case in point is the debates about the so-called “decrees of God.” Supralapsarians, infralapsarians, Amyrauldians, Arminians. So much of what went on in Protestant scholasticism was sheer speculation and yet it led to numerous divisions among Christians. Another case is, as already mentioned, theories of the atonement. The Bible overflows with images and metaphors. Attempting to develop a model that explains why Jesus had to die is natural, but elevating one model to dogma or rejecting others as wrong amounts to elevating to speculation to timeless truth.

There are minds attracted to speculation. When a theological question seems important to someone (or a group) and yet revelation/Scripture is not as clear as they wish it were (often the case) they often look around for a “Bible teacher” or theologian who “has the answer” and then latch onto that person as if he or she thought God’s thoughts after him. They then treat that Bible teacher’s or theologian’s “teachings” as God’s truth itself. They do not want to hear “This is my opinion,” or “This is how it seems to me.” So often the favored Bible teacher or theologian treats his or her doctrinal formulations just like the proverbial fundamentalist preacher who pounds the pulpit saying “The Bible saaaayyyys…” when, in fact, the Bible doesn’t say that at all!

These people have trouble distinguishing between revelation/Scripture and a theologian’s speculative answers to their pressing questions. I have seen this happen with students (and theologians) enamored with the theology of Karl Barth; they become upset when anyone questions Barth’s veracity on a subject. “That’s just Barth’s speculation” invites a glare and an argument—as if Barth’s Church Dogmatics were dropped down out of heaven. I have seen the same happen with people devoted to Paul Tillich’s Systematic Theology.

In fact, both Barth and Tillich were prone to speculation. I suspect they knew it. But too often they did not couch their truth claims in such a way as to indicate “this is speculation.”

I favor theological works that admit speculation is just that—speculation. Unfortunately, they are too few.


Exploring Evolution Series - Mankind Enters a New Anthropocene Era



This article is from the In-Depth Report 400 PPM: What's Next for a Warming Planet


CO2 Levels for February Eclipsed Prehistoric Highs
http://www.scientificamerican.com/article/co2-levels-for-february-eclipsed-prehistoric-highs/?utm_source=nextdraft&utm_medium=email

Global warming is headed back to the future as the CO2 level reaches a new high

March 5, 2015 |By David Biello
More and more carbon dioxide molecules are accumu-lating in Earth's atmosphere | Astronaut photograph from International Space Station, courtesy of NASA.

February is one of the first months since before months had names to boast carbon dioxide concentrations at 400 parts per million.* Such CO2 concentrations in the atmosphere have likely not been seen since at least the end of the Oligocene 23 million years ago, an 11-million-year-long epoch of gradual climate cooling that most likely saw CO2 concentrations drop from more than 1,000 ppm. Those of us alive today breathe air never tasted by any of our ancestors in the entire Homo genus.

Homo sapiens sapiens—that’s us—has subsisted for at least 200,000 years on a planet that has oscillated between 170 and 280 ppm, according to records preserved in air bubbles trapped in ice. Now our species has burned enough fossil fuels and cut down enough trees to push CO2 to 400 ppm—and soon beyond. Concentrations rise by more than two ppm per year now. Raising atmospheric concentrations of CO2 to 0.04 percent may not seem like much but it has been enough to raise the world's annual average temperature by a total of 0.8 degree Celsius so far. More warming is in store, thanks to the lag between CO2 emissions and the extra heat each molecule will trap over time, an ever-thickening blanket wrapped around the planet in effect. Partially as a result of this atmospheric change, scientists have proposed that the world has entered a new geologic epoch, dubbed the Anthropocene and marked by this climate shift, among other indicators.



We aren't done yet. Greater concentrations will be achieved, thanks to all the existingcoal-fired power plantsmore than a billion cars powered by internal combustion on the roads today and yet more clearing of forests. That's despite an avowed goal to stop at 450 ppm, the number broadly (if infirmly) linked to an average temperature rise of no more than 2 degrees C. More likely, by century's end enough CO2 will have been spewed from burning long-buried stores of fossilized sunshine to raise concentrations to 550 ppm or more, enough to raise average annual temperatures by as much as 6 degrees C in the same span. That may be more climate change than human civilization can handle, along with many of the other animals and plants living on Earth, already stressed by other human encroachments. The planet will be fine though; scientists have surmised from long-term records in rock that Earth has seen levels beyond 1,000 ppm in the past.

The current high levels of CO2 have spurred calls, most recently from the National Academy of Sciences, to develop technologies to retrieve carbon from the atmosphere. The U.N. Intergovernmental Panel on Climate Change relies for that on growing plants, burning them instead of coal to produce electricity, capturing the resulting CO2 in the smokestack and burying it—or in the argot: BECCS, bioenergy with carbon capture and storage, a few examples of which are scattered around the globe. Other schemes range from artificial trees to scour the skies of excess CO2 to fertilizing the oceans with iron and having diatoms do the invisible work for us.

Climate change is inevitable and, if history is any guide to what can be expected, so, too, may be regime change. A few years of diminished rainfall and attendant bad harvests have been enough in the past to fell empires, such as in Mesopotamia orChina. The world's current roster of nations struggles to hash out a global plan to cut the pollution that causes climate change, which currently stands at 90 pages of negotiating text. In addition, one nation has submitted its individual plan (or "individual nationally determined contribution," INDC in the argot) to accomplish this feat—Switzerland.

The plans of China, the European Union and the U.S. are already broadly known, if not formally submitted. Together, they are both the biggest steps ever taken to address global warming and likely insufficient to prevent too much climate change, scientific analyses suggest. The E.U., U.S. and China remain reliant on fossil fuels and the world is slow to change that habit thus far. In fact, China has become the world's largest polluter and millions of Chinese have lifted themselves out of poverty with the power from burning more and more coal, a trick India hopes to follow in the near future.

For the Swiss, the bulk of pollution comes from driving cars and controlling the climate inside buildings. Their long-term plan is "to reduce per capita emissions to one–1.5 tonnes CO2-equivalent," the INDC states. "These unavoidable emissions will have to be eventually compensated through sinks or removals." In a world that spews more and more CO2 but needs to get to below zero emissions, bring on those sinks and removals. In the meantime the sawtooth record of rising atmospheric CO2 levels moves ever upward and March 2015 will likely be the name of the next month to boast levels above 400 ppm.

* * * * * * * * * * * *

Mass Deaths in Americas Start New CO2 Epoch

http://www.scientificamerican.com/article/mass-deaths-in-americas-start-new-co2-epoch/?WT.mc_id=SA_Facebook

A new proposal pegs the start of the Anthropocene to the Little Ice Age and the Columbian Exchange

amazon-rainforest
Mass deaths after Europeans reached the Americas 
may have allowed forests to regrow, reducing atmo-
spheric concentrations of carbon dioxide and kicking
off a proposed new Anthropocene geologic epoch. | 
Courtesy of NASA
"Placing the Anthropocene at this time highlights the idea that colonialism, global trade and the desire for wealth and profits began driving Earth towards a new state," argues ecologist Simon Lewis of Leeds University and the University College of London. "We are a geological force of nature, but that power is unlike any other force of nature in that it is reflexive, and can be used, withdrawn or modified."

Lewis and Mark Maslin, a geologist at UCL, dub the decrease in atmospheric carbon dioxide the "Orbis spike," from the Latin for world, because after 1492 human civilization has progressively globalized. They make the case that human impacts on the planet have been dramatic enough to warrant formal recognition of theAnthropocene epoch and that the Orbis spike should serve as the marker of the start of this new epoch in a paper published in Nature on March 12. (Scientific American is part of Nature Publishing Group.)


The Anthropocene is not a new idea. As far back as the 18th century, the first scientific attempt to lay out a chronology of Earth's geologic history ended with a human epoch. By the 19th century, the idea was commonplace, appearing as the Anthropozoic ("human life rocks") or the "Era of Man" in geology textbooks. But by the middle of the 20th century, the idea of the Holocene—a word which means "entirely recent" in Greek and designates the most recent period in which the great glacial ice sheets receded—had come to dominate, and incorporated the idea of humans as an important element of the current epoch but not the defining one.

That idea is no longer sufficient, according to scientists ranging from geologists to climatologists. Human impacts have simply grown too large, whether it's the flood of nitrogen released into the world by the invention of the so-called Haber-Bosch process for wresting the vital nutrient from the air or the fact that civilization now moves more earth and stone than all the world's rivers put together.

Researchers have advanced an array of proposals for when this putative new epoch might have begun. Some link it to the start of the mass extinction of large mammalssuch as woolly mammoths and giant kangaroos some 50,000 years ago or the advent of agriculture around 10,000 years ago. Others say the Anthropocene is more recent, tied to the beginning of the uptick of atmospheric CO2 concentrations after the invention of an effective coal-burning steam engine.

The most prominent current proposal connects the dawn of the Anthropocene to that of the nuclear age—long-lived radionuclides leave a long-lived record in the rock. The boom in human population and consumption of everything from copper to corn after 1950 or so, known as the "Great Acceleration," roughly coincides with this nuclear marker, as does the advent of plastics and other remnants of industrial society, dubbed technofossils by Jan Zalasiewicz of the University of Leicester, the geologist in charge of the group that is advocating for incorporating the Anthropocene into the geologic time scale. The radionuclides can then serve as what geologists call a Global Stratotype Section and Point (GSSP), more commonly known as a "golden spike." Perhaps the most famous such golden spike is the thin layer of iridium found in rock exposed near El Kef, Tunisia, that tells of the asteroid impact that ended the reign of the dinosaurs and thus marked the end of the Cretaceous Period about 65 million years ago.



Lewis and Maslin reject this radionuclide spike because it is not tied to a "world-changing event," at least not yet, though it is a clear signal in the rock. On the other hand, their Orbis spike in 1610 reflects both the most recent CO2 nadir as well as the redistribution of plants and animals around the world around that time, a literal changing of the world.

Much like the golden spike that marks the end of the dinosaurs, the proposed Orbis spike itself would be tied to the low point of atmospheric CO2 concentrations around 1610, as recorded in ice cores, where tiny trapped bubbles betray past atmospheres. Further geologic evidence will come from the appearance of corn pollen in sediment cores taken in Europe and Asia at that time, among other indicators that will complement the CO2 record. Therefore, scientists looking at ice cores, mud or even rock will find this epochal shift in the future.



The CO2 drop coincides with what climatologists call the Little Ice Age. That cooling event may have been tied to regenerated forests and other plants growing on some 50 million hectares of land abandoned by humans after the mass death brought on by disease and warfare, Lewis and Maslin suggest. And it wasn't just the death of millions of Americans, as many as three-quarters of the entire population of two continents. The enslavement (or death) of as many as 28 million Africans for labor in the new lands also may have added to the climate impact. The population of the regions of northwestern Africa most affected by the slave trade did not begin to recover until the end of the 19th century. In other words, from 1600 to 1900 or so swathes of that region may have been regrowing forest, enough to draw down CO2, just like the regrowth of the Amazon and the great North American woods, though this hypothesis remains in some dispute.

Whether in 1610, 1944 or 50,000 B.C., the new designation would mean we are living in a new Anthropocene epoch, part of the Quaternary Period, which started more 2.5 million years ago with the advent of the cyclical growth and retreat of massive glaciers. The Quaternary is part of the Cenozoic, or "recent life," Era, which began 66 million years ago, which is, in turn, part of the Phanerozoic ("revealed life") Eon, which started 541 million years ago and encompasses all of complex life that has ever lived on this planet. In the end, the Anthropocene might supplant its old rival the Holocene. "It is only designated an epoch, when other interglacials are not, because back in the 18th century geologists thought humans were a very recent species, arriving via divine intervention or evolving on Earth in the Holocene," Lewis argues, but scientists now know Homo sapiens arose more than 200,000 years ago in the Pleistocene epoch. "Humans are a Pleistocene species, so the reason for calling the Holocene an epoch is a relic of the past."

Maslin suggests downgrading the Holocene to a stage within the Pleistocene, like other interglacial spans in the geologic record. But Zalasiewicz disagrees with this bid to get rid of the Holocene. "I don't see the need," he says. "Systematic tracing of a Holocene / Anthropocene boundary globally would be a very illuminating process in all sorts of ways."

The changes wrought by humans over the course of the last several centuries, if not longer, will echo in the future, whether in the form of transplanted species, like earthworms or cats, crop pollen in lake sediments or even entire fossilized cities. Still, whether the Anthropocene started tens, hundreds or thousands of years ago, it accounts for a minute fraction of Earth's history. And this new epoch could end quickly or endure through millennia, depending on the choices our species makes now. "Embracing the Anthropocene reverses 500 years of scientific discoveries that have made humans more and more insignificant," Maslin notes. "We argue that Homo sapiens are central to the future of the only place where life is known to exist."


A Theory of Consciousness - Network Theory Sheds New Light

Since I have spent some time redefining Genesis 1-3 in terms of human consciousness (sic, How God Created by Evolution: A Proposed Theory of Man's Evolutionary Development) I wish to continue uncovering scientific origin theory on how this "sense of the human self" developed and evolved. But not simply biologically as regarding our physical being but also as a primitive grouping of clans and tribes leading to a new "sense of self" when living and working together sociologically. As such, today's article will continue this subject. 





Which came first - human consciousness or eusociality?
Or, put another way, which created which?

Edward O Wilson





http://relevancy22.blogspot.com/2013/09/index-science-religion.html





* * * * * * * * * * * * *



The black dots correspond to the 264 areas of the cerebral cortex that the researchers probed, and the lines correspond to the increased strength of the functional connections between each of these brain areas when subjects consciously perceive the target. The "hotter" colors are associated with stronger connections. This figure illustrates that awareness of the target corresponds to widespread increase in the strength of functional connections (Credit: Marois / Godwin).

Network theory sheds new light on
origins of consciousness
http://medicalxpress.com/news/2015-03-network-theory-consciousness.html

by Melanie Moran
March 11. 2015


Where in your brain do you exist? Is your awareness of the world around you and of yourself as an individual the result of specific, focused changes in your brain, or does that awareness come from a broad network of neural activity? How does your brain produce awareness?

Vanderbilt University researchers took a significant step toward answering these longstanding questions with a recent brain imaging study, in which they discovered global changes in how brain areas communicate with one another during awareness. Their findings, which were published March 9, 2015, in the Proceedings of the National Academy of Sciences, challenge previous theories that hypothesized much more restricted changes were responsible for producing awareness.

"Identifying the fingerprints of consciousness in humans would be a significant advancement for basic and medical research, let alone its philosophical implications on the underpinnings of the human experience," said René Marois, professor and chair of psychology at Vanderbilt University and senior author of the study. "Many of the cognitive deficits observed in various neurological diseases may ultimately stem from changes in how information is communicated throughout the brain."

Using graph theory, a branch of mathematics concerned with explaining the interactive links between members of a complex network, such as social networks or flight routes, the researchers aimed to characterize how connections between the various parts of the brain were related to awareness.

"With graph theory, one can ask questions about how efficiently the transportation networks in the United States and Europe are connected via transportation hubs like LaGuardia Airport in New York," Douglass Godwin, graduate student and lead author on the research, said. "We can ask those same questions about brain networks and hubs of neural communication."

Modern theories of the neural basis of consciousness fall generally into two camps: focal and global. Focal theories contend there are specific areas of the brain that are critical for generating consciousness, while global theories argue consciousness arises from large-scale brain changes in activity. This study applied graph theory analysis to adjudicate between these theories.

The researchers recruited 24 members of the university community to participate in a functional magnetic resonance imaging (fMRI) experiment. While in the fMRI scanner, participants were asked to detect a disk that was briefly flashed on a screen. In each trial, participants responded whether they were able to detect the target disk and how much confidence they had in their answer. Experimenters then compared the results of the high-confidence trials during which the target was detected to the trials when it was missed by participants. These were treated as "aware" and "unaware" trials, respectively.

Comparison of aware and unaware trials using conventional fMRI analyses that assess the amplitude of brain activity showed a pattern of results typical of similar studies, with only a few areas of the brain showing more activity during detection of the target than when participants missed seeing it. The present study, however, was interested not simply in what regions might be more activated with awareness, but how they communicate with one another.

Unlike the focal results seen using more conventional analysis methods, the results via this network approach pointed toward a different conclusion. No one area or network of areas of the brain stood out as particularly more connected during awareness of the target; the whole brain appeared to become functionally more connected following reports of awareness.

"We know there are numerous brain networks that control distinct cognitive functions such as attention, language and control, with each node of a network densely interconnected with other nodes of the same network, but not with other networks," Marois said. "Consciousness appears to break down the modularity of these networks, as we observed a broad increase in functional connectivity between these networks with awareness."

The research suggests that consciousness is likely a product of this widespread communication, and that we can only report things that we have seen once they are being represented in the brain in this manner. Thus, no one part of the brain is truly the "seat of the soul," as René Descartes once wrote in a hypothesis about the pineal gland, but rather, consciousness appears to be an emergent property of how information that needs to be acted upon gets propagated throughout the brain.

"We take for granted how unified our experience of the world is. We don't experience separate visual and auditory worlds, it's all integrated into a single conscious experience," Godwin said. "This widespread cross-network communication makes sense as a mechanism by which consciousness gets integrated into that singular world."