Co-authored with P. Murali Doraiswamy, M.D.; Rudolph E. Tanzi, Ph.D.; and Menas Kafatos, Ph.D.
It would be reassuring to most people to discover that the universe is constructed to favor life. If the human race isn't a freakish outcome of highly improbable chance events, we have every right to see the universe as our home. But this psychological reassurance strikes physicists and biologists as wishful thinking; the bulwark of modern science, from the most minuscule events at the quantum scale to the Big Bang itself, is the assumption that creation is random, without guidance, plan, mind or purpose.
Only very slowly has such a blanket view been challenged, but these new challenges are among the most exciting possibilities in science. We'd like to outline the argument for a "human universe" with an eye to understanding why the human race exists. This question is too central to be left to a small cadre of professional cosmologists and evolutionary biologists; everyone has a personal stake in it.
The most accepted theory of the large-scale structure of the universe is Big Bang cosmology, which has achieved impressive results. Yet when you try to model the universe, you can't escape the problems surrounding what seems like a simple act: observing it. Measuring the cosmos is intricately interwoven with limits imposed by the process of observation itself. As you go back in time or ahead into the future, as you reach so far into space that light takes billions of years to reach Earth, any possible model encounters horizons of knowledge at some ultimate, faint observational limit. Beyond such a horizon, observation is blocked, and so are physics, mathematics and the human mind.
For example, with the Big Bang theory, light cannot be used to observe further back in time or across immense distances to arrive close to the very beginning itself. The first instant of the Big Bang remains forever hidden from the present. Knowledge about the early universe has to be inferred. We can examine the parts that scattered after the Big Bang, but we cannot grasp the whole. Thus, our observational limitations prohibit verifying cosmological theories to any degree of accuracy for any observational test. So the Hubble telescope, marvelous as it is for sending back photos of distant galaxies, can't reveal reality independent of cosmological theory. Theory cannot be verified with complete certainty, which means that important topics like the expansion of the universe and the evolution of galaxies are our own mental constructs; they reflect who we are as observers, not independent reality.
Fine Tuning in Cosmology
What science can see and infer about space and time is certainly fascinating. We want to touch upon the inexplicable fact that the cosmos fits together with the smallest and largest aspects fine-tuned beyond anything that pure chance can explain. Talking about this fine tuning is done mathematically, in a language beyond the reach of non-scientists. Yet as soon as anyone ventures to suggest a creation that departs from randomness, two bad things happen. Religionists leap into the breach with God, and in reaction scientists become hotly defensive. We aren't out to add to either of these bad things. But we can't ignore the human implications of what we're about to discuss, because God and science will both be forced to take new shapes.
The most basic aspect of fine tuning is the consistency of the cosmos, which is the smoothest of cream soups compared with lumpy oatmeal. The universe we observe is essentially flat, which has given rise to the "flatness problem." Being nearly flat today, the universe must have been exactly flat close to the time of the Big Bang itself, to one part in 1050 (10 followed by 50 zeros, an unimaginable vast number). Why? The usual interpretation proposed in the '80s is that early on, the universe was in an inflationary state, washing out any departures from flatness on extremely short time scales of 10-35 seconds. (Imagine one of those whirling paintings sold at carnivals, with the colors swirling outward with incredible force; not a single drop would leap up off the paper.) In more general terms, it would appear that the universe followed the simplest possible theoretical construct (flatness) in its large-scale geometry.
The inflationary model was developed to account for the flatness of the universe and also supposedly solves the "horizon problem." That problem arose because, looking in all directions, the universe is remarkably homogeneous, as related to the microwave background radiation that fills it; the temperature of this radiation is constant if one looks at different parts of the sky, to 1 part in 106 . Such consistency isn't easy to explain. Observations indicate that the background radiation filling all space was emitted around 100,000 years after the beginning, meaning that opposite sides of the sky at that time were separated by approximately 10 million light years. How could two opposite parts of the sky be so similar to each other if information had no chance to get from one to the other? Imagine a hot pancake fresh off the griddle that you tear into pieces and fling into the air. One hundred thousand years later, all the pieces have the same temperature as one another, even though they never came into contact again; this is like the horizon problem.
Yet the biggest fine tuning is the value of the so-called cosmological constant, introduced by Einstein as part of general relativity to keep the universe stable and not collapse back on itself. The idea was proposed before Edwin Hubble discovered the expansion of the universe, which then presented a dynamic cosmos, without the need to keep it stable. Einstein later called the cosmological constant his biggest blunder. Today we don't believe it was a blunder. The cosmological constant is a value to describe the density of energy everywhere in empty space. Such constants, like gravity and the speed of light, are necessary for the mathematical computations of physics to work. In this case we are talking about the dark energy in empty space that cannot be seen.
In recent decades the cosmological constant has been reintroduced because current observations seem to indicate that the universe not only is expanding but is also accelerating in its expansion. The standard model of particle interactions predicts a value that is 10122 larger than the actual observed value. Had the value been what standard particle theory predicts, the universe could not exist in its present form. This is known as the "cosmological constant problem."
This last has shaken our confidence that we can rely upon observation in the normal sense if we want to grasp what the universe is. Regular matter (i.e., atoms and molecules) contributes 4 percent or less of the enclosed density of the universe right now. Therefore, if one insists on exact flatness, one needs to introduce unknown forms of "dark matter" (around 25 percent) and "dark energy" (around 70 percent) to make a flat universe. Worse still for cosmologists, unknown physics is required by a non-zero cosmological constant. The mathematical model for a flat universe is simple in its initial assumptions, but the underlying physics required to maintain it is complex and even unknown.
Let's translate the dilemma into everyday terms. Only 4 percent of the universe, meaning all the stars, galaxies, planets, light, heat and interstellar dust, fits into science. We are perched as if on the cherry that tops an ice cream sundae, trying to make the whole dessert conform to being like a cherry, since that's the only world we know. But the universe refuses to be a cherry, and what it insists on being may be inconceivable, for unlike a bacterium that may have floated onto the top of an ice cream sundae through the air, we were born on the cherry, are made of its substance and can think only in terms of our small, specific surroundings.
Yet fine tuning has always lurked on the edges of standard physics, being ignored only because for a century, observation was triumphant, carrying theory along with it. You can do wonders with subatomic particles, relativity and quantum calculations before you have to worry about events that occurred over 13 billion years ago. The universe "as it presents itself" was good enough, as it has been for a long, long time. But the numbers are inescapable, and on every side they point to a universe that is fine-tuned at the smallest and largest levels. An ancient Indian proverb, "as is the large, so is the small," can be put in modern terms: "As is the microscopic, so is the macroscopic."
This similarity defies randomness. Pure chance is the clumsiest, most inelegant and least probable way to explain a fine-tuned cosmos, which means that it isn't good science. In an ironic twist, the numbers game of modern physics has revealed that the numbers match too well. It's like a bingo game where the machine spits out the same ball millions of times in a row. How can that be? More importantly, why is creation fit together seamlessly? We'll look for plausible answers in the next post.
To be continued...
Deepak Chopra, M.D., is the author of more than 70 books, 21 of which were New York Times bestsellers. Murali Doraiswamy, M.D., is a professor of psychiatry at Duke University Medical Center in Durham, N.C., and a leading physician in the areas of mental health, cognitive neuroscience and mind-body medicine. Rudolph E. Tanzi, Ph.D. is the Joseph P. and Rose F. Kennedy Professor of Neurology at Harvard University and the director of the Genetics and Aging Research Unit at Massachusetts General Hospital (MGH). He is the co-author (with Deepak Chopra) of Super Brain: Unleashing the Explosive Power of Your Mind to Maximize Health, Happiness, and Spiritual Well-being (Harmony). Menas Kafatos, Ph.D., is the Fletcher Jones Endowed Professor in Computational Physics at Chapman University. He is the co-author (with Deepak Chopra) of the forthcoming book Who Made God (And Other Cosmic Riddles) (Harmony).
For more, visit deepakchopra.com.
Follow Deepak Chopra on Twitter: www.twitter.com/DeepakChopra