Thursday, February 19, 2009

I Am a Hologram, and So Can You!

Paradigm shifts are rare in science, as anyone who has read Thomas Kuhn's The Structure of Scientific Revolutions knows. The authorities of the scientific establishment, working within a particular frame of reference, are often reluctant to eschew the accepted scientific dogma and adopt new modes of thinking. This is not entirely a bad thing, of course, since skepticism is the basis of science; and when theories explain the majority of the evidence, they deserve to be accepted. However, when the dogma becomes so entrenched that new information not conforming to the prevailing theories is tossed out, the present schema is too rigid to account for dissenting ideas. A current trend in physical science (not all of it fringe) is the emerging concept of the universe as a hologram. Since Newton, scientific materialism has been the conceptual framework of physical science, envisioning the universe as a sort of enormous machine with interrelated parts, through which all explanations of natural phenomena can be reduced to physical laws of nature. In the observable universe, this paradigm has proved remarkably resilient in its explanatory power. But, as it turns out, at the very smallest level, this view of the universe explains essentially nothing, and quantum mechanics must be employed, to the great confusion of most scientists, since the reconciliation of these two theories is among the holy grails of particle physics. According to some scientists, it seems that the model of the universe as a hologram does in fact reconcile some of the discrepancies, not only between Newtonian physics and quantum mechanics, but also string theory, the black hole information paradox, and if the bolder assertions are true, even more extra-natural phenomena.

The universe as a hologram has recently found itself in the news because of the GEO600 gravitational wave detection experiment going on in the German countryside. The experiment was originally designed to detect the enormous gravitational waves sent out by very massive, dense bodies such as neutron stars and black holes. What they didn't expect to find were traces of background noise that may inform physicists of the fundamental limits of space-time itself - where it ceases to behave like the smooth continuum that Einstein described in his theory of relativity and dissolves into a sort of grainy picture filled with little bits of information about our universe. From the article from New Scientist:

According to Hogan, the holographic principle radically changes our picture of space-time. Theoretical physicists have long believed that quantum effects will cause space-time to convulse wildly on the tiniest scales. At this magnification, the fabric of space-time becomes grainy and is ultimately made of tiny units rather like pixels, but a hundred billion billion times smaller than a proton. This distance is known as the Planck length, a mere 10-35 metres. The Planck length is far beyond the reach of any conceivable experiment, so nobody dared dream that the graininess of space-time might be discernable.

That is, not until Hogan realised that the holographic principle changes everything. If space-time is a grainy hologram, then you can think of the universe as a sphere whose outer surface is papered in Planck length-sized squares, each containing one bit of information. The holographic principle says that the amount of information papering the outside must match the number of bits contained inside the volume of the universe.
The researchers at GEO600 had no idea what they were detecting had any real significance until Craig Hogan, a Fermilab physicist at Batavia, Illinois, who had previously predicted that such noise exists, informed the research team that the noise implies we are all living in a giant hologram: that our entire universe and everything in it is a 3D reflection of what is a 2D reality - just as a hologram is a 2D surface appearing to be a 3D image. I am no physicist so I won't attempt a more in depth explanation, (the article linked above provides a pretty good one) but suffice to say that if Hogan is right, all of the "information" of the universe is stored in bits bigger than the Planck length, but which represent information the size of the Planck length in our visible universe, and that the universe we inhabit every day - from atoms and molecules to stars and galaxies - is a sort of "reflection" of the 2D bits. This is useful for understanding the "wholeness" of the universe rather than simply the reductionist version of "leaves for the tree." The source of the theory actually dates further back than this experiment at GEO600 to at least the late 70s, when Stephen Hawking and Jacob Bekenstein discovered that black holes emit radiation, but convey no information about their interior. Once the black hole is gone, all of the physical information about its existence is gone with it. This breaks the physical law that physical information cannot be destroyed and is known as the black hole information paradox. According to the holographic universe theory, the information is actually contained in these bits of information on the sphere - the very noise that the GEO600 may be detecting.

The researchers are quick to point out that this is not "proof" that such a model of the universe is accurate, but that the evidence that it should be taken seriously is mounting. The implications for the scientific community are, of course, staggering, but what of the metaphysical ones? Major discoveries in science are often accompanied with major philosophical reinterpretations of the universe - heliocentricism, evolutionary theory, the modern genetic synthesis, relativity have all affected humanity's outlook on its place in the universe. What of this theory? After some research, I found it extraordinarily interesting that, even though it is only recently that evidence has begun to accumulate to support the holographic theory, the notion has been around among both scientific and philosophical minds alike for several decades now.

The holographic model of the universe gained traction in the 1970s as the University of London's David Bohm, a protege of Einstein's who also worked with J. Robert Oppenheimer in the 1940s and 50s, became frustrated by modern theoretical physics' inability to explain natural phenomena encountered in quantum physics. It was also explored around the same time by Stanford neuropsychologist Karl Pribram, who believed that the prevailing views on human consciousness could not account for certain functionalities in human perception. If the holographic paradigm of the universe ever supplants the materialist one, it will be a prime example of multiple discovery within the memetic zeitgeist of the human scientific experience if ever there was one, and perhaps even more so, since the discovery was made by two researchers working in different fields. Unless, of course, you believe some of the more outrageous claims about the theory that the mind sciences and physics are merging into one field.

Prior to the GEO600 experiment, the most recent synthesis of the holographic universe has been put on display in Michael Talbot's book The Holographic Universe, written in 1991. Talbot writes in the intellectual tradition of Fritjof Capra and Frank Tipler, in the sense that much of his work is characterized by an attempt to reconcile various physical phenomena with supernatural mysteries and mystical religious experiences. Capra, in his Tao of Physics, wrote of the striking similarities between what modern particle physics revealed about the nature of the cosmos and the mystical veins in Eastern religions, especially Buddhism, Hinduism, and Taoism. Tipler believes that the the Omega Point theory implying a quantum singularity at the end of the universe implies that life after death and the immortality of the conscious mind is a reality. In fact, Tipler says in the introduction to his book Physics and Immortality, that "theology is a branch of physics, that physics can infer by calculation the existence of God, and the likelihood of the resurrection of the dead at the end of time..." What all of these theories have in common, of course, is the unique human penchant for inserting ourselves into the cosmos when most of the scientific evidence points the other way: The universe is a cold, unknowing place, a place of Aristotle's unmoved mover, where physical laws operate as if the universe were a giant machine and all the particles cogs. Quantum physics shows us, though, that merely the act of observing particles can change their behavior, so removing conscious perception from equation is equally erroneous since we too are a part of the universe. So, is a middle ground possible? There is certainly a great deal of evidence not explained by the materialistic view of the universe. How can electrons thousands of light years away from each other affect one another? Alain Aspect's 1982 experiment seems to defy Einstein's theory of relativity by showing that "communication" between particles travels faster than the speed of light. From another article on the holographic principle:

Aspect's experiment is related to the EPR Experiment, a consciousness experiment which had been devised by Albert Einstein, and his colleagues, Poldlsky and Rosen, in order to disprove Quantum Mechanics on the basis of the Pauli Exclusion Principle contradicting Special Relativity.

Aspect and his team discovered that under certain circumstances subatomic particles such as electrons are able to instantaneously communicate with each other regardless of the distance separating them. It doesn't matter whether they are 10 feet or 10 billion miles apart.

Somehow each particle always seems to know what the other is doing. The problem with this feat is that it violates Einstein's long-held tenet that no communication can travel faster than the speed of light. Since traveling faster than the speed of light is tantamount to breaking the time barrier, this daunting prospect has caused some physicists to try to come up with elaborate ways to explain away Aspect's findings. But it has inspired others to offer even more radical explanations.

University of London physicist David Bohm, for example, believes Aspect's findings imply that objective reality does not exist, that despite its apparent solidity the universe is at heart a phantasm, a gigantic and splendidly detailed hologram.

The article goes on to give the example, within the framework of the holographic principle, of a watching a fish swimming in a fish tank on two television screens, each showing slightly different angles. One would assume, if the screens were the only information available, that there were two separate fish, but once closer observations revealed corresponding movement, one would realize the the fish had something in common. Perhaps these corresponding electrons are not "communicating" but actually the same electron, separated by our perception of reality as three dimensional. How can human perception be reconciled with physical reality? Is there a true bridge between the mind and matter or is human consciousness some cosmic accident? According to Talbot at least, the holographic principle of the universe seems to answer, or at least answers better than other theories, many of these questions. From perusing just a few of the many articles on the web about the holographic principle, I've found an extraordinary list of phenomena it purports to explain:

- Like Fritjof Capra, this theory is one of "interconnectedness" and at its foundation, the less conservative proponents appear to believe that this will inevitably lead to a new field of science combining neurobiology, psychology, cognitive science, physics, cosmology, astronomy, and possibly even theology.

- String theory and the black hole information paradox will be reconciled and the information from the black holes will be encoded on the 2D "pixels" on the spherical universe (which apparently conforms to the number of dimensions predicted by string theory....I honestly have no idea what this means).

-Consciousness can be said to be a product of the universe that humans have somehow accessed, rather than as a material phenomenon localized entirely within the brain (going even further, this implies that the human mind can manipulate matter, opening the door for all sorts of science fictiony things like telekinesis, telepathy, etc.)

-Objective physical reality is an illusion, created by these pixels on the outside of the "surface" of the universe, conforming to the Hindu concept of reality as Maya, an illusion produced by the self.

-Synchronicity, the occurrence of meaningful coincidences (in time), could be explained as a product of the conscious mind, similar to the correspondence between subatomic particles at a various distances (in space).

OK. So all of this sounds a little ridiculous. And it sort of is. To be fair, most of the scientists currently working within the new paradigm don't subscribe to any of the outlandish claims made by some. This is simply another theory that confirms physical findings and no philosophical claims need be made until more information is available. This could merely be the most recent incarnation of scientifico-religious compromise that has been the norm for centuries, from the early Christian attempt to reconcile Biblical dogma with Neoplatonic philosophy to the medieval Scholastic attempt to maintain Church teaching while submitting to the authority of Aristotelian thinking to Capra and Tipler discovering links between particle physics and cosmology with Eastern philosophy and the Christian concept of an afterlife. It is, historically speaking, nothing new. Still, it's attractive to think of one theory with so much explanatory power. Whether it has a scientific basis or is New Age nonsense, the odd phenomenon people claim to encounter everyday do exist, even if only in our minds, and anything that provides answers to the unexplainable deserves to at least be given a once over.

Wednesday, February 18, 2009

Can We All Just Tone Down the Socialism Rhetoric Please?

I don't want to go into some politico-economic diatribe here, but I've honestly been getting a little disheartened at the slinging around of the words "socialist" and "socialism" in the context of the stimulus package, bail-out of Wall Street, and the Obama presidency in general. Here's the cover story from last week's Newsweek. Here's another one from yesterday on Fox News. From the usual suspects like Sean Hannity and Bill O'Reilly, to the McCain/Palin campaign trail rhetoric, to every scared ideologue of the Right, "socialism" has become the new terrorizing watchword of the young 2009. The problem I have isn't really in the political nature of the mud-slinging or the partisanship but in the basic lack of understanding in the vocabulary used to describe the situation. Ask the average American what socialism means, and you'll probably get a response something akin to "government controls the economy" or "the opposite of capitalism" or "that thing the Soviets did" and all of them would be, by and large, wrong.

Dictionary and encyclopedic definitions can often shed light on meaning, but even here, there is a major disconnect with what the term is supposed to mean and what it is perceived to mean: most emphasize the Marxist and Soviet varieties, which are but two out of many. Socialism, arguably, has a much broader meaning than capitalism. Capitalism at its most basic level is a system whereby all means of production (companies, land, etc.) are privately owned and operated and the investment of money into these means of production is a source of wealth itself. Socialism, however, has a larger economic meaning in the sense that the means of production are distributed to a greater variety of individuals, and could be controlled, depending on the society, by groups of individuals (co-ops or syndicates), the public at large through democratic means, or controlled by a state hierarchically from the top down. The problem with the misleading rhetoric coming out of most mainstream media is that the focus is on the latter at the expense of all other meanings. In fact, the original envisioning of socialism did not even include the statist variety - this was essentially created by Lenin and Stalin. It does not reflect well on the economic intelligence of our society as a whole that a seventeen year old blogger seems to understand this far better than the supposed experts in the field. The perception of socialism as a particularly heinous form of government control comes, I assume, from the fact that the galvanizing enemy against which the United States defined itself for 60 years was the U.S.S.R. - the Union of Soviet Socialist Republics. However, this is only one form of socialism, the statist variety, and a brief history of the movement shows that most socialists in the world today reject this variety. One could argue, in fact, that anything that is statist is, by definition, not socialism, since true socialism requires some form of cooperative ownership and control. State control of all means of production is really like having one corporation run everything.

Here's where the current situation comes in to play. What is going on now - pumping money into corporations, banks, state and local governments, and the economy at large to create jobs - is not really socialism, since the control of the capital and the means of production hasn't actually changed. A better term for this would be something like oligarchic hypercapitalism or plutocratic hypercapitalism or interventionist hypercapitalism. True socialism would involve something along the lines of firing all of the heads of the companies getting bail-outs and giving ownership equally to the workers of the companies regardless of how much money they have invested, or even whether they have money invested. Or, it would be giving all taxpayers an equal stake in the companies, with decision-making power at the ballot, with the money to be returned in kind as these companies rebuild. At least the Newsweek article gets one thing right: it points out that Republican administrations spend a great deal of money as well, and Reagan and both Bushes actually showed a net increase in spending during their terms. If the stimulus package is socialism, people on the Right should take solace in the fact that Obama has at least this one thing in common with his Republican predecessors.

Friday, February 13, 2009

Idle Thoughts

I used to write poetry. I've been writing random things for almost as long as I can remember. Little stories about aliens and dinosaurs when I was in elementary school, pretend newspaper articles, and when I was around 14 I actually completed a novel that was an absurd amalgam of basically every trope from Star Wars, Star Trek, and probably any other science fiction novel/movie/television show I had seen or read up to that point. As I've mentioned elsewhere, I feel like it took me a really long time to settle on a writing style that really fits my voice, and it was not to be found in fiction or poetry but in essays and research. But for a long time I thought I wanted to be a novelist, indeed assumed that I eventually would be, and it wasn't for lack of trying or confidence that shook this desire from me, it was my own wearied feeling that this simply wasn't for me - a hard feeling to swallow when you've been heading in a certain direction for years. Poetry came a little bit later and coincided with the time in my life that I thought my interests would be steered from the writing of literature to the study of it. This was from somewhere between late junior year in high school to around mid sophomore year in college. I don't really know why I started writing but I filled a couple of notebooks in the three or so years I wrote. Most of them, as could be expected from a 17 year old, were pretty terrible. The notebooks are probably somewhere at my parents house, I honestly don't know. However, while rifling through an old box of notes from college the other day, I happened to find a few poems that I turned in for a creative writing class I took in the summer of 2001. Usually, when I find things I wrote in years past I'm confronted with a feeling somewhere between embarrassment and pride, not in the work itself, but in how far I've come as a writer since then. However, I read this poem I wrote that summer and was actually rather impressed with what I had done. It doesn't really "mean" anything as far as I can tell, and if I can remember correctly I was just playing around with language when I wrote it. So, for all you literary critics that read this blog, don't feel like you need to look to deeply into it. I just thought I'd share since, as I said, I'm kind of surprised that I wrote this.

Idle Thoughts

Upwards reaching,
Inner is towards.
Many who may know me outwards
Are changing inwards.

Through one naked whisper
The ghostly heart is warmed--
Unaccustomed to the cold
Conversation, love is freed.

Returning those gray laughs
Was a subconscious eternity
Of the softest afternoons.

Glimpse the mastery enclosed:
Should I dare remain my
Own weakness,
Or change the mask I wear?

Precisely where childhood
Crossings were, punctured walls
Between fears were not.

"Never sigh," I thought just once,
Numbly wanting someone.
Love wasn't really found.
Never whisper what you say.

Love is the most covetous desire,
Erupting over mountains, but
Beginning in night's shadows.

Thursday, February 12, 2009

Demythologizing Lincoln and Darwin: A Very Short Historiography

Today marks the 200th birthdays of two of the 19th century's, and probably any century's, greatest figures - Abraham Lincoln and Charles Darwin. Rather than going into some detailed rambling of the historical importance of these two men as is usually my wont, I'd like to discuss the mythologizing of historical figures and the damage that can often occur when we replace the complexity of a person's character with the stagnant archetypes of our own era's perceptions. To foist upon any historical person traits that are not theirs but ours is, to paraphrase Orson Scott Card, "to kill them all over again." It is not uncommon, of course, for someone's persona to take on a life of its own following his death, especially when the life was so full and the deeds so grand. This disservice begins, I believe, when a person's life can be boiled down to one moment or one act or one thought, and Lincoln's and Darwin's are often described this way: Lincoln freed the slaves; Darwin discovered evolution. End of story. This distillation obscures the complexities of these men's lives and allows, in mythological fashion, for the blanks to be filled in as we will, and depending on our perspectives, allows us to deify or demonize at will. Lincoln was the "Great Emancipator"; Lincoln was a "war-criminal." Darwin was a brilliant naturalist; Darwin was an anti-Christian demagogue. These definitive labels do an injustice not only to the men themselves but to the historical record of which they are a part.

Lincoln scholarship has gone through numerous phases, but the best interpretations have been the ones that ignore the current trends and attempt to get to the heart of the man himself, recognizing the strengths and weaknesses, the flaws and the personal growth that arises from confronting them. Henry Louis Gates, Jr., director of Harvard University's W.E.B. DuBois Center for African and African-American Research, has described the writings and historical scholarship related to Lincoln as similar to that of Christ, in the sense that he has been "all things to all people," and Gates is quick to point out that next to Christ, more books have been written about Lincoln than any other figure in history (his count is over 15,000). How can we siphon through all of these to dig out the "true Lincoln" or is such a feat even possible? The short answer, I think, is that it is not. History is a selective narrative of past events and every historian will inevitably emphasize that which they believe is most important for understanding the person and the times. The problem is when we leave out important aspects of a person's character because they don't "fit" with our preconceived notions of who they were. Christopher Columbus is an extraordinary example of selective scholarship that has obscured basic facts about a man. Sure, he was a master sailor, a cunning businessman, and a daring adventurer, and he has been mythologized to the point that we have named cities, built statues, and granted a holiday after the man; but he was also so intent on the acquisition of gold and the conversion of indigenous peoples to Christianity, that his actions in the New World can be described as nothing short of genocidal. Lincoln's actions, similarly, cannot be boiled down to "he freed the slaves" because this ignores realities that were characteristic not just of Lincoln but of most white men and women of mid-19th century America. Lincoln, while he disagreed with the institution of slavery, did not believe that blacks should be given full citizenship in the U.S., which included voting rights, the right to marry whites, the right to sit on juries, and so on. And while the Emancipation Proclamation is undoubtedly one of the most significant documents in American history, we must not overlook the political calculus in its inception. Some modern scholars have admirably demonstrated that the legal rationale behind the document lay in the war powers granted to the president during the Civil War, and that as enemy combatants, southern slave owners' property was considered the property of the United States because of their state of rebellion. Therefore, all slaves in states that had seceded became property of the U.S., and their freedom was granted at the will of the new property owners. This, of course, did not include slaves in the border states of Missouri, Kentucky, West Virginia, or Maryland, since the citizens there were not in a state of rebellion and thus the U.S. had no property rights over them.

The misunderstandings of Darwin are equally as multitudinous and this often has more to do with a misunderstanding of evolution than with the man himself. Perhaps the greatest general myth about Darwin is that he "discovered" evolution. Evolutionary theory had been around for centuries, being discussed in primitive fashion in the Classical Age of Greece and Rome as well as in the Middle Ages and Early Modern times. What Darwin did was provide a mechanism by which it occurred - the theory of natural selection. The prevailing notions of life in Darwin's era were dominated by essentialism and uniformitarianism, though the evolutionary theory that did hold sway among some naturalists was the Lamarckian version which said that changes acquired during an organism's lifetime were then passed on to its offspring. Darwin, informed by the thinking of Thomas Malthus's An Essay on the Principle of Population, corrected this view by describing change as occurring within a population, rather than an individual, and that only inherited traits that allowed a population to survive would be passed on. One standard example given to explain the difference between Lamarckian and Darwinian evolution is that of a giraffe's neck: it's not the case that a giraffe's neck gets a little longer in a single giraffe's lifetime, whose offspring's neck gets a little longer, and so on until the neck is as long as it is now. The case is that, in a given population of giraffes, the organisms with the longer necks are better suited to survive and thus pass along their traits while those short necked giraffes do not pass on their traits and die out. Darwin himself admittedly did not know what was being passed on, and it was over half a century before the discovery of DNA and the field of genetics would give evolution its modern synthesis. But what about the man Darwin? For one thing, he is associated with atheism, not just by fundamentalist Christians out to vilify him, but among atheists themselves who wish to claim him as some sort of patron saint. Darwin began his life on the track to becoming a country clergyman in the Anglican Church as his father wanted, but his restless spirit tended to set him on the wandering path, and this in part led him to the circumnavigational voyage in the Beagle. While he later called himself an agnostic, he never was quite able to completely relinquish the idea of God, and attended Church with his family well after his formulation of the theory of evolution by means of natural selection. Perhaps more mythologizing than this, is the oversimplified account of his arrival at the theory of evolution. Most have heard the story of Darwin's examination of the variety of finches on the Galapagos Islands and how he wondered how the wild differences in the size, shape, and function of their beaks arose, epiphanously coming to the conclusion that descent with appropriate modifications must account for these anomalies. However, as science writer David Quamman pointed out in this month's National Geographic, his greater clues came three years before in 1832 in Argentina while viewing the assortment of fossils in Patagonia. The diversity and differentiation based on geographic and geological distribution in the fossil record were his first clues. It is also important to note that Darwin is not even the only contemporary figure to come to conclusions regarding evolution, as Alfred Russel Wallace, halfway around the world in Indonesia and Malaysia, came to very similar conclusions after noting the differences in bio-geography between the "Asian" animals West of the Macassar Straight and the "Australian" animals East of it.

Perhaps the mythologizing of historical figures is not all bad. Lincoln has become such an iconic figure of what it means to be a good president, that he has set a standard by which all other presidents are judged. Lincoln should also be a paragon for personal growth and change, as he softened many of his views on the relationship between blacks and whites by the end of his life. In fact, the last public speech he made in his life detailed his plan to grant full citizenship to black veterans of the Civil War and upstanding black men such as his friend Frederick Douglass. One person in the crowd at this speech, John Wilkes Boothe, found this personal growth to go to far. We should be wary though of anything that shrinks the life of any one man or woman to a monolithic action or thought because this removes from them everything that makes them human: flaws, idiosyncrasies of personality, internal conflicts, and the agony of personal decisions which will have consequences for all.

Tuesday, February 3, 2009

The Probability of the Improbable

"It is likely that unlikely things will happen."

A recurrent conversation I seem to have with people is on the balance between analysis and intuition in daily life. I consider myself an incredibly analytical person, always taking in every scrap of information about something and viewing a problem from all angles before I arrive at a decision or a solution. Still, even though I find it difficult not to rationally observe all the information at my disposal, I actually have a rather embarrassing habit of impulsively choosing things on occasion. And it's not dependent on the size of the decision either: sometimes I agonize for several minutes in the morning over what pair of pants to wear, but have made decisions about trips to take and major purchases because they "felt" right. If this sounds ridiculous, it is; but I find it interesting that many, if not most people, make decisions this way. Perhaps even more interestingly is how good the record for making decisions this way is. Given this interpretation of human behavior, one thing that's especially humorous to me, is when something is especially counterintuitive. So, here's a little anecdote I ran across while recently perusing the book How to Think about Weird Things, which illustrates how intuition can sometimes be way off:

When we try to judge the probabilities involved in events, we're often wrong. Sometimes we're really wrong because the true probabilities are completely counter to our intuitive "feel" for the odds. Mathematician John Allen Paulos offers this surprising example of a counterintuitive probability:

"First, take a deep breath. Assume Shakespeare's account is accurate and Julius Caesar gasped 'You too, Brutus' before breathing his last. What are the chances that you just inhaled a molecule which Caesar exhaled in his dying breath? The surprising answer is that, with a probability better than 99 percent, you just did inhale such a molecule. For those who don't believe me: I am assuming that after more than two thousand years the exhaled molecules are uniformly spread about the world, and the vast majority are still free in the atmosphere. Given these reasonably valid assumptions, the problem of determining the relevant probability is straightforward. If there are N molecules of air in the world and Caesar exhaled A of them, then the probability that any given molecule you inhale is from Caesar is A/N. The probability that any given molecule you inhale is not from Caesar is thus 1 - A/N. By the multiplication principle, if you inhale three molecules, the probability that none of these three is from Caesar is [1 - A/N]3. Similarly, if you inhale B molecules, the probability that none of them is from Caesar is approximately [1- A/N]B. Hence, the probability of the complementary event, of your inhaling at least one of his exhaled molecules, is 1 - [1 - A/N]B. A, B (each about 1/30th of a liter, or 2.2 x 1022), and N (about 1044 molecules) are such that this probability is more than .99. It is intriguing that we're all, at least in this minimal sense, eventually part of one another."
I'd want to see a study on how long it takes molecules contained in 1/30th of a liter to diffuse into a body of gases the volume of the Earth's atmosphere as well as the likely amount of molecules that are not free in the atmosphere to be sure this is accurate. I guess that's just my analytical nature. However, I'm willing to give the study the benefit of the doubt. I guess that's just my intuitive nature.