Conservation of Ignorance: A New Law of Nature
Author: John Horgan
John Horgan is a science journalist who has knocked many scientists over the course of his career and yet stubbornly thinks of himself as a nice guy. For a critical albeit weirdly selective take on Horgan’s work, check out his Wikipedia page, which harps on his 1993 article “The Death of Proof” and his attacks on racist pseudoscience. You can learn more about John’s work and read essays like the following at johnhorgan.org.
Hoboken, February 25, 2023. My ideas often begin as jokes. A notion pops into my head, and I toy with it for my own amusement and perhaps that of colleagues and students. (I like dangling goofy ideas before my students, to see if they bite.) I don’t take the idea seriously or expect anyone else to.
Now and then, the more I play with an idea, the more sense it makes. That happened with the end of science a while back, and it’s happening with another idea that occurred to me recently, which I call conservation of ignorance. Let me lay it out for you.
My original inspiration, or goad, was an idea espoused by physicist Leonard Susskind: conservation of information. Susskind calls it the most fundamental of all laws of physics, beating even conservation of energy. Conservation of information decrees that the universe at any moment bears the imprint of everything that has happened and will happen; there is only one past and one future associated with any given present. Yes, this is hard-core determinism. Free will? Fuhgeddaboudit.
What about quantum mechanics, which says an electron can follow many different paths? Doesn’t the probabilistic nature of quantum mechanics contradict determinism and hence conservation of information? Not according to Susskind. His reasoning is tortuous, but it reduces to saying that because the Schrodinger equation and other quantum formulas are deterministic, reality must be too. This strikes me as confusing the map with the territory.
Although I have doubts about conservation of information, let’s just assume, for fun, that it’s true and see where it takes us. It implies something much more interesting than dumb old determinism; it implies that knowledge is conserved. Now, physicists will squawk that when they talk about “information,” they don’t mean stuff stored in brains, books and hard drives; they’re talking about strictly physical items, like protons and planets and DNA. I shouldn’t conflate physical things, which are objective, with our representations of them, which are subjective.
But that’s precisely what physicists do! Determinism and its corollary, conservation of information, have less to do with reality, whatever that is, than with physicists’ mathematical models of it. Moreover, if you are a real determinist, you must accept that everything is determined, from the big bang and the origin of life right up to all the jibber-jabber of the so-called information age; it was all latent in our cosmos at its inception. Supposedly.
And you can’t draw a line between “true” and “false” forms of information. Information theory, invented by mathematician Claude Shannon a century ago, makes no such distinctions. Information is information, whether it encodes Newtonian mechanics or conspiracy theories spouted by numbskulls on Twitter. Shannon once told me that his theory “can’t and wasn’t intended to address” meaning, let alone truth.
Now here is a crucial point: If something is conserved, that means there is a finite amount of it that remains constant over time. The form of information can change, just as the form of energy changes (from potential to kinetic, for example); but as Susskind puts it, “information is never lost.” That’s a positive way to put it. A negative way would be to say that information never increases; we must pay for each apparent gain in information, or knowledge, with a proportional loss, or gain in ignorance.
The flip side of conservation of information, in other words, is conservation of ignorance; if you buy the former, you’re stuck with the latter. Once this notion occurred to me, I started finding corroboration of it everywhere. The uncertainty principle provides the most straightforward example: the more we know about where an electron is, the less we know about where it’s going.
I also see conservation of ignorance glinting within one of John Wheeler’s riffs on the meaning of quantum mechanics. The way we observe an electron, the legendary physicist noted, determines what it does and even what it is; “reality” depends on how we probe it. Musing over this fact, Wheeler proposed a disturbing analogy. When we probe nature, we are playing a special “surprise” version of Twenty Questions. In this version of the game, Wheeler leaves the room while his friends—or so Wheeler thinks—agree on a person, animal or thing that he must guess with yes-or-no questions.
But Wheeler’s friends play a trick on him; they don’t agree on a communal answer in advance. When Wheeler comes back into the room and asks the first friend, “Is it alive?”, only then does the friend think of an answer, like “whale,” and reply, “Yes.” When Wheeler asks the next friend, “Is it female?”, the friend thinks, “King Charles,” and says, “No.” And so on. Each friend must think of an answer consistent with the previous replies; otherwise, there are no constraints on answers.
Wheeler points out that the answer “wasn't in the room when I came in even though I thought it was.” In the same way, reality, before the scientist interrogates it, is undefined; it exists in an indeterminate limbo. “Not until you start asking a question, do you get something,” Wheeler says. “The situation cannot declare itself until you've asked your question. But the asking of one question prevents and excludes the asking of another.”
I’ve highlighted that final sentence, because its implications are heavy. Scientists want to believe that they are discovering the truth about reality. But Wheeler is suggesting that neither “truth” nor “reality” exists in a precise, defined way before we start asking our questions; our questions define reality and truth.
In other words, there are many possible “truths” and “realities.” (If Wheeler is right, we should utter terms like “truth,” “reality” and “knowledge” ironically, with scare quotes.) And when we settle on one “truth,” we become oblivious to others. We become, you might say, ignorant of those other “truths,” some of which might be better than ours, that is, richer, more beautiful, more conducive to human flourishing. (Wheeler’s 20-questions analogy also suggests that we should doubt the theory that inspired it, quantum mechanics!)
A celebrated psychology experiment comes to mind. The psychologist asks you to watch a video of six people passing a basketball back and forth and to count the number of times the ball changes hands. Many subjects watching the video fail to notice a man in a gorilla suit strolling s-l-o-w-l-y through the circle of basketball-passers. The lesson is that when we pay attention to one thing, we become oblivious to other things that might be at least as important. I didn’t see the gorilla when I first watched the video 20-plus years ago.
As I say above, I first thought of conservation of ignorance as a joke, an absurd implication of an absurd proposition: conservation of information. And yet the more I think about conservation of ignorance, the more it haunts me. It makes all too much sense. Our lives, after all, begin and end in oblivion. Conservation of ignorance is also forcing me to reconsider my faith in human progress, and especially in scientific and technological progress. What price are we paying for all our “discoveries”?
We, the plugged-in denizens of this logorrheic age, are awash in information. We stare at our smartphones, desperately trying to keep track of dozens, scores, hundreds of basketballs whipping back and forth. Meanwhile we are blind to marvels and perils in our midst. As the tide of pseudo-knowledge rises, so does our all-too-real ignorance of things that matter. And oblivion awaits.
Further Reading:
For more on conservation of ignorance and other ideas related to quantum mechanics, see my brand-new book My Quantum Experiment, which you can read online for free.
Addendum: Mathematician/cryptographer Whitfield Diffie responded to this column as follows:
In explaining Hamming codes to me in the late 1960s, my mentor Roland Silver taught me a concrete form of 20 Questions in which the answer is a number between 0 and 999. (The error correction issue is how many numbers can you determine if one of the answers can be a lie.) Suppose we play this 20 Questions with and without Wheeler's twist. To make things simpler, suppose we play the game with twenty friends, each of whom tosses a coin. In the expected version, when I leave the room, my friends all toss their coins and come up with a number. In Wheeler's version, they wait until I come back and toss their coins in response to my questions --- is the first bit 0; is the second bit 0; etc. The difference is only when the coins were tossed. The difficulty my friends would have tossing their coins in my presence without my seeing what they were doing suggests an intermediate form. Suppose they have all tossed their coins ahead of time but rather than looking at the tossed coin have covered it up. When I ask a question, each discretely uncovers the coin and answers based on whether it was heads (yes) or tails (no). Does it really matter when the coins were tossed and when the results were looked at?
Please consider supporting our efforts at supporting independent research with a paid subscription or a one-time donation. Help Seeds (of Science) sprout this Spring with a 25% discount for the month of March.
ICYMI: Announcing the SoS Research Collective. The Collective is now active—check out the home page and our first crop (heh) of research fellows!
Then why is Newton's second law non- abelian? Where does all that energy come from? Saying "no..." is lazy. It depends upon pulling a constant velocity out of acceleration to make working with it easier.
Very happy to have been introduced to John Horgan via this article, thank you!