On the long history of binary computing and the current movement toward quantum.
time 23 minutes
tl;dr Always wanted to learn how all of human communication can be distilled to 1s and 0s? I cover this and how, in quantum, we’re moving towards near-infinite computing power.
The past eight decades of computing, from the WWII codebreakers at Bletchley Park to the Amazon Alexas of today, all have something deeply in common: binary code.
All those late-night texts, the memes, the creators whose like buttons you smash—all the most human modes of expression we have available to us today in digital format can be expressed, totally and completely, in a series of 1s and 0s.
The journey to how we got here, to reducing human languages down to binary, is a compelling one, and it crosses huge swaths of the globe, various philosophies of human expression from ancient India to Aristotle, and, of course, the ever-turning wheel of mathematics.
But here’s the thing: our technological society is at a turning point.
You’ve probably heard the words quantum computing get thrown around, and you may have filed it in your brain as just a better computer. But what quantum computing represents is a paradigm shift in binary determinism.
We’re introducing the concept of uncertainty into the root of computing.
But I’m getting ahead of myself. If you want to understand quantum computing and all its possible transformative uses for humanity, you first need to understand how it is that we can distill our species down to 1s and 0s.
Binary starts, perhaps, with yarrow.
The yarrow herb grows all across the northern hemisphere, but for our story, we’ll zoom in on the perennial yarrow from 10th-century BC China.
In Western Zhou-period China, Yarrow represented longevity, vitality, and spiritual connection. It was used to treat a whole barn full of ailments, and in fact, its Latin name, Achillea millefolium, is derived from the later-in-history Greek hero Achilles, who, according to legend, used the herb to salve the wounds of his soldiers during the Trojan War.
The ancient Chinese name for yarrow is shī 蓍, but it’s often combined with cǎo 草, which means grass or herb. In Zhou-era China, shī 蓍 was more often combined with ruò 蒻, the symbol for the soft and tender parts of a plant, which was used for divination.
The Western Zhou period introduced the Mandate of Heaven, which posited that a leader’s authority was granted by heaven and contingent on their moral virtue. Also emergent in this time were ideas emphasizing the interconnectedness and interdependence of opposing forces and the dynamic balance that exists within the natural world.
You may be familiar with the concepts of Yin and Yang, which were first introduced in the Yijing (Book of Changes). The terms used to refer to the shady (yin) and sunny (yang) sides of a hill or river, but over time Yin and Yang came to represent all things that came in opposites or complemented each other in unique ways.
“The concept of Yin and Yang is at the core of Chinese cosmology, which holds that the universe originated from a primordial chaos. This chaos then divided into the two cosmic forces of Yin and Yang,” says JeeLoo Liu, in his excellent book, An Introduction to Chinese Philosophy: From Ancient Philosophy to Chinese Buddhism. “All things in the universe are produced through the interaction between these two forces. In the process of transformation, Yin and Yang can replace each other or they can be mutually transformed into each other. It is said that when Yin and Yang are in harmony, there is peace and prosperity; when they are out of balance, there is disaster and chaos.”
As Yin and Yang were combined with The Mandate of Heaven, the Zhou era witnessed the development of a complex system of rituals, many of which were focused on communication with the natural world, ancestors, and deities.
To determine if you were on the path to virtue, you’d need 50 humble yarrow stalks and a skilled diviner. Using one stalk to receive guidance from the natural world, the diviner would intuitively divide the remaining 49 stalks into two piles, symbolizing the interactions between yin and yang forces.
I use the word intuitively here to mean that the diviner listened to the natural world, trying to have no preconceived opinion about how to divide the stalks, letting in whatever forces may be, introducing what we might call entropy.
After the stalks were divided into two piles, they were grouped into smaller piles in much the same manner, then counted to derive numerical values which were translated into yin (broken - -) or yang (solid —) lines. The whole process was repeated six time in total, leading to a hexagram (six stacked lines) that looked like one of these:
Each of the 64 possible hexagrams in the Yijing had corresponding judgments, said to be written by the deity Fu Xi, which informed the divined about how they should live their life.
According to legend, Fu Xi and his sister Nuwa were the first humans created by the gods. They are often depicted with intertwined snake-like bodies, symbolizing their role in creating humanity and establishing Chinese civilization. Fu Xi is credited, in addition to the creation of the Yijing, with the invention of writing, fishing, and the domestication of animals.
But, remember the quote from Liu’s book? “In the process of transformation, Yin and Yang can replace each other or they can be mutually transformed into each other.”
In addition to the solid (—) and broken (- -) lines above, a hexagram could also contain changing lines—old yin changing to young yang or old yang to young yin—that indicated a dynamic situation and led to a second, related hexagram.
These changing lines were a vital part of the divination, keeping close to the needs of the natural world and humanity’s interaction with it. It was nuanced beyond just black and white.
Today, however, the divination is often done much faster, with a coin flip for each line of the hexagram, eliminating the dynamism of a changing yin or yang. 64 possible combinations from a 6-bit system of binary choices.
I think something is lost in the spinning mechanisms of money, but maybe that’s just me.
Thousands of years later, in 1654, a boy named Aisin Gioro Xuanye was born inside the Forbidden City, the imperial palace of the Chinese emperor.
And across the world, in Le Mans, France, Joachim Bouvet was born under the absolute rule of the Sun King, Louis XIV.
From an early age, Bouvet was drawn to science, to mathematics and astronomy. For a long time in the West, religion—or, more specifically, cloistered Catholicism—was the only path that existed for intellectuals who wished to dedicate themselves to a life of studies. So, Bouvet joined the Jesuit order.
Xuanye, meanwhile, at the age of six, faced the death of his father, the Shunzhi Emperor. He was named Kangxi: peaceful prosperity. At 14, he shed the weight of his advisors and ran the empire for himself. He led various military campaigns, extending Chinese control to outer regions like Tibet, Taiwan, and Xinjiang. He used the spoils of war to reduce taxes, promote cultural growth within the sciences, and even sponsor literary works, dictionaries, and encyclopedias.
Most importantly for Joachim Bouvet, the Kangxi emperor invited Jesuit missionaries to come within the walls of the Forbidden Palace and share their knowledge of the world. Bouvet, perhaps weary of the old Roman walls of Le Mans or the Julien Cathedral in which he worshipped seven times a day, according to the Liturgy of the Hours, decided to travel, to be part of King Louis XIV’s mission to spread French prestige globally (and, of course, to evangelize their religious beliefs).
Caught between the Sun King and Kangxi, Bouvet found himself appointed to the Chinese Imperial Observatory, sharing his findings, however indirectly with Kangxi.
Of course, the sharing of knowledge went both ways, however much the history books would love to teach otherwise. “In the heart of the Celestial Empire, I have discovered a treasure of wisdom and knowledge,” Bouvet wrote in a letter. “The Chinese sages possess a profound knowledge of the divine essence and of the universe.”
Bouvet got to stargaze, China got French textiles and wine, and France got tea, silk, and porcelain. For a little while, everyone seemed happy.
But tensions were rising.
Remember the yarrow stalks? The divinations? The Catholics didn’t much like what they referred to as ancestor worship, believing it blasphemous to their one true god. In 1704, Pope Clement XI issued a decree condemning Chinese rites and prohibiting their practice by Catholics. The Kangxi emperor saw this as an affront to his generosity, to his sovereignty and to Chinese culture. He expelled the Catholic missionaries from the country.
And yet, he allowed some of the Jesuits, including Bouvet, to stay. The Jesuits were trying to straddle a middle ground, taking on the policy of accommodation where they believed Chinese Christians could still participate in divination as socio-cultural practice while holding to faith in Jesus. As the decades wore on, though, even their presence in China would fade and extinguish.
By the end of the Xuanye’s reign, in 1722, China had become one of the most powerful empires in the world. After his death, any tolerance for Western ideals vanished, and doctrines of self-sufficiency ruled.
But politics aren’t our focus here. However interesting the controversies of the East and the West may be, we’re looking for a single moment in time—that moment when Bouvet, the displaced stargazer, stumbled on a little book called the Yijing.
The Book of Changes captivated him, twisting his brain in knots. He developed a pubescent philosophy which came to be known as Figurism, which suggested that the ancient Chinese text contained veiled referenced to Christian teachings. Given that the Yijin predated Jesus by 1,000 years, Figurism was not widely accepted, even among Bouvet’s fellow Jesuits.
Bouvet did, however, find one sympathetic ear—or at least one very interested in unlocking the mysteries of the Yijing. Through a mutual acquaintance, Bouvet and German polymath Gottfried Leibniz were introduced. Though they never met in person, Bovet’s correspondence with the father of calculus would go on to lay the foundations for modern computing.
Before his correspondence with Bouvet, Gottfried Leibniz had been embroiled in a daunting quest: to create a universal characteristic that could represent all human knowledge in a clear, logical, and consistent manner. The language would need to be based on a system of symbols and allow for expressions of even the most complex ideas in a vastly simplified form.
Leibniz hoped that a universal language would facilitate communication and learning across vastly different cultures and disciplines. He wanted to eliminate misunderstandings once and for all. Fundamentally, he believed that if everyone could reason together more effectively, we could discover capital-T Truths, as set out into the world by the Christian god.
That’s a vast oversimplification of Leibniz’s philosophies, which contain multitudes about monads as indivisible substances that make up the fabric of reality through pre-established harmony, each with their own perspective of the universe. But it will have to do for now.
When Leibniz encountered the Yijing, sent to him by Bouvet and accompanied by Figurism’s flawed postulations of Western and Eastern ideologies amounting to much the same thing, he saw an opportunity to build his universal language out of indivisible 1s and 0s: his deterministic versions of yin and yang.
Leibniz was particularly interested in the possibility of using binary numbers to encode logical relationship and speed up calculations. In his 1703 paper, “Explication de l'Arithmétique Binaire,” he provided a detailed account of the rules for carrying out these operations, laying out the complete foundation for modern binary.
0 + 0 = 0 and
0 + 1 = 1, but
1 + 1 = 10. (Not ten; one zero.) Unlike base-10 numbers (decimals, which were invented in ancient India, heavily developed in the North African Islamic world by figures such as Al-Khwarizmi, and imported into the West by the Italian scholar, Fibonacci), binary is a base-2 system. So, much like when adding
5 in decimals, where you carry over a
1 to equal
20, when we add
1 in binary, we need to carry over the
1, which equals
As you might imagine, binary numbers get pretty lengthy pretty fast (
1000110111 in binary), which is cumbersome to write by hand.
In Leibniz’s view, binary was more a philosophical advance than a mathematical one, and it didn’t get much if any critical acceptance in his lifetime. He’d go on to advance mathematics much more in the realm of decimals, even inventing an early—if clumsy—calculator.
Leibniz did, however, write Bouvet about binary. His friend seemed to understand: they saw creation as
0—ex nihilo, “from nothing”—and God as
1—whole, complete. From the dark side of the hill out into the sun.
200 years after Leibniz came a 19th-century English logician named George Boole, who also didn’t pay much attention to binary. And yet, his work was vital to the first computers.
To some, Boole led a rather quotidian life. Born in Lincoln, England, as the eldest of four and raised in a humble home in the Unitarian Christian faith, Boole was, from an early age, fascinated by the mind and how it makes connections.
Perhaps this passion was fostered by his father, John. The elder Boole was a talented violinist, and when he wasn’t making shoes for his trade, he was often tinkering with building telescopes or diving nose-deep into some classic mathematical work. No doubt young George found himself in his father’s workshop, discussing human ideas that far surpassed both Booles’ lack of formal education.
At 14, George Boole became a town prodigy, translating poetry from Greek and publishing it in the local newspaper. In his lifetime, Boole would learn, in addition to Greek, Latin, French, and Italian, to study what he considered the classic works. By 16, he was working as an assistant teacher throughout Lincolnshire, and by 19, he had founded his own elementary school, Mr. Boole’s Academy, to teach children what his own father had instilled in him: how to learn.
Boole found that through teaching the young, he could continue his own education, specifically in mathematics and logic. He was deeply involved in the intellectual community, giving public lectures on his nascent ideas of a logical system beyond the walls Aristotle had set out so many centuries before.
Boole, in his late twenties, began to put his ideas to paper, and in 1847, at just 32, he published The Mathematical Analysis of Logic . . . which was mostly ignored, probably due to its debut in a relatively obscure journal.
Interestingly, Boole epigraphs the work with an Aristotle quote, placed in the original Greek:
"All sciences have a connection with the common people, but not to the same extent. By 'common people,' I mean those who are unable to demonstrate what they learn; not those who do demonstrate or those to whom demonstrations are made.”
In other words, the “common” or “vulgar” people should be able to access the sciences just as much as the academic elites. It was vital to Boole that science and reason remained accessible, and although I don’t find his work the most easy to read in 2023, I do want to present some of his ideas here.
Boole believed that human logic could be distilled down into a purely mathematical representation, and he set out to do so in this book. His reasoning, based heavily on Aristotle’s categorical propositions (all
x are not
y) went something like this (plus some animals from me):
y. All kittens are cats. This can be represented as
xy = x, which is like saying, “all the kittens that are also cats are the same as all the kittens.”
y. No kittens are dogs. In algebra, we can say
xy = 0, or “there is nothing that is both a kitten and a dog.”
y. Some kittens also have blue eyes. This one can be said as
xy ≠ 0. In other words, the number of kittens that also have blue eyes is not zero.
x are not
y. Not all kittens have blue eyes.
x - xy ≠ 0. The difference between the total amount kittens and just the kittens who have blue eyes is not zero.
Now, if you think that all sounds a little . . . abstract and useless . . . you’d not be alone. Boole’s ideas were humored in his lifetime, and many respected his intelligence, but no one really related his logic to, well, anything. It all felt like a neat party trick at the time.
Boole’s father passed away just a few short months after the publication of The Mathematical Analysis of Logic. Boole, needing a way to better support his family, started teaching at the college level at Queen’s College, Cork (now University College Cork) in Ireland. Boole, who lacked his own university degree, must have been pleased with the appointment.
A few years later, in 1855, Boole (at 40) was married to Mary Everest (yes, that Everest—daughter of the man the mountain was named for), and they had five daughters together. Somewhere in the midst of caring for all his family, Boole found the time to write one last book: An Investigation of the Laws of Thought.
If his first book was his floundering with brand new ideas, this book was a cementing of those thoughts in far more detail. And, for our story, this is where Boole laid out the ever important conjunction, disjunction, and negation in mathematical terms.
Later mathematicians would also go on to add one more operation, exclusive disjunction or XOR:
These operations and their corresponding equations have been regarded as Boole’s most important contribution to logic: distilling the complexity of human reason to a series of true or false equations that, when evaluated in series, can mathematically determine the merit of an argument.
Mostly, Laws of Thoughts reads like a mathematical work, filled with abstract definitions and formal outcomes. But its logic is kinda a shitty way to express the varied nuances of the consciousness, assuming it can all be distilled to bitwise operations.
Boole seems to recognize this: in the final chapter he attempts to transform vague human experience into tangible matter, delving into the harmony between free will and established principles of thought. He wonders if some truths don’t surpass the scope of mathematical law.
To some, with this final chapter, he appears to invalidate his own careful reasoning. His ideas, though interesting, failed to take off in his time: what entity on earth could think in such a calculating manner?
I wonder if Boole would have admired the cruel logic of his own premature death. Rain, a walk to school, too long in wet clothes—all summing up to pneumonia and his passing. But it’s dangerous to reduce even a single man, taking away his body in favor of the philosophies he brought to the world.
Let’s recap. We have a way to represent numbers in binary. And now we can express logic in a binary (true or false) way. Although Boole and Leibniz contributed to vastly different fields, they shared a vision: a way for humans to reason together more precisely.
Before the 1930s, the word computer referred to a human being who calculated equations for any given profession—more like an accountant than a machine.
It was Claude Shannon, the “father of information theory,” who put
Shannon was a war baby, born right before the Americans’ involvement in the first World War. He grew up, initially, in a booming economy, but by the time he got to high school, everything had crashed.
I can only speculate so far about what went on in Shannon’s mind, but I do wonder if something of the efficiency—the leanness—of the Great Depression influenced his thinking about machines.
Calculating machines at the time did exist, but they were fully analog. We’ve skewed the word analog today to mean something more physical than digital, but it originally came from the Greek—analogos, meaning proportional—and was used in English since the early 19th century in the fields of electronics and signal processing. In these contexts, analog refers to a continuous representation of a quantity, like an electrical signal that varies in amplitude or frequency to represent information.
When I say the earliest machine computers were analog, what I mean is they were finnicky as fuck. By representing numbers on contraptions such as gears or levers or electrical signals, which could all be in various granular states at any given time, inputting and reading information from the machine was difficult and unclear.
In other words, their signal (useful information) to noise (gibberish) ratio was far too low to be practical. They resembled, in the most frightening regard, a sort of Rube Goldberg machine for math. They got answers right, sure, but the slightest malfunction would send them spinning.
What Shannon, a native Michigander, did while studying electrical engineering at University of Michigan, was happen to read George Boole, a dusty old logician whose ideas weren’t garnering much notice. And when Shannon moved on to MIT for his PhD in mathematics, he had the chance to work on a machine called the Differential Analyzer.
The Differential Analyzer was a beast, chock full of carefully aligned gears to represent various numbers and logical methodologies. Shannon, after some time with it, suggested binary.
What if, he said, we could represent all the numbers in simple yes/no states? It would be far more accurate. Furthermore, he continued. What if we used electrical circuits to perform Boolean operations, from which you can make addition, subtraction, multiplication, and so on.
It was, in short, a set of brilliant ideas. And the world was never the same.
Shannon's work on digital circuit design during his time at Bell Labs led to the development of the first electronic digital computer, the Electronic Numerical Integrator and Computer (ENIAC). The introduction of the digital computer (from the Latin, digitus, for the fingers we count on to represent definite state) took the world by storm—but we’ll get into all that in another essay.
The point, for now, is this: without Shannon, I’m not sure we’d all be here reading this article online. He’s a rare bottleneck in history, a figure who seemed to be met with the precise circumstances to link the precise logic from thinkers who history had summarized in far too general detail. I’m sure the digital computer would have been invented eventually. But who knows how many more years it would have taken?
“The fundamental problem of communication,” said Shannon, in his seminal work, A Mathematical Theory of Communication, published just after the second World War, “is that of reproducing at one point either exactly or approximately a message selected at another point.”
In the exactness of binary, Shannon found a way to do just that. He tuned the noise to signal.
After Shannon, computers were, quite literally, history. Classical computing since 1940 has been a matter of scientific advancements in the infinitesimal: making binary transistors smaller and smaller to fit more and more of them on a single chip. The power of your smartphones comes directly from the number of bits on its chip how fast those bits can switch between 1 and 0. (Incidentally, Shannon also coined the word bit for this context.)
More processing power necessarily uses more energy. And if you haven’t noticed, energy is a dwindling resource. Sure, there’s been huge advancements in classical computing that allow them to do more for less. But at the heart, all those words you type are converted, tediously, to 1s and 0s and processed in strings of ever-increasing length.
What if we could transform our human language not into 1s and 0s, but into something more, well, uncertain? What if future computers need, at their heart, entropy?
At last we come to quantum mechanics, the behavior of matter at the tiniest scale: the subatomic.
The concept of quantum computing can be traced back to the early 1980s: physicist Richard Feynman, in his talk “Simulating Physics with Computers,” wondered if a computer that mirrored the behavior of subatomic particles couldn’t better represent quantum physics than a classical computer. A year later, another physicist, Paul Benioff, had drawn the theoretical schematics: a Turing-complete machine that ran on quantum particles alone.
Unlike the binary bits in classical computers, which can only be 1 or 0 at any given time, quantum computers are made of qubits, which can, in theory, hold infinite states in what’s called superposition.
I say theoretically, because when you measure a qubit, it’s superposition collapses.
You might remember something of Schroedinger’s cat, both alive and dead in the box, until we lift the lid and collapse the possibility of both into a single state. Now, imagine we did even more: found a way—without looking—to drop food into the box (that could be either ingested or not), to throw glitter into one side of the box (which now holds a cat with either glitter on it or not), and generally abused the poor creature with increasingly deranged experiments.
Do you see what I mean now by a theoretical infinite number of states? Obviously, this doesn’t totally work with a cat, because inside the box, the cat really is one of those combinations. But with subatomic particles, a superposition—that ability to hold all things possible—is absolutely real.
The idea of quantum computing was met with a lot of interest, but the fundamental problem of quantum mechanics—their inherent unobservability—remained a dilemma. We needed a way to observe without observing. To choose without choosing, much like a diviner might divide piles of yarrow stalks, making waves without a single presupposition.
This, in a much more complicated and scientific way, is what the early 1990s brought in the form of quantum algorithms, proving the viability of quantum computing. Peter Shor’s algorithm, which provided a practical method which quantum computers could use to factor large numbers, was the first. It used quantum gates analogous to those AND, OR, NOT, and XOR logic gates in a classical computer and leveraged the part of subatomic particles that stumped even Einstein: entanglement.
Einstein said it best: entanglement is spooky action at a distance. Sometimes, in nature, electrons entangle with each other, and their states become dependent on one another, even if those electrons are separated by large distances. (In fact, we haven’t found a maximum distance.)
If you observe an entangled qubit, it still collapses the superposition of its partner, but the connection can give vital information. By manipulating these connections, scientists have been able to strategically maximize the probability of getting the overall correct measurement of the qubit performing the calculation.
It’s not about 1 or 0; it’s about sensing your way through to the place in nature that has the most information to give. The world of technology is never far from the physical realities of the natural world, quantum computing maybe most of all.
In 1996, Lov Grover designed a way to efficiently search databases of information with a quantum computer. He proved that, because of a qubit’s ability to work in parallel—very unlike traditional bits, which work linearly, one operation after the other—just about any scale of data could be efficiently indexed by quantum computers.
The quantum computer of today, to me, resembles a bit of a yarrow stalk divination. We’re trying to make quantum computers more reliable, but they’re incredibly sensitive to their environments, ultimately consorting with the natural world to get back their information.
I know, I know—quantum computers are infinitely more advanced than the yarrow stalk divinations. They’re not magic. But I still think it’s a helpful comparison: we have to find a way to tune out the noise in favor of signal. To let go of preconception.
Even quantum computing, something that seems magical to many, is ultimately a tool created by humans for humans. And if we want to see it used well, we need to pay attention to the advances.
Do you see how long it took just to get to binary on the machine? How long it took to distill reason into something consumable and arbitrary?
In 2019, Google announced quantum supremacy. They solved, with a 56 qubit quantum computer, a problem that would be impractical for a classical computer—of any size—to attempt.
Since then, we’ve made huge progress on more barriers: cooling, compression, error-checking, networking between machines, and even using quantum computers to observe their own patterns of entanglement.
If you take nothing else away from this essay, I hope you see how quickly we’re moving. How the centuries it took to get binary on the machine can be compared to the mere decades it’s taken to stabilize quantum computing.
Sure, we’re still a long way off from all of us having quantum smart goggles, but every time we sleep, someone somewhere is inventing.
One last tale, this time from ancient India.
In a region untouched by the Yijing philosophies of ancient China, 3rd-century BC India, there lived a mathematician named Pingala.
It’s speculated that Pingala may have been the younger brother of the well-known Sanskrit grammarian, Pāṇini, but it’s not able to be confirmed. Either way, linguistics and mathematics stayed close to each other at the time.
Pingala, at some point, took a deep interest in Sanskrit poetry—specifically the rhythms, rhymes, and sounds of the words. His treatise on Sanskrit prosody is called the Chandahśāstra, which is a Sanskrit term that can be broken down into two parts: "Chandas" (छन्दस्), referring to the meter of poetry, and "Śāstra" (शास्त्र) which more or less translates to scientific exposition—a systemic knowledge in a particular field.
In this “Science of the Poetry Metric,” Pingala focuses on rhythm, exploring various combinations of heavy (Guru) and light (Laghu) syllables.
For my English audience, you might remember studying Shakespeare’s iambic pentameter—the alternating of unstressed and stressed syllables. Pingala came up with a way to represent these Guru and Laghu and made a sort of counting system out of them called the Mātrāmeru.
For example, if we wanted to count zero to seven, we could write:
0: L (laghu)
1: G (guru)
And so on. It might look similar to something we’ve seen a lot of in this essay: binary. It has its slight differences, but the general principle is the same. Pingala’s work is the first known recording of binary in its barest form.
Which means the rhythm of human speech—specifically Sanskrit poetry—is one of our first recorded advancements in computing. It’s an instance of what scientists today might call “multiple discovery”—when unrelated people in unrelated places uncover the same truth about the world.
But here’s what Pingala didn’t account for in his rhythms: the breaths. The line breaks. The intake of air and expectation. The waiting and the uncertainty.
The beauty of poetry comes from the line break, and much has been studied about it. It’s a punctuation mark unlike any other across all media:
Take this poem by Craig Santos Perez:
Sure, the line breaks are highly exaggerated here compared to most poetry, but the point remains: there’s a beauty to the uncertainty of human thought, and, by extension, the world around us.
I have to believe that a quantum system constructed not of certainty but out of uncertainty at its root is one that will, in the generations to come, be more adept at expressing the extreme nuances of human life.
As we look at scientific innovation, I find organizations that fund it are a little binary in their thinking: what purpose will this have? Will it solve world hunger? Will it make us money?
Leibniz, Boole, and so many others—they dreamed of a world that would use their discoveries, but imagined it so different than the one we actually have. Their ideas lived quietly for centuries before finding a use case in machines neither of them would have ever expected.
The past is not linear up to now. Leibniz never saw Pingala, but that doesn’t mean they both didn’t invent binary. Progress in the human race, much like a quantum computer, happens in parallel. Innovation, by its nature, is uncertain.
And we often break it when we observe it, when we skew it toward what we believe is needed. It’s tempting to trace pathways across the past to now, but there’s an infinite amount of information and inspirations we’ve lost, that go on living in some entangled part of the minds of those we encounter.
If there’s anything I take away from the remarkable stories of binary and the fast advancements in quantum, it’s this: we need open ecosystems of innovation, because we can’t predict the future or very well remember the past. At least not alone.
If we want to solve human problems, we’ll need as many humans working on them from as many different walks of life as possible. We’ll need a high tolerance for advancements that feel useless, even for centuries, and a high ability to look back at the past with new eyes to move toward the future.
I’ve talked a lot about men in this chapter. History isn’t very good at remembering the women, however vital they’ve been to the story. For example, Boole’s wife, Mary Everest, went on to become a mathematician in her own right after George’s death, and the couple’s five daughters each had distinguished careers of their own.
In the next essay, I’m going to talk about the women who got us where we are now, the programmers who were able to abstract binary into a language that humans could speak to commune with the machine. It was always women misbehaving who gave us machines that can follow our instructions.
I’ve got a lot of essays planned beyond this next one, too. I’ll take my time getting the research right, but it’s fair to say you can get excited.
If you want to be sure to get all my updates, subscribe to my monthly newsletter.
‘Til next time!