When you press a key, watch a video, or call a friend, your device doesn’t understand “A,” “blue,” or “hello.” Deep inside, it only speaks one language: two tiny symbols, 0 and 1. In our last post we met Boolean algebra — the arithmetic of true/false, on/off, 0/1. Now let’s take a journey to see how those two little symbols grew into the foundation of our entire digital universe.

Long before smartphones, early engineers asked: “If a telegraph wire can only be on or off, how can we send numbers?” Humans use the decimal system (ten digits, 0–9). Machines needed a base that matched their physical reality. Two states meant base-2, or binary.

Here’s the simple process that converts any decimal number to binary. Let’s use 42 as an example:

Numbers were easy. But how do you tell a machine to print “Hello”? In the 1960s, a group of engineers created a universal dictionary: ASCII, the American Standard Code for Information Interchange. Each character — A, B, C, a comma, a space — got its own 7-bit binary number. They even arranged it cleverly. Uppercase A–Z line up from 1000001 to 1011010 so computers can compare and sort letters quickly. It’s like giving every letter a house number on the same street.

ASCII worked fine in the English-speaking world. But soon computers had to handle accented letters, Asian scripts, scientific symbols, even emojis. The answer was Unicode — a code space large enough to give a number to every character in every language.

Text is just the beginning. Every image you see on a screen is stored as numbers too. Think of colors: any pixel can be described by how much red, green, and blue it contains. Each of those values is stored in binary. Combine three numbers and you get a color. If you look at your screen and see a wallpaper full of colors, behind it all the screen is actually a huge grid — like a giant chessboard made of millions of tiny squares, each assigned a color. If a screen has fewer pixels, each pixel covers more space. The result is a blockier, less detailed image (fans of old Mario games will remember those chunky squares). When a screen has far more pixels, each one is smaller and the image becomes sharp and lifelike because the details are spread across many tiny points.

Even audio can be digitized. Microphones pick up sound waves and an analog-to-digital converter (ADC) samples the wave thousands of times per second. Each sample becomes a binary number, creating the digital audio you listen every day. The idea is simple: take a continuous audio signal and break it into discrete slices. Each slice is then assigned a value that represents the signal’s height at that point. If the wave is divided into many slices, the recording captures subtle variations and sounds more accurate. But if it’s divided into only a few slices, the result is rough and less clear. The figure below shows an example of how a continuous wave is converted into discrete steps.

From a telegraph’s on/off clicks to Unicode’s millions of characters, from RGB pixels to sampled sound waves, it all traces back to one simple idea: representing information with 0 and 1. Those two symbols have built the language of computers and, in turn, our entire digital lives.

But have you ever wondered where all these 0s and 1s are actually kept? What physical medium holds them? After all, it’s impossible to separate the digital world from physics, every bit must be stored and processed using some physical medium. That’s a story for another episode, so stay tuned!

Keep Reading