Comparison · 7 min read ·

Morse code vs binary: similarities and key differences

Both encode information with two symbols, but the timing rules, alphabet size, error tolerance, and bandwidth are completely different.

People often describe Morse code as "the original binary," but the comparison is loose. Both encode information with two symbols, but the timing rules, alphabet size, error tolerance, and bandwidth are completely different. Here's a side-by-side that respects both.

What they have in common

  • Two-symbol alphabet. Dot/dash, or 0/1.
  • Variable-length encoding. In Morse, common letters are short (E = single dit, T = single dah). Binary uses Huffman coding for the same reason.
  • Encodable by hand. A trained human can produce both with a key/keyboard.

Key differences

Timing matters in Morse

Binary is a sequence of discrete bits. Morse is a stream where the duration of each signal carries meaning: a dot is 1 unit, a dash is 3, a letter gap is 3, a word gap is 7. Two adjacent dits and a single dah only differ in timing.

This is why /translate/ talks about WPM (words per minute) and renders a rhythm bar that animates with the audio — the same letters at different speeds are still the same letters, but the listener has to track the tempo.

Alphabet size

Binary is 1-bit. Each symbol carries one bit of entropy. Morse symbols carry roughly 1 bit each but the alphabet (the set of dot-dash strings that map to a character) is large: 26 letters, 10 digits, ~20 punctuation, prosigns. To encode "A" in binary you need a chosen encoding like ASCII (8 bits). To encode "A" in Morse you need one dot, one dash (.-).

Error tolerance

Binary has no inherent error correction. A single flipped bit changes the character. Morse, transmitted by ear, is highly redundant: humans can guess missed dots from rhythm context. CW operators routinely copy 80–90% accuracy under noise that would defeat raw binary at the same bandwidth.

Bandwidth

A 20 WPM Morse signal occupies about 100 Hz on the air. A 9600-baud binary modem occupies 2,000+ Hz. For low-power long-distance work, Morse delivers more meaning per hertz than almost anything else.

Compatibility with computers

Binary is the native language of digital systems. Morse is encoded for human ears — to make a computer "speak Morse" we have to translate to/from another encoding (usually ASCII or UTF-8). That's what tools like our /morse.json map do.

Summary table

PropertyMorseBinary
Symbolsdit, dah (+ gaps)0, 1
Timing-sensitiveYes (relative durations)No (clocked discretely)
Variable-length charactersYesYes (in Huffman/UTF-8)
Human-decodable by earYesNo
Native to computersNoYes
Bandwidth (per unit info)Very lowHigher
Standard year1865 → 1909 (ITU)1948 (Shannon)

So: Morse and binary aren't ancestors of each other. They're independent answers to the same question (how do you encode information with two symbols?), tuned for different transmission media — air-modulated sound versus electrical state.


Tags: comparebinaryencoding