Overflow
There was a man named John Titor. Or so the story goes.
In 2000, someone appeared on an internet forum claiming to have arrived from the year 2036. He diagrammed how time travel worked, predicted future events, and wrote that his mission was to retrieve an IBM 5100 and bring it back. He needed a hidden capability of the IBM 5100 to fix problems with old computers.
Absurd. But one detail was strange. The IBM 5100 really did have a hidden capability. An undocumented ability to emulate the System/370 mainframe. Bob Dubke, one of its original engineers, confirmed this years later. Even within IBM, only a handful of people knew. How did an anonymous poster on an internet forum know about it? Nearly all of Titor's predictions were wrong. But the IBM 5100 claim remains unexplained. Many people learned about this through Steins;Gate. I'm one of them.
The year 2036, when Titor supposedly came from. Two years after that, another time bomb is waiting. January 19, 2038, 03:14:07 UTC. The UNIX clock rewinds. UNIX time counts the number of seconds since the epoch — January 1, 1970 — as a 32-bit signed integer. The maximum value is 2,147,483,647. One second past that, the value flips to -2,147,483,648, and the clock jumps to December 13, 1901. A 137-year rewind.
The Year 2000 problem came first. Years were stored as two digits, so 2000 became 1900. The world panicked. The fix cost an estimated $300 to $600 billion globally. No major catastrophe actually happened — but that was because people spent years fixing it. The U.S. Naval Observatory's website displaying the date as "January 1, 19100" was a charming example of what slipped through.
Was storing two digits programmer laziness? Knowing the constraints of the era, it's hard to say yes. In the 1960s, memory cost roughly a dollar per bit. Eight dollars per byte. Eight thousand dollars per kilobyte. Punch cards had a hard physical limit of 80 columns. Every column mattered. Using two digits instead of four to save two bytes was a rational decision. Some people saw the problem coming, but most programmers never expected the software to last until 2000.
The 2038 problem has the same root. The nature of the constraint is different — not memory cost, but the choice of bit width. But the structure of the decision is the same: "this will outlast my career." When 32 bits were deemed sufficient, 2038 was 68 years away.
The migration to 64-bit time on major operating systems is mostly done. The problem is the devices that will never be updated. Embedded systems with no firmware update path, still ticking away in 32-bit time, scattered everywhere. When 2038 arrives, they'll jump back to 1901.
Overflow isn't just about clocks. NTP's time format hits its ceiling in 2036. GPS week numbers reset every 1,024 weeks, and rollovers actually happened in 1999 and 2019. Clocks everywhere are quietly counting toward their own limits.
People plan for their own careers, not beyond. When I estimate data volume for a new service, I do the math in my head through retirement. The rest is someone else's problem. Programmers in the 1960s thought the same thing. Engineers in the 2000s still do.
Nobody knows who John Titor really was. But when the UNIX clock rewinds in 2038, there might be a post on some forum. "Told you so."