Welcome to Hitting the Books. With less than one in five Americans reading just for fun these days, we've done the hard work for you by scouring the internet for the most interesting, thought provoking books on science and technology we can find and delivering an easily digestible nugget of their stories.
Humble Pi: When Math Goes Wrong in the Real World
by Matt Parker
The start of the 21st century was a time of excitement and trepidation for the world. One one hand, we sat on the cusp of a future, an entirely new millenium, filled with countless possibilities. On the other hand, there was a small chance that the whole of modern human civilization would come crashing down around us because coders had for years used a shorthand method to denote the current date and our computer systems might not have been able to differentiate between the year 200 and the year 1900. We dodged a bullet when the Y2K bug fizzled out the first time. Will we be as lucky in 2038 when we once again face a similar threat?
Math is hard and even the brightest minds of our generation can get it wrong. But in the modern world, something as simple as a rounding error can pose a significant threat with incalculable consequences, as author Matt Parker illustrates in his hilarious and insightful collection of mathematical mistakes, Humble Pi.
At 3:14 a.m. on Tuesday, January 19, 2038, many of our modern microprocessors and computers are going to stop working. And all because of how they store the current date and time. Individual computers already have enough problems keeping track of how many seconds have passed while they are turned on; things get worse when they also need to keep completely up-to-date with the date. Computer timekeeping has all the ancient problems of keeping a calendar in sync with the planet plus the modern limitations of binary encoding.
When the first precursors to the modern internet started to come online in the early 1970s, a consistent timekeeping standard was required. The Institute of Electrical and Electronics Engineers (IEEE) threw a committee of people at the problem, and in 1971 they suggested that all computer systems could count sixtieths of a second from the start of 1971. The electrical power driving the computers was already
coming in at a rate of 60 Hertz (vibrations per second), so it simplified things to use this frequency within the system. Very clever. Except that a 60- Hertz system would exceed the space in a 32-digit binary number in a little over two years and three months. Not so clever.
So the system was recalibrated to count the number of whole seconds since the start of 1970. This number was stored as a signed 32- digit binary number, which allowed for a maximum of 2,147,483,647 seconds:
a total of over sixty- eight years from 1970. And this was put in place by members of the generation who in the sixty-eight years leading up to 1970 had seen humankind go from the Wright brothers inventing the
first powered airplane to humans dancing on the moon. They were sure that, by the year 2038, computers would have changed beyond all recognition and no longer use Unix time.
Yet here we are. More than halfway there and we're still on the same system. The clock is literally ticking. Computers have indeed changed beyond recognition, but the Unix time beneath them is still there. If you're running any flavor of Linux device or a Mac, it is there in the lower half of the operating system,
right below the GUI. If you have a Mac within reach, open up the app Terminal, which is the gateway to how your computer actually works.
Type in date +% s and hit Enter. Staring you in the face will be the num- ber of seconds that have passed since January 1, 1970. If you're reading this before Wednesday, May 18, 2033, it is still coming up on 2 billion seconds. What a party that will be. Sadly, in my time zone, it will be around 4:30 a.m. I remember a boozy night out on February 13, 2009, with some friends to celebrate 1,234,567,890 seconds having passed, at just after 11:31 p.m. My programmer friend Jon had written a program to give us the exact countdown; everyone else in the bar was very confused as to why we were celebrating Valentine's Day half an hour early.
Celebrations aside, we are now well over halfway through the count-up to destruction. After 2,147,483,647 seconds, everything stops. Microsoft Windows has its own timekeeping system, but MacOS is built directly on Unix. More importantly, many significant computer processors in everything from internet servers to your washing machine will be running some descendant of Unix. They are all vulnerable to the Y2K38 bug.
I don't blame the people who originally set up Unix time. They were working with what they had available back then. The engineers of the 1970s figured that someone else, further into the future, would fix the
problems they were causing (classic baby-boomers). And to be fair, sixty-eight years is a very long time. The first edition of this book was published in 2019, and occasionally I think about ways to future-proof it. Maybe I'll include "at the time of writing" or carefully structure the language to allow for things to change and progress in the future so that it doesn't go completely out of date. You might be reading this after the 2 billion second mark in 2033; I've allowed for that. But at no point do I think about people reading it in 2087. That's sixty-eight years away!
Some steps have already been taken toward a solution. All the processors that use 32-digit binary numbers by default are known as 32-bit systems. When buying a new laptop, you may not have paused to check
what the default binary encoding was, but Macs have been 64-bit for nearly a decade now, and most commonly used computer servers have gone up to 64 bits as well. Annoyingly, some 64-bit systems still track time as a signed 32-bit number so they can play nicely with their older computer friends, but for the most part, if you buy a 64-bit system, it will be able to keep track of time for quite a while to come. The largest value you can store in a signed 64-bit number is 9,223,372,036,854,775,807, and that number of seconds is equivalent to 292.3 billion years. It's times like this when the age of the universe becomes a useful unit of measurement: 64-bit Unix time will last until twenty-one times the current age of the universe from now— until (assuming we don't manage another upgrade in the meantime) December 4 in the year 292,277,026,596 CE, when all the computers will go down. On a Sunday.
Once we live in an entirely 64-bit world, we are safe. The question is: will we upgrade all the multitude of microprocessors in our lives before 2038? We need either new processors or a patch that will force the old
ones to use an unusually big number to store the time.
Here is a list of all the things I've had to update the software on recently: my lightbulbs, a TV, my home thermostat, and the media player that plugs into my TV. I am pretty certain they are all 32-bit systems. Will they be updated in time? Knowing my obsession with up-to-date firmware, probably. But there are going to be a lot of systems that will not get upgraded. There are also processors in my washing machine,
dishwasher, and car, and I have no idea how to update those. It's easy to write this off as a second coming of the Y2K "millennium bug" that wasn't. That was a case of higher level software storing the year as a two-digit number, which would run out after 99. With a massive effort, almost everything was updated. But a disaster averted does not mean it was never a threat in the first place. It's risky to be complacent because Y2K was handled so well. Y2K38 will require updating far more fundamental computer code and, in some cases, the computers themselves.
From HUMBLE PI: When Math Goes Wrong in the Real World by Matt Parker, publishing on January 21, 2020 by Riverhead, an imprint of Penguin Publishing Group, a division of Penguin Random House LLC. Copyright © 2019 Matt Parker. First published in Great Britain as HUMBLE PI: A Comedy of Maths Errors by Allen Lane, an imprint of Penguin Random House UK, 2019.
by: via https://www.AiUpNow.com/