Even if you do not remember the “Millennium bug” or “Y2K problem” and I guess, for some younger readers, this would be considered ancient history, I am sure that you have heard about it. In the late 1990s we were all sure that chaos would occur as the year 2000 dawned. The problem was that any code written to process the date might have used just two digits to store the year and, hence, when “99″ clocked around to “00″, confusion would result. We fully expected power to go off and planes to fall out of the sky. An entire “profession” of “Y2K consultants” appeared and, I imagine, made a lot of money.
In the event, it was an anti-climax. The millennium dawned and the world carried on much as it had before. Whether this was because all the preventative work fixed the problem or the problem never really existed on a large scale in the first place, we will probably never know. I did not hear of a single system failing, but it seemed to be a valid concern at the time.
Still, it is good to have that behind us. But, it is all going to happen again and could be worse this time …
Since computers work in binary and the Y2K bug was concerning a decimal digit problem, I did wonder if it would come to anything. Yes, I have heard of Binary Coded Decimal. But I thought that a binary wrap-around might be a more difficult problem, which was bound to occur sometime. It looks like I was right.
Many systems use a 32-bit value to track the date and time by simply counting seconds. That should be OK, as that gives capacity for 136 years or so, which sounds like a long way in the future. However, that rather depends on the starting date for this date/time format. The good news in this respect is that the start date was around 1970, which would mean that we can rest easy for the remainder of this century.
However, this assumes that an unsigned 32-bit value is used. As far as I can tell, it is usually a signed value – I have no idea why – and this reduces the capacity to 68 years. So, it should All Go Wrong in 2038 sometime.
I have no idea what systems are likely to fail, but broadly, anything based on or related to UNIX will be in trouble. By that time, I will be about 80, and probably less interesting in the working of embedded systems, but may be concerned about the integrity of medical instrumentation.
Anyway, we have lots of time to get this fixed just like we did last time ....
I'll be in my later 60s by this time. Assuming I'm retired, I like to joke that I plan on coming out of retirement to help fix Y2038 bugs, but the kicker is: my hourly rate will be $ 2038.00.
It is a shame that 32bit Linux hasn't provided a solution for this (last I checked). I believe the only solution presented to-date is to use 64bit CPUs. Perfectly valid in the PC world, but I have no doubt there will be millions upon millions of 32bit embedded Linux systems still active in this timeframe. Each product will have to be carefully evaluated for flaws and impacts of those flaws.
One thing I started doing many years ago: any new protocol or file format or API involving the typical "seconds since unix epoch" timestamp, I always define as a 64bit value, even if the underlying OS only provides 32bit values. I believe this is particularly critical with communication and data exchange protocols. It at least means the data exchanged between devices/computers will be ready for any underlying improvements.
To post reply to a comment, click on the 'reply' button attached to each comment. To post a new comment (not a reply to a comment) check out the 'Write a Comment' tab at the top of the comments.
Please login (on the right) if you already have an account on this platform.
Otherwise, please use this form to register (free) an join one of the largest online community for Electrical/Embedded/DSP/FPGA/ML engineers: