Main      Site Guide    
Message Forum
Y2038 and the "32-bit" time clock
Posted By: Wolfspirit, on host 206.47.244.92
Date: Thursday, January 6, 2000, at 20:45:00
In Reply To: one problem. . . posted by shadowfax on Monday, January 3, 2000, at 09:10:47:

Erm. I think we're getting our technical terminology confused here. This topic brings up a great number of issues too detailed to discuss here... like, for example, Unix's software dependance on a "time_t" counter on mainframes, as opposed to using a Real Time Clock (RTC) on PC platforms. Another issue is that the width of the data registers in 16-bit or 32-bit processors have *nothing* to do with the timekeeping registers in an RTC. Anyway, I don't want to stress out everyone while they read this, so I'll just address your points separately and fervently hope my explanations are clear enough.


> first, according to that, 16 bit operating systems such as win3.11 should have already bit it, because if in only 38 years 32 bits were gonna run out of room, 16 bit would have run out of room a long time ago.

I wasn't implying that DOS or Win3.1x use a 2^16 integer datatype(time field) similar to Unix's. For one thing, it's physically impossible. A 2^32 (signed) integer covers a total of ±2.1 billion seconds, or 136 years... but a presumed time field size of only 2^16 is totally inadequate to represent even *one* day of seconds in human-readable format. So the DOS and Windows platforms do not track time like mainframes do (more on that below).


> second, that's assuming that modern BIOSes are set to 1980 as a start date. Most of them are not. (what would be the point of setting a bios to a date that hasn't occurred for 20 years?)

You're skipping a step here. In PCs, the BIOS by itself does not maintain the system date; that job is handled by the CMOS RTC chip on the motherboard. When the system is booted, the O/S obtains the system date from the RTC via the BIOS. The O/S typically converts that date to the number of days since January 1st 1980, plus the number of seconds since midnight of the current day. In order to retain backwards-compatibility with existing PC software and existing clock architecture, the 1980 date had to be maintained. Moreover, any time counter on any system *has* to be initialized from some given start-point as a reference; thus 1980 is as good as any. In the end, whether the reference epoch for an O/S starts in AD 1601 (for NT) or in 1858 (VMS) or in 1980 (DOS) is irrevelant and arbitrary; the timer has to start *somewhere*.


> third, 64 bit architecture is already here. . .consumers see it in the latest game consoles (nintendo 64), and businesses are already getting it on mainframes.

Well, things begin to get sticky when deciding how to convert to a 2^64 time field. This "64-bit" signed time gives over 290 billion years in the past and future... In design architecture, it's considered unreasonable to limit the time value to some range less than geologic time. (BTW, you also have to be careful in flinging around that term "architecture", because it could refer to design in either hardware OR software... Notwithstanding the Nintendo 64. :-) I believe most current C compiler implementations, created for systems with 32-bit hardware registers, are able to handle a 64-bit software data type... today. But that doesn't resolve the problem of layers on layers of, say, entrenched older 32-bit business software in current use. You can't just go, "Oh, we'll just recompile the O/S kernel to handle 2^64 time!" It's not that simple, because there are numerous references to time variables in both an O/S and in any applications made for it. New hardware clocks keep track of elapsed-time in a unit smaller than seconds (like 100 nanoseconds and shorter), and then they pass the time-and-date information up to a higher level where it gets represented in human-readable time. Even with 64-bit design, there will be limitations at this "higher level"... in the software... that will prove a challenge in the next decade(s).

Wolf "whew -- aren't you glad that most computer operations are completely transparent (invisible) to the average user?? spirit

Replies To This Message