The year 2000 issue is a general term that really refers to three separate date-related computing issues, each of which can potentially produce misinterpretations or miscalculations.
- The first issue is related to the way computer hardware and software traditionally stored date information. Historically, programmers specified a year using two digits (99) rather than four digits (1999). By assuming the first two digits of the year, precious memory and storage of a computer was saved.
This was an economical shortcut for programmers that made good sense twenty-five years ago, but stopped making sense as we approached a new century. Though programming practices have changed in recent years, some computer hardware and software may still have difficulty interpreting the year after the turn of the century. And if the computer system stores or works with an unintended date, any calculations or information based on that date could lead to incorrect results.
Another contributing factor to this issue is the everyday practice of people to use only two digits to specify a year. Though each of us is accustomed to using two-digit shortcuts for the year, this practice forces the software to interpret the century that was intended. An interpretation of the century is simply not as reliable as a clearly specified four-digit year! |