Bay City, Michigan
Sales (989) 892-9242                 Support (989) 686-8860
Everyone wss aware that there was going to be a "computer problem" when the new century rolled around. Few people really understand the scope of the problem but they knew it was there. The media covered the threat extensively. Fear mongers cashed in with solutions and survival gear. Self proclaimed experts talked about how our infrastructure was going to dissolve into a quivering mass of wires and burned up computer chips.
These days nobody even remembers the near panic state. It was just a non-event. What bothered us then were the myths, hype, and mis-information. Some people had an obvious profit motive and preyed on people's fears. Others simply repeated things they've heard without understanding what they were saying.
Some folks abused the situation and used it as an excuse or smoke screen to cover other issues. Crying "Y2K" was a great way to get the boss (or the public if you're in government) to buy you a new computer. Sure they bought! Who could prove it wasn't necessary.
Perhaps we should learn from the past. Look back at a few of our favorite Y2K Myths and see how silly they seem now.
This is how we wrote it up in the last months of 1999:
OUR FAVORITE MYTHS:
- Myth 1: Y2K is a "bug"
- Myth 2: Computers can't tell if year 00 is 1900 or 2000.
- Myth 3: Computers will stop working.
- Myth 4: Smoke detectors are at risk.
- Myth 5: Embedded chips will cause machines to fail.
SOME OTHER DATES:
- Date 1: September 9, 1999 is NOT 9999.
- MYTH 1: Y2K is a BUG.
Despite all the hype and the implication that this whole thing is a bug, IT IS NOT.
The year 2000 delima we face today is the result of programmers using 6 digits to store dates. Typically the format is yymmdd or mmddyy representing year, month and day. Each of these is 2 digits.
The reason that was done is simple economics. Until very recently, computer storage was a very expensive and precious comodity. Today your new $1,200 PC will come with BILLIONS of bytes of disk storage and MILLIONS of bytes of RAM (memory). 30 years ago a large computer cost half a million dollars. It might have an 80 million byte hard drive which itself cost $50,000. Most programmers thought that 128K (thousand) bytes of memory was a lot. You could get more but it was more costly per byte than disk space!
Programmers in those days spend days devising the most efficient and cost effective methods of storing data. We were concerned with every byte we used and we took every measure possible to save a byte here and a byte there.
Before you say that TWO BYTES for the year is not much of a savings, remember that a data file might contain several dates in every record and there could be hundreds of thousands of records in a file. Those two byte savings translated into pure gold!
Another factor that we considered was plain old speed. Besides being much smaller in those days, computer disks and tapes were much slower than today's equipment. Speed is a function of how much data must be processed. If we could save 2 bytes per record over 200 thousand records, we just might be able to get the payroll out on time.
So the 6 digit year is NOT a bug. It's a well established and well considered trade off based on time, economics and scarce resources.
- MYTH 2: Computers can't tell if year 00 is 1900 or 2000
In truth, computers can't tell if the number 00 is even a year at all. It could be anything from a year to an age to a weight to money.
The real problem is software.
Software is the programs that tell the computer what to do. The software is nothing but a list of instructions. It can be written to do anything that a person can describe as a step by step process.
The problem we face is that older software was generally not written to worry about the year being outside the current century. To compute a person's age the program might simply subtract the birth year from the current year, using two digits for both. 99 - 47 = 52 years old.
One very common solution to the year 2000 problem is to change the programs to be a bit more discriminating. If the date is stored on tape or disk, the program can read the date into memory and convert the two digit year to a four digit year with some degree of confidence.
The problem is that this requires a programmer to find the program, find where the dates are used, devise a reasonable set of rules for date conversion, then change the program.
This solution is not difficult... on a small scale. the problem is that there may be hundreds of programs in a system (like Payroll) and there may be hundreds of systems. What's more, all of the systems may interrelate and exchange data and all of the potential interactions must be found and checked.
So it's not a difficult fix, just a time consuming one. That's the real problem.
- MYTH 3: Computers will not work on Jan 1, 2000.
Some people seem to think that computers just won't work after January 1, 2000. WRONG. The on-board clock chip has nothing to do with the computer running. Sure, some older clock chips won't handle the date change but the most significant result is simply the wrong date being reported when the computer starts.
Remember that PC's didn't even have clock chips for the first few years. You always had to key in the date and time when you started your computer. Many people didn't even do that. They just ran the things with the same date every day! (Much like a VCR flashing 12:00 forever.)
When "clock/calendar" boards came out they were a luxury item. You bought it separately and for the first time automatically had the correct date everytime the computer was turned on.
Historically, mainframe computers didn't have clocks either. Part of the power on sequence was to set the date and time.
Once a computer has been booted up and is running, almost all systems maintain the date and time through software as part of the operating system (OS). Most OS's don't care WHAT the date is. They just report it to other programs when requested.
So it's still a SOFTWARE problem. The operating system (Windows for most of the world) may report the date wrong but it stop the computer from physically running.
- MYTH 4: Smoke detectors could fail!
We've heard this one a lot recently. People are warned to check their smoke detectors.
Does this make sense? When you get a new smoke detector all you do is put in a battery. There is no clock, no time display and no reason to have one. All the thing does is detect smoke.
The one possible exception would be a central system that controls smoke detectors in large buildings or complexes. Even then, it is unlikely that there is any clock or date related function.
At most it would be a test routine set to try the system at prescribed intervals. If that IS the case it is most likely based on hours, not dates.
- MYTH 5: Embedded chips will cause equipment failures
This seems to be the buggaboo that freightens most people. It's probably because we don't know where those chips are or how they're used.
Common sense can help here. Coffee makers that have timers may be using an embedded chip. Even so, it's probably just a CLOCK and not a calendar. After all, how many people would need a coffee maker that could start the coffee a week from now.
What about your VCR. Now THAT could be a problem since most systems will let you program up to a year in advance and base it on the date. I've never programmed a VCR for more than a DAY in advance... I can't imagine programming a YEAR ahead. Do the networks even KNOW what they're going to show 12 months from now?
Since few people know if an appliance or machine has an embedded chip we have to use some common sense to decide if there could be one that would be a problem:
If you are required to set a DATE to use a piece of equipment AND that date is significant to the operation of the equipment, THEN you should ask some questions. If there isn't even a CLOCK then it's unlikely that an embedded chip is calendar controlled.
SOME OTHER DATES
There is a growing awareness of some of the other dates that could be a problem. Some are real. Some have already come and gone.
Here's one we've heard a lot in the past couple of months: September 9, 1999!
This page has been updated since that date and we're pleased to say that the world did not end. For that matter, we didn't hear of a single problem. It was pretty much a NON-EVENT. We'll leave these comments here because they illustrate the mis-information concerning other date myths too.
The 9/9/99 concern was that some software uses all nine's to mark the end of a file or list. That's true but here's why it is unlikely to matter:
- This technique is rare. It's almost never used for files. It sometimes is used internally for tables and lists which are run in memory tables.
- COBOL programs, which are often cited as a major Y2K carriers due to their age, typically use "HIGH-VALUES" to mark ends of lists. That value is not 9's but is the highest value that can be represented on the particular computer running the program.
The place we most often encounter the 9's construction is in example programs used in introductory programming courses. It's easy for the novice to understand. Perhaps that's why it's received such attention in the media.
- Lastly, September 9, 1999 is NOT all nines! Software that uses dates for processing purposes stores the date in fixed length fields. That's why we have the Year 2000 problem in the first place: Dates are stored in SIX DIGITS such as "mmddyy".
What that means is that September 9, 1999 is actually stored as "090999". Storing the date as "9999" would be an illogical and awkward. The only date that might be a problem would be 99/99/99 which is not very likely!