The Y2K Computer Bug -- May 1999

by Ben Best



    Cryonicists are necessarily "technophiles", people who expect great things from technological progress. But the "Y2K Bug" could be a dangerous pothole in the road to the future. ("Y2K" refers to 2 "kiloyears", ie, two thousand years, ie, the year 2000, and a "bug" is a defect in a computer program.)

    Most of what I write in this article concerns computer issues that are seemingly unrelated to cryonics. I justify this on the grounds that cryonicists are survivalists, and the Y2K bug has the potential to threaten the survival of cryonics organizations, cryopreserved cryonicists and even living cryonicists. The forms of the threat and possible defenses against the threat depend upon the reality of the threat. My suspicion that there could be a risk of crisis has motivated me to spend a great deal of time trying to evaluate the problem to determine its seriousness. This has not proven to be an easy task.

    Many scientists have dismissed cryonics as being non-scientific because cryonics is so dependent upon speculations about future technology -- and is therefore not subject to scientific investigation. A similar criticism could be made concerning the impact of the Y2K Computer Bug. No one can have direct experience of what will happen in January, 2000 before January, 2000.

    But whether or not predicting the future is scientific, survival & daily life depend upon predictions about the future -- and science can improve those predictions. Without predictions no one would buy a house, train for a vocation, run a business or even prepare a meal. Most predictions are based on the assumption that the future will be much like the past. But predictions that involve radical deviations from the past are necessarily highly speculative. Sometimes survival depends upon such predictions.

    The Y2K problem can be very frustrating for someone in search of hard facts. Evaluations depend upon indirect evidence. Psychological factors play a significant role in evaluation. For many people the Y2k problem is too boring, too remote, too horrific or too intangible to contemplate. Most people refuse to seriously contemplate their mortality until they are on their deathbed -- or rationalize with wishful thinking about an afterlife. The majority of people who experience a heart attack die before reaching the hospital -- despite ample warning in most cases (death due to denial). Few people want to admit or believe that they are having a heart attack.

    Psychological factors also play a significant role among the alarmists, many of whom have no more information than those who ignore or deny problems. Alarmism is an attention-getting device. Sensationalism makes for big news. Vendors of weapons and "survival supplies" have economic incentives for fear-mongering. So do computer consultants who are selling Y2K computer fixes, financial consultants who are selling Y2K advice and legal consultants who are selling Y2K legal preparedness. The most comprehensive, "objective" and widely-quoted surveys of prevalence and remediation of Y2K problems are computer consultants who make large revenues from remediation: The Gartner Group, Capers Jones and Cap Gemini America.

    I struggle with my own psychological biases as I struggle to know the truth. Here I will try to present the best direct & indirect evidence I can muster in the hopes of clarifying my own understanding and of forewarning of possible serious consequences to those in the cryonics community who will listen to me. As frustrating as the Y2K question is, it also affords a unique & fascinating opportunity to examine the technological interconnected underpinnings of modern life. And the unfolding of events in the next year is bound to be dramatic.



    The Y2K bug has arisen from the use in computer programming of 2-digit representations of the year, for example "1984" written as "84". This abbreviation may create a wide variety of problems at the beginning of year 2000 when "99" becomes "00" on some systems, "100" on others and "??" on others. Programs that use dates for event-sequencing may malfunction by interpreting "00" as being before "99". Programs that produce "100" may crash because of insufficient storage allocated to accommodate the extra digit. Programs that sort by date may place events in incorrect order. Programs that recognize a time-stamp of "00" as "1900" may interpret this information to mean that food has spoiled, mechanical maintenance is long-past due, licenses have expired, deliveries have been missed, etc. At first glance this seems like a trivial bug, easy to find & easy to fix -- and inconsequential for applications that do not use date functions. This bug is NOT inconsequential, but the extent of the consequence is a matter of dispute.

    Computer programmers have been blamed for creating this problem out of "laziness", but in the early days of computing the use of two digits was mandated by the high cost of data storage. Moreover, in a business environment time is money, and the entry of 4 digits when 2 digits will suffice results in considerable savings of time & effort that will certainly continue past the year 2000.

    The four most common fixes for Y2K problems are (1) field expansion, (2) code replacement, (3) encapsulation and (4) windowing. Although expanding fields from 2-digits to 4-digits is the most lasting fix, it is rarely done, for reasons just described. Rather than try to patch-up an old system, new replacement systems are often built from scratch -- including many improvements in addition to Y2K fixes. However, these new systems can also include many new bugs, especially if they are built in a rush to meet Y2K deadlines.

    Encapsulation relies on the fact that the correspondence between days-of-the-week and calendar days (including leap-year adjustments) will continue to repeat in a cycle every 28 years for the next hundred years ("time capsules"). Resetting the date from 1999 to 1971 is only a general solution for systems without a database or with no data-exchange with other systems. Evansville, Indiana is turning the clock back 28 years to remediate the Y2K Bug in its traffic light system. A more restricted use of encapsulation adds 28 years to a pair of dates before doing time-interval calculation or sorting.

    Windowing refers to the use of software to interpret the century applicable to a two-digit code. In the time-series database I have supported, double-digits equal to or greater than "70" affix the prefix "19", but smaller double-digits affix "20". Thus, "72" becomes "1972", but "65" becomes "2065". MicroSoft Excel 97 adds the "20" prefix to double-digits in the range of "00" to "29" and adds the "19" prefix to "30" and greater. Earlier versions of Excel used "20" rather than "30" as the pivot year. MS Access 95 added the prefix "19" to all double-digits, but MS Access 97 is consistent with Excel 97 in using "30" as the pivot year. The strategy of windowing makes sense for a company that sells software and that periodically advances the window, but it could create problems for databases & applications that use a very wide range of dates -- and for applications that must communicate with other applications using a different pivot year.

    The Y2K Bug has been grandiosely called "The Millennium Bug", but it is actually a century bug -- a problem that can recur every 100 years. Of interest to nitpickers, the year 2000 is not the beginning of the 21st century -- the year 2001 is the beginning of the new millennium. The Western calendar was the creation of a 6th century monk named Dionysius Exiguus, who omitted the year zero. One practical reason for this is that there is no Roman Numeral for zero. But A.D. 1 followed B.C. 1 because "Anno Domini 1" meant "the first year of the Lord". Infants in their first year are rarely said to be zero years old. An ordinal number became transformed into a cardinal number with the move to Hindu-Arabic numerals in the Middle Ages.

    Associated with the Y2K bug is the fact that the year 2000 is a leap year. This could cause problems in programs written by programmers who know that leap years are skipped every century, but who do not know that leap years are not skipped in centuries divisible by 400. The year 1900, although divisible by 4, was not a leap year, but the year 2000 is a leap year. This leap-year mistake can cause problems in programs that interpret 00 as 1900. March 3rd will be a Friday in 2000, but was a Saturday in 1900. Bank vaults may not open, elevators may not run and security systems could lock-out employees on 3-March-2000. Problems could also occur on 31-December-2000 for systems not expecting a 366th day. Such a leap-year problem occurred on 31-December-1996 at the Tiwai Pt, New Zealand aluminum smelter when 660 process control computers shut-down at midnight. Without computers to regulate temperatures, five pot cells overheated and were damaged beyond repair.

    An End-of-Week (EOW) Rollover problem may occur on 22-August-1999. The Navstar Global Positioning System (GPS) satellites use a 12-bit field for week number. The 1,024th week of operation will begin at midnight of August 21/22, when the week field will be re-set to zero. GPS was created by the US Navy to provide navigational data for ships & planes, but over 10 million commercial ships&planes (and much non-commercial navigation) now depend on GPS. Moreover, the accuracy of the GPS atomic clocks have made it a standard for thousands of financial institutions that calculate interest on large loans by the milli-second. The major problem is expected to occur with pre-1994 receivers. Those most affected will probably be small commercial shippers & fishing companies, trucking firms and recreational users whose receivers may give wrong locations & incorrect date -- 6-January-1980, the zero date when GPS began.

    And as if there aren't enough problems, the 11-year cycle of solar flares will reach a peak in the first quarter of the year 2000 ("cycle 23", the 23rd recorded cycle). The increase in use of satellite communication probably means that more disturbances will occur this time than occurred 11 years ago. Signal fading of high frequency radio waves is commonly seen in these periods. Cell phones & GPS signals could be adversely affected. Energetic particles can cause bit-flipping in satellite signals and geomagnetic storms can result in atmospheric heating (which caused premature re-entry of Skylab in 1979). Intensification of magnetic fields in the sky on March 13, 1989 magnified currents in high-voltage lines -- resulting in a 9-hour blackout in Quebec. Also, a transformer at a nuclear power plant in New Jersey was damaged beyond repair by overheating.



    Since 1987 I have worked as a computer programmer for an investment division of one of Canada's large banks. For most of the last year I have been primarily concerned with finding & fixing Y2K problems ("remediation"). The banking/investment industry has probably spent a greater proportion of revenues on Y2K remediation than any other industry. Dates play a prominent role in all banking activities & application software. I want to describe my experiences with candor, in the hope that these will provide a more general insight into Y2K issues. People who find computer programming incomprehensible may want to skip the next sections, although I will attempt to be clear & simple.

    I support a system that uses APL computer-language, C-language and an INGRES relational database in a DEC (Digital Equipment Corporation) VAX environment. My project was not only to test & remediate this system, but to convert it from using the non-Y2K-compliant VAXC compiler to the Y2K-compliant DECC compiler. VAXC is based on "K&R"#C (the old C-language) rather than ANSI standard C-language and, moreover, 21#format & implementations changes were incorporated into DECC which were unrelated to Y2K. We had to do software searches of thousands of lines of code in hundreds of programs to find & correct these changes. I won't begin to try to describe the complex issues involved in interpreting these 21#changes. Although the entire project was targeted for completion by December, 1998, it was only in December that we finally succeeded in compiling & linking.

    Then I was asked for a test plan. It is unreasonable & impossible to test everything, so I selected what I believed to be the most critical applications -- ones that run on a daily basis. Testing was done in a "Y2K Lab", rooms isolated from production systems where we could change the system dates at will. I somewhat whimsically referred to the Lab as "Tomorrowland", but it was also a "Fantasyland" where we tried to simulate reality. Some of the test data had to be fabricated to resemble short-term financial instruments which could be of use in the year 2000. I suspect that problems in setup of test environments for Y2K simulations may contribute to many mistaken assumptions about the reality of year 2000, and give rise to some very unusual bugs. Y2K Labs can be a confusing place. When dates can be switched back-and-forth, time-stamps become an unreliable means of identifying files. Many businesses do not have the luxury of Y2K Labs for their testing, however.

    Of the 3 systems we had re-compiled, management decided that one of them would not be tested in the Y2K lab because of its dependence on telephone/modem connections which were too much trouble to install. This application involves transmitting files to clients, but my manager said that if the system fails we can FTP the files. I question this, even if the Internet is functional, but I did not challenge him. That particular system also includes a FORTRAN-code portion that none of us understood. The main test we did on that system was to verify that it would compile. We also did not test importing data from an application on a UNIX platform, because it would have been too much trouble to set-up (cost & complexity).

    We began our testing with a system date of December 28, 1999. Almost immediately we experienced problems with our index calculator. Since we had not crossed the 2000-year threshold, I became convinced that the problem lay in the environment. Index calculations depend on previous data, and I suspected that the sudden advancement of the date from December, 1998 had caused problems. I also found evidence that my co-worker had improperly imported test-data. (At least as many errors occur in testing as occur in programming.) The calculator worked fine for December#30th, but failed again on December#31st.

    When we advanced the date to January#3rd, 2000 we experienced real Y2K problems. The application produced "100" and programs would crash because the 2-digit fields could not accommodate the 3rd digit. The most perplexing case of this problem was with an application that our Operations people use for submitting jobs. There was no obvious indication that it was crashing because of a date problem, and it took me a couple of days of tracing before I finally found the reason. I can see how applications that innocuously time-stamp an activity could fail for this reason (every Internet e-mail message is time-stamped).

    My co-worker went on vacation for a month, and I finished testing the daily job-stream to March, 2000, despite the fact that he left me inadequate instructions about the testing procedure (and despite my attempts to badger him into documenting his work). The original schedule had involved testing more dates, to the year 2001, but management decided to stop at March, 2000. When my co-worker returned from his vacation, he resumed testing the index calculator problem that had failed for December, 1999 (despite working properly for year 2000 dates). I was still convinced that the problem lay in the test environment. I went to a manager who agreed with me that we should not waste more time testing the index calculator. But a second manager disagreed, and the testing of the index calculator for December, 1999 was resumed. It was work I did not relish and I had been eager to look for reasons why it was unnecessary.

    After many days of testing we finally traced the problem to an INGRES patch (upgrade) which had accompanied the Y2K-compliant compiler. The relational database was going into an infinite loop of locking and relocking the same table due to a faulty escalation from page-lock to table-lock. This is another example of how new bugs can be introduced into software that is being made Y2K compliant -- addition of "enhancements" that don't work under unusual conditions.

    After placing a work-around in the code, I recompiled the index calculator. However, various adjustments had to be made before moving the systems we had tested from the lab into the production area. One of the systems people told us to recompile everything. We did this (both my co-worker and myself on different days, without realizing that other was re-compiling), thereby deleting all previous versions of the executables. Then we were told that the instruction to recompile was a mistake, because only the original compilation had been tested on all of the dates. This was true, but the original compilation included the December, 1999 INGRES table-lock problem, without our work-around in the C-code. The systems people had to restore executables from a back-up made the previous month. They were confused about the correct environment because they had not kept careful records of all the changes that had been made. Despite the restriction against recompilation, I insisted on re-compiling the section of code that contained the work-around.

    I can imagine a similar thing happening in other shops. Testing is done on a series of dates, an error is found near the end of the testing and then the error is corrected. But the schedule is too tight to re-test all the earlier dates on the code that has been fixed. If a bug was introduced during the fix, it could be missed.

    The first evening we went into production we had a very serious problem with the database reconciliation. Several years ago I had been forced to implement a kludge when comparing data from our APL database (which was highly accurate) with data from our INGRES database (which was somewhat skewed on the 3rd decimal place). The INGRES patch had added a fix to this inaccuracy along with other enhancements, but we had received no readable documentation of the enhancements. Although we had noticed a slight problem in the lab, we had assumed that this was an artifact of the artificial set-up. Fortunately, this was a problem we were able to correct in an evening.

    The project I have just described was finally completed at the end of May, 1999. All Canadian bank Y2K remediation of "mission critical" systems must be completed before July, 1999. (Projects may shrink to fill the amount of time allotted for their completion.)

    I have now been placed on another (delayed) Y2K project involving an inter-office messaging system used to transmit data concerning brokerage transactions. There is not enough time before July to convert the system from VAXC to DECC. DEC does not guarantee that the VAXC compiler will compile in the year 2000, but we are making the assumption that if we re-compile before the year 2000, the executable will function even if we are unable to compile in the future.

    In this messaging system, when I want to view all messages since 30-May-1999, I use a "/since=99.05.30" qualifier. Messages created in the Y2K lab for dates in 2000 store the year as 100, but if I try to use the qualifier "/since=100.05.39", the syntax is rejected as invalid. If I say "/since=00.05.30" I get nothing -- even if messages have been posted in the lab with timestamps after 30-May-2000.

    The messaging system was created 7#years ago by contractors who have long ago left the company. I have had to search-about to find the libraries & files needed by the compiler & linker. I find an assortment of object libraries & source code files with the same name in different directories, often with no indication (or documentation) as to which I should use. I cannot go by the date of the file, necessarily, because more recent versions may have only been for testing.

    I was able to relink the files and create an executable, but recompiling some of the files and including them in the object libraries caused executables to crash on relinking. I had more than one version of source files & text-library files to choose from, but both the old and the new files resulted in executable that crash.

    My manager asked me if more people could be put on the project, to which I replied with the adage that nine women cannot produce a baby in one month. Brooks' Law (from THE MYTHICAL MAN-MONTH, Frederick Brooks, revised 1995) states that adding more programmers to a late project only makes it later. Six good novelists cannot collectively write a novel as good as one of the novelists alone -- nor as quickly (for comparable quality).



    A large amount of Y2K remediation work is done by programmers reviewing code written by others many years earlier in an unfamiliar application (if not an unfamiliar computer language) that is undocumented. The work is boring, tedious and is done under extreme pressure and deadlines that are driven by external considerations rather than of the amount of work to be done. The temptation to cut corners is tremendous (and not always by rational "triage"), especially in light of the fact that most managers won't know the difference. Users have no interest in projects lacking immediate tangible results when they have fixes of their own crying for remediation. A small minority of programmers can do much damage.

    Programmers typically deliver software that has one bug per hundred lines of code, although Lucent (Bell Labs), Motorola and NASA reportedly do much better. Programming is a "creative" practice, but the "creative" logic can easily become convoluted. Only two days ago I spent several hours trying to find a problem in a few lines of code written by an odd-ball contractor. Y2K errors can be very hard to find, and new errors can easily be introduced in the correction process -- especially by someone unfamiliar with the code.

    The most effective way to find Y2K bugs is to reset the system clock in a Y2K Lab or in a production system during off-production hours. But test cases necessarily are only simulations in a simulation environment. An IRS test of a post-2000 date on a PBX (Private Branch eXchange) worked well for 3 days before the system crashed. What if the test had only been conducted for a few hours?

    It is programmers who decide what to test and how to test it. Since 90% of Y2K remediation has been done by in-house programmers rather than by external contractors, Y2K compliance is something reported by employee-programmers to their managers -- not really an "objective" evaluation. Programmers are under pressure from managers who are under pressure from senior officers who are under pressure from regulators or customers to meet deadlines. No one wants to look foolish or incompetent -- problems can be swept under the rug -- until the year 2000. I do believe that my systems will stand-up.

    There is an exponential drop in system understanding from programmer to senior manager. Senior managers cannot afford to say that their deadlines have not been met when the consequences are interference from regulators, drop in share prices, loss of credibility in the business community, and humiliation in comparison with competitors who boast of having met deadlines. There are no objective standards for Y2K compliance other than the reality-test that will come in the New Year. (The British Standards Institute states that "Year 2000 conformity shall mean that neither performance nor functionality is affected by dates prior to, during and after the year 2000." Not very helpful.) And I would trust the claims of government agencies least of all. In meeting the March 31, 1999 deadline for "mission critical" systems in the US Government there was a 25% drop in the number of systems designated "mission critical" between November 1997 and February 1999.

    A very large-scale study of 108 IT (Information Technology) directors (including 100 Fortune 500 companies) sponsored by Cap Gemini indicated that "end-to-end" testing (of the kind done in Y2K labs) was planned for no more than 40% of all applications [THE YEAR 2000 SOFTWARE CRISIS, Hayes & Ulrich, page 5]. Errors are certain to have been made in deciding which systems are "mission critical". I know from experience that a cost-risk decision is made for systems which are especially difficult to test (and the costs are much easier to know than the risks).

    Management will be able to point to piles of paper generated in testing and to bundles of money spent on remediation in defending the claim that they showed great diligence. I see many reasons to question the glowing reports of Y2K compliance being published.

    Capers Jones, in his book THE YEAR 2000 SOFTWARE PROBLEM, has attempted to quantify the scope of the problem. 65% of software needing Y2K remediation is written in FORTRAN, COBOL, C-language or PL/I, while 35% is in 450 other languages and dialects. The shortage of trained personnel is severe for PL/I & Assembler and for many of the obscure languages. Jones says that PL/I is in wide use in the oil & energy industry due to its being promoted by IBM in the early 1970s as a business tool. About 30% of US software application contains at least two languages. Finding people competent in two specific languages, especially when one or both are no longer widely used, can be difficult. And there is also a problem with data in databases, which might contain 2-digit dates and require new software.



    I need to begin this section with a note on how PCs keep track of time. The BIOS (Basic Input Output System) is software that is responsible for performing tasks required to initiate the system when a PC is turned-on. It also provides the interface between the operating system and the input/output hardware (hard disk drive, modem, printer, keyboard, mouse, etc.). BIOS is stored in ROM (Read-Only Memory), meaning a memory that is permanently stored and cannot be erased or re-programmed. BIOS gets configuration information from a tiny piece of CMOS RAM (CMOS Random Access Memory -- a memory that is always running, even when the PC is turned off). CMOS RAM contains parameters about hardware as well as a 2-digit century (although the century mark is sometimes in ROM). CMOS (Complementary Metal Oxide Semiconductor) is an expensive kind of semiconductor that requires so little power that it can remain activated by a tiny battery for years.

    The time/date information is stored in another CMOS memory unit called the Real Time Clock (RTC). Almost all RTCs use a 2-digit year. BIOS receives time/date information from the RTC, expands the year to 4 digits, corrects the century information in CMOS RAM and sends the time/date information to the operating system. MicroSoft operating systems that are not Y2K compliant will nonetheless reject "1900" as an invalid year and "correct" the year to "1980".

    In practice, this whole procedure has been far more bug-ridden than my description would imply. One study found that nearly 80% of PCs with BIOS chips manufactured before 1997 could not roll-over from 1999 to 2000 and that nearly 15% did not recognize the year 2000 as a leap year [COMPUTER WEEKLY NEWS, 22-May-1997]. The US Nuclear Regulatory Commission now refuses to buy PCs built with Intel chips (see because Intel RTCs use a 2-digit year. Motorola maintains a list of its faulty RTC semiconductor devices at

    Some good practical information on PC BIOS Y2K problems, and how to fix them, can be found at and

    Potentially the most dangerous Y2K problem arises from so-called "embedded systems". These are chips that are built into a wide variety of machines, including VCRs, FAX machines, elevators, security systems, ATMs, microwave ovens, industrial controls, vehicles, weapon systems, etc. There are an estimated 25 billion embedded systems installed world-wide, with experts guessing that in the range of 0.1% to 5% could have Y2K problems [PC MAGAZINE, 6-April-1999, page 103]. Most consumer products use microcontrollers, rather than microprocessors. Since microcontrollers do not have an RTC, their likelihood of failure is extremely low (The Gartner Group estimates 1-in-100,000 -- see But industrial machinery & large weapons systems are more likely to include microprocessors, which are susceptible to some of the BIOS/RTC problems I have described. Although the explicit use of dates is rare in industrial controllers, time-intervals & synchronization often use time/date differences.

    Evaluation of the scope or impact of the embedded systems problem is beyond my capability -- and is perhaps beyond the capability of any human being. However, I am particularly limited by my lack of knowledge of electronic machine controllers and Programmable Logic Controllers (PLCs). I direct interested readers to collections of links on the subject:


    Capers Jones has said that malfunctioning software in a microwave oven chip caused a fire that nearly destroyed his neighbor's house. It is well known that medical equipment, industrial machinery and weapons systems can be vulnerable to Y2K malfunction. In most cases, but not all, they simply become inoperable. This is good for weapons systems, but not so good for medical equipment. With 25 billion chips world-wide, even one disastrous malfunction in a million would result in 25,000 crises.

    Finding & testing embedded systems is much more difficult than testing software applications. An offshore oil platform can have 10,000 embedded chips, many of which are underwater and difficult to find, much less access. (By definition, they are embedded systems because they are built into machinery. Even when it is possible to non-destructively remove chips from machinery, it is often not possible to test the date functions. I will not attempt to enumerate the many failures that have occurred when such tests have been made, but I will mention a few examples.

    Such a test in a Japanese electric company shut the entire plant down. A 1994 test in Phoenix, Arizona crashed the entire traffic system for 3 days. Some other examples can be found at and

    A Cap Gemini study in early 1998 found that only one-sixth of large firms had gotten beyond the stage of inventory & assessment of their embedded systems [CONTROL MAGAZINE, March 1998]. The 3M Corporation has reportedly addressed the embedded chips problem through requests for Y2K compliance letters from chip manufacturers (

    Finding & testing embedded systems has been compared to looking for burned-out light bulbs in Las Vegas -- with the power off (although I think light bulbs would be easier to find & remove!). Some embedded chip "remediations" consist of little more than looking for a real time clock in chips that have time functions. But chips without RTCs can have date functions in firmware. Software clocks stored in ROM, EPROM or ASICS (Application Specific Integrated Circuits) are what Bruce Beach calls "the carbon monoxide of computer death" because there is no practical way to test for them. Nonetheless, software clocks are probably rare in chips.

    Chips without time functions may still have an unused time/date capability that can cause the chip to malfunction. Chips with more capability than is needed for specific applications (including RTCs) can be mass produced more cheaply. By analogy the typical user of a word-processor typically only uses a tiny portion of the available functionality. Moreover, compliant chips may interact in a non-compliant way -- they cannot be tested in isolation.

    The American Chemical Society (ACS) has warned of chemical spills due to chip control of pumps & valves. After doing in-house remediation, Occidental Chemical company hired a consulting firm which found ten times more systems with potential Y2K problems than Occidental's own engineers had found.

    Tava Technologies, one of the largest independent firms doing Y2K remediation of embedded systems (and which has obvious vested interests in exaggerating the problem) has reported that of the tens of thousands of manufacturing automation systems & components they inspected, more than 20% were either fully non-compliant or non-compliant under certain circumstances. In a large pharmaceutical company with 4,457 embedded systems, Tava reported that 17% of the systems had serious enough compliance problems to cause plant shutdown or production degradation. 40% of the vendors of the non-compliant devices had declared the devices compliant. 15% of the vendors of the non-compliant devices were no longer in business.

    The California State Water Resources Control Board has an excellent summary ( of Y2K embedded systems issues which states that it takes up to 21 months to inventory, fix & test embedded systems in a small-to-medium sized plant, and there are over 30 tests that must be performed to ensure an embedded system is compliant. The summary includes the statement "It will not be possible to test all embedded systems and repair the non-compliant ones".



    Much of the discussion of infrastructure risk is presented in the form of "What if?" questions. Asking what could possibly go wrong is not the same as asserting that these things will go wrong, but it does illuminate interconnections & implications of failures that might not be immediately obvious.

    At the core of infrastructure is electric power. If there is no electricity, there cannot be much banking, communication (phones, Internet, TV, etc.), manufacture, water supply, etc. Frozen foods & vegetables spoil, farm animals freeze to death, railroad switching systems fail, and the possibility of repairing computer problems is poor, although some hospitals & computer facilities have back-up generators. There are four large electrical Interconnection systems in North America, the largest of which covers the eastern two-thirds of the US & Canada. The Western Interconnection covers most of the western third of the US & Canada. Quebec & Texas have their own Interconnections. These systems are mostly independent except that the Eastern Interconnection is dependent upon Hydro-Quebec. Although the Interconnections allow for more efficient use of resources and minimize costs, they also increase the possibility that a failure in one part of the system could incapacitate the whole Interconnection.

    The Eastern Interconnection is dependent on nuclear power for 40% of its electricity. Nuclear plants that have not proven themselves Y2K compliant to the US Nuclear Regulatory Commission will be shut-down. Even small failures in January 2000 could trigger a precautionary shutdown, but there is a reasonable chance that caution will mandate shutdowns before January in any case ( This would increase dependence on fossil fuel plants, which are dependent upon transportation for delivery of oil or coal.

    Like the power industry, the telecommunications industry relies heavily on embedded systems for switching & machinery control. Like the power industry, telecommunications are at the core of the infrastructure that makes the conduct of business possible. An April 1999 report indicated that both AT&T and BellSouth had conducted Y2K dry runs with 100% success (although no mention is made of embedded systems -- There are over a thousand telephone companies in the US, 20% of which supply 99% of the lines, but the FCC reports that nearly half of the smaller companies have no formal process of Y2K preparation.

    Water & wastewater may be the most vulnerable of utilities, due to dependence on embedded systems, poor resources for remediation and insufficient awareness or motivation about the problem. Attempts by the US wastewater association to survey its members about Y2K compliance in April, 1999 resulted in responses from only 15% of the largest facilities and less than 1% of the smallest. Only 35% of the respondents expected to complete Y2K remediation before the year 2000. Most water/sewer systems are owned by municipalities, many of which have no Y2K remediation plans. In Britain both water regulators & suppliers reportedly refused to guarantee water supply in early 2000 even to hospitals, but recent reports are more optimistic.

    Computers have greatly increased the efficiency of doing business through the use of "just-in-time" (JIT) delivery mechanisms that reduce the amount of warehousing of merchandise needed by retailers & manufacturers. Scanners at check-out stands can monitor inventory, which can be replenished by Electronic Data Interchange (EDI) phone-line ordering systems. But this increased efficiency comes at the cost of increased vulnerability to problems with suppliers.

    In 1995 employees at a Canadian brake manufacturer unexpectedly went on strike. Within four days General Motors, the world's largest corporation, was forced to shut-down production. To investigate its supplier dependencies, GM commissioned its internal audit department to study its 87,000 vendors. The audit not only revealed that 1,000 of the vendors were mission-critical, but that mission-critical suppliers themselves were dependent upon other suppliers. In some cases the dependencies of suppliers upon suppliers went as deep as nine layers -- like a line of dominos.



    Many people still have the child-like belief that governments are omnipresent, omniscient, benevolent entities that can solve all of our problems. The Federal & Provincial governments in Canada may be among the best prepared in the world, if a judgement is to be made on the basis of per-capita spending & self-reported success.

    But there may be serious problems south of the border. In contrast to some self-reported successes of US government agencies, Senator Robert Bennett (R-Utah, the Chairman of the Subcommittee on Government Management, Information and Technology) released a report ( for the quarter ending in mid-February, 1999, which gave failing grades to the Department of State & the Department of Transportation (DOT). Concerning the failing grade for the DOT, the report noted that "The Federal Aviation Administration's antiquated air traffic control system is a significant part of the problem. Its progress rate makes the horse and buggy look like rapid transit." In the 2-June-1999 Atlanta Journal-Constitution FAA officials reported that the air traffic control network was 92% compliant and that deadlines had been missed only because extra care was taken to ensure the computer systems would recognize the year 2000. When asked why his reports differ from those of the President's Y2K Council, Bennett replied that his committee reports to Congress, but the Council reports to the White House.

    Even Bennett's seemingly tough-minded report seems overly optimistic to me, however. The Department of Defense is given a "C-" grade despite the fact that the expected 100% completion date for mission-critical systems is some time in the year 2000. Grades were assigned on the basis of self-reported completion of self-selected "mission-critical systems" which do not include embedded systems. In the report, only the Department of Education is listed as having completed remediation of embedded systems.

    Even in Canada the Defense Department does not expect to be done until 30-Sept-1999. Given the fact that at least half of computer projects are normally late, that doesn't leave a lot of time. Again, embedded systems may not be included in this assessment.

    Page A3 of the 14-May-1999 issue of the GLOBE AND MAIL contains a statement of concern by Canada's Defense Minister about NATO's Y2K preparedness. Given the fact that NATO programmers have probably been too busy working on "smart bombs" that target the Chinese Embassy in Belgrade to worry about Y2K preparedness, it should be no surprise that NATO is not Y2K compliant. Anyone wishing to worry themselves over the possibility of nuclear war due to embedded systems in missiles is welcome to visit and anyone wanting advice on how to survive a nuclear war might want to visit

    Although all Canadian Provinces seem to be doing Y2K remediation, the same cannot be said of every US State. Many local governments in both countries are doing nothing at all. A 1998 survey of 3,600 American city governments found that 55% believed there was no need for Y2K remediation work on their computer systems.

    Although the Federal Deposit Insurance Corporation (FDIC) is reputedly the watchdog on American banks, the banks are forbidden by law to reveal their FDIC ratings on Y2K compliance, and the FDIC disclaims any certification of Y2K compliance. It seems like quite a paradox for an agency to try to enforce Y2K compliance while trying to disclaim any responsibility for its ability to verify Y2K compliance, but it would not be reasonable to expect anything else from a government agency concerning Y2K problems. And the FDIC may well be suffering from Y2K compliance problems itself. Although the FDIC supposedly insures bank deposits, if Y2K leads to a serious banking crisis the FDIC would not have the resources to come to the rescue. As with the Savings & Loan crisis in the 1980s, a Federal bailout may be necessary (assuming the IRS is not too seriously damaged).

    Although governments attract some very competent programmers, who are attracted to large & glamorous projects, it also attracts a very large number of dregs who are too incompetent or too lazy to keep a job in the private sector. Government projects are frequently delayed by bureaucratic wrangling and Y2K projects are probably no exception. I view reports of government Y2K compliance with particular suspicion.

    If welfare payments are delayed, looting & robberies may increase. There are reports of preparations in Canada to issue welfare cheques by hand, if necessary, although humanitarian concern may be the prime motive. The 19-April-1999 issue of MACLEAN'S magazine reports that Operation Abacus, perhaps the largest Canadian peacetime deployment of military forces in history, will be ready to deal with Y2K problems if necessary. Although the stated intent is to provide the kind of relief that was needed during the 1998 ice storms, they may come in handy if martial law is imposed. Toronto police are prohibited from taking vacations between 27-Dec-1999 and 9-Jan-2000, and Vancouver police have a ban on vacations during a similar period.



    Some experts have estimated that lawsuits due to Y2K problems may be the biggest financial risks that companies will face. The most common lawsuit is likely to be breach of contract due to failure to deliver goods or services because of problems with hardware, software, utilities or suppliers. Injuries or death from malfunctioning equipment in chemical plants, hospitals, etc, can lead to tort suits. Subsequent loss of profitability could lead to further lawsuits by shareholders against company directors. Accounting firms would probably also be sued for failure to adequately disclose Y2K problems.

    Elevators programmed to go to the ground floor and shut-down when software tells them that maintenance is 100 years overdue could force employees to climb flights of stairs. An employee who experiences a heart attack in the stairwell of the 8th floor could bring a lawsuit. FAX machines that stamp FAXes with the wrong date could cause misrepresentations where FAXes are used to document diligence. Orders placed on the last trading day of 1996 at the Brussels Stock Exchange were date-stamped 1997, and the Exchange was forced to close for 3 hours at the beginning of 1997. Lawsuits over this mishap have not yet been resolved. Even seemingly small failures can have large legal and financial consequences.

    The victims of the lawsuits are likely to turn to insurance companies for coverage, suing them if necessary. Insurance companies would likely countersue or refuse to reimburse on the grounds that Y2K damage was expected. This mass of claims & lawsuits could result in a crisis for many cryonicists who are dependent upon insurance.

    In the US, insurance regulators have approved wording for general liability policies that exclude losses due to computer date problems. In 1997 Nevada passed legislation granting itself (the Nevada government) and all Nevada local governments immunity from lawsuits due to computer date problems. By early 1999, California, Florida, Georgia, Hawaii and Virginia had passed similar legislation, and many more states were considering Y2K immunity laws. Deaths & vehicle damage on 1-January-2000 due traffic light malfunctions in Nevada and other states cannot be compensated by legal remedy (lawsuit).

    A bill passed the House and US Senate to require a 90-day "cooling-off" period for Y2K lawsuits and to limit punitive damages & executive liability. Business/technology leaders lobbied hard for this bill, which Clinton threatened to veto on grounds of consumer protection and to prevent weakening of remediation efforts. A modified version of the law was passed with White House sanction. I know of no similar legislation in Canada.



    According to Capers Jones, the countries with the largest inventory of software are the United States, Japan, Russia, Germany, United Kingdom, Brazil, France, China, Italy and India -- in that order. The US has more than twice the software of Japan, the country with the second largest software inventory.

    While painting a rosy picture of domestic preparation for Y2K, business executives and media alike warn of dangers abroad. The 19-April-1999 issue of MACLEAN'S depicts a world map with estimates from The Gartner Group that at least 50% of company & government computer systems will experience at least one critical failure in Russia, China, Indonesia, Pakistan and many African nations. The estimated 10-15% "critical failure" rate for Canada, the US, the UK, Australia, Switzerland and Sweden also seems serious (in contrast to the reassuring tone in the text of the article).

    Russia, with the 3rd largest software inventory in the world (much of it pirated), has been called "Bangladesh with Missiles". Prior to 1999 there was no Russian government budget for Y2K remediation. In 1998 a spokesman for the Russian Nuclear Power Ministry told RUSSIA TODAY magazine "We'll deal with the problem in the year 2000".

    A January 1999 study by the World Bank reported that only 54 of 139 developing nations had national Y2K policies and that only 21 were actively engaged in remediation. The Gartner Group has predicted that half of all Latin American enterprises will be damaged by Y2K problems. North America is dependent on South America for winter vegetables, among other goods.

    Programmers in France & Germany have been too preoccupied with preparing for a common currency to prepare properly for Y2K. Germany faces additional risk due to the large loans it has to Eastern block countries ill-prepared for Y2K. Additional risk faced by France is that 60% of its electricity comes from nuclear energy -- a high dependence on embedded systems and slow progress with remediation. Italy is believed to be even less prepared than France or Germany.

    Airports in Paris, Spain and Italy are believed to be in the "at risk" list being drawn up by the International Air Transport Association. A spokesman for the airport in Rome said that Y2K remediation would not be resumed until September, when the tourist season is over. KLM has announced plans to cancel flights during the New Year's period to airports believed to have air traffic control risks, and other airlines will undoubtedly follow suit.

    Middle Eastern countries have done little to prepare for Y2K because year 2000 corresponds to 1418 in the Islamic calendar. But their software often translates Gregorian dates to Islamic and their exposure to problems with embedded systems may be greater, considering their dependence upon the functioning of oil industrial functions.

    Japan has the second largest inventory of computer software in the world, most of which is customized (rather than packaged, as in the US). Recession and a calendar based on the emperor's birthday have no doubt contributed to the lateness with which a remediation effort has begun in Japan. The Gartner Group until recently had rated Japan's preparedness on a level with Pakistan and Panama.

    At the January, 1999 Devos, Switzerland World Economic Forum, Sun Microsystems CEO Scott McNealy told a press conference: "People are talking about stockpiling cash, water and canned goods. Given what I and everybody else in the computer industry knows about Asia, it might not be a bad idea to stockpile some computers for the next millennium." In May, 1999, a spokesman for Intel made a statement that January, 2000 power shutdowns in Japan could jeopardize the American economy. Nearly a quarter of Intel's "mission critical" suppliers are in Japan (

    South Africa has declared December 31, 1999 and January 3, 2000 as additional national holidays to allow time for Y2K remediation before the first business day. Britain has also declared January 3rd a holiday. More countries may make such a declaration in the coming months. For some embedded chips this may allow more time for damage to be done before discovery on the first working day (as in chemical plants), but for others (like utilities) this may give enough time for repair (assuming replacement chips can be obtained on short notice on a weekend -- at a time when others may have the same demands). Monday, January 3rd is already declared as a holiday for Canadians, but it is still a working day for Americans. Clinton refused to make January 3rd a holiday on grounds that the reprogramming required would add to the burden of Y2K work.



    It has been said that Y2K fearmongering is the moral equivalent of yelling "FIRE" in a crowded theatre. If that is true, how can people know, discuss or discover the truth through the fog of a polite & ethical silence? Many Y2K websites show no interest in technical enquiry, but focus primarily on "practical" issues like storage of food, water & weapons. Ironically, "Y2K for Women" ( is such a website. This is in contrast to "Y2K for Kids" ( which reads like a tale of Santa Claus. Who wants to be accused of scaring little children?

    I'm sure that many responsible authorities believe that the most moral thing to do is to reassure the public, even if they privately have doubts. It is quite possible that widespread panic could create a disaster even if the Y2K Computer Bug proved to be a non-event. Or that panic could turn a disaster into a cataclysm. I write what I have written with the thought that I am addressing mature, responsible adults who can make the best possible decisions only if the best possible information is available. I may have a psychological aberration that causes me to be an alarmist, but my sense of a weight of evidence makes me believe that this cannot be the only factor governing my perceptions. To remain silent would be to fail to warn my precious comrades in the struggle for life of possible threats to survival. (I have often puzzled about the proper way to inform people about a fire in a crowded theatre.)



    Whether January, 2000 comes in with a bang or with an anticlimactic whimper December, 1999 is likely to be a time of craziness unlike anything we have experienced in our lifetimes. Grandiose plans are being made by movements & interest groups of a wide variety of persuasions (peace activists, ecologists, and perhaps even terrorists) to mark the year 2000 with something Colossal. The coming of the Millennium, whether or not it is the "real" Millennium, is bringing out the megalomania in large sections of the population.

    Over ten million people are expected to travel to Rome for the Grand Jubilee of the Incarnation of Christ. Party 2000 scheduled for Southern California has been expected to draw millions, as is Times Square 2000 in New York City. There may be many Millennialists expecting the Second Coming of Jesus Christ. (Seventh Day Adventists & Jehovah's Witnesses were based on movements with failed Millennialist expectations for the years 1843 & 1914, respectively.) Many of these people are already claiming the Computer Y2K Bug as an instrument of God & the Apocalypse.

    Voices in the media crying that there is no Y2K Bug Crisis and that "the only thing we have to fear is fear itself" will become increasingly strident & shrill -- to the point where many people who ONLY fear panic will start to panic.

    Resignations & retirements by computer programmers, managers, executives and directors at a time when companies are increasingly demanding that their employees work overtime and not take vacations is likely to create economic problems in addition to the ones actually created by the Y2K Bug. Businesses which have been in denial, procrastination or simply behind schedule with Y2K remediation will be making frenzied efforts at corrections which are bound to introduce many new errors. Companies that regard themselves as Y2K-ready could become so worried about their suppliers that inter-company relations may become bitter.

    A stock market crash in the Fall (which I fully expect) will only add to the sense of panic & despair. People who had been postponing their "survival" preparations will be preparing with a frenzy -- resulting in shortages which will further exacerbate the panic.



    In early April I attended a Toronto MENSA presentation on Y2K by a programmer who has been supervisor of a Y2K remediation project. He plans to spend the New Year in an underground bunker. I have spoken to a number of computer programmers who plan to be somewhere far away on New Year's Eve, although others trivialize & laugh at the problem. My employer has mandated a moratorium on vacations by programmers in the December/January period, so if some of these programmers want to be far away they may have to quit their job. Ironically, this in itself might create a New Year's software crisis.

    I don't know how bad the Y2K problem is going to be. Sometimes I feel like I am being reckless with my life in not doing more to prepare, and at other times I feel like I am being a paranoid fool. Some reasonable advice can be found on the American Red Cross Website (, but I am more inclined to stockpile provisions for a month, rather than a week. More worried survivalists might want to visit the survival and preparation sections of

    If a severe disaster continues for more than a week, predators may be a greater danger than lack of provisions. If I was really worried, I would quit my job and try to camp-out in a remote, hidden location (a late escape from the city would risk ambush on the road). I have decided against that, more because the circumstances that would require it are too horrible for me to contemplate than out of rational provision. And it is so hard to know what is rational. I am risking that social order (even if only martial law) will be restored within a couple of weeks and that there is a reasonable chance I may be able to return to work. I would sooner rely on being well hidden or having secure locks & bolts than on having weapons.

    I see no need to buy expensive "survival foods". There is a lot of condensed nutrition & calories in dried milk, whey protein powder, nuts, canned tuna, canned blackbeans, pasta, dried fruit & tomatoes and Life Extension Mix. Since it will be winter, there may even be a chance to store fresh fruits & vegetables in the snow. (Diabetics can do the same for insulin.) A full bathtub, a waterbed full of fresh water, and an ample supply of drums can hold a lot of water. Those with houses might have access to hot water tanks & swimming pools. Bleach can be used to purify water.

    Both the Canadian & American governments are printing extra money in anticipation that people may hoard cash. But officials are worried that if there are insufficient armored cars to service ATM machines, people may panic. Banks are already toughening loan policies out of concerns that businesses are not ready for Y2K, and they may toughen even more to ensure adequate liquidity near the New Year.

    Many businesses will stockpile supplies rather than rely on JIT delivery. Some will even stockpile cash so they can cover payrolls.

    Many people who regard the Y2K Bug as a severe crisis are vociferously hostile to the "selfish individualist survivalist mentality". Such people advocate community-spirited co-operation as the only solution. Ideally, cryonicists would be like those Mormons who normally keep a year's supply of provisions on hand at all times. During a crisis lasting a few days, Mormons have been in a position to become beloved by their communities by sharing provisions. Few of us want to be forced into lifeboat ethics. But life is not always kind, and sometimes we find ourselves in lifeboats.



    I have discussed the question of cryonics facility preparedness with Paul Wakfer, former President of CryoSpan. He could think of nothing other than liquid nitrogen which would be essential to stockpile. Because of boil-off, liquid nitrogen is not easy to stockpile. Paul recommends that cryonics organizations ensure that dewars are topped-off late in December, and keep extra liquid nitrogen in a spare dewar. I expressed concern that waiting until the end of December to top-off dewars might entail risk of unavailability due to stockpiling by other businesses, but Paul assured me that other businesses using liquid nitrogen are not in the same situation as cryonics facilities.

    Extra supplies of food & water with provisions for many people living in the cryonics facility for an extended period would probably also be prudent. The facility would be a good place for key members of the organization to "celebrate New Year's" and spend the night. As much as possible, data normally resident in computers should be printed in duplicate or triplicate and stored in different locations.

    Several years ago I argued very vigorously that cryonics organizations should keep their patient funds invested in equities rather than fixed income securities. I believe that for the next year or two, all possible patient care funds should be invested in government T-bills or precious metals. I think the danger of a stock market crash and recession is very great. On the other hand, once the worst is over, equities will be a good investment, because I believe the long-term prospect for technology to increase wealth and improve the human condition is fabulous.



    Peter de Jager is a Brampton, Ontario computer consultant with such renown as the foremost authority on the Y2K Bug that he wrote the article on the subject for the January, 1999 SCIENTIFIC AMERICAN (page 88). This article painstakingly explains the fundamental Y2K issues. In his conclusion de Jager states "I believe that severe disruptions will occur and that they will last perhaps about a month."

    de Jager seems to not be mincing words when he speaks of "severe disruptions", but such a phrase (like the Gartner Group's "critical failure") has the quality of astrological predictions that are vague enough to fit any denouement. What we really want to know is "will there be food, water and electricity -- and if not, how long will it take to restore service? Will there be conflagrations, rampant violence or nuclear catastrophes?"

    Definite answers are available. You can go to, click on the "predictions" button and select any prediction that suits your fancy -- from the claim that Y2K is a hoax, to assertions that it will be The End Of The World As We Know It (TEOTWAWKI). Most books on the subject of Y2K are negative, but they also tend to be out-of-date if published before 1999. Most websites are negative, but many of these are out-of-date also. Most magazine & newspaper articles are positive, but they report on press releases of successful Y2K application testing of post-2000 dates (something not easily done for embedded systems). The predictions most matching my own expectations can be found at -- and I mostly base my predictions on concern about embedded systems, despite my limited understanding of the subject. I have given many links to embedded system sites to encourage my readers to decide for themselves.

    Considerations such as these have led to the well-publicized predictions of Dr. Edward Yardeni, chief economist at the investment banking firm Deutsche Morgan Grenfell to predict a 70% chance of global recession in the year 2000 ( Views of other experts can be found at and a collection of national Y2K progress links are at -- most of which are fairly negative. Some, like Gary North ( are widely regarded as socially irresponsible fearmongers for predicting catastrophic crisis. A site dedicated to Y2K issues in Canada can be found at

    I can appreciate that assembling a "SWAT" team to solve one computer problem could be difficult when there are many such problems amidst widespread fires due to failed equipment compounded by failed fire alarms, failed phone systems, failed water systems and inoperable fire truck ladders. But will it really be that bad? In 1997, Capers Jones predicted that costs of Y2K remediation would be so great that 1998 & 1999 would be years of recession. His prediction was wrong.

    Psychologically, it is hard to accept the possibility of radical deviations from "normality", especially when such deviations have not happened in North America for countless years. But to think, "it hasn't happened before" is not applicable to an event that is certain and unlike anything that has happened before. The problem is maddening. I vacillate between feeling like Paul Revere and Chicken Little. But I think the consequences of being overprepared for a disruption are far less severe than being underprepared for a disaster.

    Severe problems, if they are to occur, will not begin on the stroke of midnight at the end of 1999. If severe problems have not begun in late December, the problems in January should not be too terrible. Y2K malfunctions with embedded systems are likely to take the form of popping popcorn, beginning with a few pops and reaching a crescendo on the day of 1-January-2000. Fewer applications systems problems will happen before the stroke of midnight, and their worst effects will be felt in the first week of January, and should be mostly manifested by the end of the first month.

    The best place to start when searching for Y2K information on the web is Peter de Jager's website at Anyone interested in the latest news on Y2K could hardly do better than -- and for those in need of comic relief, there are some good Y2K jokes at


    For an update on information in this article go to The Y2K Computer Bug -- Update September 1999