Project Management Institute

How the Internet reduced Y2K damages

by Capers Jones

img

AFTER BRACING FOR SIGNIFICANT numbers of failures and problems, the software industry experienced fewer Y2K problems and fewer Leap Year problems than expected. The popular press is asserting that the reason for the comparatively low number of serious failures is because the Y2K problem was trivial. However, from analysis of the problems that did occur, it is clear that the Y2K problem was potentially very serious indeed. The most probable reason for experiencing fewer Y2K and Leap Year problems than anticipated is because Y2K analysts and software repair teams had unprecedented abilities to share data and communicate via the Internet and the World Wide Web. The availability of detailed Y2K problem lists and possible solutions on the Web greatly facilitated Y2K repairs and readiness. If the Internet had not been available, Y2K problems might have been at least an order of magnitude more common.

In 1994 the software industry embarked on an unprecedented set of software repairs to correct a long-standing problem with the way many computers and software packages stored calendar dates. At roughly the same time that repairs were commencing on the Y2K problem, Internet and World Wide Web usage began to explode across the global business community. Prior to the availability of the Internet's e-mail and web sites, information transfer from country to country and company to company was very limited.

The Y2K problem was the first major international crisis to occur after the Internet was widely deployed. I believe the fact that Y2K problems were fewer and less serious than anticipated is due in large part to the impact of the Internet. Indeed, the use of the Internet in alleviating the Y2K problem can serve as a useful model for dealing with other major international issues.

Risk Management Prior to the Internet. Before the Internet became widely used, research on any kind of a business or technical problem required specialized tools, library support, and often assistance from personnel trained in the use of proprietary search tools. While major corporations and government agencies could access information reasonably well, workers in small companies that lacked on-site librarians were very limited in how much information they might gather, even on important topics.

Although research prior to the Internet was reasonably effective, it was slow and it took quite a lot of work. There were some other limitations that were not realized as being significant at the time, but are now known to be important. In general, gathering data before the Internet required active seeking and extraction of information. There was no effective way of making data on specific topics simply show up as exists today. In other words, data and information were “pulled” from available sources such as libraries. The concept of “pushing” data or broadcasting findings to a wide audience only became feasible by means of the Internet.

Caper Jones (capers@spr.com) is chief scientist for Artemis Management Systems and a leading authority on software development issues.

Dangerous Dates for Software Applications

The year 2000 date problem is not the only calendar problem causing trouble for software applications. There are known date problems that are likely to affect software applications over the next 50 years. Date problems that might impact software include the date at which global positioning satellites (GPS) roll over, the dates at which commodities switch to the Euro, the dates at which the Unix and C libraries roll over, and some hazardous date patterns that have been used for non-date purposes in software applications. In addition, at some point early in the 21st century the numbers of digits assigned to social security numbers and telephones will run out of capacity.

Over the next 50 years at least 100,000,000 software applications globally will need modification because of various date problems. The total costs of these modifications can top $5 trillion dollars.

Starting in August of 1999 and continuing at intervals over the next 50 years both methods used by computers and software for dealing with dates will experience problems because of the historical practice of attempting to conserve storage space. What are some of the known date problems that are going to affect computers and software over the next 50-year period? Between now and roughly the year 2050, a huge amount of effort and hundreds of billions of dollars will be spent on expanding numeric fields in software applications:

Financial fields starting circa 1980

Zip codes starting circa 1985

Date fields starting circa 1999

Telephone numbers starting circa 2012

Social security numbers starting circa 2050.

None of these massive software updates will add useful new features or functions to applications. Their main purpose is merely to allow the applications to continue to operate when dates or information volumes exceed the available sizes of the fields originally set aside to store the information.

The software industry is currently dealing with each problem individually as it occurs, rather than seeking general solutions to the fundamental problem. I have recommended that an international symposium be convened on the problem of dates and computers that will address the root causes of all of these problems.

Four possible solutions can be envisioned for the fundamental problem of inadequate date and numeric field sizes:

Developing standard formats for dates that will not expire in short periods

Developing methods for finding hidden or indirect dates with high efficiency

Developing mass-update tools and technologies that can make changes rapidly

Developing improved testing methods to minimize the risks of missed dates.

Unfortunately, the current international standards for dates are not adequate; there are no proven methods for finding indirect dates; much of the work of making these massive updates remains manual and labor-intensive. Further, testing of software has never been 100 percent effective, and testing for date and numeric fields has seldom been more than 95 percent efficient, and often worse.

For studies prior to the Internet, information from direct competitors was only available if it had been published or presented at an external conference. Now that uselists, websites, and e-mail are so common, it is a daily occurrence to receive information from scores of sources. For a widespread problem such as Y2K, being able to gather data from a variety of sources was a considerable advantage.

As a long-time employee of several high-technology corporations such as IBM and ITT, circa 1979–1985, I was often involved in dealing with various risks that were associated with software applications. For example, I know that both IBM and ITT mounted large-scale efforts to improve software quality levels. IBM also mounted a large-scale effort to improve maintenance defect repair turnaround as well as reducing the numbers of defects.

The starting point for these risk-abatement projects was often a search of the available literature on relevant topics. In the 1970s and early 1980s, performing literature searches required assistance from librarians who were trained in running special search tools. The tools were complex and required careful structuring of the vocabulary and syntax used to guide the searches.

The results returned by search tools were sometimes helpful, but many times the information was of only marginal utility. Therefore it was necessary to repeat the search several times, with each pass using different syntax and keywords. Sometimes more than two weeks would pass before the results were on target and false hits were screened out. At best, about 25 books and 50 or so journal articles would be identified as being relevant.

In addition to searches of the literature, it was also necessary to identify personnel who were skilled in the relevant topics such as quality assurance, testing, and software maintenance. IBM had a skills inventory so that personnel with some background in selected topics such as testing could be identified. However, ITT lacked a skills inventory, so the methods used there to identify personnel with quality assurance skills included many telephone calls plus actual site visits to the ITT labs with quality assurance departments.

Overall, gathering useful information and assembling lists of personnel who could participate in quality improvement efforts took more than six weeks on the part of the team leaders, staff assistants, and librarians.

Risk Management Using the Internet. By fortunate coincidence, the use of the Internet began to spread at the same time that the business community was grasping the seriousness of the Y2K problem. To those of us working on the Y2K problem, we quickly realized that the Internet and the Web were going to be helpful. But only in retrospect can we see how serious the Y2K problem might have been without the benefit of the Internet. Let's consider how the Internet has become a key tool in risk management, and its particular place in dealing with Y2K risks.

When the Y2K problem began to be taken seriously, about 1994, the Internet, e-mail, and the World Wide Web had already begun to be common business tools. Since a number of Y2K researchers such as Peter de Jager had established websites, it was possible to use search engines to quickly gather a significant quantity of useful Y2K information.

As companies such as IBM, Microsoft, and Computer Associates began to provide Y2K status reports of their key software packages on their own websites, the work of Y2K project managers was greatly facilitated. IBM, for example, provided detailed discussions of the Y2K status of all major commercial applications. This gave Y2K repair managers an excellent head start in planning remediation and testing activities.

E-mails and uselists began to coalesce around Y2K issues. Scores of prominent Y2K consultants and experts such as Bob Bemer, Tom Crouch, Irene Dec, Martyn Emery, Dave Hall, Robin Guenier, Leon Kappelman, John Koskinnen, Dick Lefkon, Howard Rubin, Bill Ulrich, and Ed Yourdon were in frequent email contact.

I estimate that by early 1999 no fewer than 500 major websites containing at least 1,000,000 pages of Y2K information were available over the Internet.

In addition, the Internet allowed easy communication across the traditional boundaries of time and occupation. For example, many police and fire department Y2K researchers could quickly establish contact with their peers and colleagues throughout the United States and abroad. Municipal risk managers, Y2K software managers, attorneys, corporate risk managers, and Y2K consultants could easily communicate and deal with technical and social issues.

The Internet was also a primary tool for community action groups in scores of cities around the world. In the U.S., Y2K community meetings took place in both large cities and smaller communities.

But major journals either ignored the Y2K problem or put out information consisting more of “sound bytes” than of substance. And because television stations, newspapers, and magazines traditionally do not deal with long-range issues, coverage of the Y2K problem by the media was often trivial and misleading—that all changed in 1999, when the Y2K frenzy became “real news.”

By contrast, although the Web and the Internet did contain some shallow pieces and some articles that exaggerated the risks, the volume of solid Y2K information was probably an order of magnitude more plentiful on the Internet than for all other channels of information prior to 1999.

I FIRMLY BELIEVE that the Internet and World Wide Web provided easy access to timely and important information, and was a major player in preventing the Y2K situation from being a whole lot worse. ■

Reader Service Number 054

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI.

July 2000 PM Network

Advertisement

Advertisement

Related Content

  • PM Network

    Playing with Fire

    By Jones, Tegan With the coastline of an entire continent burning, a scorched-earth urgency had teams across Australia racing to control the damage. Between September 2019 and January 2020, bushfires ravaged…

  • PM Network

    Trees of Life

    By Hendershot, Steve The world needs more trees—and a lot of them—to stem the damage wrought by mass deforestation. Brazil alone is destroying the equivalent of three football pitches per minute in the Amazon rainforest…

  • PM Network

    Rising Risks

    By Nilsson, Ryan For as long as humans have been building cities, they have migrated toward the coasts -- for food, ease of transportation and any number of ecological benefits. Today, it's estimated that more than…

  • PM Network

    From the Rubble

    By Thomas, Jennifer Puerto Rico's infrastructure woes began long ago. But a series of earthquakes this year coupled with hurricanes Irma and Maria in 2017—which racked upUS$139 billion in damage—exacerbated the U.S.…

  • PM Network

    Protection Clause

    By Parsi, Novid As harbors of sensitive client information, law firms are ripe targets for hackers. According to PwC's 2019 global survey, 100 percent of the top-10 surveyed law firms experienced a cybersecurity…

Advertisement