Project Management Institute

Lessons learned from the U.S. health benefit exchange projects

img

IT and Project Management Consultant, Terrapin Consulting

Abstract

This paper examines the lessons learned of the health benefit exchange websites mandated under the U.S. Affordable Care Act (ACA) of 2010. States could either join the federal exchange or implement their own. The federal website and state-run websites such as Oregon and Maryland experienced significant project-related issues that hindered their effectiveness while exchanges in Kentucky and Connecticut launched more successfully. This paper examines the common characteristics of both the failed projects and the successful projects. Common characteristics of the failed projects include poor understanding of scope, lax change management processes, confusion over roles and responsibilities, and ineffective risk management. Successful projects excelled at scope management and change control, risk management, and stakeholder management, and they included sufficient time for testing.

Summary

The Affordable Care Act of 2010 provided for the establishment of health benefit exchanges (hereafter referred to as exchanges or websites) to facilitate the purchase of health insurance. States could either join the federal exchange or implement their own. Sixteen states and Washington, D.C. opted to develop their own exchanges. On October 1, 2013, the state exchanges and the federal healthcare.gov websites were launched. The exchanges should enable to, in the words of President Barack Obama, “compare ..... plans side by side, the same way you would…shop for a TV on Amazon” (2013). The results were mixed. The federal site and several of the state websites experienced significant problems. But some websites launched successfully.

This paper examines the lessons learned from the health benefit exchange websites. Three states with failed projects released significant data from status reports done during the projects by independent quality assurance contractors. Data from the status reports, and other publicly-available data, were categorized by PMI Knowledge Area.

Based on the analysis, the greatest amount of risk was identified in the Project Scope Management, Project Integration Management, and Project Time Management Knowledge Areas. All three states did not define the scope effectively, either including features that were not required for launch or missing functionality that was required.

For Project Integration Management, the failed projects suffered from lack of planning and lack of change control. In Vermont, the QA contractor said “there appears to be very little control over changes in the schedule, deliverables or scope. Impact to other project areas is not analyzed or alternatives presented” (Gartner, 2013a, p.7). In Maryland, there was no Change Control Board even though the request-for-proposals called for one and the winning proposal included it (Barnickel, 2014, p. 11).

Most schedule time estimates were poor and significant schedule compression occurred toward the end of each project. This mostly impacted testing time, and some functionality was released with minimal or no testing.

There were risks in the remaining PMI Knowledge Areas, but they were not necessarily on all three projects. In human resources, Vermont had 20 unfilled positions just months before launch and Oregon had poorly defined roles and responsibilities. Oregon decided to change its procurement strategy and not hire a systems integrator; and in Maryland, there was a lack of communication. Finally, none of the states managed risk well—risks such as a lack of testing were accepted instead of being mitigated.

On the other end of the scale, Connecticut, Kentucky and Washington state had successful projects. The analysis shows successful projects excelled at scope management and change control, risk management, and stakeholder management, and they included sufficient time for testing. For example, under Project Scope Management, these states defined scope aggressively and narrowly. Unlike the federal healthcare.gov site, both Kentucky and Washington state did not include account creation as a prerequisite to shopping for insurance plans.

These lessons learned from both the failed and successful projects provide insight to project managers involved in large public sector projects.

This paper does not evaluate the policy goal of the ACA—that is, insuring more Americans. Nor does it examine aspects of the project outside of the exchanges themselves.

Methodology

The following analysis is based entirely on publicly available data on the exchanges. The three failed projects are the state health benefit exchanges in Maryland, Oregon, and Vermont. The three successful projects are the state health benefit exchanges in Connecticut, Kentucky, and Washington state. The decision to classify a project as a success or failure was based on the opinions of the stakeholders, not the author.

Maryland, Oregon, and Vermont, three of the exchanges that did not function properly at launch, released significant amounts of data about their projects. All three states released portions of status reports prepared contemporaneously by quality assurance (QA) or independent verification and validation (IV&V) contractors (hereafter, both are referred to as QA contractor) on the project. In addition, Oregon hired a contractor and Maryland charged its Office of Legislative Audits to perform audits after the scheduled launch date for the website.

The risks from the QA contractors were categorized by PMI Knowledge Area from A Guide to the Project Management Body of Knowledge (PMBOK® Guide) – Fifth Edition (2013). For Maryland and Oregon, the number of times they appeared in the status reports was summarized to indicate the PMI Knowledge Area that caused the most problems.

For Vermont, the QA contractor included a severity rating. The severity was given a numerical score corresponding to Low = 1, Medium = 3 and High = 9. For example, a risk that was in three reports always with a Low severity was summed to 3 points while another risk that was Low in one report, Medium in one report and High in three reports was summed (1x1) + (1x3) + (3x9) to equal an overall rating of 31. This example is shown in Exhibit 1 – Example of the Vermont analysis.

Example of the Vermont analysis

Exhibit 1 – Example of the Vermont analysis

Little data has been released on the successful projects. Analysis of the successful projects is based on either meeting minutes from the Exchange Board (Washington State) or interviews with participants (Kentucky and Connecticut).

Results

Maryland Health Benefit Exchange

Testimony by the Maryland Department of Health and Mental Hygiene after the launch of the health benefit exchange noted the most significant problems (Sharfstein, 2014, p. 4):

  • Serious software defects
  • Poorly configured software which crashed on October 1
  • Major challenges integrating different software products.

Exhibit 2 shows the results of the analysis of the Maryland health benefit exchange. Maryland has only released the executive summaries of the monthly status reports prepared by BerryDunn. The executive summaries only include high risks. For example, the February executive summary lists five high risks but says there are 12 risks in total (BerryDunn, 2013, p. 2).

Count of Risks by PMI Knowledge Area – Maryland

Exhibit 2 – Count of Risks by PMI Knowledge Area – Maryland

It is notable that the QA contractor recorded an average of five high risks per month for the first eight months and that number jumped to 12 in the month before launch and 15 in the month of launch. It is also notable that the QA contractor sent a letter to the Executive Director of the Maryland health benefit exchange five days before launch that listed eight key reasons why the state should reduce functionality at launch (Leadbetter, 2013). The Executive Director responded that she remains “comfortable with our plan to move forward” (Pearce, 2013). The Executive Director resigned two months after launch (Cox, 2013).

Maryland spent $90 million on its website before deciding to scrap it and install software from Connecticut's exchange for an additional US$40–50 million (Johnson, 2014).

Oregon Health Benefit Exchange

Oregon has the distinction of never launching a functional website. On September 16, 2013, two weeks before go-live, the government project manager said that while there were issues, “bottom line: we are on track to launch” (First Data, 2014, p. 61). The website subsequently failed system testing for two months. In November, the website launched in the form of a fillable PDF that had to be completed and processed manually. The website was abandoned in April 2014 when Oregon chose to join the federal healthcare.gov site (Wozniacka, 2014). Both the executive director and the lead IT person resigned. The site was estimated to cost US$248 million (Kopta, 2014).

Oregon was hobbled by a convoluted management and oversight structure. The state created an agency named Cover Oregon to work in conjunction with the Oregon Health Authority and the Oregon Department of Human Services while oversight was charged to the Cover Oregon Board, the Department of Administrative Services, the Legislative Fiscal Office, and the federal Center for Medicare and Medicaid Services.

After the failure, Oregon hired First Data to do an audit. First Data's audit findings rely extensively on reports by MAXIMUS, the QA contractor on the project. First Data summarized the QA data into 15 high risks. After the analysis for this paper was completed, the actual MAXIMUS QA reports were released that contain significantly more data.

Count of risks by PMI Knowledge Area - Oregon

Exhibit 3 – Count of risks by PMI Knowledge Area – Oregon

The audit report noted three main areas that did not function effectively:

  1. Competing priorities and conflict among agencies involved in the exchange
  2. Lack of universally accepted foundational project management processes and documents
  3. Communication and lack of transparency (First Data, 2014, p. 2-3).

The audit also pointed out failings such as the lack of formal meeting notes and documentation. For example, the project decision log that was designed to capture risks, issues, and decisions had only nine entries.

The project also suffered from two major procurement mistakes. One, the state decided not to hire a systems integrator, which is standard for large IT efforts. Two, the state used a time and materials contract for the software vendor's consultants as opposed to the best practice of using a fixed price type contract.

The state lacked the experience and personnel to perform the systems integration function and lacked the contractual authority to demand more from the software vendor.

Vermont Health Benefit Exchange

Vermont released 14 QA reports from Gartner. Gartner pointed out project management failings such as:

  • “Plan continues to be reworked and updated, there is no way to report on or assess progress. It is not clear what components will be delivered when. The dates seem to be unrealistic and unachievable. Some dates violate [federal] schedule requirements.” (Gartner, 2013b, p. 4)
  • “Appears to be very little control over changes in schedule, deliverables, or scope. Impact to other project areas are not analyzed or alternatives presented.” (Gartner, 2013b, p. 9)

Gartner cited in reports dated May 14, May 22, June 19, August 14, August 28, September 12, September 29 and (after the scheduled launch date) October 11 that development, test, pre-production, training, production, and the disaster recovery environments were not ready as planned. This resulted in substantial schedule compression and minimal time for testing. Gartner also highlighted under-staffing as an issue. Just four months before launch, the state had not filled the roles of Test Manager, Training Lead, Solution Architect, and Technical Architect and the vendor had 20 unfilled positions with no staffing plan to address the issue.

One poor decision was the risk mitigation technique to this risk recorded by Gartner on August 28 (about five weeks before launch): “It is likely that functionality will not be tested until it is too late” (Gartner, 2013c, p. 4). Vermont's response was to accept this risk.

Sum of risk ratings by PMI Knowledge Area – Vermont

Exhibit 4 – Sum of risk ratings by PMI Knowledge Area – Vermont

Vermont's website cost $19 million. This amount is not complete, as an additional US$200 million in funding on the state's Medicaid systems is partly designed to support the health benefit exchange.

Summary of the Failed Projects

All three state projects did a poor job understanding scope and maintaining control over scope. This, combined with unrealistic schedule estimates, ensured the websites would not launch successfully. Additional problems arose over insufficient staffing in Vermont and confusion over roles and responsibilities in Oregon. Oregon made two poor decisions in procurement. Finally, all three states clearly did not understand or chose not to believe the amount of risk on the projects and therefore chose the wrong risk management technique.

Summary of the Successful Projects

There is insufficient data on the successful projects to analyze by PMI Knowledge Area. What follows are the areas that the participants felt contributed most to the success of their projects.

Washington state's project governance was nearly the opposite of Oregon's: they had a single project leader with a single oversight committee (the Exchange Board) and they hired a systems integrator on a deliverables-based contract.

Washington had a clearly defined scope that was realistic and achievable. They did not include functionality that was nice to have such as a live chat feature attempted by some of the exchanges. They also did not make the healthcare.gov mistake of forcing users to create an account before browsing insurance plans. Reports indicated that the project had an effective change control process with a focus on feedback from the QA contractor.

Connecticut was also aggressive on scope. Instead of 14 major functions included in the healthcare.gov website, Connecticut focused on just six. For example, Connecticut did not spend time building a function to collect insurance premiums from customers, leaving that task to the insurers. This management of scope allowed the state time to do extensive testing. Connecticut also used a phase-gate approach where the project did not move forward until phase-gate exit criteria were satisfied.

Similar to Washington and Connecticut, Kentucky managed scope aggressively. User account creation was not forced on users before browsing available health plans as in healthcare.gov and the renewal functional was de-prioritized since insurance renewals would not begin for a year.

All three states highlighted extensive testing as key to their projects.

Methodology Bias

The methodology suffers from bias that impacts the analysis.

  1. The definition of risk used by each contractor was not stated and could be different. A high risk in Maryland may not be equivalent to a high risk in Oregon.
  2. The categorization of risks into PMI Knowledge Area is based on expert opinion. A risk categorized in the Project Scope Management area could be categorized into the Project Integration Management area by another expert.
  3. Original project data on scope, schedule, and budget has not been released for any project. Claims of scope, schedule, or budget performance cannot be verified independently.
  4. Data from the successful projects is mostly from involved parties who have an interest to exaggerate the success of the project or their role in the success.

References

Barnickel III, T. J. (3 April 2014). Document review of the Maryland health benefit exchange.

BerryDunn. (8 February 2013). Monthly review report.

Cox, E. (2013, December 7). Maryland health exchange director resigns after questions about vacation. The Baltimore Sun. Retrieved July 27, 2014 from http://articles.baltimoresun.com/2013-12-07/health/bs-md-rebecca-pearce-20131206 1 pearce-own-exchanges-carolyn-quattrocki.

First Data. (2014, March 19). Cover Oregon Implementation Assessment Report.

Gartner. (2013, May 14). Bi-weekly quality assurance status report.

Gartner. (2013, June 19). Bi-weekly quality assurance status report.

Gartner. (2013, August 28). Bi-weekly quality assurance status report.

Johnson, J. (2014, April 18). Md. spent $90 million on health exchange technology, according to cost breakdown. The Washington Post. Retrieved July 27, 2014 from http://www.washingtonpost.com/local/md-politics/md-spent-90-million-on-health-exchange-technology-according-to-cost-breakdown/2014/04/18/5f2e7600-c722-11e3-8b9a-8e0977a24aeb story.html.

Kopta, C. (2014, May 16). Cover Oregon insider: “It was off the rails from the time it got started.” KPIC. Retrieved July 27, 2014 from http://www.kpic.com/news/local/259530231.html.

Leadbetter, C. (2013, September 26). Letter to Rebecca Pearce.

Obama, B. H. (2013, September 26). Remarks by the President on the Affordable Care Act. Retrieved July 27, 2014 from http://www.whitehouse.gov/the-press-office/2013/09/26/remarks-president-affordable-care-act.

Pearce, R. (2013, September 30). Letter to Charlie Leadbetter.

Project Management Institute. (2013). A guide to the project management body of knowledge (PMBOK® guide) – Fifth edition. Newtown Square, PA: Author.

Sharfstein, J. (2014January 14). Testimony before the Committee on Health and Government Operations, Maryland House of Delegates.

Wozniacka, G. (2014, April 25). Cover Oregon board votes to move to the federal exchange. Associated Press. Retrieved July 27, 2014 from http://www.katu.com/news/local/Cover-Oregon-board-votes-to-move-to-federal-exchange-256731961.html

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

© 2014, Dominic Lepore
Originally published as a part of the 2014 PMI Global Congress Proceedings – Phoenix, Arizona, USA

Advertisement

Advertisement

Related Content

Advertisement