Hybrid waterfall agile development for the federal space

Loraine Boyne, PMP CSM, Program Manager, CSC

Abstract

Historically, the Waterfall methodology has been ingrained in U.S. federal software development contracts with requirements determined prior to software development. However, in 2010, focus shifted to “ensuring that results matter more than plans” (Kundra, 2010, p 33). Agile is better suited for achieving the goal of “breaking projects into more manageable chunks and demanding new functionality every few quarters” (Kundra, 2010, p 1). However, federal governance directives to produce and deliver plans before results have not changed. How then is it possible for federal projects to be Agile?

This paper presents a case study of how we have addressed this challenge in a current program. The client is a United States federal agency. Programmatic responsibilities center on the development of workflow applications designed to increase the efficiency of agency business processes. To complete this program, we designed and implemented a hybrid Waterfall Agile development methodology that is based on having two development process levels. The first level supports Waterfall development processes and is designed to meet all government requirements. The second level enables iterative Agile application development in a non-government contractor-controlled environment. Work in the Agile environment is bubbled up every quarter into a new release that is deployed into the client's production environment.

Overall, our Government Sponsor has been very pleased with our hybrid methodology. While there are challenges to utilizing Agile principles in a federal context, the benefits far outweigh the additional effort. Direct benefits have included applications that are designed directly to meet end-user needs, quick deployment of new functionalities, and enthusiastic end-user involvement. We are fortunate to have a very knowledgeable and engaged Government Sponsor to serve as our Product Owner. With her support, each version of our software not only meets preset requirements, but also is shaped by the end-users. Consequently, with each release, our applications become increasingly user-friendly and meet changing user needs. This has led to increasing user adoption and deployment of our application into multiple organizations across our Government Sponsor's agency.

Introduction

The U.S. federal government has historically issued contracts for very large software development projects that must meet a predetermined list of requirements. Development of these applications followed a traditional Waterfall approach with requirements and system design being delivered upfront. Then, throughout the development lifecycle, standard Waterfall documentation was required for monitoring progress and passing capstone reviews (Lapham, M., Williams, R., Hammons, C., Burton, D., & Schenker, A., 2010). Development occurred with little if any input from end-users. A final release would then be deployed with plans for application maintenance, but few plans for application upgrades or modifications. This system led to a number of application failures and dissatisfied end-users.

As a result of tightening budgets and goals for modular functionality implementations, Agile is now acceptable as an alternative to Waterfall in requests for proposals. Agile has demonstrated the ability to provide mission efficiency at a lower cost while maintaining quality. Moreover, the strategic benefits of remaining responsive to quickly changing user needs are well-aligned with federal objectives (Lapham, M. et al., 2010). Consequently, Agile methods are becoming increasingly accepted by federal clients.

Despite the desire to go Agile, federal governance structures are still designed to govern the development of traditional applications. It is thus necessary to meet Waterfall-structured requirements while leveraging Agile methodologies. The case study presented here describes one methodology for performing Agile development while meeting federal governance requirements.

The Program and Applications

The program discussed in this case study is responsible for the development of two applications for a federal agency. Each application is designed to meet similar, but distinct, operational needs of federal employees and contractors responsible for monitoring the flow of tasks and approvals as they are dispersed and collected throughout the agency. General requirements include the ability to enter data once, assign a task or approval within the appropriate organizational structure, monitor the completion of activities, maintain associated artifacts in a single repository, and report on overall organizational performance in regards to these activities. Given the similarity of the applications and their shared development methodology, we will henceforth refer to them as a single application.

Application end-user requirements stem from business processes to increase efficiency. With key insight into where roadblocks exist in their processes, end-users are invaluable in determining how an application can automate current processes to overcome these hindrances. In addition, it is common for a complete workflow to occur in less than eight hours. Thus, any application attributes that create additional steps, even if they equate to the loss of even a few moments, must be addressed. For these reasons, methods that allow the continual integration of user feedback were required.

As will be discussed further in subsequent sections, each application is scheduled to have quarterly releases. Planning for each new version incorporates lessons learned (Project Management Institute, 2008, Section 10.4.3) from previous versions and leverages end-user feedback in the prioritization of new functionality. In addition, if critical issues are discovered, emergency releases can be deployed.

Team Structure

Individuals supporting the program are divided into three distinct, but overlapping teams. These are the program management team, the development team, and the socialization team. While teams are responsible for both Agile and Waterfall activities, their structure is driven by Agile principles. Each team is empowered to self-organize and make decisions regarding their activities. End-user representation is present in all teams and is integrated into daily activities (Waters, 2007). Foremost, a sense of ownership resides among team members such that all teams feel part of an overarching team that works together, communicates regularly, and shares in the success of the program.

Program Management Team

The program management team is responsible for traditional program management activities. These include the management of all team activities; coordinating interaction between the teams and the client; the development and maintenance of all program documentation; the creation of Waterfall deliverables; and the verification that contractual requirements are met. The program management team also performs technical activities including managing the teams' multiple development and test environments and managing application configuration settings. Program management team members also serve as the program's testing team. They are responsible for creating test cases and performing both integration and system testing.

Development Team

The development team is responsible for the actual coding of the application and unit testing. Most development activities are Agile in nature including daily scrum meetings. However, the development team is also responsible for the completion of the System Design Document required by Waterfall and supports the creation of documents required by the federal Infrastructure Change Control Board.

Socialization Team

The socialization team is responsible for interacting with end-users. Interactions include: presenting application briefings to agency leadership; responding to user questions; trouble-shooting user issues; creating user materials such as the application user guide; and training users individually and in groups. The team's awareness of end-user activities is critical to the on-going development of functional requirements as well as the prioritization of the Agile backlog.

Despite having three teams there is little hierarchy, leading to a very flat overall team structure. Each team is self-organizing with some supervisory roles filled by the Government Sponsor (Product Owner), Program Manager, and Solution Architect. However, each team divvies up responsibilities on their own with little, if any, assignment of duties. This is possible due to the small size of our teams. Currently, the PM team consists of four members, the development team of eight members, and the socialization team of four members.

Information Technology in the Federal Environment

Within the U.S. federal government, most departments and agencies have management directives that shape all information technology (IT) acquisitions and development lifecycles. These directives are designed to govern traditional software development projects that proceed chronologically through a series of gates until the software application is complete. These gates form the structure of the System Engineering Life Cycle (SELC) (DHS AMD, 2010). Depending on the size of the project, its SELC can be tailored to reduce the number of required activities and associated documentation. Size is categorized into three levels. Level 1 programs have an annual expenditure level of over USD 1 billion. Level 2 programs have an annual expenditure level of between USD 100 million and USD 1 billion. Level 3 programs have an annual expenditure level of under USD 100 million (Department of Homeland Security, 2010). The program described in this case study is a Level 3 program.

The System Engineering Life Cycle

The SELC is a government-mandated methodology that outlines a common system life cycle. The intent of the SELC is to ensure the efficient and effective delivery of agency capabilities. The SELC has nine process stages that include: Solution Engineering, Planning, Requirements Definition, Design, Development, Integration & Test, Implementation, Operations and Maintenance, and Disposition. Each of these stages has a corresponding System Engineering Review and required documentation (DHS AMD, 2010). As demonstrated by the progression of the stages, the SELC is structured around the completion of a standard Waterfall development process. Planning and requirements are delivered upfront and design and development follow. While iteration between stages is allowed, it can result in the inefficient repetition of activities and document creation.

The SELC is designed to manage all program levels and thus has an extensive number of requirements. These requirements can be burdensome for smaller programs. Consequently, tailoring of the SELC is allowed on a case-by-case basis to determine which activities and documents a program should complete. As a level 3 program, our program was able to significantly reduce the number of federal reviews and documents required. Following PMI guidance (Project Management Institute, 2008, Section 2.4.3), our team worked in conjunction with our Government Sponsor, to implement project-appropriate documentation based on the size and scope of our project. This allowed for both the mitigation and satisfaction of Waterfall requirements while taking advantage of Agile benefits.

The Infrastructure Change Control Board

The Infrastructure Change Control Board (ICCB) is responsible for evaluating all changes that are to be made to the agency's technical infrastructure. Changes under the purview of the ICCB include: application code changes; installation, removal, or upgrade of commercial products; system code changes; hardware changes; and other similar changes. Deployment of an application upgrade thus requires ICCB approval. For a change to be considered by the ICCB, a request to appear on the ICCB docket must first be submitted. Advance notice is required to be included on the ICCB agenda, unless it is considered an emergency request. Thus, the time required to appear before the ICCB must be included in the project schedule.

Both the ICCB and the SELC tailoring plan delineates the required documentation for a particular program. The required documentation for Level 1 and 2 programs can be extensive. However, document requirements can be reduced to accurately reflect those needed to communicate and control smaller programs. The tailoring of requirements based on our program size has been key to the success of our hybrid methodology.

Hybrid Waterfall Agile Methodology

As previously described, the program is responsible for developing an application that increases the efficiency of operational business processes. Given the need for responsiveness to the constantly changing nature of these processes, it was determined that the applications must have regular releases that would continually add and shape functionality. In addition, in order to verify that end-user requirements are met, constant feedback was also deemed a critical factor to the program's success. As a result, it was determined that Agile development principles were aligned with the nature of our program. However, given the nature of this government contract, a pure Agile approach would not have been appropriate for our program. A number of contractual and programmatic constraints exist that would not be met through Agile methods (Lapham, M. et al., 2010). Thus, a hybrid Waterfall Agile methodology was developed to garner the benefits of Agile development, while meeting Waterfall-based requirements.

It is not uncommon for a blend of methodologies to be utilized. Hybrid processes that allow development teams to be Agile, while formal product requirements are met, are currently being utilized (McKenrick, 2011). And, as we demonstrate, “for some projects, a hybrid approach is most appropriate to get the best of both worlds: predetermined parameters and the freedom to move within them” (Nee, 2011).

Dual Development Process Levels

Our hybrid Waterfall Agile development methodology is based on having two development process levels as seen in Figure 1. The first level supports Waterfall development processes and is designed to meet all federal requirements. These requirements are only met for each release (not for each sprint). Thus the Requirements and Integration and Test phases only occur quarterly. The second level supports Agile development. Agile methods are used for the Design and Implement stages of the Waterfall methodology and occur in a separate Agile development environment. The final Maintain stage is ongoing and contributes to requirements gathering.

Hybrid Waterfall Agile Development Methodology

Exhibit 1: Hybrid Waterfall Agile Development Methodology

While the Waterfall and Agile levels will be described fully in subsequent sections, this section describes the basic structure of our methodology. Our primary application has four quarterly releases. Each release is composed of six two-week sprints. Prior to beginning sprint 1, requirements are translated into user stories and entered into our backlog. The backlog is then prioritized by the Product Owner. The Agile development team then evaluates the prioritized backlog and slates user stories for each sprint within the release. Agile methods are then utilized for the completion of six sprints, with each sprint resulting in a functional build of the application. Each build is deployed in the test environment and tested by the Product Owner, team members, and select end-users. Test results are then compiled and user feedback is incorporated into the next sprint.

At the end of sprint 6, a final build is subjected to extensive system testing. When complete, required documentation is updated by the program management team and presented to the ICCB. Once accepted, the build is then deployed into the federal staging environment. Additional testing is then completed. When the release build is verified as ready for production, it is deployed as a new version into the federal production environment. End-users then have access to the new version and continue to give feedback on previous and new functionality.

Waterfall – Requirements Phase

Requirements gathering was initially held at the beginning of our program. Functional requirements were gathered from users through a working group. The functional requirements collected were translated into technical requirements. Both functional and technical requirements were then compiled into a requirements traceability matrix (RTM). This original RTM was the basis for the development of the first version of our application. It is worth noting that the original working group was comprised of only one type of user. Consequently, original functionality was developed to meet the needs of this group only.

With the deployment of the first version of our application, there was then the opportunity to gain feedback from all users of the application. It was at this point that the need for a working group with representatives from all user types was discovered. Consequently, a new working group was assembled with a diverse set of user types. The result was a compilation of a number of new functional requirements that would meet the needs of all user types. Requests for future enhancements are now continually collected from this working group as well as all users.

Interaction Point – Requirements Translation

With two levels, there must be interaction points where Waterfall artifacts are translated into Agile artifacts and vice-versa. Interaction point activities occur only once per release (approximately quarterly). The first interaction point occurs between requirements gathering and design. At this point, functional and technical requirements recorded in the Waterfall RTM are translated into the Agile user stories contained in the backlog. In addition to the user stories developed from the original set of requirements, user stories based on end-user requests are also continually added to the backlog. Members from the socialization team, including the Product Owner, support the creation of user stories.

With a set of user stories in the backlog, the release planning meeting is held. The purpose of the release planning meeting is to prioritize and select which users stories will be completed for that particular release. The meeting is led by the Program Manager and the Product Owner; however, the entire team is present. In the meeting, key dates and milestones are discussed. A typical release consists of six two-week sprints and thus lasts a minimum of 12 weeks. However, additional time is added for activities in the Waterfall development process level.

Prioritization of the backlog is also completed in conjunction with the assignment of story points. The development team assigns story points to each user story. With a set release date, the development team's velocity is multiplied by the six sprints to get a rough estimate of how many story points can be completed within a single release (agileSherpa, 2010, Release Planning Meeting). The Product Owner then uses this information to prioritize the backlog such that an achievable set of functionality is selected for completion within this release. Any user story that is not selected for this release remains on the backlog for consideration in a future release. Prior to the conclusion of the meeting, the Solution Architect verifies with his team whether or not the user stories can be completed in the allotted time. Adjustments are made if necessary. The meeting concludes with a finalized prioritized backlog with functionality selected for development during that release.

Agile – Design and Implement Phases

Agile development principles replace the Design and Implement stages of the Waterfall methodology. Entry into the Agile level requires a prioritized backlog with functionality selected for development during that release. While the intent is to have all activities in these phases be Agile, Waterfall deliverable requirements based upon the tailored SELC are also completed by the program management team in response to efforts in this level. Agile development activities utilize a private program-controlled development environment.

Release Planning

The development team also holds its own release planning meeting. In this meeting, each user story is slated for a particular sprint resulting in a rough sprint schedule. In addition, the self-organized development team divides responsibility for each of the user stories. The sprint schedule and point-of-contact for each user story is then related back to the program management team. It is understood by both the program management team and the development team that the sprint schedule and developer assignments are fluid and may change.

Six Sprints and a Release

Once all release planning is complete, development begins. Each sprint results in a functional build of the application with associated build release notes. The development team is responsible for testing the functionality of the build prior to delivering it to the program management team. The program management team then has the responsibility of independently testing the build to verify that all functionality indicated in the build release notes was completed. They also verify that previously existing functionality is still working as expected. Test results are then delivered by the program management team back to the development team.

For each two-week sprint, a sprint planning meeting and sprint review meeting are held. Subsequent to the first sprint planning meeting, the sprint review meeting typically precedes the sprint planning meeting. The sprint review meeting verifies which features and functions have been completed during the sprint (agileSherpa, 2010, Iteration Review). Test results are reviewed and any discrepancies between the release notes and the test results are discussed. The Product Owner participates in the sprint review by providing feedback regarding whether functionality received has met expectations. If expected functionality is not delivered, reasons regarding why are discussed. The Sprint Review ends with updating the backlog to indicate completed functionality. Items closed on the backlog are also indicated as complete on the RTM. The augmented RTM tracks which release and sprint requirement was completed.

The sprint planning meeting is then held soon after the sprint review. Items discussed in the sprint review that may affect the next sprint are evaluated. The Product Owner leads in the reprioritization of user stories if necessary. In addition, there may be a reevaluation of story points assigned to a particular user story. The development team then makes a decision on which user stories are slated for the next sprint. This may or may not reflect the original sprint schedule. The program management team is responsible for updating the sprint schedule with any changes made by the development team. The meeting concludes when all development team members agree that the user stories selected for the sprint are achievable, and commit to their completion (agileSherpa, 2010, Planning Meeting).

Upon completion of the six sprints, a final release build is compiled. This final build is subjected to thorough system testing to verify that all intended functionality prioritized by the Product Owner in each sprint is available and working as intended. The Product Owner is an active participant in testing and is responsible for determining if the build is ready for transition to the Waterfall process level. Once the build is approved, it moves into the Waterfall Integration and Test Phases.

Interaction Point – Testing

The second interaction point between the Waterfall and Agile levels occurs between implementation and testing. Testing, in fact, straddles both levels. As previously discussed, unit testing is first performed by the developers in their development environment as functionality is added. A second round of testing occurs at the end of each sprint and is performed by the program management team. At the end of Sprint 6, the release build is then subjected to extensive system testing to verify that all previous and newly added functionality is working as intended. Once the release build is approved by the Product Owner it will be deployed into the federal staging environment. Testing in this environment as well as in the federal production environment will be discussed in the following section.

Waterfall – Integration and Test Phase

Integration and testing in the Waterfall level begins when a final release build is deployed into the federal staging and production environments. In order for this to occur, an official change request must be submitted to the ICCB. To comply with ICCB requirements, the following documentation is submitted with each change request:

ICCB Change Request Form: The change request form includes a detailed description of the change, the implementation plan and schedule for the change, how the change will be tested, any risks and mitigations associated with the change, and if there are any security implications.

Implementation Communication Plan: The implementation communication plan delineates all implementation activities and how all stakeholders will be made aware of intended activities. This is inclusive of individuals and their roles in implementation, all systems affected by implementation, and a back-out plan should any changes need to be reversed.

Version Description Document: The version description defines changes to the application that were made for the current release. This document builds upon previous version description documents and refers to other project documentation so as to minimize the maintenance of multiple documents.

Application Diagrams: The functional diagrams provide screen shots or wire frames of the functions that are to be implemented. The diagrams highlight all design and process changes.

Integrated Master Schedule: The integrated master schedule (IMS) contains a time-based schedule of all tasks necessary to successfully execute program requirements. The IMS is used to evaluate progress toward meeting program objectives.

Once a release build is approved for integration into the federal environments, it is first deployed into the federal staging environment. In this environment, the application undergoes extensive regression testing. Testing in this phase has two stages. First, the program management team creates and executes use cases to determine that all previous and newly added functionality is implemented correctly. They also verify that data integrity has not been compromised. The Product Owner then performs the final stages of system testing to determine if the build is ready for users. Approval by the Product Owner starts the second phase: user acceptance testing (UAT). UAT is completed by the working group. Socialization team members sit with working group members to verify test results and capture any future enhancements requested.

Once all testing is complete and application functionality is validated, the application is then deployed into the federal production environment. Since users have immediate access to the new release and data integrity must be maintained, testing in this environment is limited.

Subsequent to each release, a “lessons learned” or Release Retrospective meeting is held. In this meeting, lessons learned are discussed and collected as opportunities for improvement. This includes positive activities that should be repeated, new activities that should be implemented in the future, and inefficient activities that should be reevaluated. These topics are in line with the common start-stop-continue meeting format (Port, 2010). Revelations from this meeting are documented and improved methods are implemented in the next set of sprints. Key benefits of this meeting include improved quality of work, improved ability to estimate team capabilities and meet expectations, increased productivity, and higher team cohesion and morale (agileSherpa, 2010, Retrospectives).

Waterfall – Maintain Phase

The maintain phase consists of all operation and maintenance (O&M) activities. O&M for our applications consists primarily of adding users and new organizations into the application. Issues uncovered by the socialization team working with users can usually be remedied by a system administrator. Persistent issues are captured and addressed in the subsequent release. In the case of critical issues, an emergency change request can be submitted. A new version can then be released to correct the issue.

Required Resources

To support this hybrid methodology, a number of resources are required for success. These include a distinct development environment, a dedicated Product Owner, frequent team meetings, and user working groups.

Testing Environment

In order to alter the federal staging or production environment, a change request must be submitted to the ICCB and approved. To avoid completing this process at the end of each sprint, a separate testing environment is necessary. At the end of each sprint, application builds can be deployed and tested in this environment without additional paperwork. Control of a testing environment is also useful for the testing of environment configurations that may not exist in the federal environments. Necessary environment configurations can be determined and tested without a change request. In addition, patches and other environment changes can also be tested. Finally, the test environment allows the simultaneous deployment of multiple builds or releases to compare functionality or validate that new releases do not unintentionally change previously deployed functionality. To guarantee that the application will work in the federal production environment, it is critical that the test environments mirror the federal production environment as closely as possible.

Dedicated Government Sponsor

Our Government Sponsor's full-time dedication has been critical to the success of our program for a number of reasons. First, she supports Agile development by serving as Product Owner and prioritizing user stories for implementation. Second, she serves as a liaison between our development team and the end-users. As such, she provides critical insight into their business processes and brings user feedback to the attention of our team. Finally, she garners support for the application among users, which increases user adoption and demonstrates success of the application. It is important to note here that these functions require an individual to be dedicated full-time. While success may not require the Government Sponsor herself to serve as Product Owner, a full-time federal employee with insight into user needs is required as a member of the team.

Frequent Team Meetings

With a distributed team, the ability to have multiple team meetings via phone, web-meeting, and other means is critical. These meetings are utilized to keep all team members apprised of team status. While the development team holds its own daily scrum meetings, one meeting is held each day for a varying subset of team members across all three teams. The focus of the meetings varies and some are Agile in nature, while others are Waterfall in nature. Regardless of the type of meeting, there is always time to discuss any issues that may need attention that day. Team meetings include:

Agile – Development Team Meeting: The purpose of this meeting is to discuss the development team's objectives for the week. If a sprint has just completed, this meeting will incorporate the sprint review meeting for the previous sprint and the sprint planning meeting for the next sprint. All team members attend this meeting.

Waterfall – Program Status Meeting: The purpose of the program status meeting is to discuss the progress of the current release and the overall progress of the program. Any risk factors that need to be incorporated into the schedule are discussed along with any risk mitigation plans. An updated integrated master schedule is delivered to our Government Sponsor. Attendees include the program management team, the Government Sponsor, and the Solution Architect.

Waterfall – User Feedback Conference Call: The purpose of this meeting is two-fold. First, any application updates are provided to the users. Updates typically include announcement of known issues that are currently being handled, updates on an upcoming release, and schedule of upcoming training sessions. Second, this meeting provides a forum for users to ask questions, notify the team of any issues, provide general feedback, or request future functionality. Any new requirements garnered from these meetings are amended to the RTM, translated into user stories, and placed in the backlog. New user stories can be prioritized for either the next sprint (in case of a critical issue) or a future release. Attendees of this call include the Government Sponsor (Product Owner), a representative from the program management team, and end-users.

Agile – Weekly Stand-Up Meeting: The purpose of this meeting is for the development team to provide an update on their tasks for that week. This includes communicating what they have completed, what they are working on next, and any issues that may be affecting their productivity. While Agile recommends daily stand-up meetings of no more than 15 minutes (agileSherpa, 2010, Daily Standup), development updates on other days are provided as part of other meetings. Attendees include the development team, a representative from the program management team, and the Product Owner.

Strategic Planning Meeting: The purpose of this meeting is to discuss strategic initiatives. This includes the implementation of functionality that may need additional planning or research to implement. Strategic initiatives have included topics such as mobile views of the applications, upgrade of the user interface, and future archiving and records management needs. Attendees include the program management team, the Government Sponsor (Product Owner), and the Solution Architect.

Issues Meeting: The purpose of this meeting is to bring user issues to the attention of the development team. These issues are located in the production environment to which the development team does not currently have access. The issue meeting can also be used to discuss test results and any outstanding issues from the last sprint that may still need attention. Attendees include the socialization team, a representative from the program management team, the Solution Architect, and occasionally some members of the development team.

User Working Groups

A critical element of Agile development is incorporating user feedback. This feedback shapes the next sprint and next release by providing insight on what functionality is most critical, so that it can be implemented first. To gain access to user feedback, it is necessary to have access to users. With the support of our Government Sponsor, our socialization team manages one user working group for each of our applications. The working groups serve multiple functions. First, they are constantly providing feedback on the current version of the application in the production environment. Second, they participate in the prioritization of functionality for future application releases. Third, they participate in user acceptance testing. Finally, they are available to support design decisions in cases where it is unclear how to implement a user story. The combination of these functions serves to provide the user guidance necessary to develop an application that meets user needs.

Overcoming Challenges

Implementing a hybrid methodology in a federal environment helps overcome some challenges, while presenting new ones. The following sections describe an example of each type of challenge.

Agility in a Federal Environment

As previously discussed, approval from the ICCB is required to make any changes to the federal staging and production environments. These approvals are time-boxed in that any changes approved must take place within a two-week time period. Thus, the team has two weeks to uncover any issues, address the issues, and deploy a solution. To overcome this time constraint, three mitigation strategies have been implemented. First, we secure support from a system administrator prior to appearing in front of the ICCB. Through his support, we gain additional insight and access to logs that can provide possible causes to any issues found. Second, we successfully negotiated for one of our team members to have administrative privileges in the federal staging environment. These privileges allow us to deploy and test fixes as often as needed within the window until all issues are resolved.

Third, and foremost, we employ Agile development principles. With a standard Waterfall approach, our development team would not be able to develop a resolution to system issues in such a short time period. This would result in having to reappear in front of the ICCB to test each fix as it is developed. An Agile approach enables the development team to be made aware of any environment-related application issues and develop fixes within the two-week time frame. Consequently, additional approvals from the ICCB are not necessary and users have access to the application on the date originally intended.

Distributed Development Team

Due to the nature of our federal contract, our development team consists of both contractors and sub-contractors. Each set of developers has multiple work locations that exist in four states. The need for constant communication is thus paramount. The multiple weekly meetings described previously meet this need. We also encourage open communication between all members of our team. We have found it extremely valuable to have testers discuss test results directly with developers and also to have trainers, who interact with the end-users, discuss functional requirements directly with the developers. It is for this reason that the development team assigns a point of contact for each user story being developed.

In addition, release planning meetings often occur in person. Our Government Sponsor, program management team, Solution Architect, and key developers meet to plan each release, discuss prioritization of requirements, and finalize the sprint schedule. Since these meetings occur only a few times a year, the cost of bringing team members together is minimal.

Conclusion

Overall, our Government Sponsor has been very pleased with our hybrid methodology. While there are challenges to utilizing Agile principles in a federal context, the benefits far outweigh the additional effort. Direct benefits have included applications that are designed directly to meet end-user needs, quick deployment of new functionalities, and enthusiastic end-user involvement. We are fortunate to have a very knowledgeable and engaged Government Sponsor to serve as our Product Owner. With her support, each version of our software not only meets pre-set requirements, but also is shaped by the end-users. Consequently, with each release, our applications become increasingly user-friendly and meet changing user needs. This has led to increasing user adoption and deployment of our application into multiple organizations within our Sponsor's organization.

References

agileSherpa. (2010, August). Daily standup. Retrieved from http://www.agilesherpa.org/agile_coach/iteration/daily_standup/

agileSherpa. (2010, August). Iteration review (aka sprint review). Retrieved from http://www.agilesherpa.org/agile_coach/review/iteration_review/

agileSherpa. (2010, July). Planning meeting. Retrieved from http://www.agilesherpa.org/agile_coach/iteration/planning_meeting/

agileSherpa. (2010, July). Release planning meeting. Retrieved from http://www.agilesherpa.org/agile_coach/release/release_planning_meeting/

agileSherpa. (2010, August). Retrospectives. Retrieved from http://www.agilesherpa.org/agile_coach/review/retrospective/

Department of Homeland Security. (2010, January). Department of homeland security acquisition management directive (Directive number 102-01 revision number 01). Retrieved from http://www.dhs.gov/xlibrary/assets/foia/mgmt_directive_102-01_acquisition_management_directive.pdf

Department of Homeland Security Acquisition Program Management Division (AMD) and the Office of Chief Information Officer. (2010, September). Appendix B. DHS systems engineering life cycle (SELC) (Part I Version 2.0). Retrieved from https://acc.dau.mil/adl/en-US/245894/file/53585/Appendix_B_Systems_Engineering_Life_Cycle_SELC__VER_2_0_Release_9-21-10.pdf

Kundra, Vivek. (2010). 25 Point Implementation plan to reform federal information technology management. Retrieved from http://www.cio.gov/documents/25-point-implementation-plan-to-reform-federal%20it.pdf

Lapham, M., Williams, R., Hammons, C., Burton, D., & Schenker, A. (2010). Considerations for using agile in DoD acquisition (CMU/SEI-2010-TN-002). Retrieved from http://www.sei.cmu.edu/library/abstracts/reports/10tn002.cfm

McKenrick, C. R. (2011, October). Agile project management with formal requirements and test case management. Paper presented at PMI Global Congress Proceedings.

Nee, N. Y. (2011, October). Agile: Still the magic bullet, or do you need a blended solution? PMI 2011 Global Congress Proceedings.

Port, Larry. (2010, October). Implement “start, stop and continue” meetings to improve your operations. In Project Management Ensuring Smooth Navigation, ILTA White Paper. Retrieved from http://www.jdsupra.com/legalnews/implement-start-stop-and-continue-mee-03635/

Project Management Institute. (2008). Manage stakeholder expectations: Outputs. In A guide to the project management body of knowledge (PMBOK® Guide) (4th ed.). Newtown Square, PA: Project Management Institute.

Project Management Institute. (2008). Organizational process assets. In A guide to the project management body of knowledge (PMBOK® Guide) (4th ed.). Newtown Square, PA: Project Management Institute.

Waters, K. (2007, March 4). Agile principle 2: Agile development teams must be empowered. Retrieved from http://www.allaboutagile.com/agile-principle-2-agile-development-teams-must-be-empowered/

This material has been reproduced with the permission of the copyright owner. Unauthorized reproduction of this material is strictly prohibited. For permission to reproduce this material, please contact PMI or any listed author.

© 2012, Erica M Salinas & Loraine A Boyne
Originally published as a part of 2012 PMI Global Congress Proceedings – Vancouver, Canada

Advertisement

Advertisement

Related Content

Advertisement

Publishing or acceptance of an advertisement is neither a guarantee nor endorsement of the advertiser's product or service. View advertising policy.