Disciplined Agile

Data Management Workflow – Internal

Now let’s drill down and see what the workflow for a Disciplined Agile® (DA™) approach to data management looks like. First, notice how all of the activities depicted in Figure 1 are collaborative in nature. This is shown via the additional roles beside the activities or interacting with them. Second, how you address these activities will vary depending on the situation that you face. Our aim here, is to explore a baseline from which you can potentially start, but you’ll need to tailor it to address your actual situation. 

Internal

Figure 1. Internal workflow for Data Management.

Let’s work through each activity one at a time:

  1. Evolve organizational data artifacts. Organizational data management artifacts may include data models, including but not limited to a high-level conceptual model for your enterprise (typically a view within your enterprise architecture); metadata describing common concepts, entity types, and data elements within your organization; master data for critical entity types; and master test data to support database testing across multiple delivery teams. Data managers will work closely with product managers to understand their overall vision for their products and the organization as a whole to ensure that their data management strategy aligns with your product roadmaps. Data managers will also work closely with enterprise architects to ensure that data concerns are addressed appropriately in your organization’s architecture and that your data management strategy aligns with your technology roadmap. These collaborations are often accomplished through regular working sessions that are often called in an as-needed, impromptu manner.
  2. Enable delivery teams. Data managers work closely with delivery teams to train, educate, and coach them in data skills. The overall strategy is to enable delivery teams to be as self-sustaining as possible when it comes to data-related activities, to offload as much of the grunt work as possible to enable the teams to become more reactive and to allow data managers to focus on value-added activities such as evolving organizational data artifacts and guidance. The implication is that data managers will need to develop and maintain a training program around fundamental data skills (computer-based training often proves sufficient for this) such as data modeling, database design, and data security. They will also need coaching skills so that they can work side by side with delivery teams to help them to learn these critical skills.
  3. Support delivery teams. Delivery teams will need help from time to time to address hard database design problems, to gain access to and to understand legacy data sources, and to obtain and/or generate test data. The DA strategy is for data managers to work collaboratively with the delivery teams to do so, to get directly involved with the teams to do the actual work (and to transfer skills while doing so). In a pragmatic take on the sage advice around teaching a man to fish, the goal should be to teach the delivery team how to fish but while doing so provide enough fish to sustain them until they become self-sufficient.
  4. Evolve and support data guidance. Delivery teams should follow your organizational conventions around data (and around security, and user experience, and so on). The data management team is the source of this data guidance, which should address fundamental issues such as data naming conventions, data security conventions, and your data architecture and design patterns. This guidance should be developed and evolved collaboratively with the delivery teams themselves to ensure that the guidance is understandable, pragmatic, and accepted by the teams.
  5. Support and monitor operations. Data managers will work closely with operations managers to monitor your existing production data sources (operations managers monitor far more than just data sources of course). Ideally this monitoring is fully automated with dashboard technology used to render critical operational intelligence in real time. Note that operational database administration activities are addressed by the IT operations process blade.
  6. Improve data quality. Data managers will guide and collaborate in the data quality improvement efforts of your database administrators (DBAs) and operations engineers as well as your delivery teams. This is depicted below in Figure 2. They will monitor your automated database regression testing efforts (ideally a continuous effort) and your ongoing data source evolution efforts (implemented as database refactorings) that occur on a daily basis. Your data managers will oversee the long-term aspects of database refactoring, in particular the retirement of deprecated database schema and the scaffolding required during the deprecation periods for the appropriate refactorings.
Data Management Collaboration

Figure 2. A collaborative approach to data quality (click to enlarge image).

Let’s examine Figure 2 in a bit more detail. Common data quality activities are indicated towards the top (the blue bubbles). Immediately below each activity is the primary role(s) responsible for it – notice how in an agile environment data quality is so important that it isn’t left to just people in data roles. Below the primary roles, in some cases, we indicate secondary roles that may be involved in assisting with, or supporting, the activity.