Scrum & BI: Continuously Validate the Quality of the Data

Quality Assurance ConceptTaking the Stacey Matrix into account, Business Intelligence (BI) projects can be seen as complex, where the amount of unknown requirements and technologies exceeds the known. BI projects are complex due to fast changing information needs and priorities, existence of many users/customers, availability and quality of data, different systems to extract source data from, and continuously changing technologies. Implementing Scrum within a BI-environment is therefore challenging.

In a series of blog posts we’ll share our 9 biggest lessons we’ve learned using Scrum within a BI environment. Hereby we will focus on individuals, collaboration and the process.

From now on, we’ll share one learned lesson every week. We appreciate receiving feedback, because we’ll process the feedback and combine all the lessons learned into a whitepaper.

We’ve deliberately made the choice to write short blog posts with only one lesson learned per article. This gives us the opportunity to “release early and often” and improve the quality of the final whitepaper.

Our Lessons Learned

  1. Continuously validate the quality of the data
  2. Stop analyzing, start visualizing
  3. The Product Owner breaths Business Intelligence
  4. Invest in test automation
  5. Build the ideal Business Intelligence team
  6. Believe in the servant architect
  7. Incorporate DevOps practices
  8. Arrange a tailor made Sprint Review
  9. Create actionable insights 

Lesson 1: Continuously Validate the Quality of the Data

When starting a new BI project, the team should attempt to deliver a valuable increment from the first sprint on. This might be a very small feature to see if the plan the team drafted during the first sprint planning is sufficient. Although this is very difficult to do during the first sprint, it encourages a mindset of releasing early and often.

Together with realizing the first (small) features, the team should focus on gathering knowledge on the availability of high quality data and performance from the start. BI is all about data; it’s gathered for creating valuable, actionable business insights. Validating the quality of the data and performance should be done continuously. The Development Team should collaborate with the Product Owner to determine what ‘good enough’ exactly means. Is the data complete, accurate, available and up-to-date? Capturing these quality and performance standards in a Definition of Done hereby is a good practice.

The goal of BI is to create insights. These insights will be truly valuable when they’ve become actionable and get a follow up by the customers/stakeholders. Having trust in the quality and reliability of the data hereby is key. To build a solid foundation of trust the BI department should have comprehensive data quality management to show the status of data quality across all the stages, from source systems and ending with the use of reports.

A continuous focus on the quality of data and performance is therefore a sensible investment. Otherwise BI implementations will always have issues with business confidence in data quality and common understanding within the organization of data quality issues and their sources.

We hope sharing this lesson was useful for you; next week we’ll discuss why you should stop analyzing and start visualizing.


Barry Overeem – Agile Coach at Prowareness
Sander van Schaik – BI Delivery Manager at PGGM