Lire cet ebook en Français
The State of Analytics Governance 2025
Executive Summary
This report contains the findings from Wiiisdom’s survey on The State of Analytics Governance 2025, led in 2024. We conducted this research in response to the growing significance of Analytics Governance as a category and the lack of data within this field. This survey was designed to gather feedback from BI professionals and data analytics leaders on their maturity in Analytics Governance, the reality within their organizations, and the trends we expect to see in this category.
The following key Analytics Governance trends emerged from the survey:
- Analytics Governance is a somewhat new and vast category with maturity among organizations being lower than perceived, but it is of growing importance with 75.9% of respondents stating they have implemented governance policies that specifically apply to BI & analytics.
- The Data Governance department seems to be the most appropriate champion for defining Analytics Governance policies as it requires decision-making power and time to be properly implemented. The study shows that other possible roles can also champion the topic.
- Almost 66% of identified dashboard errors lead to negative consequences for an organization (reputational, financial, etc).
- BI content testing is the top future investment in Analytics Governance. This is especially true for respondents who stated that they don’t currently implement any form of quality checks to validate BI content for consumption.
Introduction
With innovations like generative AI set to revolutionize the BI & analytics space, a new era of data analytics is emerging. As data can now be transformed into insights in milliseconds, the importance of Analytics Governance has never been higher. Gartner® states that “With growing regulation tied to AI, analytics governance is now required, not a nice-to-have.”1
We understand the frustration caused by inaccurate and poor-performing dashb oards and reports, and the issue of trust in analytics; a significant challenge that most organizations, of any size, face, especially in regulated industries like the financial, life science or pharmaceutical industries . All of us have encountered errors in BI content, both noticeable and not, and many of us have experienced fallout from those errors such as operational outages, reputational damage, etc. Moreover, the current adoption rate for BI/analytics tools remains stuck at 20%, which continues to hinder progress. This is a big problem facing organizations today.
This is why we aimed to deliver the first piece of research on this topic and demonstrate to the market just how big of an issue trust in analytics is, how organizations are currently dealing with it, to apply then these lessons learned for GenAI analytics, and their future plans.
1.Source: Gartner, Hype Cycle™ for Data and Analytics Governance 2024, Guido De Simoni, Andrew White, 18 June 2024. GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and HYPE CYCLE is a registered trademark of Gartner, Inc. and/or its affiliates and are used herein with permission. All rights reserved.
Key Findings
Key findings from the survey on The State of Analytics Governance show the growing importance of Analytics Governance to respondent organizations. However, there remains a gap in ensuring full governance with few organizations establishing automated continuous testing and certification processes for BI artifacts.
20%
of all respondents continuously test once a report or dashboard is in production.
82.8%
of respondents stated that accuracy testing is their number one type of test carried out.
77%
say they encounter issues or inaccuracies in BI content that is already in production at least once or more per month.
95%
of organizations with three or more environments test BI content.
59%
say that BI content testing is the top future investment in Analytics Governance.
66%
say identified dashboard errors lead to negative consequences for the organization.
52.2%
of responding organizations state that they don’t do any proactive monitoring of their BI content.
Study Demographics
We anonymously surveyed BI professionals from the largest organizations in the world and this report is exclusively based on the completed responses. For a deeper understanding of who participated in the survey, we first gathered information on each respondent’s background profession, location, industry they work in, and organization size.
Job Role
Over 80% of respondents are BI Developers or BI & analytics Leaders which are roles that predominately leverage BI and analytics solutions.
Industry
Survey respondents work in various industries with Financial Services being the number 1 industry with Banking, Insurance, and Technology coming just behind. 33 industries emerged from the results, but the top 12 were the most pertinent2.
2.We decided to remove Freelancers from this analysis.
Location
Most respondents are based in North America (68%) followed by Europe (21%).
Organization size by employees
45% of respondents work in organizations that have between 10,000 and 99,000 employees. Very large organizations of 100,000+ employees also hold a notable share (16%).
Defining Analytics Governance
Understanding the core of Analytics Governance
Analytics Governance is gaining recognition as a distinct category prompting us to explore its current significance within the industry. The term is an umbrella statement that refers to different methodologies and sub-categories that all contribute to a better and more trusted data analytics experience. According to Gartner, “analytics governance (including self-service analytics) is the setting and enforcement (with a workflow) of governance policy along the analytics pipeline, from the ETL (start) to the sharing of analytics insight (end).”
As an emerging category and an umbrella term for multiple already-existing fields, we were curious to know what comes to mind when we refer to Analytics Governance. The findings (Figure 1) from our survey align with the contemporary understanding of the term, revealing that respondents associate Analytics Governance with a diverse range of areas.
Figure 1: When you see the term “Analytics Governance”, what does it mean to you?
Permission management and security came out as the sub-category that comes to mind the most among respondents (71.7%) when thinking about Analytics Governance, which was no surprise. Managing who can access what content and what data is critical to safeguarding your BI & analytics platform. On another note, it’s interesting to see that less than half of the respondents answered BI content testing (45.5%). This clearly shows a lack of consideration on that topic considering that quality assurance is one of the most important preoccupations among BI & analytics leaders today. This study will focus on that domain, aiming to uncover the reasons behind this oversight.
Furthermore, version control came out as the sub-category that comes to mind the least among respondents when thinking about Analytics Governance, even though more than 40% of respondents indicated that a content management process, including version control, has been implemented in their organization (Figure 2). It is crucial for regulatory compliance to ensure that every modification to a dashboard or report is traceable, viewable, and reversible. This process maintains the reliability and accuracy of analytics, underscoring its importance in relation to Analytics Governance.
Figure 2: Have you or has your organization implemented a content management process including version control for BI content?
Analytics Governance exists in practice
Understanding the concept of Analytics Governance is a crucial first step; however, its implementation within organizations is becoming increasingly vital. Survey findings demonstrate that 75.9% of respondents’ organizations have implemented governance policies that specifically apply to BI & analytics (Figure 3). This highlights that despite being a new category, methodologies and processes under this concept are not something new, and the fact that they’re being placed within a category, shows the growing importance of it.
Figure 3: Has your organization implemented governance policies that specifically apply to BI & Analytics?3
3.Due to rounding, percentages may exceed or fall short of 100% by up to 0.1%.
Findings also show that this is true whatever the region with no noticeable differences between the two main locations of respondents, North America and Europe (Figure 4).
Figure 4: Comparison between organizations in North America and Europe of their implementation of governance policies.
What is the Maturity Level of Analytics Governance in Companies?
Full maturity is yet to come
We asked respondents to rate their maturity in Analytics Governance both at the beginning and end of the survey. This approach aimed to determine if providing deeper insights into their progress would prompt them to reassess the accuracy of their initial maturity assessments. The results (Figure 5) confirm that the maturity in terms of Analytics Governance is lower than perceived. Research carried out by Gartner also states that “governance is still about command and control, and maturity in D&A governance is low. Organizations that mature in D&A governance consistently report higher levels of success. However, only 18% of participating organizations are mature at enterprise scale.4”
4.Gartner, Infographic: Data and Analytics Governance Survey: IT Says Mission Accomplished Business Disagrees, Saul Judah, Andrew White, Anna Tocheva, Stuart Strome, 17 February 2022.
Figure 5: Comparison of respondents’ maturity in Analytics Governance between the start and end of the survey.
Interestingly, despite the awareness of Analytics Governance and its perceived maturity, few organizations govern BI content through continuous testing. While 45% of respondents recognize BI content testing as an Analytics Governance sub-category, only 20% of respondents govern it through continuous testing. Full maturity requires organizations to be at the top of the maturity curve.
Figure 6: The Analytics Governance maturity curve according to Wiiisdom.
Who should lead an Analytics Governance initiative?
With only 20% implementing continuous testing on their BI content, there is a lack of maturity in the field, so there is work to be done for organizations to realize the importance of this sub-category. The first step for organizations is to define leaders that will establish the different policies and processes around data and analytics.
With this in mind, we wanted to know which department within an organization was currently responsible for defining governance policies because change management is a real challenge. Findings showed that the Data Governance team is the best candidate to lead Analytics Governance initiatives regardless of the respondents’ location (Figure 7). Interestingly, the IT team remains more involved in defining governance policies in Europe than in North America.
Figure 7: Who has been in charge of defining these governance policies?
What is the Current Reality of BI Content Governance?
BI content integrity
Industry analysis states that organizations should establish a certification or watermarking process for BI content accuracy. According to Gartner,
“A common strategy is for professional BI developers to apply watermarks, tags, certification labels, or similar, to content. Certifying content indicates to analytics consumers that the content has been endorsed by professional BI developers, differentiating it from content created by self-service analytics users, and establishing trust.”
Source: Gartner, How to Govern and Scale Existing Self-Service Analytics Initiatives, Georgia O’Callaghan, 7 March, 2024. GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.
At Wiiisdom, we believe data consumers should consume analytics like any other product. Like food consumers seek healthy and safe goods, data consumers expect to have accurate and reliable data packaged in the form of a report, dashboard, metric, KPI, etc. Certifying BI artifacts indicates quality guarantees to reassure consumers that they have been validated. Trust marks, reviews, and product certifications that we see in everyday life should be reciprocated for data analytics to provide evidence of quality.
We, therefore, asked respondents whether their organization has a certification process in place that specifically applies to BI content (Figure 8). Nearly 40% of respondents state that they lack a standardized certification process. Just over 30% of respondents claim to certify their BI content manually, and over 50% of these respondents find errors on average once a month in their dashboards and reports. This highlights that certifying BI content manually, despite showing a higher maturity, is not enough. Automating the process of validation and certification is needed.
Figure 8: Does your organization have a certification process in place that specifically applies to BI content?
BI content inaccuracies
Data quality issues cost every organization an average of $12.9 million every year which is a worrying figure given that data quality is directly linked to decision-making. When we talk about data quality issues, it englobes many different elements, and companies have already heavily invested in data quality because it’s a never-ending process. However, just like an iceberg has a lot of ice downstream, the tip of the iceberg is what navigators see, and in the data world, this is where data consumers see data presented, enriched, and filtered in a dashboard or a report, ultimately, where decisions are made.
Our survey found that 77% of respondents encounter issues or inaccuracies in BI content that is already in production at least once or more per month (Figure 9) . The risks associated with this, whether they be reputational, financial, regulatory, or operational, only make the importance of Analytics Governance even clearer. Unfortunately, though, lots of quality issues remain unnoticed.
Figure 9: How frequently do you encounter issues or inaccuracies in BI content created for business stakeholders that are already in production?
Furthermore, we then asked respondents if the identified dashboard errors led to any negative consequences for their company, such as internal challenges (like loss of trust or system outages) or external issues (including financial repercussions, damage to reputation, or compliance violations) (Figure 10). 66% of respondents claim these errors led to internal challenges, external issues, or both. Organizations in regulated sectors such as financial services, healthcare, life sciences, and the public sector must accurately report data to regulatory agencies to ensure compliance. Any errors in these reports can lead to significant fines.
Figure 10: Did the identified dashboard errors lead to any negative consequences for your company?
When we compared BI & analytics leaders and non-leaders, 58% of BI leaders said it led to internal challenges whereas among non-leaders, 36% of them said there were no consequences. This suggests that they might not be as deeply engaged with or affected by the intricacies of BI & analytics initiatives to the same extent as their leading counterparts.
How are organizations testing their BI content?
We asked respondents when they test their BI content to get a better understanding of the reality of Analytics Governance implementation for organizations (Figure 11). “We test when content is being deployed” is unsurprising as it’s very common to test content before being delivered in production. Equally, “We test during the initial development of dashboards” is of the same view.
Figure 11: When is BI content tested?
If we compare the findings of when respondents test their BI content with the number of business intelligence environments they have, 95% of organizations with three or more environments test BI content. However, among the organizations with just one environment, 40% don’t test at all, 53% test in the initial development and only 7% continuously test once in production. This demonstrates that those with more environments are more likely to test their BI content in some way or another.
If we take a more global view, only 20% of all respondents continuously test once a report or dashboard is in production. During development, however, the majority of respondents spend at least 10% of their time on testing (Figure 12). This shows that while many respondents recognize the importance of testing during the development phase, they often neglect continuous testing once the reports or dashboards are in production. The gap suggests that despite initial efforts, there is a lack of ongoing quality assurance, which is crucial for maintaining robust and reliable data analytics over time.
Figure 12: For each dashboard, how much time do you devote to testing during development?
Types of BI content testing
When it comes to testing BI artifacts, there exist multiple types of tests for BI content because of the number of issues that could arise. Accuracy testing was the number one type of test among the survey respondents which isn’t surprising given it’s the test that assures the displayed data in dashboards and reports is accurate (Figure 13). Other types of tests, including performance, user experience, and user permission are closely behind demonstrating the importance of different tests for a trusted global analytics experience.
Figure 13: Types of BI content tests organizations are currently running.
BI content monitoring
Respondents were asked how they track and evaluate the performance and accuracy of BI content after it has been delivered in the production environment (Figure 14). Over 50% of respondents state that they don’t do any proactive monitoring which demonstrates that very few actually test in production.
Figure 14: How do you track and evaluate the performance and accuracy of BI content after it has been implemented in the production environment?
This study shows the growing importance of Analytics Governance as a category and the different themes that fall within it. It demonstrates the potential consequences of having quality issues in the data analytics layer and the current level of maturity among organizations. Now, we’ll look at the future trends of this rising category.
What is the Future of Analytics Governance?
As the Analytics Governance market will continue to evolve, we wanted to understand and depict future trends. We, therefore, asked respondents what they think will increase in terms of investment in this category (Figure 15).
BI content testing emerged as the number one future investment in Analytics Governance. This is an encouraging statistic to see because of the huge importance of quality for trusted data analytics. Testing is at the heart of Analytics Governance to ensure fewer errors, faster adoption, and more confidence in decision-making.
Given the high number of answers for each category, these findings also highlight that all sub-categories of Analytics Governance are growing in importance showing that this category as a whole is amplifying in significance.
Figure 15: The predictions for areas of investment of Analytics Governance.
If we then take a closer look at the organizations that are not controlling the quality of their BI assets, it’s compelling to see that investing in BI content testing is a top priority for the future (Figure 16). The validation of BI content is a critical part of Analytics Governance to ensure dashboards and reports are accurate, recent, and ready to use.
Figure 16: Predictions for areas of investment of Analytics Governance among organizations who don’t currently test BI content.
Of course, these trends will continue to evolve because innovations like Generative AI are reshaping the data analytics market. More and more users can create insights and metrics inside BI platforms at their fingertips. If your BI & analytics landscape already resembles the Wild West due to a lack of governance, then as it becomes even more inundated with Gen AI insights, ensuring accuracy will be crucial for maintaining trust and compliance, further emphasizing the need for governance.
Conclusion
Wiiisdom surveyed BI & analytics professionals to gain insights about the current state of Analytics Governance within their organizations. Survey results demonstrated that albeit a new category, organizations recognize its importance for a better data analytics experience and are actively implementing governance policies that specifically apply to BI & analytics. However, there remains a gap in ensuring full governance in the last mile of the data journey with few organizations establishing automated continuous validation and certification processes for BI artifacts. As the Analytics Governance category continues to mature, it will be interesting to see where organizations will put their focus and the processes they put in place.