I’ve seen many organizations run into difficulty defining governance and best practices managing the lifecycle of BI dashboards. Perhaps you have deployed a self service BI program – data scientists have access to tools, IT has deployed severs to publish completed dashboards, and employees are anxious to leverage visualizations and become more data driven. Job done right?
This is common and occurs because organizations fail to put in some basic governance practices around data, reports, dashboards, visualizations, and analytics.
I stress the word basic – not overbearing, but recognize that a self-service BI program without basic data governance may provide value but will slow down before achieving its full potential.
Best Practices in Self Service BI Programs
So let me suggest some starting points.
- Define a life cycle – I strongly suggest organizations manage dashboards like applications. They are developed, they need to be tested, their needs to be some documentation, users need to be trained, they need to be published, feedback from users needs to be gathered, enhancements should be prioritized. This doesn’t need to be as onerous as it sounds, but having these disciplines insures that dashboards developed provide enough value and worth the effort. Testing, documentation, and training insures that dashboards developed are leveraged by a audience wider and hopefully prevents the duplication of effort in creating similar or derivative dashboards.
-
- Data scientists need a central place to store source files; spread sheets, Tableau workbooks, Qlik application, etc. Ideally these should be stored somewhere where files can be versioned and tagged.
- Servers should be configured to handle the equivalent of dev, test, and production.
- IT should consult with data scientists to consider tools, templates, and storage of documentation.
-
- Documentation should focus on data flows, data definitions, calculations, aggregations, and known data quality issues. Data scientists joining the organization need to have a strong understanding of their starting points while dashboard consumers need to be able to use them and interpret the results. Organizations need to define documentation standards for these audiences and when in the lifecycle they should be updated.
-
- Define style guides covering layouts, control choices, common components, visualization types, color palettes and other design considerations so that there is consistency between dashboards developed by different data scientists.
-
- Testing should focus on insights, calculations, and usability. Do the results look reasonable? Are calculations providing expected values? Are the dashboards usable and intuitive or complex and clumsy? The lifecycle should be developed for iterative review and feedback. Dare I say agile data practices?
-
- Governance practices should target reviews of the quantity and disparity of dashboards developed. This is really important in order to avoid landfills because if everyone develops their own dashboard versions that largely perform the same analytics, the sheer quantity of them will make it more difficult for users to navigate and possibly provide conflicting results. It will also add complexity when data models or software systems need upgrading.
- Measure usage and impact because data visualizations and dashboards not being used should be phased out.
Evolve the Practice
Not all of this needs to be defined up front. Define these and other practices as needed.






















Leave a Reply