Intro
Data is the new water for today’s organisations. It is essential for their survival and growth, as it nourishes every aspect of their operations and strategy. It needs to flow to all users enabling them in making informed decisions and moving away from intuition and guesswork.
The challenge for many organisations has been in making data analytics accessible to all users.
How to enable the masses with curated & timely data analytics?
This blog post discusses this very topic and suggests a method on how to amplify data analytics in your organisation.
A Plan that created Noise and Kaos
It started with the Data Analytics vision – using data to drive greater insights, improving the business value, making data accessible to everyone and driving the data driven culture.
First step – setup a central dedicated specialist team to handle Data Consolidation, cleansing and enriching, and providing the Analytics. This team would help the organisation.
Great first step, but soon complains were surfacing from the business users around SLOW delivery, unable to help my department, when will my work get done, they do not understand what we want… and the list went on.
So what did go wrong? The central team consisted of the best Data Engineers, Modellers and BI Specialists, so why these complains?
Sound Familiar! This is the side effect of fully centralised analytics teams.
The Underlying Challenges of Centralised Teams
Centralised team models in large organisations frequently experience “delays” and mis-alignment with the business stakeholders. Why?
Central teams have single or very limited points of entry and output, these soon become bottlenecks causing large backlogs.
In turn resulting in the Central team having to prioritise work, with some work shifting to backlog (“delayed”) and some progressing ahead.
At this point noise from requesting departments increase. This noise starts a series of chain reactions – the department leaders with the loudest noise may get their work pushed forward at the expense of other works which could further slide back and cause more noise. The cycle continues.
The Bottleneck is the Central Analytics team.
Two options available at this point, further scale up the team and retain the same central model or move to a DECENTRALISED model.
The Decentralised Data Analytics Model
Note: Decentralised model in this postis not the same as Data Mesh?
Rather this is the first foundation step needed if Data Mesh is to be considered.
Whilst a Central Data Analytics team works very well for smaller organisations, that is not the case for larger organisations.
For larger organisations that suffer from centralised team fatigue, a stealth decentralised model commences to take shape. This model has a Central Data Analytics team and key departments running their own “lightweight” DA function, and in some rare cases, it is done with some central team’s guidance and collaboration.
This model scales up rapidly with all areas of the business, all departments are able to self-manage their own work, own targets and each able to scale up their lightweight DA team as needed. Lots of flexibility, but, this model does have its pitfalls and it can be very problematic if no controls are put in place.
The downsides of Decentralised models with no controls
With every department creating their own Analytics in silo, they are multiple drawbacks of this approach.
- Duplication of analytics, datasets and integrations across different DA teams
- there is zero or limited sharing, thus rework is high, leading to higher chance of discrepancies of the same metrics
- Inconsistent metrics
- individual DA team’s may define a business metric with different logic or rules
e.g. total sales from Marketing DA may be different to total sales from Operations DA team
- individual DA team’s may define a business metric with different logic or rules
- Inconsistent attribute & reporting
- teams may decide to group attributes to how their department sees it rather than a consistent manner across the business.
e.g. reporting of website sales as Digital whilst other team may show this as Online
- teams may decide to group attributes to how their department sees it rather than a consistent manner across the business.
- Lack of Data Governance and Compliance
- Teams would not be entirely across how data should be secured, or meet minimum regulatory requirements , e.g. GDPR
These and many other issues could promote the lack of trust in Data Analytics and also pose a high security & compliance risk.
A lot of challenges in an entirely distributed model, something is needed to connect all distributed pieces together, to guide them whilst allowing them to plot their own journeys.
An improvement would be to add the central Data Analytics Centre of Excellence (CoE) – “The Lighthouse”.

The Lighthouse Distributed Data Analytics Model
The Lighthouse Analytics model enhances on the fully distributed model by repurposing the Central Data Analytics/Business Intelligence team as THE LIGHTHOUSE (Data Analytics Centre of Excellence CoE).
In this model the Central DA team’s mandate is to provide overall DA Guidance/Training, manage Data Governance across all Data analytics and manage/provision ENDORSED reusable Data assets.
This central team ensures all Business Teams / Departments using and working with Data are:
- Handling data in a safe and compliant manner
- are aware of their data responsibilities
- have adequate guidance on what they can or cannot do with data
- trained on utilising and extracting value out of data
- trained on utilising the Data Platforms to achieve their business objectives
- aware of ENDORSED data assets that are maintained by the Central DA CoE
- aware of global metrics and their definitions

Next post I will cover the details of the LightHouse CoE core functions and how it engages with other departments and users.