The challenge
A government-funded national education programme was running a large-scale randomised controlled trial to measure the impact of targeted interventions. The trial generated complex statistical outputs: treatment vs control comparisons, effect sizes, confidence intervals, regional breakdowns, and longitudinal progress data across multiple cohorts.
The problem wasn't the data - it was the audience. The findings needed to reach programme managers, local authority leads, and policy-makers who understood education but not statistical methodology. Traditional academic reporting formats (dense tables, technical language, PDF reports) weren't getting the message across. Key stakeholders were either not engaging with the results or misinterpreting them.
The commissioning body needed the data to drive decisions - which regions to invest in, which interventions were working, where to redirect resources. That required a reporting tool that made complex methodology accessible without dumbing it down.
The solution
Kicktag worked directly with the research team in a co-creation process - understanding the RCT methodology, mapping the data structures, and designing visualisations that preserved statistical rigour while being immediately interpretable by non-technical users.
The platform translates treatment effects into clear visual comparisons, shows confidence intervals as intuitive range indicators rather than abstract numbers, and allows users to drill into regional and cohort-level detail. Geographic mapping shows where interventions are having the strongest impact, and timeline views track progress across programme phases.
What "co-creation" means
This wasn't a standard brief-and-build project. Kicktag embedded with the research team over several weeks: understanding the RCT design, debating how to represent effect sizes visually, testing prototype visualisations with actual stakeholders, and iterating until the outputs genuinely served the decision-making process. The result is a tool shaped by the people who use it, not just the people who commissioned it.
Every design decision balanced two competing requirements: statistical accuracy (the researchers needed to trust it) and practical accessibility (the policy-makers needed to act on it). Where standard dashboard patterns didn't fit, Kicktag built custom visualisation components specifically for this project.
The results
The reporting tool became the primary way programme stakeholders engaged with the trial results. Regional leads could see at a glance where interventions were delivering measurable impact. Programme managers used the geographic and cohort views to make resourcing decisions that had previously relied on spreadsheet analysis and committee interpretation of dense reports.
Critically, the research team endorsed the platform - the visualisations accurately represented the underlying statistics, which meant findings could be communicated with confidence rather than caveats.
Why this matters for agencies
Not every dashboard project is a brand tracker or customer satisfaction survey. Sometimes your clients need something genuinely bespoke - a reporting tool that doesn't exist yet because the data and the audience are unique.
This project shows Kicktag's ability to go beyond standard MR dashboard templates and co-create something purpose-built for a specific problem. The same approach applies to any agency facing a complex data visualisation challenge: programme evaluation, social research, clinical trials, longitudinal studies, or any project where the methodology is too specialised for off-the-shelf tools.
If your client has asked for something and your first thought is "we'd need to build that from scratch," talk to us. That's exactly the kind of problem we solve.