DCMN · Dashboard Design · Data Visualization
Eliminating 400 Hours of Monthly Reporting Waste
400 hours of manual Excel reporting every month — error-prone, inconsistent, and consuming the strategists most capable of turning data into insight.
Orchestrated 6 months of facilitated workshops and user research — standardizing 50 client accounts onto a single live platform and reclaiming 400 hours of monthly strategist capacity from manual Excel work.
400 hrs/month eliminated. 40 adopters across 5 teams. One standardized report, company-wide. Excel templates are gone.
30+ Hours
per strategist, per month. No longer wasted.
At the start of each month, I had five reports due. Each one could take a full day.
Role
Lead Designer
Timeline
6 Months
Impact
400 hrs/mo
400 hours of monthly strategist capacity reclaimed. I orchestrated the end-to-end redesign — facilitated discovery workshops, user research, and a standardized live dashboard — that unified 50 client accounts onto a single reporting platform and achieved institutional alignment company-wide. Adopted by 40 individuals across 5 internal teams. The Excel templates are gone.

01 — Context & Constraints
The organization
DCMN is a performance marketing agency specializing in offline channels — TV, audio, out-of-home — for growth-stage companies. The business model depends on demonstrating measurable channel performance to clients every month. The reporting process was the primary mechanism for that demonstration. It was broken.
The target user
Janine — a senior marketing strategist and our primary target user — put it plainly: a single client report could consume a full working day, and at the start of each month, she had up to five clients expecting reports simultaneously. This was not an edge case. It was the standard experience for every account manager on the offline marketing teams.
The real constraints
8 hours per report, 50 clients
Each monthly client report required up to 8 hours of manual Excel work — data entry, formatting, chart creation, and export. With 50 active clients, that was 400 hours of error-prone effort every month before a single insight reached a client.
Inconsistent templates across the company
There was no single report standard. Each account manager maintained their own Excel file with their own structure, meaning clients received different-looking reports depending on who handled their account — a quality and trust problem, not just a time problem.
No live data connection
All data was pulled manually from ad platforms and pasted into spreadsheets. There was no API integration, no version control, and no audit trail. A single copy-paste error could silently corrupt a client's performance numbers.
Reporting consumed strategist capacity
The people most capable of turning data into insight — the marketing strategists — were spending the bulk of their monthly cycle doing data entry. The reporting process was cannibalizing the work the reports were supposed to enable.
02 — The Efficiency Map
From Excel Slog to Live Dashboard
Every step below represents time that was no longer available for the work that matters. The left column shows what the team was losing every month. The right column is what replaced it.
Before — Excel Process
Time lost per client
8+ hours
× 50 clients = 400 hrs/mo
After — Live Dashboard
Time per client
~2 minutes
30+ hours reclaimed per strategist / mo
03 — Discovery & Research
Weekly workshops as the engine
I facilitated weekly workshops via Miro with the product team and internal users throughout the engagement. These weren't status meetings — they were working sessions where we audited the current reports together, surfaced disagreements about what data mattered, and negotiated the information architecture of the new system in real time.
Auditing what existed — before designing anything new
Before committing to a direction, I audited the existing reports and catalogued the most common data points across all client accounts. The audit revealed that roughly 70% of the content was consistent — the inconsistency was in structure, not substance. That meant standardization was viable by leveraging what already existed. The data model could be implemented before the UI was designed.


The previous process (left) and workshop artifacts (right)
04 — Stakeholder Management
The Hard Part
The resistance, the prototype that missed, and the political KPI negotiations that shaped the final system.
The resistance
Our Head of Marketing championed the new tool, but some team members quietly ran the old Excel process in parallel — comparing numbers until they fully trusted the data. Rather than forcing a hard cutover, we let the dashboard prove itself. That dual-use period became an unexpected validation phase. When the numbers matched consistently over three cycles, adoption accelerated on its own.
The prototype that looked "too familiar"
My first prototype validated splitting reports into views by funnel stage. But users flagged immediately that it looked and felt too much like the old Excel report — same visual language, same cognitive pattern. That was well-noted feedback. We had redesigned the data model but hadn't redesigned the experience. The second round addressed layout, color system, and navigation before it went to client testing.
The KPI negotiation
Every team had a different opinion on which metrics belonged on each view. The workshop process was essential here — not to reach consensus by committee, but to surface the underlying disagreement about what "performance" actually meant across channels. Once we had that conversation explicitly, the KPI prioritization became a product decision rather than a political one.
05 — The Dashboard System
The final dashboard split reporting into three distinct views aligned to how strategists actually communicate performance — not how the data happened to be structured in the source systems.
Funnel Overview
5-step marketing funnel, budget breakdown by day, key KPI metrics benchmarked against the previous month and the same month of the previous year, adjustable reporting period and campaign filter.
Channel Performance
Toggle between KPIs via dropdown, clear visual distinction between channel types (Awareness, Volume, Frequency), tooltip showing exact values on hover.
Daypart & Timing
Toggle between daypart and weekday breakdowns, timeline view showing how budget and conversions fluctuated throughout the day, splits for key KPIs by time segment.

Funnel overview view

Channel performance view

Daypart & timing view
Design Principle: Shareable by Default
Every view was designed to be shared live — directly in a client call, not as a static export. This changed the interaction model from "we prepare, they receive" to "we explore together." Strategists reported that client conversations became materially more productive once the data was live in the room.
06 — Business Impact
30+ hours reclaimed
Per strategist, per month. Not hours saved on a spreadsheet — hours returned to campaign strategy, client storytelling, and the analysis the reports were supposed to enable in the first place.
400 hrs/mo — eliminated
Four hundred hours of manual Excel work removed from the monthly cycle across 50 clients. Every hour that was being silently lost to data entry is now billable capacity.
40 adopters across 5 teams
Rolled out to 40 individuals across 5 internal teams, starting with 3 beta testers. The adoption curve was steep once the parallel validation period resolved — no mandate required.
One report, company-wide
A single standardized report template replaced the inconsistent Excel patchwork. Every client now receives the same structure, same visual language, and the same quality floor regardless of account manager.
Reactive → proactive
Real-time, shareable reports shifted the team's relationship with data from reactive entry to proactive storytelling. Strategists could pull a live view during a client call rather than scheduling a reporting cycle.
Second-Order Outcome
The standardized report structure implemented for this engagement became the template for how DCMN onboarded new clients — institutional alignment that outlasted the project and became operational infrastructure. That's the marker I use to distinguish design that sticks from design that gets replaced.
07 — What I'd Do Differently
I would have built the parallel validation period into the rollout plan intentionally rather than discovering it emergently. The insight — that letting teams self-verify accelerates adoption more than a hard cutover — was real and useful, but we stumbled into it. A planned transition protocol would have been faster and less anxious for the team members running dual systems.
I would also have pushed earlier for a client-facing version of the dashboard. We built for internal users first, and the shareable reporting capability was partially retrofitted. The client interaction model — exploring data live in a call — was the most valuable behavioral shift that came out of this project, and it deserved to be a design brief in its own right from the start.