Dashboard Design

What Actually Belongs on a Dashboard


Most dashboards fail. Not because the data is wrong or the technology breaks, but because nobody thought carefully enough about what the dashboard is for. The result is screens full of charts that look impressive in a demo and get ignored within a week. Expensive wallpaper.

Good dashboard design starts with a different question: what decision does this help someone make? If a metric does not connect to an action, it does not belong on the screen. That single filter, applied consistently, separates dashboards that people rely on from dashboards that people tolerate.

We have been designing and building custom dashboards since 2005. Across 50+ applications, the pattern is consistent. The dashboards that stick are the ones designed around decisions, not data. This page covers what that means in practice, and what the design process looks like when you get it right.


What Is Dashboard Design

Dashboard design is the process of selecting, organising, and presenting data so that a specific audience can assess status, spot exceptions, and take action quickly. It combines information architecture, data visualisation, and interface design into a single view that communicates the most important things first.

A well-designed dashboard answers its core questions within seconds. Not minutes. Not after scrolling. Within the time it takes to glance at the screen and understand whether things are on track or need attention.

The discipline is closer to editing than to decorating. The hard work is deciding what to leave out.


Types of Dashboards and When Each One Fits

Not all dashboards serve the same purpose. The type determines the layout, the refresh rate, the level of detail, and the interaction model. Getting the type wrong is one of the most common reasons dashboards fail.

Executive Dashboards

An executive dashboard shows five to seven key metrics that summarise business health. Revenue, pipeline, customer acquisition cost, churn rate, and one or two metrics specific to the current strategic priority. That is it.

The audience glances at this dashboard for 30 seconds between meetings. Every element must communicate at a glance. Sparklines show direction. Traffic-light indicators (green, amber, red) show status against targets. No tables. No multi-axis charts. Executive dashboards refresh daily or hourly. They answer one question: are we on track?

Operational Dashboards

Operational dashboards monitor live systems. Server health, order processing, manufacturing throughput, support queue depth. The data refreshes in seconds or minutes, and the primary interaction is exception-based: everything green means things are running normally. Amber or red demands attention.

The layout prioritises scanning. Status indicators are large and colour-coded. Detail is available on click but hidden by default. The goal is to surface problems before they escalate. Real-time dashboards in this category often connect to WebSocket feeds for sub-second updates.

Analytical Dashboards

Analytical dashboards support exploration. Unlike executive and operational dashboards, they expect the user to spend time filtering, drilling down, and comparing segments. Marketing performance by channel, sales pipeline by region and quarter, customer behaviour across cohorts.

These dashboards have more controls: date range selectors, filters, comparison toggles. The layout accommodates more charts because the user is actively investigating rather than glancing. The danger is scope creep. An analytical dashboard that tries to answer every question ends up answering none of them well.

Sales Dashboards

Sales dashboards track pipeline health, conversion rates by stage, individual and team performance against targets, and deal velocity. The audience is sales managers and leadership who need to know whether the quarter is on track and where deals are stuck.

The most useful sales dashboards highlight exceptions: deals that have been in the same stage too long, targets that are behind pace, conversion rates that have dropped below historical norms. A pipeline visualisation that simply shows total value by stage is less useful than one that flags the three deals most at risk.

Project dashboards show delivery status across multiple workstreams: milestones, blockers, resource allocation, and timeline adherence. The link between project dashboards and project visibility systems is direct. The dashboard is the top layer of a system that tracks time, tasks, dependencies, and progress. It summarises that complexity into a view that can be understood in under a minute.

Some dashboards are the product. SaaS applications that show customers their own data, client portals that display campaign performance, platforms that report usage and billing. Customer-facing dashboards carry the brand. Design quality matters more because it directly affects how customers perceive the product. They need to handle a wider range of data volumes and a wider range of user sophistication.


Dashboard Design Principles That Actually Matter

Design principles are easy to list and hard to apply consistently. Stephen Few's Information Dashboard Design remains one of the best references on the subject. These are the principles that make the biggest difference in practice, drawn from two decades of building dashboards for businesses.

One Audience, One Purpose

A dashboard that tries to serve the CEO, the operations team, and the marketing department will serve none of them well. Each audience has different questions, different time horizons, and different tolerance for detail. Build separate views for separate audiences. The data underneath can be the same. The presentation should not be.

The Decision Test

For every element on the dashboard, ask: what decision does this support? If the answer is "it is interesting" or "we have the data", remove it. Interest is not a reason to occupy screen space. If a metric does not connect to something the user can actually do, it belongs in a report, not a dashboard.

Visual Hierarchy and Information Density

The most important information gets the most visual weight. Larger type, prominent position, strong colour contrast. Secondary information is smaller, lighter, positioned lower or to the side. This is visual intelligence applied to data: guiding the eye to what matters first.

Information density is a balance. Too sparse and the dashboard wastes the user's time with scrolling. Too dense and nothing stands out. The right density depends on the audience. An executive dashboard is sparse by design. An analytical dashboard for a data team can be denser because the users are comfortable with complexity.

Colour as a Vocabulary, Not Decoration

Colour should encode meaning, not aesthetics. Establish a consistent colour vocabulary across the dashboard.

Green
On target, healthy, positive trend
Amber
Approaching threshold, needs monitoring
Red
Below target, requires action
Grey
Neutral, informational, no action needed

Use this vocabulary consistently. If green means "on target" in one chart and "marketing department" in another, the dashboard teaches the user nothing. Colour is the fastest visual channel. Do not waste it on branding. The Web Content Accessibility Guidelines (WCAG) also require that colour is never the only means of conveying information, so pair colour with labels or icons for accessibility.

Context over Isolated Numbers

A number by itself is meaningless. Revenue of 420,000 pounds. Is that good? Compared to what? A number with context tells a story: 420,000 pounds, 12% above target, trending up over the last three months.

Every metric on a dashboard needs at least one form of context: a target, a trend, a comparison to previous period, or a threshold. Without context, the user has to remember what "normal" looks like. That is a cognitive load the dashboard should carry.


Common Dashboard Design Mistakes

These are the patterns we see most often when auditing existing dashboards. Each one seems reasonable in isolation. Together, they explain why most dashboards get abandoned.

Mistake What happens The fix
Showing everything because the data exists A scrolling wall of charts where users cannot distinguish signal from noise Start with decisions, work back to the five or six metrics that inform them. Leave the rest in a drill-down or a separate report.
Choosing the wrong chart type Pie charts for 12 categories, 3D bar charts, stacked area charts where middle layers are impossible to judge Match chart type to the data and the question. Comparisons need bar charts. Trends need line charts. Most situations need a simple table or a single number with context.
Ignoring mobile entirely Works on a 27-inch monitor, collapses into an unusable mess on a phone Design a mobile-specific view showing the most critical metrics with appropriate touch targets and readable typography.
No refresh strategy Yesterday's data when users expect today's, or a real-time feed battering the database every second Match refresh rate to the decision cycle. Executive dashboards refresh daily. Operational dashboards refresh in minutes or seconds.
Treating the dashboard as finished Irrelevant metrics accumulate, new ones are missed Plan for quarterly reviews. Audit usage: which charts do people actually click? Prune what nobody looks at.

The Dashboard Design Process

Our approach follows a pattern refined across hundreds of dashboard implementations. It is not complex, but it is disciplined.

1

Discovery: Users and Decisions

We start by interviewing the people who will use the dashboard. Not their managers. Not the person who requested the project. The actual users. We ask: What decisions do you make regularly? What information do you wish you had in front of you? What do you currently check in spreadsheets, emails, or other tools? How often do you need this information? What does "something is wrong" look like in your role? This produces a decision map: a list of decisions linked to the metrics that inform them.

2

Design: Layout, Hierarchy, and Interaction

With the decision map in hand, we design the layout. The most important metrics get the most prominent positions. Related metrics are grouped. The visual hierarchy ensures that the user's eye lands on the right thing first. We prototype in high fidelity because dashboard design is too dependent on real data to judge from wireframes. A sparkline with dummy data looks fine. A sparkline with real data that is flat for six months and then spikes tells you the scale is wrong.

3

Build: Data Connections and Performance

Custom dashboard development connects the interface to live data sources through APIs, database queries, or data pipelines. The architecture depends on the refresh rate. Daily dashboards can query a reporting database. Real-time dashboards need streaming connections, often through WebSockets or server-sent events. Performance matters because a slow dashboard is an unused dashboard. We optimise queries, cache appropriately, and load charts progressively so the most important metrics appear first.

4

Test: The Crisis Scenario

Before deployment, we run a crisis scenario. We ask: if something went seriously wrong right now, would this dashboard tell you? Would it tell you quickly enough? This test catches missing alerts, unclear thresholds, and metrics that look fine in normal conditions but fail to highlight problems.

5

Refine: Usage-Driven Improvement

After deployment, we monitor which parts of the dashboard get used and which get ignored. Ignored charts are candidates for removal. Frequently filtered dimensions suggest a need for a dedicated view. Questions that users ask but the dashboard does not answer indicate missing metrics. The best dashboards get better over time because they are maintained, not just built.


When to Build a Custom Dashboard

Off-the-shelf business intelligence tools handle common reporting well. Looker, Metabase, Power BI, and Tableau all produce functional dashboards from standard data sources. If your requirements fit their templates, use them. They are faster and cheaper than custom development.

Custom dashboard development makes sense when the gap between what a template offers and what you need becomes too wide to bridge with configuration.

The dashboard is the product: Customer-facing analytics, client portals, platform reporting. Off-the-shelf tools cannot be embedded into your product with your brand, your interaction patterns, and your data security model.
Business processes are specific: Your workflow does not map to generic dashboard templates. The metrics, the groupings, and the drill-down paths are unique to how your organisation operates.
Performance requirements are strict: Sub-second response times on large datasets. Real-time streaming updates. Offline capability. These requirements often exceed what BI tools deliver.
Integration depth matters: The dashboard needs to trigger actions, not just display data. Clicking a flagged metric should open a ticket, send a notification, or initiate a workflow. That requires application-level integration, not report-level display.

The honest answer is that most organisations start with an off-the-shelf tool and move to custom development when they hit the tool's limits. The transition point is usually when the dashboard needs to do something, not just show something.


Getting Started

If your dashboards are not driving decisions, the problem is almost certainly design, not data. The data is there. The question is whether it is presented in a way that makes the right action obvious.

We have been building custom dashboards since 2005. We start with the decisions your team needs to make, design the interface around those decisions, and build it to perform reliably at scale.


Build Dashboards People Actually Use

If that sounds like what you need, no pitch deck. We will look at your current dashboards, discuss what is working and what is not, and outline what a better version looks like.

Book a discovery call →
Graphic Swish