Engineering Alignment | Engineering Alignment with Metrics

Photograph of Carl Bergenhem

Carl Bergenhem

December 16th, 2024

Metrics dashboards: Do they divide or unite?

Metrics dashboards: Do they divide or unite?

It seems obvious, right? In theory, a shared set of metrics on a common dashboard should align the team, getting everyone on the same page.

In reality though, a metrics program with good intentions can have unintended consequences. It is actually far easier to use dashboards to divide your team, especially without a thoughtful and strategic rollout.

In other words, how you approach it makes all the difference.

In this article we’ll take a look at two real-world examples: the rollout of OKRs and DORA metrics, and how they divided and united the teams.

Story from the trenches: OKR rollout

A large organization (1,400 employees) I worked at wanted to adopt OKRs org-wide. The first two attempts to roll them out to the entire organization failed; the third succeeded. What was the difference?

The OKR divide

The concept of OKRs was introduced at an all-hands meeting for the first two attempted rollouts. It was presented as a one-size-fits-all approach, offered no opportunities to address questions, and set a target for 100% of teams to use OKRs by month end. No examples of good and bad OKRs were shared for reference. No educational resources were provided. And no specific OKRs were introduced.

Leadership eventually shared dashboards containing the OKRs for teams to review, but they only contained the leadership view. Without the ability to drill down, teams couldn’t understand how their work impacted leadership’s OKRs.

Worse, there was no way for individual contributors to understand what, if any, impact they had on a single objective or key result highlighted in the leadership OKR dashboard. Teams were confused, OKRs became ignored, and the dashboards died.

Uniting around OKRs

The third rollout was different. Leadership met with only the team leads to introduce OKRs before rolling them out to the rest of the organization.

Team leads developed initial OKRs, received feedback, reworked them, and developed a useful set of OKRs before sharing them with their teams. And the leadership team’s OKRs were presented in each meeting, helping teams connect their OKRs with top-level OKRs.

Everyone understood OKRs and their importance. Eventually, bi-weekly sprints, monthly syncs, and quarterly reviews were driven by the relative key results that the various teams were trying to achieve.

The BI team created dashboards to help each team focus on influencing their key results. The dashboards and metrics within them were flexible — when a team recognized that a key result wasn’t the right thing to measure, it was either removed or transitioned to a nice-to-know health metric not critical to the team’s work.

It was an effective rollout that united everyone behind the practice of setting quarterly and yearly OKRs.

Story from the trenches: DORA metrics rollout

DORA metrics are four industry-standard metrics (deployment frequency, change lead time, change failure rate, and mean time to recovery) that help assess the team’s software delivery performance.

In my role at Sleuth, where I coach engineering organizations on how to use the DORA metrics, I’ve seen teams divide and become dysfunctional over the metrics, and I’ve seen organizations flourish and unite around them. Again, division or unification comes down to the degree of strategic planning behind a DORA metrics rollout.

The DORA divide

The effort to introduce DORA metrics within engineering teams typically includes a DevOps or DevEx implementation team that sets up dashboards for other teams to use. Usually this is part of annual or quarterly goals to instrument metrics across the organization, with a big push to get the metrics up and running to establish a baseline. But once this is complete, enthusiasm to continue using the metrics fades.

Teams end up self-interpreting the metrics’ definitions and guessing how to use them. Individual developers across teams receive different explanations of the metrics, and dissent ensues on the merits of the metrics and if they’re even useful. Inevitably, teams stop paying attention or actively refuse to participate.

This rollout approach risks one of the most divisive scenarios when using dashboards to track DORA metrics: someone external to the team finds the DORA dashboard, thinks they understand the metrics, compares teams to one another based on their DORA metrics, and sets targets they think make sense.

Add to this the worst-case scenario where DORA metrics are used in individuals’ performance reviews. This is extremely dividing as it moves away from a DORA core concept that DevOps is a team sport and the teams — not individuals — are responsible for the metrics. Suddenly, a set of metrics meant to empower teams is used against them.

Uniting around DORA metrics

Uniting teams around DORA metrics requires rolling them out to the broader organization by starting small, gathering feedback from the initial pilot teams, and growing organically.

Eventually a critical mass is hit where the organization knows the common pitfalls teams may run into, making it easier to onboard larger groups of teams at a time.

An important aspect of the uniting approach is that anyone who may see the metrics but aren’t necessarily using them daily, receives the proper education and coaching for what the metrics mean. This isn’t just meant for adjacent roles like product or project management. Leadership also needs to be properly educated around what the metrics mean to them. Most importantly, they are educated on what not to do with the DORA metrics, such as using them in performance reviews.

The end result is an organization that is fully united behind a set of metrics that improves software delivery performance without sacrificing stability.

Your turn: divide or unite?

In both scenarios, taking the proper time to think about how the metrics and dashboards will be used on day one — and months into the future — determines a successful rollout.

Division originates from rushing to implement the next great idea across the organization, whereas uniting teams requires thinking about why the metrics and dashboards are important and how they can help a specific organization.

Without proper planning to introduce the metrics to all levels of your organization and provide ongoing coaching, you’re planning for dashboards to divide your team.

But with the proper thought and education in place, teams will unite around the dashboards and use them to drive their daily activities. The end result is a more efficient organization that delivers features quicker and with fewer issues, and delivers more value to their customers.


Related Content