Level up your DORA metrics journey

October 11, 2022
|

Level up your DORA metrics journey

October 11, 2022
|

You're adopting DORA metrics and you want to take them to the next level. What is that next level? How can you maximize the return on your investment in DORA metrics and improve your software team’s engineering efficiency?

Sleuth’s co-founder and CTO, Don Brown, and head of customer success, Leigh Ann Whitmarsh, discuss four levels of using DORA metrics that they’ve identified based on their  interactions with thousands of teams to help them improve their DORA metrics, as well as their own experience with using the metrics. 

If you're brand new to DORA metrics, they are four metrics that have been shown by research to correlate to high performing teams. These metrics give you an idea of how quickly you are able to produce, but also the quality of what you're producing. 

The four metrics include:

  1. Deployment Frequency: how often you deploy
  2. Change Lead Time: when you deploy, how long it takes to go from first commit to deploying 
  3. Time to restore service, also known as Mean Time to Recovery: when you fail in production, how long it takes to restore health
  4. Change Failure Rate: when you deploy something to production, the percentage of those deployments that fail

Let’s get to it so you can move to the next level of your DORA metrics journey.

Level 1: Finding your software engineering pain point

Don Brown: When you talk to people who are new to DORA metrics, what do you find is the entry point into tracking software team engineering efficiency? What pain that's driving them?

Leigh Ann Whitmarsh: What I hear the most is change lead time. Teams want to understand how much time it's taking from when a developer starts writing code for a future or change to when that change is actually used by the end users. A lot of people are guessing at this.

It’s better to understand exactly what's happening so that you know specifically how much of your change lead time is actual coding time versus review time, for example. I can't tell you how many times I've seen managers surprised to know that their review time was three times longer than the coding time and was actually the reason for that gut feeling they had that something wasn't quite right.

By starting to measure it, they're able to understand the details, understand specific projects or teams that are eating up time in one place or another. Sometimes it's explained by patterns or even team changes, especially when a team is growing fast and moving fast. Those changes in your team or in your processes can be reflected in your change lead time. 

So the problem is if you can't actually see it, how do you know which thing is being impacted? 

Don: Right. Change lead time is a good one. And just for those who aren't as familiar with DORA metrics, change lead time is a measurement of how long it takes you to get a change out to production from the first code commit. Knowing how long it takes to get something out is a key part of that process because efficiency is all about improving and getting more things out with what you have.

So, often people have a specific question or pain point that starts them on the DORA metrics journey. It's not that they want to adopt a cutting edge tool and don't really know what they want from it. The teams that are most successful with DORA metrics have a specific thing in mind. For example, when I was at one company, our top priority was to increase our deployment frequency and deploy code once every day. 

Leigh Ann: Identifying that one thing is so helpful, and it tends to open up some doors. So, once you know what that one metric is and start to understand it, then it's time to start building that out and seeing what else there is to improve.

Level 2: See the connections among all DORA metrics

Don: That leads into the second part. Once you’ve identified what you want to improve, started measuring it and finding ways to improve, that's not the end of the journey. That's really just the beginning. 

One thing I like about the DORA metrics is that they look beyond how fast you go. Anyone who's been in any kind of leadership position knows if you go really fast in one direction, but it's the wrong direction, then it doesn’t matter how fast you were going, because you didn’t improve anything.

The next step is going from one metric that you're interested in and starting to look at the other metrics. This is where tracking DORA metrics become really powerful because you're starting to understand their impact across other aspects of your team. 

In talking with customers, what’s an example of success they’ve had going from one metric to another?

Leigh Ann: I’ve been working closely with an engineering director as they've been adopting DORA metrics and learning how to improve, and manage their teams and performance better. We had a conversation about a tool they were using, but they'd stopped looking at it because it was just noise. We talked about what they had expected to be getting out of it, and I suggested connecting it to the stack to see what happens, because they were already paying for it.

Once we did that, we saw some patterns in errors they were catching that didn't need to be there. They were easy fixes that actually made the tool more valuable, and now they're able to see what's happening. Their developers are happier because they don't feel like they're being bombarded by this mess of things coming in. It's been really cool to watch that happen and make a huge difference in their entire engineering department.

Don: That’s a great story of how sometimes we just want to buy tools to solve all the problems but we don't tie it all together, and we don't really see the net effect of all these things working together toward our goal of improving engineering efficiency and failing less frequently. And so getting these DORA metrics views is a great way to tie these disparate things together so that you can see the forest for the trees.

Leigh Ann: It's really bringing that whole picture together and all in one place. So instead of having to go and look at seven different places, you start at one and you've got an overview of everything that's happening. Then you can start drilling down, and you see something that's not right, or something that's fantastic. 

We talk about errors and issues all the time. Then we forget to talk about things that are improving, and how to make them happen across the entire org. So, it can go both ways, but getting more value out of the tech stack that you already have in place by tracking DORA metrics is amazing. 

Don: That’s a good point that we often talk about what we need to improve. There are problems we need to fix, but there are also things worth celebrating.

Tracking DORA metrics can help us celebrate how we've been able to keep shipping things even when people are on vacation, for example. They’re also a way to communicate internally about how things are going and build a stronger culture, which is really important to do, especially as more and more teams work remotely. 

Leigh Ann: Another important part of this is setting goals and understanding measures. It's easy to set a goal for change lead time or deployment frequency based on something you’ve read, but is it really the right thing for your team, and how feasible is it? 

Once you start tracking and seeing what's happening for real, you can investigate bottlenecks or pinpoint where the team does really well. Then, you have data and actual results to inform how to set a realistic goal and a plan for getting there.

Don: That’s an important point because when people hear that management is going to start measuring engineering, it’s an opportunity to explain that it’s not about measuring people. What people miss is that they take these measurements and they make them the goal. That's dangerous because when you tell a dev their job is to improve this number, they’ll find a way to prove that number. 

If everything else burns, they kind of don't care because their job is to hit that number and they want to get that performance raise. Then, people start trying to game it. Having multiple measures from the DORA metrics can feed that. You can measure aspects of engineering if used in the right way. 

To review, we’ve talked about level one, where you find a specific pain to solve and you start tracking for that. Level two is you're starting to look at the fuller picture, track other metrics, communicate those, and set them as baseline using the goals. 

Level 3: DORA metrics as a team effort

Don: Level three is where it gets really fun, because you go beyond the metrics as numbers, and into uncovering actual areas to improve and getting everybody involved. Tell me more about your experience, observing teams who make it to this level.

Leigh Ann: I agree with you, this level is incredibly fun. These are light bulb moments that everybody gets excited about. When leadership talks about setting and reaching goals, the people who make that happen are individual contributors — our developers out there. Empowering them to impact positive change can be so incredible. 

Many of my engineering peers really understand that exposing developers to what's happening, allowing them to see their performance and how things are going with their teams makes developers happy.

Putting the metrics up to them allows them to see things in real time, to see how the work they're doing impacts the numbers and the metrics, and how close they are to meeting goals. It also allows them to make changes before managers have to ask for them.

I see our own Sleuth team referencing DORA metrics from our tracker all the time. I see them talking about changes that they see and immediately taking actions to improve them. This means that they didn't wait for our head of engineering to come to them. They made changes before she even had the chance to see it, which is pretty powerful. 

Obviously, that's something that takes some time, work and trust to get to. It's almost a cultural thing at that point, but we all want to be really good at our jobs. We don't like surprises when it comes to how we're doing as an individual contributor or as a team. The last thing anybody wants to do is get into a review and hear they’re not meeting expectations.

Building transparency into processes and cultures by using metrics and reviewing dashboards during team stand-ups really helps the team to be more effective. It encourages talking about performance, understanding what's happening, understanding the changes, and not doing it as a weapon. 

It's about looking at accomplishments, talking about what was different from last week to this week, learning and changing all the time. I mean, this is tech. Everything is changing and there's always new ways to do things and new things to try. Having real time data helps us to feel good about what we're doing, but also understand it and understand how it impacts the bigger picture.

Don: I think the key thing you talked about in the beginning, which enables what you're talking about now, is that it's an entire team effort. Now, to be clear, there are companies that are staying on level two and they’re good with that. They just wanted to get some numbers, wrap them up in an easy to understand way to see trend graphs on what’s improving or not, and report that. 

But in my experience, where it gets exciting is when you work to improve things, and the best way to do that is, as you said, get everybody involved. This isn't just a manager or team lead or tech lead thing. This is a dev team thing. We learned this lesson at Sleuth with deployment. The more you get developers involved in deployment, the more ownership they feel, the higher quality of code they produce, and the better experience customers have. Once you get the devs involved, the people on the front lines, they're the ones who know what to change and improve it. The more visibility you offer to empower them, that's where the needle really starts moving.

Leveling up to predictability

Leigh Ann: Absolutely. When you really get everybody involved, and everybody's participating and working with what they're seeing, the next part is predictability. It's something a lot of people are looking for. They can start addressing things that they see, but they want to know how to get in front of it and even stop things before they start. 

As you start setting all these baselines, setting goals, and working with your teams to understand what's happening, you can start to see patterns. You can start to understand that when this one thing happens, we always see this other thing the next week. Why is that?

One really recent example. An engineering lead that I've been working with took vacation, which we all need to do. When he came back, we were digging into a working session, and we took a comparison from the week before his vacation to the week he was gone, and instantly we noticed a big dip. But there weren't many things that happened, so why did it make such a difference for him to be out a week?

Within the first five minutes of us digging into the data, we noticed some things that were out of process. So now he’s refining some processes, documenting things, and doing some extra training to make sure that next time he or his other leaders need to be out, they don't have to wonder if everything is going okay when they’re gone.

If you aren't tracking things, you’ll have an awfully hard time figuring out how to predict and prevent issues from happening. 

Don: I remember at a previous company we were adopting a new technology, but not everyone was brought up to speed at the same time. That caused things to get dragged down. We didn’t have DORA metrics at the time, and people were arguing out of emotion. We weren't saying, look, the deployment frequency dropped from this to this, or we're spending much more time in reviews than we did before. 

If we had DORA metrics, we could have pointed at the deployment frequency or change lead time or change failure rate. It was an interesting case where the culture would have benefited from having numbers and not just impressions and feelings. 

Leigh Ann: Yes, absolutely. And that emotion takes up a lot of brain space that should be used for more creative, bigger things.

DORA metrics enhance remote cultures

Don: Another element is, more and more teams are remote, like our company. Before, I could look over at somebody's desk and see that they're frustrated, and we could go have a coffee to talk through it. But when you're remote, it’s more difficult to discover these issues, and you end up spending energy in these online flame wars. And because you're remote, you're not able to repair those relationships, so that sucks even more energy. Being able to see and prevent that is a huge step in trying to find a problem before it becomes too big.

Leigh Ann: Being able to open up that visibility to the whole company, to your execs, your founders, your product team, marketing, it's so valuable to understand what's really happening. It adds a human element to metrics, because it helps you know what to expect, understand what’s going on around you, and where you might be able to help. 

For me in customer success, understanding what projects the product team is working on and being able to connect them with customers who might be really interested in it helps us share information about what somebody's looking for. 

Don: Communication is so important. When we're talking about tools and metrics, it's easy to jump into buying a tool, adopting new technology, and everything is solved. But the really hard part in business is communication, especially as you go remote. 

Level 4: DORA metrics offer a constant baseline

Don: And that leads us to that last level, which is monitoring as work continues. If you're starting with metrics, you might have a specific thing you're trying to fix. Then you start looking at all the metrics to get a clearer picture and communicate about it. The third level is where you're getting everyone involved, including the developers to make changes. 

But at some point you get everything the way you like it, and you might think you don't need metrics anymore. But as you mentioned earlier, things change. Where I found metrics to offer a fourth level of value is to be a baseline in the midst of change.

As things change and get a little crazy, I can see how that change affects actual numbers — how it affects delivery performance, how it affects what the team is shipping, or how it affects the failure of what we’re shipping. I can always go back to the DORA metrics to be the source of truth. What have you seen as you've talked to customers who have reached this fourth level?

Leigh Ann: So far, what I've seen is some of the small stuff of just understanding the changes they see when they bring on a new developer, for example, or there’s a change in leadership. One thing that isn't so fun to talk about is downsizing or reorganizing teams. 

In those instances, having tracking in place is almost more valuable because there's fewer managers, fewer people who are actually making sure everything is running smoothly. It allows for a temporarily slimmed down management team to see things quicker, to feel like they still have a hold on what's going on, and maybe even make a case for why they need to rehire or why positions were important. They can actually show it with the data because they can show changes — a decline in frequency, more errors, or a longer lead time to solve issues.

I spoke with someone recently who was able to keep some budget by showing DORA metrics. It's also allowed them to bring on new leads faster and show them where the team has been and where it’s expected to go. Those new people understand so much more about the landscape of this engineering team than they would have otherwise. 

And you can actually show that by shrinking a team or moving people, things take more time. When an exec asks why a new feature isn't out or the team is missing a deadline, you can show the difference in time things took before the personnel change to after. Simple as that. And they don't even have to come to you to ask. They have access and can look for themselves.

Don: Having a downturn is definitely a thing. I love that story, where the person you're talking about retained their budget because they were able to show how things were going. That’s the whole value of metrics. Yes, you can make statistics lie, but they can also be a tool for good to help illuminate a situation, good or bad, to help you get to that next level.

Leigh Ann: And the fact that you didn't have to spend 80 hours building out spreadsheets, trying to guess and pull information. We've all been there. But once you've gotten to this level and something comes up, you can have a few reports ready tomorrow to help facilitate discussions. 

Don: So, no matter where you are on the DORA metrics journey, these are the four levels that we came up with based on our past and present experience, sales calls, and customer success conversations. It's such a freeing experience to go from guessing and arguing via emotion and whoever's the loudest wins, to putting some numbers behind it to understand better and start to quantify some of the things you might have guessed on previously.

Go from zero to one hundred deploys a day.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Level up your DORA metrics journey

October 11, 2022
|
DORA Metrics

Go from zero to one hundred deploys a day.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.