Real World Stories | Customer Stories
Katie Weddle Langer
August 22nd, 2023
Sleuth helps drive continuous improvement culture at Gigpro
Quick facts:
- Gigpro’s engineering team wanted metrics in place that would validate or invalidate gut feelings and help prioritize what improvements to make
- Reviewing DORA metrics, like lead time, in Sleuth gave business areas insight into time requirements throughout the software development process and improved their planning process
- Gigpro has incorporated Sleuth and DORA metrics into its sprint review process to help establish a culture of continuous improvement
It’s one thing to have a gut feeling that your team is moving slowly. It’s another to have the data to prove it, uncover specific sources of slowdowns, and validate the improvements you want to invest in.
“It's hard to get everyone in the business rallied around you focusing on making things better if you're just talking about a feeling,” says Tucker LoCicero, software engineer and team lead at Gigpro. “You need metrics to support the things you're trying to improve.”
Here’s how the Gigpro team got the data they needed with Sleuth and built a business-wide culture of improvement.
Buying into industry-standard DORA metrics
The Gigpro team had been struggling to release faster, deliver requests to customers quickly, and help the business understand the time required to make perceived simple fixes. They were a small team with a desire to improve and sought an efficient way to do it.
Like many teams we talk with, the Gigpro team came across the “Accelerate” book by Nicole Forsgren, PhD, Jez Humble and Gene Kim. They used it as inspiration to start tracking the DORA metrics and understand what wasn’t working.
“The DORA metrics and Sleuth helped us get insight into that, explain it to people who are not in engineering, make improvements, and prove that we were getting better,” Tucker says.
Rick Cabrera, Gigpro’s VP of engineering, adds, “The DORA metrics data from Sleuth allowed us to zero in on improvement opportunities, and then march toward them in iteration, which fed into things like more frequent releases and smaller batches.”
Visibility drives desirable behaviors
With Sleuth in place, Gigpro saw the breakdowns of where they spend their time – things like time spent iterating on a design before coding, and between coding and release. That accomplished two key things:
- It helped the business as a whole understand the time that goes into different types of requests, and answered questions about why certain fixes aren’t as easy to implement as they seem.
“We could look at our lead time metrics, which told us how many days a change would take. And that really helped product teams to plan better,” Tucker says. “It created business-wide buy-in for tracking and caring about DORA metrics.”
And with that level of trust and understanding from across the business, engineering has the space and freedom to invest time in hygiene and maintenance improvements that ultimately help engineering deliver faster.
“When you can show the business that the enhancements we made to our build process to make it faster to deploy code reduced this time by 20 percent, it's a clear metric and a clear level of understanding that time spent on development will pay dividends into the foreseeable future,” Rick says.
- It motivated the team and drove behaviors for success. When developers saw that they made a 10 or 15 percent improvement from one sprint to the next, it helped instill the right behaviors required to be a successful team. From there, the trickle down effect went into gear.
The team went from deploying once every two weeks to deploying once a day. Their batch sizes shrunk, and their review times improved – and they knew they improved because they have the data within Sleuth to prove it.
“Small achievements over and over again feel good and make us confident,” Tucker says. “And we have the data to back it up that we are actually improving.”
Incorporating Sleuth into existing processes
With Sleuth’s engineering efficiency data at the ready, Gigpro has incorporated reviewing and reporting on the DORA metrics in their sprint review process. The team regularly assesses frequency metrics including batch size, and lead time metrics including coding, review lag, review and deploy time.
“Every sprint, we're looking at how it compares to the last sprint,” Rick says. “I like to think of data as leading you to questions, not always answers. Did our code review time lag? Why did that happen? What does this data tell us? And does it make sense?”
If the team sees a large increase in a metric in a negative way, it provides an opportunity to investigate the anomaly. For example, an increase in review time led the team to dig into the data and realize it was because a key contributor was on vacation during that time.
“In the sprint review, we'll also cover our batch size for the sprint,” Rick explains. “Maybe it went up by 10%, but our gigantic sizes went down by 50%. So we've made improvements on the most extreme side, but we want to continue to break those down. We highlight those things in the retro, and for our next grooming and planning, we make sure we break down those tickets so we can keep the batch sizes small.”
Furthermore, business stakeholders participate in sprint reviews, so they can see how the team health looks relative to prior sprints.
Adjacent benefits
As with our other customers, like Puma, Gigpro realized unexpected benefits in the process of implementing Sleuth and measuring the DORA metrics to improve their engineering efficiency.
“For example, it’s not to say we wouldn't have used feature flags if we didn't have Sleuth or paid attention to DORA metrics,” Rick says, “but it puts more pressure on doing those things. Combined with the business visibility of how time spent on different areas is beneficial and has ROI, it makes it easier to prioritize finding a tool for feature flags, finding the best way to implement them, and having talks about how you do this in code. It has had the adjacent benefit of basically upskilling the team.”
The team has seen cultural improvements, too.
“The feeling has changed and releases are less stressful,” Tucker says. “It feels like we're on an efficient, lean, mean team now rather than a slow, heavy team. That feels good to people, and it’s another improvement outside of just the metrics that helps the team work better.”