We had a PR problem at Atlassian. Our median time from PR to merge had crept up to over 3 days, with engineers waiting an average of 18 hours just to get that first review comment. That’s 18 hours of context-switching or sitting idle, neither of which we can afford at the scale of Atlassian.
And while developers can now generate code and PRs faster than ever, we still only have a fixed number of senior engineers able to review them. Our bottleneck for assessing whether a PR is production-ready and meets the original requirements was only getting worse.
We needed help, and it was our Developer AI team that came to the rescue with Rovo Dev: Atlassian’s developer agent with context of the entire Atlassian platform, available in Bitbucket, GitHub, IDEs, and the CLI.
This context was Rovo Dev’s secret sauce, helping us immediately accelerate code review in three ways:
- Providing instant and actionable feedback on overall code quality, thanks to cutting-edge LLMs
- Enforce engineering standards, including lint rules, preferred design patterns, security and compliance policies, and API contracts tailored to our organization.
- Ensured PRs meet the acceptance criteria by connecting our PRs to Jira work items
Since we built and adopted Rovo Dev in early 2025, we cut our PR cycle time by 45%, more than a full day! Rovo Dev is now the automated, first reviewer on every PR, reducing wait time for reviews, reducing our overall PR backlog, and freeing our team to focus on reviewing larger, more critical code changes.
Rovo Dev also supercharged new Atlassian engineers; those who used Rovo Dev merged their first PR five days faster than those who didn’t.
Here’s how the Dev AI team did it.
Three challenges—and opportunities—to code review
The common theme across all the factors slowing down code review at scale was that so much of code review was manual. No individual, not even senior technical leads, could remember all the knowledge needed to consistently write and review code for each team and each project.
For us, the three biggest challenges were:
- Overall code quality checks: Code reviews often got stuck in back-and-forth small issues, such as general code consistency, typos, and other stylistic issues. Despite being simple, easy-to-fix things, they often added up, lengthening the code review process.
- Not following engineering standards: At our scale, it’s not uncommon for engineers to forget to follow certain engineering standards, even if the code they commit technically works.
- Not meeting the acceptance criteria: It’s also not uncommon for engineers to forget to check whether the code meets the feature requirements.
Our Developer AI team recognized these challenges as opportunities. Many instances of these challenges are easy-to-solve mistakes; engineering standards, for example, are often harder to find (and document and keep updated!) than they are to follow.
Let’s look at an example of how the team built Rovo Dev to turn each challenge into an opportunity.
Immediate, effortless feedback
We noticed that 26% of the total PR cycle time came from an engineer waiting for the first code review comment – an average of 18 hours. With Rovo Dev, that wait time was cut to 0.
Built on cutting-edge LLMs, Rovo Dev automatically reviews any PR commit for general-purpose issues with code changes such as typos, logic errors, anti-patterns, even potential bugs. Rovo Dev also surfaces code suggestions to fix these issues, which engineers can accept with one click.
Incorporate engineering standards into code review
One of the gaps in most AI code review tools is that they don’t have context of an organization’s development workflow. Critically, this includes the engineering organization’s coding standards, such as those regarding security, compliance, accessibility and others.
On the other side, enforcing those standards is a common struggle for enterprise teams, and Atlassian is no exception.
Rovo Dev solved this problem for us on both ends. As part of automatic code review, Rovo Dev pulls from Confluence documents that include customized standards for the repo, project, or across the whole company, and checks to see that the PRs match those stated standards.
Incorporate acceptance criteria from Jira
Another gap in most AI code review tools is the ability to check whether or not the code does what we promised it would do.
Naturally, we track our acceptance requirements and business objectives in Jira work items, and since Rovo Dev is part of the Atlassian platform, it can check PRs against those same Jira work items and notify the PR author of criteria that hasn’t been met. This ensures all our code changes actually meet our project’s goals, and reduces the breadth of manual code review.
Looking forward: AI for the whole software lifecycle
Rovo Dev is now an essential part of our code review process, and it’s already transforming how our customers review and ship code too. In beta, Rovo Dev cut customer PR cycle time by 32%, or by more than a day on average (from 4.18 to 2.85 days)!
As the front-line, AI-powered code reviewer, Rovo Dev is the first to provide immediate feedback to our engineers, speeding up code review cycles while improving code quality, enforcing our company’s engineering standards, and ensuring that features always meet their acceptance criteria.
And now that Rovo Dev is generally available, we’re excited to iterate and improve Rovo Dev in the coming months, with more customized support for enforcing complex standards, improved review quality, and adding Rovo Dev’s automated quality checks to more stages of the development lifecycle, such as CI/CD.
Learn more about how you can try Rovo Dev in your Bitbucket and GitHub repositories today.
