Skip to content
2024 DORA Report Cover: Accelerate State of DevOps, featuring Google Cloud and key sponsors like Deloitte, Datadog, Gearset, and more, marking a decade of insights into software development performance

Shocking DORA Report: Your AI Code May Be Making Things Worse!

In a stunning revelation from Google’s latest DevOps research, the massive rush to adopt AI in software development might be backfiring. The landmark 10th annual DORA report dropped a bombshell: while 75% of developers report feeling more productive with AI tools, their actual software delivery performance is declining. Even more alarming, the research shows a significant 14% decrease in code stability when teams rely heavily on AI-powered development.

“AI adoption brings some detrimental effects,” warns the report. “We have observed reductions to software delivery performance, and the effect on product performance is uncertain.” This concerning trend comes as 81% of organizations are racing to integrate AI into their development processes, despite nearly 40% of developers admitting they have little to no trust in AI-generated code.

What is DORA?

The DevOps Research and Assessment (DORA) team, now part of Google Cloud, has been the industry’s most trusted voice in measuring software development performance for the past decade. Founded by Dr. Nicole Forsgren, Jez Humble, and Gene Kim, DORA’s annual reports have become the gold standard for understanding what really works in software development. With data from over 39,000 professionals across ten years, when DORA raises a red flag about AI adoption, the industry takes notice.

The Unsettling Findings

While the DORA report does not fully focus on AI, here are some of the more disturbing findings.

AI’s Hidden Costs

While developers are enthusiastically embracing AI for coding (74.9%), summarizing information (71.2%), and documentation (60.8%), the data reveals a troubling paradox. “The negative impact on delivery stability is larger,” the report warns, showing “an estimated 7.2% reduction for every 25% increase in AI adoption.”

Trust Gap Widens

Perhaps most concerning is the trust deficit: “39.2% of respondents reported having little or no trust in AI.” This skepticism appears well-founded, as the data shows increasing instability in software delivery as AI adoption rises.

Chart showing developer trust levels in AI-generated code, with most respondents expressing moderate to low trust. Error bars indicate 89% uncertainty intervals, highlighting concerns over AI's reliability in software development
Source: 2024 DORA Report

The Platform Problem

Even more surprising, the study found that organizations’ attempts to control AI through internal developer platforms might be compounding the problems:

  • While showing an 8% boost in individual productivity
  • These platforms correlate with a shocking 14% decrease in change stability
  • And an 8% decrease in deployment throughput

The Human Factor

“Unstable organizational priorities lead to meaningful decreases in productivity and substantial increases in burnout,” the report reveals, suggesting that the rush to adopt AI might be creating organizational chaos.

Warning Signs for Leadership

The report identified several red flags that leadership should watch for:

  • Increasing code instability despite faster development
  • Growing disconnect between perceived and actual productivity
  • Rising burnout rates in teams heavily relying on AI
  • Deteriorating software delivery metrics despite increased AI usage

Recommendations

Despite these alarming findings, DORA’s report isn’t all doom and gloom. It offers several critical recommendations for organizations:

  1. Slow Down the AI Rush: “Adopting AI at scale might not be as easy as pressing play,” the report cautions. Organizations need to “Define a clear AI mission and policies” rather than rushing to adopt AI everywhere.
  2. Focus on Fundamentals: “The best teams are those that achieve elite improvement, not necessarily elite performance.” The report emphasizes focusing on core development practices rather than chasing AI integration.
  3. Trust But Verify: Organizations should “Implement a measurement framework that evaluates AI not by sheer adoption but by meaningful downstream impacts.”

Conclusion

The 2024 DORA Report serves as a wake-up call for the software development industry. While AI tools promise increased productivity and efficiency, the data suggests a more complex and potentially troubling reality. As one interview participant in the study noted, “We are, as a company, under pressure to deliver. So, all of these, like, nice shiny things… we’re focusing on delivery, not quality.”

The message is clear: organizations need to approach AI adoption with far more caution and strategic thinking than many currently do. The technology’s promise remains bright, but its perils are becoming increasingly apparent. As the report concludes, “The adoption of AI is just starting. Some benefits and detriments may take time to materialize, either due to the inherent nature of AI’s impact or the learning curve associated with its effective utilization.”

Post
Filter
Apply Filters